Dw Architect Job in Thinkpalm Technologies Pvt.ltd

Dw Architect

Apply Now
Job Summary

Data Futures knowledge and experience of AI and ML techniques and practices in order to evangelise, facilitate and extend the use of ML across all Doc s and products

  • End to end Lifecyle of Data warehousing, DataLakes and reporting
  • Experience with Maintaining/Managing Data warehouses.
  • Responsible for the design and development of a large, scaled-out, real-time, high performing Data Lake / Data Warehouse systems (including Big data and Cloud).
  • In-depth experience in data modeling and experience with business intelligence systems (dimensional modeling, data mining, predictive analytics).
  • Comfortable in multi-terabyte production environments
  • Experience with distributed data management, data storage including databases (Relational, NoSQL, Big data , data analysis, data processing, data transformation, high availability, and scalability)
  • Experience in end to end project implementation in Cloud (Azure / AWS / GCP) as an Architect
  • Big data specialist working in and with data lakes, data warehouses, streaming platforms, and cloud platforms
  • Well versed in the Data domains (Data Warehousing, Data Governance, MDM, Data Quality, Data Catalog, Analytics, BI, Operational Data Store, Metadata, Unstructured Data, ETL, ESB)
  • Experience / Knowledge in data mining, machine learning, and data science
  • Handling of reporting tools like BO & Tableau
  • Knowledge on HDFS Architecture
  • Exposure to cloud DBs like Snowflake
  • Good understanding of ETL tools like Informatica.
  • Understanding scripting languages like Python is an added advantage
  • Good experience in monitoring the server usage and providing corrective actions.
  • Troubleshooting the execution plans, advising on structuring the data warehouse, handling a team with vendors, knowledge in agile execution, resizing the environments, capacity planning.
  • Candidate must be flexible to work in different time zones and support user reporting requests.
  • Work across different teams to troubleshoot application anomalies and fix data, user and application issues by coordinating activities across technical, ETL teams and business users.
  • Strong communication and interpersonal skills.
  • Experience with scrum methodology and Agile project methodology.
  • Expertise in accurately estimate work efforts and timelines.
  • Prepare appropriate documentations during every phase of the project life cycle.
  • Hands on experience with Kafka, Flink, Spark, SnowFlake, Airflow, nifi, Oozie, Pig, Hive,Impala Sqoop. Storage like HDFS , Object Storage (S3 etc), RDBMS, MPP and Nosql DB.
  • Strong data modelling expertize for Datawarehouse , data lake , delta lake , data hub (Kimball or Inman, DataVault2).
  • Hands on experience in building real time and batch processing solution using open source and cloud managed services.
  • Strong experience in script based ETL tool preferably python or scala or java.
  • Strong agile mindset and ability to collaborate efficiently with large teams.
  • Experience of processing huge amount of data , performance tuning , cluster administration, High availability and failover , backup restore.
  • Rich data governance experience , data security , data quality , provenance / lineage.
  • Experience in managing demanding stakeholders and working in parallel on multiple projects.
  • Prior experience of working with practices and horizontal functions is desirable. It is expected to contributed to Pre-sale and solutioning opportunities.
  • Good understanding of industry trends and products in dataops , continuous intelligence , Augmented analytics , and AI/ML.
  • Excellent communication and Stakeholder management skills
Experience Required :

Minimum 12 Years

Vacancy :

2 - 4 Hires

Similar Jobs for you