Senior Data Engineer Job in Harry International

Senior Data Engineer

Apply Now
Job Summary Senior Data Engineer Experience:6+ Years Salary: 25 LPALocation: Pune, Hyderabad
Job Description:Job Duties and Responsibilities:
  • Design and develop scalable solutions to store and retrieve high volume of data on cloud.
  • Deliver project and customer success by meeting deadlines, managing expectations and by delivering good quality solutions.
  • Develop and maintain scalable data pipelines and build out API integrations to support continuing increases in data volume and complexity.
  • Perform data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
  • Design data integrations and data quality framework for medium complexity modules.
  • Responsible for Ingesting data from files, streams, and databases. Process the data with Hive, Hadoop, Spark.
  • Responsible for designing and developing distributed, high volume, high velocity multi-threaded event processing systems using Big Data technologies.
  • implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Hadoop, any Cloud computing etc.
  • Investigate, identify, and establish new tools and processes for data warehousing, data quality assurance, reporting, business intelligence, data governance, and data cataloguing.
  • Setup reliable data ingestion pipelines for new data sources and integrate them effectively with existing data sets.
  • Work on migrating on-prem data warehouses to cloud platforms.
  • Assist in development of reusable code and accelerators development.
  • Training and mentoring other members in the team.
  • Open to learn new technologies in a short duration.
Required Skills and Experience:
  • Advance SQL knowledge and experience of working with a variety of databases.
  • Hands on experience on Big Data tools and design and develop data pipeline on any Cloud Platform.
  • Possess in-depth knowledge of various Design Patterns in Big Data, Data Processing Patterns (Batch/NRT/RT processing) & capable of providing design & architecture of typical business problems.
  • Ability to work independently on modules ( estimate , organize work, test and contribute to SDLC)
  • Collaborate with architect and leads to develop application architecture and ensure coherence
  • Ensure conformity to NFRs established for the project
  • Design reusable and implement good quality code
  • Hands-on experience in working with programming languages such as Python, Scala and Java
  • Experience with Big Data technologies such as Hadoop, Kafka, NoSQL databases
  • Familiarity with AWS/GCP cloud deployment models
  • Self-starter, challenger, and analytical thinker
  • Experience on working in cloud data migration projects.
  • Working experience in one of the cloud platform GCP/AWS/Azure is must
  • Working experience with Big Query/ Snowflake/Redshift (at least one of them)
  • Experience of developing data pipelines using tools such as Spark, Python, AWS Glue, Azure
Data Factory:
  • Hands-on experience in Dataflow, Apache Beam, Spark, Hadoop, Kafka, BigQuery, Data Catalogue, Airflow etc.
  • Dynamic who demonstrates a very positive attitude
  • Strong Analytical and Communication Skills
  • Excellent written and verbal communication skills.


Salary : 25 LPA
Experience Required :

Minimum 6 Years

Vacancy :

2 - 4 Hires

Similar Jobs for you

See more recommended jobs