Data Engineer (aws Or Snowflake) Job in Cloud Kinetics Technology Solutions Private Ltd
Data Engineer (aws Or Snowflake)
Cloud Kinetics Technology Solutions Private Ltd
4+ weeks ago
- Hyderabad, Telangana
- Not Disclosed
- Full-time
- Permanent
Job Summary
- Identify, propose and implement advanced best-of-breed solutions to complex data engineering problems at scale
- Shape and advise on detailed technical design decisions involving the design, build and launch of efficient and reliable distributed data pipelines to move & transform data
- Design and develop new systems in partnership with software engineers to enable quick and easy data consumption
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc
- Guide and direct junior developers
- Be a technology ambassador Git, blogs, spokesperson in customer and industry
Requirements
- Possess a strong technology foundation
- Platform Architecture incl. containerization
- Databases
- Data Pipelines Engineering and Architecture
- Infrastructure Deployment and Management
- Have extensive hands-on experience across the data pipeline
- Ingestion: batch, streaming or both
- ETL/ELT
- Repository architecture and implementation
- Deep expertise with ability to mentor in at least one programming language (such as Python, Scala, R) as well as SQL and Bash
- Proven ability to envision and build platforms to scale disproportionately
- Prior knowledge of CI/CD and agile development methodologies would be a plus
- Prior experience in migration to cloud from legacy Hadoop, Teradata or Cloudera environments would be a big plus
- Great aptitude for collaboration combined with excellent communication and presentation skills
- Have deep experience and associated certifications in at least one of the platform-tracks below
- Required: AWS Glue, AWS Redshift, AWS S3, Athena, MSK or Apache Kafka
- Preferred: AWS Certified Data Analytics Specialty or AWS Certified Big Data Specialty
- Must Have: Any combination of tools Snowflake native (COPY, Snowpipe, Streams etc) or 3rd party tools (dbT, Fivetran, Matillion, Immuta, Collibra etc) or Open Source tools (Airbyte, Debezium, etc)
- Preferred: SnowPro Core & SnowPro Advanced: Data Engineer
- Nice to Have: AWS Data pipeline, Lake Formation, EMR, Data Exchange or any of the 3rd party data integration tools (dbT, Fivetran, Matillion, Immuta, Collibra etc) or Open Source tools (Airbyte, Debezium etc)
- Nice to have: AWS Solutions Architect Professional or Associate
- Nice to Have: AWS Glue, Informatica ETL, other data integration and ETL/ELT tool sets
- Nice to have: SnowPro Advanced: Administrator or Architect
Experience Required :
3 to 7 Years
Vacancy :
2 - 4 Hires
Similar Jobs for you
×
Help us improve TheIndiaJobs
Need Help? Contact us