Data Engineer _gcp Job in Tiger Analytics
Data Engineer _gcp
Tiger Analytics
4+ weeks ago
- Bengaluru, Bangalore Urban, Karnataka
- Not Disclosed
- Full-time
- Permanent
Job Summary
Job DescriptionAbout the role:
As a Data Engineer, you will build a variety of big data analytics solutions, including big data lakes. More specifically, you will: Design and build scalable data ingestion pipelines to handle real-time streams, CDC events, and batch data Execute high-performance data processing for structured and unstructured data and data harmonization Schedule, orchestrate and validate pipelines Design exception handling and log monitoring for debugging Make tech stack and tools related decisions Collaborate with business consultants, data scientists, and application developers to develop analytics solutions
Job RequirementRequired Experience, Skills & Competencies: Hands-on experience with:Hadoop ecosystem - HDFS, Hive, Sqoop, Kafka, ELK Stack, etcSpark, Scala, Python, and core/advanced JavaNoSQL databases e.g. Hbase, Cassandra, MongoDBRelevant GCP components required to build big data solutionsGood to know: Databricks, Snowflake Ability to develop and manage scalable Hadoop cluster environments Good understanding of data warehousing concepts, distributed systems, data pipelines, ETL 5-10 years of tech experience with at least 3 years in big data engineeringDesignation will be commensurate with expertise/experience. Compensation packages are among the best in the industry.
As a Data Engineer, you will build a variety of big data analytics solutions, including big data lakes. More specifically, you will: Design and build scalable data ingestion pipelines to handle real-time streams, CDC events, and batch data Execute high-performance data processing for structured and unstructured data and data harmonization Schedule, orchestrate and validate pipelines Design exception handling and log monitoring for debugging Make tech stack and tools related decisions Collaborate with business consultants, data scientists, and application developers to develop analytics solutions
Job RequirementRequired Experience, Skills & Competencies: Hands-on experience with:Hadoop ecosystem - HDFS, Hive, Sqoop, Kafka, ELK Stack, etcSpark, Scala, Python, and core/advanced JavaNoSQL databases e.g. Hbase, Cassandra, MongoDBRelevant GCP components required to build big data solutionsGood to know: Databricks, Snowflake Ability to develop and manage scalable Hadoop cluster environments Good understanding of data warehousing concepts, distributed systems, data pipelines, ETL 5-10 years of tech experience with at least 3 years in big data engineeringDesignation will be commensurate with expertise/experience. Compensation packages are among the best in the industry.
Experience Required :
Minimum 3 Years
Vacancy :
2 - 4 Hires
Similar Jobs for you
×
Help us improve TheIndiaJobs
Need Help? Contact us