Data Engineer - Bigdata Job in Xento Systems
Data Engineer - Bigdata
Xento Systems
4+ weeks ago
- Pune, Pune Division, Maharashtra
- Not Disclosed
- Full-time
- Permanent
Job Summary
What We Have In Store For You:
- Evaluate business needs and objectives.
- Gather, verify, and organize data from many sources.
- Create and maintain optimal data pipeline architecture.
- Help to build and manage data warehouse strategies.
- Performs data analysis required to troubleshoot data-related issues and deliver state-of-the-art solutions to those identified problems.
- Use SQL, Python, Node JS, and other languages in connection with modern cloud architecture and transform that data into usable assets.
- Identify ways to improve data reliability, efficiency, and quality.
- Deploy sophisticated analytics programs, machine learning, and statistical methods.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Collaborates with analytics and business teams to improve data models.
- Implements processes and systems to monitor data quality, ensuring production data is always accurate.
You Fit the Bill If:
- Should have roughly 6-8 years of industry experience with a minimum of 3+ years of hardcore data engineering experience.
- Should have the practical and foundational knowledge of creating and managing data pipes in topical data frameworks like Spark, Kafka, Hadoop
- Should have a strong understanding of data warehouse techniques.
- Experience in data warehouse development and architecture
- Should have experience working on ETL tools
- Strongly desired to have full cycle implementation of data solutions on Any cloud flavors like AWS, GCP, or Azure. This includes Ingestion, Storage, Compute, Data Exploration, and Data Serving at a minimum. Experience with AWS Glue is a plus.
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Should have the skills to do modular programming and application development, preferably in python. This is a key skill for data sharing/provisioning using APIs and data visualization using any of the frameworks like Flask, Django, Falcon etc.
- Good understanding of SDLC and Agile Methodologies.
- Strong problem-solving skills and analytical skills.
- Ability to communicate verbally and in technical writing to all levels of the organization in a proactive, contextually appropriate manner.
- Strong teamwork and interpersonal skills at all levels.
Good to Have:
- Git, Docker
Experience Required :
2 to 4 Years
Vacancy :
2 - 4 Hires
Similar Jobs for you
×
Help us improve TheIndiaJobs
Need Help? Contact us