Data Engineer Job in Elasticrun
Data Engineer
Elasticrun
4+ weeks ago
- Pune, Pune Division, Maharashtra
- Not Disclosed
- Full-time
- Permanent
Job Summary
You will be someone who loves and can analyse raw data and exploit algorithms to bring out
crucial insights to tackle complex problems. You will be responsible for developing data
pipeline, creating stunning reports and visualizations and develop real world machine
learning appliances. You will be taking bottom-line of managing range of database (SQL,
NoSQL) platforms in cloud and scaling to needs of our internal applications and SaaS
platforms.
Responsibilities:
Develop and build software solutions.
Conducting data analysis and report generation.
Doing a thorough requirement analysis.
Entire project coordination
Skill Sets:
Data pipeline development batch, streaming and distributed processing
fundamentals with on-job experience. Working knowledge of ETL, ELT tools, map-
reduce, spark.
Information models design and development analyse, design and build relational,
dimensional and document models.
Reports and visualization Should be able convey data story to end users with
impact. Hands on atleast one of Tableau, Qlik, PowerBI, D3.
Distributed databases Experience with Hadoop, MongoDB, Full text search engine
configuration and management.
Machine learning Python, SciKit, Pandas.
Knowledge of / experience in any of any security standards like ISMS, PCI DSS would
be preferred.
Candidate should be aware of Information Security procedures and processes
Job Description
Roles and Responsibilities
Roles and Responsibilities:You will be someone who loves and can analyse raw data and exploit algorithms to bring out
crucial insights to tackle complex problems. You will be responsible for developing data
pipeline, creating stunning reports and visualizations and develop real world machine
learning appliances. You will be taking bottom-line of managing range of database (SQL,
NoSQL) platforms in cloud and scaling to needs of our internal applications and SaaS
platforms.
Responsibilities:
Develop and build software solutions.
Conducting data analysis and report generation.
Doing a thorough requirement analysis.
Entire project coordination
Skill Sets:
Data pipeline development batch, streaming and distributed processing
fundamentals with on-job experience. Working knowledge of ETL, ELT tools, map-
reduce, spark.
Information models design and development analyse, design and build relational,
dimensional and document models.
Reports and visualization Should be able convey data story to end users with
impact. Hands on atleast one of Tableau, Qlik, PowerBI, D3.
Distributed databases Experience with Hadoop, MongoDB, Full text search engine
configuration and management.
Machine learning Python, SciKit, Pandas.
Knowledge of / experience in any of any security standards like ISMS, PCI DSS would
be preferred.
Candidate should be aware of Information Security procedures and processes
Experience Required :
3 to 8 Years
Vacancy :
2 - 4 Hires
Similar Jobs for you
×
Help us improve TheIndiaJobs
Need Help? Contact us