Cloud Datawarehouse Architect Job in Ideas2it
Cloud Datawarehouse Architect
- Chennai, Tamil Nadu
- Not Disclosed
- Full-time
- Permanent
About the Open Role
We are looking for a Cloud Datawarehouse Architect who has worked on large-scale data systems and has a strong understanding of data models, databases and data pipelines.The architect will work with our software developers, data analysts, and data scientists and will ensure that the platform architecture is robust and scalable to handle our analytical, BI, and data science requirements. The ideal candidate will have experience in designing data platforms that use varied databases and incorporate complex data pipelines as part of large analytical systems.The candidate must be self-directed and comfortable with learning new concepts and technologies to support emerging data needs.Understand product requirements and design solution and data architecture to support and scale with the product roadmap
What s in it for you?
- A robust distributed platform to manage a self-healing swarm of bots on unreliable network/compute
- Large scale Cloud-native applications
- Document Comprehension Engine leveraging RNN and other latest OCR techniques
- Completely data-driven Low code platform
- You will get to learn cutting-edge technologies as you work with leading Silicon Valley startups.
- Work in a culture that values capability over experience and continuous learning as a core tenant.
- You will get to bring your ideas to the table and make a significant difference to the success of the product instead of being a small cog in a big wheel.
If you have any relevant experience, great! If not, it doesn t matter. We believe in hiring people with high IQ and the right attitude over ready-made skills. As long as you are passionate about building world-class enterprise products and understand whatever technology you are working on in-depth, we will bring you up to speed on all the technologies we use. Oh BTW, did we mention that you need to be super smart?
What you will be doing here
- Create and maintain optimal data architecture, including data models/data structures and data pipelines
- Assemble large, complex data sets that meet functional/non-functional business requirements
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, NoSQL, and AWS big data technologies
- Build processes supporting data transformation, data structures, metadata, dependency, and workload management
What could make you a great candidate for this position
Mandatory Requirements
- Advanced knowledge and experience in working with SQL and NoSQL databases as part of BI/ analytical systems. Experience with implementing analytical/ machine learning algorithms is a plus
- AWS Services: Glue Jobs, Data Migration Service, RDS (Postgres), S3
- Technologies/Languages: Oracle, SQL, Python, Spark, PySpark
- AWS Redshift
- Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. is a plus
- Experience with healthcare data is a plus but not required.
7 to 10 Years
2 - 4 Hires