Lead Data Analyst Job in Optum
Lead Data Analyst
- Noida, Gautam Buddha Nagar, Uttar Pradesh
- Not Disclosed
- Full-time
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together.
At UnitedHealth Group and Optum, we want to make healthcare work better for everyone. This depends on hiring the best and brightest. With a thriving ecosystem of investment and innovation, our business in India is constantly growing to support the healthcare needs of the future.
Our teams are at the forefront of building and adapting the latest technologies to propel healthcare forward in a way that better serves everyone. With our hands at work across all aspects of health, we use the most advanced development tools, engineering, data science, AI and innovative approaches to make the healthcare system work better for everyone.
As a Fortune 4 business, we re one of the world s leading healthcare companies. There are no limits here on the resources you ll have or the challenges you ll encounter.
As a Lead Data Analyst, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Terraform, etc.), automation and innovation.
Shift time: 3.30 to 1 am IST
Location : Noida, Sector 144
Primary Responsibilities:
- Create & maintain data pipelines using Azure & Snowflake as primary tools
- Create SQL Stored procs and Functions to perform complex transformations
- Understand data requirements and design optimal pipelines to fulfil the use-cases
- Creating logical & physical data models to ensure data integrity is maintained
- Code management, CI/CD pipeline creation & automation using GitHub & GIT Actions
- Tuning and optimizing data processes
- Design and build best in class processes to clean and standardize data
- Code Deployments to production environment, troubleshoot production data issues
- Modelling of big volume datasets to maximize performance for our Business Intelligence & Data Science Team
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Undergraduate degree or equivalent experience
- Computer Science bachelor s degree or similar
- Total 8-10 years and Strong Hands-on relevant experience of minimum 4-6 years is required.
-
- Excellent communication skills Verbal and Written
- Excellent knowledge of SQL
- Excellent knowledge of Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault, etc.
- Excellent knowledge of Snowflake Architecture, Features, Best practices
- Excellent knowledge of Data warehousing & BI Solutions
- Excellent Knowledge of change data capture (CDC), ETL, ELT, SCD etc.
- Hands on experience on the following technologies:
- Developing data pipelines in Azure & Snowflake
- Writing complex SQL queries
- Building ETL/ELT/data pipelines using SCD logic
- Query analysis and optimization
- Analytical and problem-solving experience applied to a Big Data datasets
- Data warehousing principles, architecture and its implementation in large environments
- Experience working in projects with agile/scrum methodologies and high performing team(s)
- Knowledge of different data modelling techniques such as Star Schema, Dimensional models, Data vault is an Advantage
- Experience in code lifecycle management and repositories such as GIT & GitHub
- Exposure to DevOps methodology
- Good understanding of Access control and Data masking
Preferred Qualifications:
- Knowledge and experience on Terraform, CI CD Pipelines and automation is an advantage
- Automation and orchestration using ADF
- Create real-time analytics pipelines using Snowpipe Streaming
- Experience developing optimized data models for Viz tool E.g. Tableau, PowerBI is an Advantage
- Exposure and experience in other programming language such as Python, Spark etc. is an advantage
- Hands-on experience of CI CD Pipelines using GIT & GIT Actions
- Understanding of United States Healthcare data and applicable regulations
Qualification : Undergraduate degree or equivalent experience
8 to 10 Years
2 - 4 Hires