Data Architect Job in Growtharc Technologies
Data Architect
- Bengaluru, Bangalore Urban, Karnataka
- Not Disclosed
- Full-time
Job Summary:
We are looking for a highly skilled and experienced Data Architect to join our team. The ideal candidate will have a deep understanding of big data technologies and experience working with Hadoop, Python, Snowflake, and Databricks. As a Data Architect, you will be responsible for designing, implementing, and managing complex data architectures that support our business needs and objectives.
Key Responsibilities:
-
Design and Architecture:
-
Design scalable and efficient data architecture solutions to meet the business's current and future data needs.
-
Lead the development of data models, schemas, and databases that align with business requirements.
-
Architect and implement solutions on cloud platforms such as AWS, Azure, or GCP.
-
Data Management:
-
Develop and maintain data pipelines and ETL processes using Hadoop, Databricks, and other tools.
-
Oversee data integration and data quality efforts to ensure data consistency and reliability across the organization.
-
Implement data governance and best practices for data security, privacy, and compliance.
-
Collaboration and Leadership:
-
Work closely with data engineers, data scientists, and business stakeholders to understand data requirements and translate them into technical solutions.
-
Provide technical leadership and mentorship to junior data engineers and architects.
-
Collaborate with cross-functional teams to ensure data solutions align with business goals.
-
Optimization and Performance:
-
Optimize existing data architectures for performance, scalability, and cost-efficiency.
-
Monitor and troubleshoot data systems to ensure high availability and reliability.
-
Continuously evaluate and recommend new tools and technologies to improve data architecture.
Qualifications:
-
Bachelor's degree in Computer Science, Information Technology, or a related field. A Master's degree is preferred.
-
10+ years of experience in data architecture, data engineering, or a related field.
-
Proven experience with Hadoop ecosystems (HDFS, MapReduce, Hive, HBase).
-
Strong programming skills in Python for data processing and automation.
-
Hands-on experience with Snowflake and Databricks for data warehousing and analytics.
-
Experience with cloud platforms (AWS, Azure, GCP) and their data services.
-
Familiarity with data modeling tools and methodologies.
Skills:
-
Deep understanding of big data technologies and distributed computing.
-
Strong problem-solving skills and the ability to design solutions to complex data challenges.
-
Excellent communication skills, with the ability to explain complex technical concepts to non-technical stakeholders.
-
Knowledge of SQL and database performance tuning.
-
Experience with CI/CD pipelines and automation in data environments.
Preferred Qualifications:
-
Certification in cloud platforms such as AWS Certified Data Analytics, Google Professional Data Engineer, or Microsoft Certified: Azure Data Engineer Associate.
-
Experience with additional programming languages like Java or Scala.
-
Knowledge of machine learning frameworks and their integration with data pipelines
Qualification : Bachelor's degree in Computer Science, Information Technology, or a related field. A Master's degree is preferred.
Minimum 10 Years
2 - 4 Hires