Data Architect Job in Toll Group
Data Architect
- Pune, Pune Division, Maharashtra
- Not Disclosed
- Full-time
- Permanent
Job Description: At Toll Technology Centre, Pune (India), the Data Architect is responsible for designing and implementing innovative, high performance and scalable reporting/analytics solutions tailored to solving business issues by leveraging data across the Toll Group. The Data Architect will focus on managing, transforming, analysing and visualizing large sets of data to turn information into insights using multiple platforms. As a Data Architect , you will be responsible for/ to: Develop an Enterprise Data Model and data architecture patterns for data movement and storage within the Toll organisation. Support BU projects in using these artefacts to build more effective data systems. Defines and builds the data pipelines that will enable faster, better, data-informed decision-making within the business. Architecture, design, implementation, and/or support of complex application architectures. Understanding of the application of technologies to solve big data problems and to develop innovative big data solutions. Design and develop highly scalable distributed analytics solutions for efficient analysis and insight generation, using different o pen source & proprietory tools. Provide input into BI & Analytics strategy based on identified trends and predicted outcomes. Adopt and implement continuous improvement program to continuously enhance and improve the BI & Analytics service offering and standards in line with the principle of better, simpler and more efficient. Keeps up with industry trends and best practices on new and improved data engineering strategies that will drive BI & Analytics CoE s performance leading to operational efficiencies and improvements. Creates design specifications that demonstrate an understanding of most interfacing systems and supported business processes. Develop prototypes and proof of concepts for the selected analytics solutions. Leads innovation through exploration, benchmarking, making recommendations, and implementing big data technologies for BI & analytical platforms. Test and validate the accuracy of data transformations and data verification used in BI & analytics solutions, including unit testing, user acceptance testing and performance testing. Promotes and supports proper data governance and quality. Analyses complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Identifies data sources relevant to solving business problems and help design the optimal combination of data sources and analytical techniques. Understanding the quality of data sourced, its management, and liaising with data scientists, report developers and other BI & Analytics CoE experts to management the impact of data quality issues. Drive collection of new data and the refinement of existing data sources. Understand and safeguard Toll master data. Evaluate the needs and requirements of projects or problems and provide technical expertise in the development of the solutions within the BI & Analytics CoE. Ensure solutions meet functional and non-functional requirements including reliability, scalability, maintainability, cost to deliver, etc. Interactively analyse and manipulate data using a variety of data analysis and data mining tools. Translate visualisation designs into physical solutions. Ensure effective communication and stakeholder management through the delivery cycle. Maintain solution documentation as appropriate. Comply with Toll policy and procedures in line with risk and security protocols. To be successful in this role, you will need: Total relevant experience to be in the range of 8 to 12 years. Proficient in - ADLS Gen 2, AZURE Data Factory, AZURE Data Bricks, AZURE synapse, AZURE Event Hub, AZURE Stream Analytics, SQL Server familiarity, familiarity with Azure BLOB & File Storage, AZURE DevOps, AZURE Purview or an equivalent Data catalogue tool familiarity. Architecture, design, implementation, and/or support of complex application architectures (i.e. having an architectural sense for connecting data sources, data visualization, structured and unstructured data, etc. Experience working with relational databases such as SQL, Oracle. Experience with relational data modelling database packages such as Oracle, SQL Server, Sybase or DB2 Prior experience using ETL using tools like Informatica/SSIS/ODI will be a plus. Experience managing ETL & Reporting meta-data & migrating same to cloud services like AWS/AZURE a big plus Experience working in agile environment. Experience in working in both on premise and cloud-based platforms
Fresher
2 - 4 Hires