.Key Responsibilities:- Design and implement Data Lakehouse architectures that meet business requirements and are scalable, reliable, and secure.- Gather and document Master Data Management (MDM) requirements, liaising with various business stakeholders to develop MDM strategy & architecture patterns.- Analyzing the business requirements, design and develop highly efficient highly scalable Data Ingestion processes.- Design and build modern data pipelines including ETL mappings, being well versed in writing complex SQL queries.- Develop and maintain data models, data dictionaries, and data flow diagrams.- Hands-on working with virtualization, performance tuning, optimization, and scaling solutions from a storage/processing perspective.- Maintain batch processing jobs and respond to critical production issues communicate well with stakeholders on their proposal recommendations.- Conduct research focused on identifying emerging technology solutions that reduce costs, increase efficiencies, provide more value, provide more capabilities, reduce risks, and increase security.- Implement and maintain data security and access controls.- Support the Lead Data Architect by contributing improvements to the Data Architecture strategy, evolving best-practices, templates and processes.- Work collaboratively with Solution Architects, Developers, Testers and other Data Analytics team members to understand constraints and opportunities that may shape and influence Data Architecture deliverables.**Experience**:- Bachelor's degree in computer science, computer engineering, electrical engineering, systems analysis or a related field of study, or equivalent job experience.- Significant experience in Data Architecture, with a deep understanding of data models, standards and tools used across architecture and engineering.- At least 2 years' experience designing and implementation Data Lakehouse architectures in Azure or AWS, hands-on experience with Databricks is desirable.- Experience of transforming traditional Data Warehousing approaches to Big Data based approaches.- Demonstrated experience using ETL tools such as IICS/ADF/AWS Glue and SSIS using traditional approaches, experience with more modern real time data pipeline solutions is desirable.- Demonstrated effectiveness in executive presentation and escalation management. - Demonstrated Data modelling and data mapping experience.- Experience with Data virtualization & catalogue is a desirable- Experienced with Agile methodologies.Skills and Attributes:- Language Fluency: Must possess 100% fluency in English.- Robust Data Solutions: Ability to manage complex data requirements and develop robust code that ensures data availability, integrity, quality, and security.- Interpersonal Excellence: Outstanding interpersonal skills, teamwork, and facilitation capabilities to effectively collaborate across diverse teams