.Job Detailsyou will doJoin us in the Procurement Execution Center (PEC) as a Data Engineer as part of a is a diverse team of data and procurement individuals. In this role, you will be responsibr deploying supporting the E2E management of our data, including:ETL/ELT, DW/DL, data staging, data governance, and manage the different layers of data required to ensure a successful BI & Reportir the PEC. This role will work with multiple types of data, spreading across multipnctional areas of expertise, includieet, MRO & Energy, Travel, Professional Services, among others.How you will do itDeploy data ingestion processes through Azure Dactory to load data models as required into Azure Synapse.Build and design ETL/ELT processes with Azure Dactory (ADF) and/or Python, which once deployed, will require to be executed daily and weekly.Assemble large, complex data sets that menctional / non-functional business requirements.Build the infrastructure requirr optimal ETL/ELT of daom a wide variety of data sources using Azure SQL and ADF.Develop data models that enable DataViz, Reporting and Advanced Data Analytics, strivir optimal performance across all data models.Maintain conceptual, logical, and physical data models along with corresponding metadata.Manages the DevOps pipeline deployment model, including automated testing proceduresDeploys data stewardship and data governance across our data warehouse, to cleanse and enhance our data, using knowledge bases andbusiness rules.Performs the necessary data ingestion, cleansing, transformation, and coding of business rules to support annual Procurement bidding activities.Support the deployment of a global data standar Logistics.Create data toor analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.Support Rate Repository management as required (including Rate Card uploads to our DW).Other Procurement duties as assigned.Are we lookirBachelors degree in (Engineering, Computer Science, Data Science or similar)Advanced workingSQLknowledge and Importante empresa working with relational databases.Knowledge in DW/DL concepts, data marts, data modeling,ETL/ELT, data /stewardship, distributed systems andmetadata management.Importante empresa building and optimizing data pipelines, architectures, and data sets.Azure Data Engineering certification preferred (DP03)ETL/ELT development Importante empresa (3 years). SSIS or ADF are preferred.Ability to resolve ETL/ELT problems by proposing and implementing tactical/Strategic.Strong project management and organizational skills.Importante empresa with object-orientnction scripting languages:Python, Scala, C#, etc.Importante empresa with NoSQL databases is a plus to support the transitiom On-Prem to Cloud