.We Are PepsiCoJoin PepsiCo and Dare for Better! We are the perfect place for curious people, thinkers and change agents. From leadership to front lines, we're excited about the future and working together to make the world a better place.Being part of PepsiCo means being part of one of the largest food and beverage companies in the world, with our iconic brands consumed more than a billion times a day in more than 200 countries.Our product portfolio, which includes 22 of the world's most iconic brands, such as Sabritas, Gamesa, Quaker, Pepsi, Gatorade and Sonrics, has been a part of Mexican homes for more than 116 years.A career at PepsiCo means working in a culture where all people are welcome. Here, you can dare to be you. No matter who you are, where you're from, or who you love, you can always influence the people around you and make a positive impact in the world.Join PepsiCo, dare for better.ResponsibilitiesThe OpportunityThe Data Engineer will operate within a team of database developers, data analysts, data engineers, and software engineers to produce simpler, faster, and more insightful solutions for supply chain performance managers, decision makers, specialists and frontline personnel.Your ImpactAs Data Engineer your responsibilities would consist of:Develop ETL Pipelines, database tables, views, indexes to transform, prepare, and stage data for user facing tools, dashboards, and reports by using cloud technologies (databricks, Azure, DevOps).Perform data validation to ensure data pipelines meet specifications for the business applications.Work closely with other team members (business analysts, data engineers, Tableau and PowerBI developers) to collaborate, design, plan work, and develop complete business solutions.Develop resilient data processes which include failure email notification and detailed logging. Proactively monitor deployed ETL pipelines and resolve issues when failures or issues arise.Support team with ad-hoc analytics, tool troubleshooting, and support end-users directly to resolve tool issues.QualificationsWho Are We Looking For?Key skills and experience required:Bachelor's degree in information technology, engineering, maths, physics or any related field.Fluent in English is a MUST.Intermediate SQL skills.Intermediate Python programming skills.Familiarity with Azure Databricks, PySpark, or Azure Data Factory.Experience in using common ETL patterns to process data. Understands best practices and pitfalls of various ETL methods.Ability to translate business requirements or ideas into data architecture and data process design.Familiarity with supply chain or logistics analytics is a plus.Experience with Tableau and/or PowerBI is a plus.If this is an opportunity that interests you, we encourage you to apply even if you do not meet 100% of the requirements.What can you expect from us:Opportunities to learn and develop every day through a wide range of programs