.MercerTech DnAi Platform engineering team is seeking candidates for the following position based in the CDMX office and be onsite 3 days a week: Sr. Data Platform EngineerMercerTech DnAi Platform engineering team is seeking an experienced Mid-Level Data Platform Engineer with expertise in data with AWS cloud who has expertise in the administration, development, and implementation of large-scale data projects in the hybrid data platform (on-prem and cloud both).What can you expect?You will be responsible for overseeing the design, implementation, and maintenance of the data platform built on Kubernetes, AWS services, and other SAAS solutions.You will be responsible for setting up the platform and its toolkit like Databricks, Snowflake, IDMC with the best possible standards for different projects to work on it.You will also collaborate with projects' business teams to understand their requirements, technical needs, and help the team design their architecture using the combination of data tools aligning with MMC's technology standards and security.You will also be involved in onboarding new technologies onto the platform and ensuring smooth operations through regular maintenance, governance, and guideline definition.You should be an expert in ETL processing toolsets to have the capability to guide the developers to optimize their jobs.What is in it for you?A company with a strong Brand and strong results to match.Culture of internal mobility, collaboration, and valued partnership from the business.Employee Resource Groups which provide access to leaders, relevant volunteer and mentoring opportunities, and interactions with counterparts in industry groups and client organizations.Entitled to vacation, floating holidays, time off to give back to your community, sick days, and national holidays (with early dismissal).We will count on you to:Work with our partners to understand business and technology problems or opportunities.Conceptualize and implement the setup and management of the data platform built on Kubernetes, AWS services, and other cloud SAAS solutions like Databricks, Snowflake, IDMC, Mongo Atlas across different regions.Configure and optimize the tech stack including Spark, NiFi, Python, R, Databricks, IDMC by Informatica, Snowflake AWS S3, Glue, and Athena for efficient performance and scalability to create the best-in-class data ingestion and processing data platform.Understand the data landscape i.E., tooling, tech stack, source systems etc. and work closely with development teams to improve data collection, quality, reporting, and analytics capabilities.Provide guidance on selecting the appropriate tools in appropriate AWS regions to meet project objectives and performance requirements.Set up the skilled team who can support the projects to build efficient data pipelines