.Senior Data Engineer – Data Fabric PracticeGluo, an Orium Company, is a leading eCommerce agency with over 20 years in the market. We implement ecommerce platforms, payment gateways, search engines, content management systems, ERPs, and CRMs. Additionally, we develop new commerce functionalities from scratch.We specialize in Headless and Composable Commerce. We create flexible, connectable, and scalable stores that enhance your user experience.We're a remote company based in Mexico, and our team of 100+ brings together strategists, architects, designers, and developers to craft innovative digital experiences.About the OpportunityWe are looking for a Senior Data Engineer to join our team. This role will play a critical part in designing and implementing data fabric solutions, enabling real-time data orchestration, and supporting enterprise analytics.ResponsibilitiesShape the Data Fabric Practice: Contribute to the development of accelerators, frameworks, and best practices for our data fabric offerings.Build Modern Data Architectures: Design and deploy data fabric solutions to unify and simplify access to enterprise data; Enable seamless integration of data across diverse systems to support analytics, AI/ML, and decision-making.Enable Real-Time Data Synchronization: Develop event-driven architectures using tools like Kafka, EventBridge, or Pub/Sub to ensure real-time data flow across platforms.Collaborate with Cross-Functional Teams: Partner with stakeholders across engineering, data science, and business teams to align data solutions with strategic goals; Work with cloud providers (e.G., Google Cloud, AWS) and integration partners to deliver innovative solutions.Optimize Data Ecosystem: Ensure the scalability, reliability, and performance of data pipelines and infrastructure to meet enterprise needs; Automate complex data integration processes.Skills and Qualifications5+ years of experience in data engineering, with a strong focus on modern data ecosystems.Expertise in cloud platforms like Google Cloud, AWS, or Azure.Hands-on experience with ETL processes, real-time data pipelines, and integration tools (e.G., Apache Kafka).Holding at least one relevant certification is required. Examples include: AWS Certified Data Analytics – Specialty, Google Professional Data Engineer Certification, Confluent Certified Developer for Apache Kafka.Deep knowledge of data modeling, data architecture, and performance optimization.Familiarity with AI/ML use cases and enabling analytics-ready data pipelines.Exceptional communication and problem-solving skills.Ability to collaborate with diverse teams, aligning technical execution with business priorities.Leadership experience or interest in mentoring junior team members.Fluency in both Spanish and English, with excellent written and verbal communication skills.Something else…We care a lot about our culture and the people who are part of the team, so there are certain characteristics we look for in each person