.Company DescriptionTechnology is our how. And people are our why. For over two decades, we have been harnessing technology to drive meaningful change.By combining world-class engineering, industry expertise and a people-centric mindset, we consult and partner with leading brands from various industries to create dynamic platforms and intelligent digital experiences that drive innovation and transform businesses.From prototype to real-world impact - be part of a global shift by doing work that matters.Job DescriptionOur data team has expertise across engineering, analysis, architecture, modeling, machine learning, artificial intelligence, and data science. This discipline is responsible for transforming raw data into actionable insights, building robust data infrastructures, and enabling data-driven decision-making and innovation through advanced analytics and predictive modeling.As a Senior Data Engineer at Endava, you will be responsible for developing and implementing data pipelines, collaborating with data architects, and ensuring data quality and integrity. You will also support data migration projects and contribute to agile development processes.Responsibilities:Work closely with the Data Analyst/Data Scientist to understand evolving needs and define the data processing flow or interactive reports.Discuss with the stakeholders from other teams to better understand how data flows are used within the existing environment.Propose solutions for the cloud-based architecture and deployment flow.Design and build processes, data transformation, and metadata to meet business requirements and platform needs.Design and propose solutions for the Relational and Dimensional Model based on platform capabilities.Develop, maintain, test, and evaluate big data solutions.Focus on production status and data quality of the data environment.Pioneer initiatives around data quality, integrity, and security.Qualifications:5+ years in Data Engineering.Proficiency in SQL and relational databases (e.G., MySQL, PostgreSQL).Expertise in ETL processes and tools (e.G., Apache NiFi, Airflow).Strong programming skills in Python, Java, or Scala.Experience with big data technologies (e.G., Hadoop, Spark, Kafka).Cloud platform expertise (AWS, Google Cloud, Azure) and their data services.Skilled in data warehousing tools (e.G., Snowflake, BigQuery, Redshift).Knowledge of serverless processing.Strong analytical skills for structured and unstructured data.Some experience in leading IT projects and effective stakeholder management.Advanced English level (B2 or higher)