.We are seeking a versatile and experienced professional to fill the role of Data Engineer specific to analytics and Big Data technologies.
This position requires a candidate who can seamlessly navigate between designing robust data pipelines and have experience in constructing efficient data warehousing solutions as well as reporting and data modeling.
The ideal candidate will have a passion for data architecture, a strong technical background, and the ability to contribute to both data engineering and data warehousing initiatives.We are looking for Analytics and Big Data experience, data modeling and integration to different systems in the Cloud realm.
We would like to have a candidate with experience in Spark, SQL, Python, terraform, Oracle Cloud, Oracle DB, scripting and data warehouse experience.Career Level: IC4ResponsibilitiesData Engineering:Design, develop, and maintain scalable data pipelines and ETL processes to collect, process, and store large volumes of data from diverse sources.Implement and optimize distributed computing frameworks such as Apache Hadoop, Apache Spark, or Apache Flink to handle big data processing tasks efficiently.Collaborate with cross-functional teams to understand data requirements and deliver innovative solutions for data ingestion, transformation, and storage.Monitor and troubleshoot data pipeline performance issues, implementing solutions to improve reliability and efficiency.Data Warehousing:Design dimensional data models and database schemas optimized for analytical queries and reporting needs.Develop and maintain ETL workflows to extract, transform, and load data from source systems into the data warehouse, ensuring data quality and consistency.Implement and optimize data warehouse architectures and indexing strategies for efficient data storage and retrieval.Collaborate with business analysts and stakeholders to understand reporting and analytics requirements and translate them into technical solutions.Qualifications:Technical Skills:Proficiency in programming languages such as Python with experience in SQL databases.Hands-on experience with big data technologies such as Apache Spark, or Apache Flink, and familiarity with related ecosystem tools.Experience designing and implementing dimensional data models for data warehousing solutions.Knowledge of ETL tools and techniques for data integration and transformation.Analytical Skills:Strong analytical and problem-solving skills, with the ability to understand complex data requirements and translate them into technical solutions.Experience working with stakeholders to gather and analyze business requirements and deliver solutions that meet their needs.Communication and Collaboration:Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.Ability to communicate technical concepts to non-technical stakeholders and collaborate with colleagues from diverse backgrounds