**Data Engineer
**Responsibilities**:
- Develop and maintain robust, efficient, and scalable ETL processes for data extraction, transformation, and loading tasks.
- Design, implement, and optimize data models to meet business requirements and ensure data integrity and accuracy.
- Collaborate with cross-functional teams, including Data Scientists, Software Engineers, and Business Analysts, to understand data needs and provide data-related solutions.
- Perform data analysis to identify and resolve data quality issues, inconsistencies, and performance bottlenecks.
- Build and maintain the data pipeline architecture to enable data ingestion from various sources into the data warehouse or data lake.
- Work closely with stakeholders to understand business objectives, identify data-related opportunities, and provide actionable insights.
- Develop and maintain documentation related to data processes, data models, and system architecture.
- Collaborate with the Data Governance team to ensure compliance with data privacy regulations and implement appropriate security measures.
**Skills and Qualifications**:
- Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
- Proven work experience as a Data Engineer or in a similar role.
- English Fluent
- Strong knowledge and experience with ETL processes, data modeling, and data warehousing concepts.
- Proficient in SQL, with the ability to write complex queries and optimize their performance.
- Expertise in Python programming language and its associated data libraries (e.g., Pandas, NumPy) for data manipulation and analysis.
- In-depth understanding of relational database systems (e.g., PostgreSQL, MySQL) and experience in query optimization techniques.
- Hands-on experience with Databricks for data engineering, including building and managing data pipelines, and optimizing data processing workflows.
- Strong problem-solving and analytical skills, with the ability to troubleshoot and resolve data-related issues.
- Excellent communication skills and the ability to effectively collaborate with cross-functional teams.
- Knowledge of data governance and data security principles.
- Attention to detail and ability to adhere to project timelines and deadlines.
**Nice to Have Skills**:
- Experience with Informatica Intelligent Cloud Services (IICS) or similar cloud-based integration platforms.
- Experience designing and developing data integration workflows using IICS or similar tools.
- Experience with data integration performance tuning and optimization.
- Familiarity with cloud platform Azure and experience with its data services (e.g., Azure Data Factory).
- Experience with big data technologies (e.g., Hadoop, Spark) and distributed computing frameworks.