Client: Our client is a global investment firm that manages tens of billions of dollars in assets and provides advisory services focused on credit products.Position overview: We are looking for a skilled Data Engineer to join our team and support the design, development, and maintenance of data solutions that drive critical business insights. This role requires expertise in data integration, transformation, and database management to enable high-performance data pipelines and analytics. The position is fully remote, offering flexibility and the opportunity to work with a dynamic team on impactful projects.Technology stack: C#, Python, React, SQL, Azure- Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes to support analytics and reporting needs- Collaborate with data scientists, analysts, and other stakeholders to define data requirements and implement solutions- Ensure data quality, integrity, and accuracy through efficient data cleansing, transformation, and validation processes- Monitor and optimize data systems for performance, reliability, and cost efficiency- Maintain documentation for data architecture, pipelines, and ETL processes- 3+ years of experience as a Data Engineer or in a similar role- Proficiency in Python for data processing and automation, experience with libraries such as Pandas, Polars, NumPy, and SQLAlchemy- Strong understanding of data warehousing concepts and architectures, experience in designing, building, and maintaining data warehouses- Strong proficiency in SQL for data querying, transformation, and optimization- Experience with data integration tools and ETL processes (e.G., Apache Airflow, Talend, Informatica)- Proficient in using DBT (Data Build Tool) for transforming data in the warehouse, familiarity with DBT Cloud and its integration with various data warehouses- Skilled in using Microsoft Fabric for data integration and orchestration- Experience with reporting and analytics platforms (e.G., Power BI, Looker)- Familiarity with cloud platforms such as AWS, Azure, or GCP and their data services (e.G., S3, Redshift, Azure Data Factory, BigQuery)- Experience implementing data governance best practices to ensure data compliance and privacy- Knowledge of data security practices and regulatory standards such as GDPR or CCPA- Strong analytical skills with attention to detail, able to troubleshoot data issues efficiently- Excellent collaboration and communication skills for cross-functional teamwork- Self-driven and proactive, with the ability to work independently in a remote setting