**Required Qualifications**- Bachelor's degree in Computer Science, Engineering, or related field- Previous experience in data engineering, with a focus on extract and load stages of the pipeline- Experience with GCP, dbt, Dagster, BigQuery, Cloud Functions, Adverity, Cloud Storage, Pub/Sub, Cloud Build, Cloud Logging, Airtable, and GitHub- Strong knowledge of advanced SQL, Python, Java Script and data modeling principles- Experience with data quality and validation checks- Experience with data pipeline optimization for scalability, reliability, and performance- Hands-on experience with structured and un-structured database design- Experience in digital marketing data sets, including Google Ads, Microsoft Ads, Facebook Ads, Pinterest Ads, TikTok Ads, theTradeDesk, and Google Analytics is a plus- Experience creating and maintaining APIs a plus**Skills**:- Strong problem-solving skills and ability to troubleshoot complex issues- Excellent communication and collaboration skills- Able to quickly learn new systems and software programs with mínimal training,- documentation or guidance- Works well in a team environment, as well as independently**Responsibilities**- Design, develop, and implement pipelines and integrations to connect to various client databases, as well as ingesting into a single base for our cross-client, agency-wide database- Build and maintain various DAGs for orchestration/scheduling- Develop and implement data quality and validation checks- Optimize data pipelines for scalability, reliability, and performance- Collaborate with analysts and analytic engineer to ensure data is properly transformed and loaded into BigQuery- Work with other teams to define requirements for data pipelines and integrations- Monitor and troubleshoot data pipelines for errors and performance issues- Implement security and compliance best practices to protect our data- Maintain and enhance the existing technology stack- Develop, enhance, and maintain end-user computing or semi-automated tools using tools such as Salesforce Datorama, Adverity, Python, SQL, Google BigQuery, Airtable, Google Sheets, Excel, Looker Studio, and Tableau or other relevant BI tools- Document data dictionary, and data model that supports the enterprise data lake- Maintain reference lists, business rules, and other data documentation- Maintain high standards of software quality via code standardization, code review, testing, deployment automation, and tooling- Respond to support requests and troubleshoot data and pipeline issuesTipo de puesto: Tiempo completoSalario: $25,000.00 - $30,000.00 al mesHorario:- Lunes a viernes- Turno de 8 horasPrestaciones:- Horarios flexibles- Opción a contrato indefinidoExperiência:- SQL: 2 años (Deseable)- Python: 2 años (Deseable)- JavaScript: 2 años (Deseable)Idioma:- Inglés Fluido (Obligatorio)Lugar de trabajo: remoto híbrido en 66220, San Pedro Garza García, N. L.