Es un proyecto por 6 meses, con posibilidad de extensión 6 meses más y finalmente posibilidad de contratación con empresa en USA
**2 -3 AÑOS DE EXPERIENCIA**
**Required Qualifications**
- **Bachelor's degree in Computer Science, Engineering, or related field**:
- **Previous experience in data engineering, with a focus on extract and load stages of the pipeline**:
- **Strong knowledge of advanced SQL, Python, Java Script and data modeling principles**
HABRÁ APOYO PARA SEGUIR APRENDIENDO:
- Experience with data quality and validation checks
- Experience with GCP, dbt, Dagster, BigQuery, Cloud Functions, Adverity, Cloud Storage, Pub/Sub, Cloud Build, Cloud Logging, Airtable, and GitHub
- Experience with data pipeline optimization for scalability, reliability, and performance
- Hands-on experience with structured and un-structured database design
- Experience in digital marketing data sets, including Google Ads, Microsoft Ads, Facebook Ads, Pinterest Ads, TikTok Ads, theTradeDesk, and Google Analytics is a plus
- Experience creating and maintaining APIs a plus
**Skills**:
- Strong problem-solving skills and ability to troubleshoot complex issues
- Excellent communication and collaboration skills
- Able to quickly learn new systems and software programs with mínimal training,
- documentation or guidance
- Works well in a team environment, as well as independently
**Responsibilities**
- Design, develop, and implement pipelines and integrations to connect to various client databases, as well as ingesting into a single base for our cross-client, agency-wide database
- Build and maintain various DAGs for orchestration/scheduling
- Develop and implement data quality and validation checks
- Optimize data pipelines for scalability, reliability, and performance
- Collaborate with analysts and analytic engineer to ensure data is properly transformed and loaded into BigQuery
- Work with other teams to define requirements for data pipelines and integrations
- Monitor and troubleshoot data pipelines for errors and performance issues
- Implement security and compliance best practices to protect our data
- Maintain and enhance the existing technology stack
- Develop, enhance, and maintain end-user computing or semi-automated tools using tools such as Salesforce Datorama, Adverity, Python, SQL, Google BigQuery, Airtable, Google Sheets, Excel, Looker Studio, and Tableau or other relevant BI tools
- Document data dictionary, and data model that supports the enterprise data lake
- Maintain reference lists, business rules, and other data documentation
- Maintain high standards of software quality via code standardization, code review, testing, deployment automation, and tooling
- Respond to support requests and troubleshoot data and pipeline issues
Tipo de puesto: Tiempo completo, Por obra o tiempo determinado
Duración del contrato: 6 meses
Salario: $25,000.00 - $30,000.00 al mes
Horario:
- Lunes a viernes
- Turno de 8 horas
Prestaciones:
- Horarios flexibles
- Opción a contrato indefinido
Experiência:
- SQL: 2 años (Deseable)
- Python: 2 años (Deseable)
- JavaScript: 2 años (Deseable)
Idioma:
- Inglés Fluido (Obligatorio)
Lugar de trabajo: remoto híbrido en 66220, San Pedro Garza García, N. L.