Cloud Data Engineer

Cloud Data Engineer
Empresa:

Maswer


Lugar:

México

Detalles de la oferta

We're hiring: Cloud Data Engineer // Advanced English level

**Education:

- BA/BS degree in Computer Science, Applied Mathematics, Information Technology, Data Science or similar.
- Proven experience as a Cloud Data Engineer.
- Certifications like Google Certified Professional Data Engineer or AWS Certified Big Data are a plus.
- Advanced English is required.

**Work experience:

- 5 or more years of demonstrated success in the specialized technical area.
- 5 or more years of experience working with international teams.
- 5 or more years of participation in projects and initiatives.

**Responsibility of the profile:

- The specialized profile will be responsible for designing, building, and managing the information management architecture in a cloud environment. This includes setting up and maintaining large-scale data processing systems, creating ETL pipelines, and ensuring the data's security, scalability, and accessibility.
- Design, construct, install, test, and maintain highly scalable data management systems in the cloud.
- Build high-performance algorithms, prototypes, predictive models, and proof of concepts using ETL techniques.
- Develop and implement data collection systems and other strategies that optimize statistical efficiency and data quality.
- Build automated data validation and error tracking systems to ensure data integrity and accuracy.
- Design cloud-based data storage infrastructure and ensure it meets the needs of our organization.
- Migrate data from legacy systems to cloud storage solutions.
- Implement effective metrics and monitoring processes.
- Ensure the data architecture supports the requirements of the business and complies with regulatory and security standards.

**Knowledge:

- Cloud Platforms: Expertise in major cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP).
- ETL Tools: Experience with ETL tools such as Informatica, Talend, or Microsoft SSIS.
- Big Data Technologies: Familiarity with Hadoop, Spark, or other big data technologies.
- Data Warehousing: Knowledge of data warehousing concepts and technologies like Redshift, BigQuery, or Snowflake.
- Programming Languages: Proficiency in languages like Python, Java, or Scala.
- SQL and NoSQL Databases: Deep understanding of SQL and NoSQL databases, such as MySQL, PostgreSQL, MongoDB, or Cassandra.
- Data Pipeline Tools: Experience with data pipeline tools like Apache Beam, Airflow, or Luigi.
- Data Modeling: Knowledge in data modeling and data architecture principles.
- Security: Understanding of data security and privacy principles and techniques in cloud environments.
- DevOps and Agile Methodologies: Familiarity with DevOps and Agile methodologies
- Domains at least one of the major cloud architectures such as EC2 or S3 and knows well the ecosystem from the technology with expertise.
- Cloud Platforms: Expertise in major cloud platforms like AWS (Amazon Web Services), Google Cloud Platform (GCP), Microsoft Azure, or others is vital. Each of these platforms offers services like AWS Glue, GCP Dataflow, and Azure Data Factory, which are often used for ETL processes.
- ETL Tools: Familiarity with a variety of ETL tools like Informatica, Talend, DataStage, SSIS, etc. Some cloud-native tools such as AWS Glue or Google Cloud Dataflow could also be relevant.
- Programming Languages: Strong programming skills are necessary for script writing and automation. Languages such as Python and Java are often used. SQL knowledge is crucial for database interactions.
- Databases: Proficiency in working with SQL and NoSQL databases like MySQL, PostgreSQL, Oracle, MongoDB, Cassandra, etc. Understanding of distributed systems like Hadoop, Hive, and Spark can also be beneficial.
- Data Warehousing Solutions: Knowledge of cloud-based data warehousing solutions such as Google BigQuery, Amazon Redshift, or Snowflake is highly important.
- Data Modeling: Ability to design, implement, and maintain optimal data pipeline architecture. This includes conceptual, logical, and physical data models.
- Data Security: Understanding of data privacy standards and encryption methods to ensure data security in the cloud.
- DevOps: Familiarity with CI/CD principles and tools (like Jenkins, Git, Docker, Kubernetes etc.) to automate ETL pipelines is a valuable skill.
- Performance Tuning: Ability to optimize and improve the performance of ETL processes, databases, and data queries.
- Data Visualization Tools: Although not strictly an ETL task, knowledge of data visualization tools like Tableau, Power BI, or Looker can be helpful for presenting data insights.
- Real-time Data Processing: Familiarity with real-time data processing tools such as Apache Kafka, Apache Flink or Google Cloud Pub/Sub.

**Offer:

- Basis salary.
- Grocery vouchers.
- SGMM
- Life insurance
- Saving Fund
- Punctuality and attendance bonus.
- Provisions of law from the first day.


Fuente: Whatjobs_Ppc

Requisitos

Cloud Data Engineer
Empresa:

Maswer


Lugar:

México

Bilingual Sr Android Developer (Kotlin)

At NEORIS we continue to grow and want you to participate by adding more people to our Teams. Invite them to be part of new and challenging projects that spa...


México

Publicado 12 days ago

Qa Automation

Direct message the job poster from Aditi LATAM IT Recruiter at Aditi Consulting | Hiring in LATAM for worldwide projects Aditi Consulting - LATAM We are l...


México

Publicado 12 days ago

Senior Software Engineer

We're looking for a Senior Software Engineer. Headquartered in Los Angeles, California, Right Balance provides top-tier technology talent for innovative comp...


México

Publicado 12 days ago

Documentation Engineer Susy (F/M/D)

Working at Freudenberg: "We will wow your world!" This is our promise. As a global technology group, we not only make the world cleaner, healthier and more c...


Desde Eagleburgmann - México

Publicado 11 days ago

Built at: 2024-09-14T07:27:11.676Z