Data Engineer Gcp

Data Engineer Gcp
Empresa:

Sequoia Connect


Detalles de la oferta

Our client represents the connected world, offering innovative and customer-centric information technology services and solutions, enabling Enterprises, Associates and the Society to Rise. Our client is a USD 4.0 billion company with 107,100+ professionals across 90 countries, helping over 800 global customers including Fortune 500 companies.Its innovation platforms and reusable assets connect across a number of technologies to deliver tangible business value to its stakeholders. Our client is also amongst the Fab 50 companies in Asia as per the Forbes 2014 List.Our client is part of the USD 16.9 billion Group that employs more than 200,000 people in over 100 countries. The Group operates in the key industries that drive economic growth, enjoying a leadership position in tractors, utility vehicles, information technology, after-market and vacation ownership.We are currently searching for a **Data engineer GCP**:**Responsibilities**:- Design and build production data engineering solutions to deliver data pipeline patterns using following Google Cloud Platform (GCP) services:- In-depth understanding of Google's product technology and underlying architectures- BigQuery - Warehouse/ data marts - Through understanding of Big Query internals to write efficient queries for ELT needs, creation of views/materialized views, creation of reusable store procedures etc.- DataFlow (Apache Beam) - reusable Flex templates/ data processing frameworks using Java for both batch and stream needs.- Pub/Sub, Kafka, Confluent Kafka - Real time streaming of database changes or events.- Experience of designing, building, and deploying production-level data pipelines using Kafka; Strong experience working on Event Driven Architecture- Strong knowledge of the Kafka Connect framework, with experience using several connector types HTTP REST proxy, JMS, File, SFTP, JDBC etc.- Experience in handling huge volumes of streaming messages from Kafka- Cloud Composer (Apache Airflow) - to build, monitor and orchestrating the pipeline- Knowledge on BigTable- Cloud SQL, Compute Engine, Cloud Function, Cloud Run and App Engine, Cloud Storage- Experience with open-source distributed storage and processing utilities in the Apache Hadoop family.- Extensive knowledge on processing various file formats orc, Avro, csv, json, xml etc.- Knowledge/experience in any ETL tools like DataStage/Informatica - Ability to understand existing on-premises ETL workflows and redesign them in GCP.- Experience and expertise on Terraform to deploy the GCP's in CI/CD.- Knowledge/ Experience on connecting to on-prem API's from google cloud.**Language**:- English Advance.If you meet these qualifications and are pursuing new challenges that focus on delivering innovative solutions that increase business value, we'd like to talk with you today. Come be a part of the action at Sequoia Connect.Keywords:ELT, BigQuery, SQL, JMS


Fuente: Jobtome_Ppc

Requisitos

Data Engineer Gcp
Empresa:

Sequoia Connect


Service Desk Analyst

**Responsabilidades**:- Answer incoming calls to the Service Desk- Provide basic administration of Access Management, End User Management, accounts and passw...


Desde Celanese International Corporation - Veracruz

Publicado 7 days ago

Monitorista Gps/Cctv Nocturno (Tlalpan)

EMPRESA DE:RASTREO SATELITAL.PUESTO SOLICITADO:MONITORISTA GPS O CCTV TURNO NOCTURNONÚMERO DE VACANTES:1ESCOLARIDAD:TÉCNICO EN INFORMÁTICA, O COMPUTACIÓN CON...


Veracruz

Publicado 7 days ago

Data Engineer Mexico

.The Data Engineer within the Analytics Center of Excellence leverages Acxiom and third-party data to create solutions for business problems defined by speci...


Desde Acxiom - Veracruz

Publicado 7 days ago

Solution Architect 5G Packet Core

**About this opportunity**:We are looking for Solution Architect for:- Packet Core to perform E2E work with nodes like SGSN-MME, EPG, WMG, SAPC and Enterpris...


Desde Ericsson - Veracruz

Publicado 7 days ago

Built at: 2024-10-03T13:31:15.871Z