Remote Senior Data Engineer or Data Architect GCPOur client, founded in Poland, is a leading consulting and technology company specializing in data analysis, business intelligence (BI), and Big Data solutions. The company is dedicated to transforming data into valuable information that drives strategic decision-making, enabling organizations to optimize their operations and gain a competitive edge in the market.With a solid reputation built on an innovative approach and the high quality of its services, our client has collaborated with some of the most recognized brands worldwide. Their team of experts employs cutting-edge technologies, including artificial intelligence and machine learning, to solve complex problems and improve operational efficiency for their clients. This ability to handle large volumes of data and extract actionable insights positions our client as a strategic and valuable partner across various sectors.In addition to their technical expertise, our client is distinguished by a corporate culture focused on innovation and continuous development. The company invests significantly in staff training and the research and development of new technologies, ensuring they are always at the forefront of emerging technological trends. This proactive approach allows our client to anticipate market needs and offer solutions that address current challenges and prepare their clients for the future.We are currently searching for a Remote Senior Data Engineer or Data Architect (GCP):ResponsibilitiesYou will be a part of the team accountable for the design, model, and development of the whole GCP data ecosystem for one of our Clients (Cloud Storage, Cloud Functions, BigQuery).Involvement will be needed throughout the process, starting with gathering, analyzing, modeling, and documenting business/technical requirements. The role will include direct contact with clients.Modeling data from various sources and technologies. Troubleshooting and supporting the most complex and high-impact problems to deliver new features and functionalities.RequirementsAt least 8 years of experience as a Data Engineer, including min. 3 years of experience working with GCP cloud-based infrastructure & systems.Strong knowledge of cloud computing platforms - Google Cloud - Candidate should be able to design, build, and deploy data pipelines and applications in the cloud.Proficient in data modeling techniques and database optimization. Knowledge of query optimization, indexing, and performance tuning is necessary for efficient data retrieval and processing.Proficient in database management systems such as SQL (BigQuery is a must), NoSQL.Candidate should be able to design, configure, and manage databases to ensure optimal performance and reliability.Experience with data integration tools and techniques, such as ETL and ELT. The Candidate should be able to integrate data from multiple sources and transform it into a format suitable for analysis.Excellent communication skills to effectively collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders. Ability to convey technical concepts to non-technical stakeholders clearly and concisely.Programming skills (SQL, Python, other scripting).Open to learning new technologies and solutions.Experience in a multinational environment.Certifications in big data technologies or cloud platforms.Experience with BI solutions (e.g., Tableau).Experience with Azure cloud-based infrastructure & systems.Experience with ETL tools: e.g., Talend, Alteryx.Experience working in distributed teams.LanguagesAdvanced Oral English.Native Spanish.Note:Fully remote.If you meet these qualifications and are pursuing new challenges, start your application to join an award-winning employer. Explore all our job openings | Sequoia Career's Page.Salary: $12,000.00
#J-18808-Ljbffr