.**The Company**Wizeline is a global technology services provider that partners with Fortune 500 companies and startups to build high-quality digital products and platforms that accelerate time-to-market. Our diverse and adaptive teams provide the right combination of solutions and methodologies to deliver results while collaborating with our customers' teams to foster innovation through continuous learning.**Our People**At Wizeline, we understand that great people and teams build great software. All team members are empowered to take ownership, raise their hands, and develop innovative solutions to our customers' most challenging problems. To retain and develop top talent, we foster a working environment that celebrates creativity, encourages skill development, and allows for multidisciplinary collaboration.**Community Impact**We are invested in making a positive impact in our communities. That's why we created Wizeline Academy, a free, community-based education program that teaches high-value skills to workers looking to advance their tech industry careers. To date, Academy has served more than 32,000 students with 250 instructors across 194 courses. Wizeliners have the opportunity to upskill by taking Academy courses and can also share their expertise by delivering classes to students.**What you will bring to the team**We are looking for Senior Data Engineers to drive the architectural design, implementation plan, best practices, and testing plans for projects involving terabytes of data, which will serve as the foundation for advanced analytics and machine learning tasks to be performed by data scientists on top of that infrastructure.**Your day-to-day activities**:- Design and implement product features in collaboration with product owners, report developers, product analysts, architects, and business partners within an Agile / Scrum methodology.- Design and implement data platforms for large-scale, high performance and scalable requirements, integrating data from several data sources, managing structured and unstructured data while melding existing warehouse structures.- Analyze, diagnose and identify bottlenecks in data workflows- Participate in demos to clients as well as requirements elicitation and translation to systems requirements (functional and nonfunctional).- Constantly monitor, refine and report on the performance of data management systems.**Are you a fit?**:To be successful in this role, you must have:- Strong General Programming Skills- Experience with Spark.- Solid engineering foundations (good coding practices, good architectural design skills)- Experience working with SQL in advanced scenarios that require heavy optimization- 4+ years of experience with large-scale data engineering with emphasis on analytics and reporting- 2+ years of experience developing on Hadoop-like Ecosystem- Experience building cloud scalable, real time and high-performance Data Lake solutions