**Hi there!**
We are the biggest European Python Powerhouse with over 18 years of experience, several offices in Poland and deep commitment to Agile principles. Join a group of 500+ professionals dedicated to helping customers build outstanding products.
**Are you the NEXT one?**:
**SALARY**
- Regular + : up to 52 000 MXN gross- Senior: up to 80 000 MXN gross**Job description**:
**Core values**
We believe that every problem has a solution, often hidden and not so obvious. Our job is to work out them and the best are born from imagination, cooperation and being craftsmanship.
**How do we work?**
We work with clients for their benefit and the benefit of their target users. We often act as consultants and architects, people who tear down the existing order, introducing changes and innovations. But just as often we act as craftsmen who must deliver software of the highest quality.
**How the daily work will look like?**
You will be assigned to a team working on a project for one of our foreign clients. Your responsibilities will include developing, inventing and implementing solutions to process large or distributed data.
**What is expected of you?**
As a Data Engineer, on a daily basis you will work with systems that process data often on a large scale.
**Requirements**:
**We expect knowledge of**:
- Best practices for designing scalable data processing systems, including data pipelines, advanced ETL processes, data warehouses and data lakes.
- At least one cloud platform (AWS, Azure, GCP) and its solutions related to data processing.
- SQL and at least one relational database management system like MySQL or PostgreSQL.
- At least one NoSQL database like HBase, DynamoDB or MongoDB.
- Working principles of distributed data processing systems (Apache Spark, Apache Flink or similar).
**Software development practices**:
- Experience in Python 3.x
- more advanced python constructs as: lambda functions, generators, list comprehension etc
- core principles of object-oriented programming
- understanding of the threading and multi-process computation
- Experience in using code versioning tools, such as Git.
- Day-to-day work experience with Docker.
**Soft skills**:
- Good communication skills in English (minimum B2).
- Eagerness to develop yourself and learn new technologies.
- Problem solving and analytical thinking.
**Your experience rating will also be affected by your other skills such as**:
- Understanding and knowing message broker systems like Apache Kafka or AWS Kinesis.
- Experience in using orchestration tools like Apache Airflow or Dagster.
- Knowledge of search systems like ElasticSearch or Solr.
- Experience in using CI/CD tools like Github Actions.
- Experience in implementing solutions using frameworks like Hadoop, Hive or Presto.
- Experience in machine learning or statistics.
- Other development skills like REST API or GraphQL.
- Data scraping experience.
**Benefits**:
**Work-life balance**:
- We are open to discussing individual needs. Set up working hours and limited remote work scheduled with your team and manager, in a way that works for both sides.**Reimbursed private medical care (Medicover) and Multisport**:
- We care about the health and well-being of our colleagues. Choose a sports card and dedicated medical care for yourself and your relatives.**Leader's support**:
- Work with true enthusiasts and professionals who will support you along the way. You can count on leaders and experts who are willing to share their knowledge so that you too can join their ranks someday.**Technology focus**:
- Python and JavaScript are not our only strengths, we are also very good at React Native, IoT, Machine Learning,.Net, DevOps and Blockchain.**Growth review**:
- Junior, Regular or Senior? Every year we have a chance to discuss acquired skills and prepare a development plan for upcoming months.**Events**:
- Attend exciting internal webinars, celebrate special days with us, and join us at conferences and meetups as a listener or speaker!**Vacation**:
- 20 days of vacation to help you maintain your work life balance.