.Canonical is building a comprehensive automation suite to provide multi-cloud and on-premise data solutions for the enterprise. The data platform team is a collaborative team that develops a full range of data stores and data technologies, spanning from big data, through NoSQL, cache-layer capabilities, and analytics; all the way to structured SQL engines.We are facing the interesting problem of fault-tolerant mission-critical distributed systems and intend to deliver the world's best automation solution for delivering data platforms.We have a number of openings ranging anywhere from junior to senior level. We will help you identify a suitable position depending on your experience and interests. Engineers who thrive at Canonical are mindful of open-source community dynamics and equally aware of the needs of large, innovative organisations.Location: This is a Globally remote role.What your day will look likeThe data platform team is responsible for the automation of data platform operations. This includes ensuring fault-tolerant replication, TLS, installation, and much more; but also provides domain-specific expertise on the actual data system to other teams within Canonical. This role is focused on the creation and automation of features of data platforms, not analysing the data in them.Collaborate proactively with a distributed teamWrite high-quality, idiomatic Python code to create new featuresDebug issues and interact with upstream communities publiclyWork with helpful and talented engineers including experts in many fieldsDiscuss ideas and collaborate on finding good solutionsWork from home with global travel for 2 to 4 weeks per year for internal and external eventsWhat we are looking for in youProven hands-on experience in software development using PythonProven hands-on experience in distributed systemsHave a Bachelor's or equivalent in Computer Science, STEM, or a similar degreeWillingness to travel up to 4 times a year for internal eventsAdditional Skills That You Might Also BringYou might also bring a subset of experience from the following, which will determine the exact role and level we consider you for:Experience operating and managing data platform technologies like PostgreSQL, MySQL, MongoDB, OpenSearch, Kafka, Yugabyte, Trino, Superset, Atlas, Ranger, and RedisExperience with Linux systems administration, package management, and operationsExperience with the public cloud or a private cloud solution like OpenStackExperience with operating Kubernetes clusters and a belief that it can be used for serious persistent data servicesWhat we offer youYour base pay will depend on various factors including your geographical location, level of experience, knowledge, and skills. In addition to the benefits above, certain roles are also eligible for additional benefits and rewards including annual bonuses and sales incentives based on revenue or utilisation