Akvelon is an American company with offices in Seattle, Mexico, Ukraine, Poland, and Serbia. Our company is an official vendor of Microsoft and Google. Our clients also include Amazon, Evernote, Intel, HP, Reddit, Pinterest, AT&T, T-Mobile, Starbucks, and LinkedIn. To work with Akvelon means to be connected with the best and brightest engineering teams from around the globe and working with an actual technology stack building Enterprise, CRM, LOB, Cloud, AI and Machine Learning, Cross-Platform, Mobile, and other types of applications customized to client's needs and processes.
Required Skills & Experience: Bachelor's or Master's degree in a quantitative discipline: engineering, statistics, operations research, computer science, applied mathematics, etc.5+ years of experience working with large-scale ETL systems, with a focus on building and maintaining clean, maintainable code (Python preferred) in a production environment.Strong programming proficiency in Python, SQL, Spark, Scala, etc.Experience with data modeling, ETL concepts, and maintaining data governance for large-scale structured and unstructured data.Expertise with data workflows (e.g., Airflow), data modeling, and both front-end and back-end engineering for data systems.Experience in data visualization and dashboard design, using tools such as Looker, Tableau, or other visualization libraries.Proven ability to execute in a cross-functional environment, collaborating with engineers, data scientists, and other stakeholders.Key Responsibilities: Develop, maintain, and optimize data pipelines: Build and support robust pipelines for data ingestion, processing, and transformation, ensuring high reliability and performance to meet evolving business needs.Ensure data quality, accuracy, and consistency: Monitor, troubleshoot, and resolve issues related to data pipeline performance, data integrity, and consistency. Ensure accurate and timely data availability across ads data systems.Data Tools & Reporting:Support the creation and maintenance of reporting systems: Centralize key business metrics and marketplace metrics. Ensure these metrics are consistently tracked, reported, and accessible across teams.Contribute to data visualization and reporting tools: Assist in implementing data visualization tools like Looker and Hex, ensuring seamless integration of data sources and actionable reporting for both technical and non-technical stakeholders. Continuously improve reporting data models to enhance scalability and ease of use.Enhance internal data tools: Develop and maintain internal applications that streamline data analysis, making it easier for cross-functional teams to generate insights. Ensure these tools are optimized for performance and user-friendly.Work closely with data engineering and data science teams to ensure accurate data collection and consistent advertiser channel history.Collaborate across teams to ensure the effective integration of data tools and metrics for business tracking and analysis.Troubleshoot data issues.Overlap time: From 8 AM till 12 PM PSTPaid vacation, sick leave (without sickness list);Official state holidays — 11 days considered public holidays;Professional growth while attending challenging projects and the possibility to switch your role, master new technologies and skills with company support;Flexible working schedule: 8 hours per day, 40 hours per week. It additionally depends on the project's operational hours. Work on weekends or overtime is only upon request of the customer and is paid in addition;Personal Career Development Plan (CDP);Employee support program (Discount, Care, Heals, Legal compensation);Paid external training, conferences, and professional certification that meets the company's business goals;Internal workshops & seminars;Corporate library (Paper/E-books) and internal English classes.Drop your CV, or browse
Support Docs, Docx, PDF, max size 3 MB
#J-18808-Ljbffr