Our tech stack:
- Snowflake and Postgres for DWH
- Airflow for ETL management and monitoring
- Python/DBT and SQL
You will:
- Move data pipelines to Airflow using Python/DBT with the final destination in PostgreSQL
- Create ETLs for transferring data from CRM, Marketing platforms like Segment, GA, Facebook Ads, etc
- Create and maintain new pipelines for company data projects
- Facilitate and explore ways to enhance data quality and reliability (monitoring of ETLs, datasets checks, etc.)
Requirements
- Experience in the field 3+ years
- Familiarity with data warehousing (e.g Big Query, Snowflake) ETL/ELT Solutions, and data pipeline architecture (e.g Airflow)
- Proficiency in Python and SQL languages
- Previous experience with AirFlow and DBT
- Experience working with REST APIs
- Experience with relational databases. Postgres is an advantage
- Experience with performance tuning of long-running SQL queries
- Basic knowledge of statistics (averages, medians...)
- English - upper intermediate
Benefits
- A secure job in a successful, internationally active company
- Remote work and flexible work schedule
- 28 vacation days and 10 sick days per year with 100% payment
- Career and professional growth
- Competitive salary
- Free English classes by SkyEng
- Mental health support by Oliva
- International environment with a multi-cultural team
Ready to apply for this role?
Apply Now →


