Quadcode logo
Quadcode

Data Engineer

We are Quadcode, a fintech company excelling in financial brokerage activities and delivering advanced financial products to our global clientele. Our flagship product, an internal trading platform, is offered as a Software-as-a-Service (SaaS) solution to other brokers.

We are currently looking for a Data Engineer to join our Data Platform team. This team plays a crucial role in building and maintaining the company’s analytical platform, enabling data-driven decision-making across the business.

Our current team consists of:

  • Data Engineers
  • Team Leader

We follow Agile practices, with daily stand-ups at 12:00 PM (GMT+3), regular peer code reviews, and use tools like Slack, Google Meet, and Zoom for collaboration.

Tech Stack:

  • Databases: Greenplum, PostgreSQL, Clickhouse
  • ETL & Orchestration: Airflow, DBT
  • Programming: Python, Scala
  • Streaming & Messaging: Apache Kafka, Apache Flink
  • BI & Visualization: Metabase
  • Storage & Tools: S3, Datahub, Linux

Tasks:

  • Integrate new data sources into the platform;
  • Respond to internal user requests and incidents;
  • Co-develop data marts in collaboration with analysts;
  • Ensure the completeness and consistency of analytical data.

What We Expect From You:

  • 1+ years in Data Engineering or 2+ years in Data Analytics;
  • Working knowledge of relational databases (Greenplum, PostgreSQL, Oracle, MySQL, MS SQL);
  • Solid understanding of DBMS and ETL concepts (ACID, normalization, CAP theorem, OLTP vs OLAP, scaling strategies);
  • Proficiency in SQL and query optimization;
  • Experience with Linux environments and Docker;
  • Good Python skills (OOP, data structures, decorators, venv, PEP8);
  • Experience with Airflow and message brokers like Kafka;
  • BI tools experience;
  • Fluency in Russian and English level B1 or just English B2 or above;
  • Familiarity with NoSQL databases (Cassandra, Redis, Infinispan).

Nice to Have:

  • Familiarity with NoSQL databases (Cassandra, Redis, Infinispan)
  • Experience with GitLab CI/CD, Grafana, Ansible
  • Knowledge of Flink, Spark, Scala, Kubernetes
  • Formal education or certifications in Data Engineering or Data Science

What we offer:

  • Full-time remote work model (Service Provider).
  • Competitive remuneration.
  • 20 paid days off annually.
  • Flexible working hours.
  • Training and development opportunities.
  • A friendly, enjoyable, and positive work environment.

Ready to apply for this role?

Apply Now β†’

Related jobs

Apply Now β†’