Brainsome logo
Brainsome

Data Engineer+ML

We operate in the marketing industry, specializing in lead generation for our clients from different industries. We specialize in solving complex challenges in processing and analyzing large-scale data.

Our system focuses on two key objectives:

  • Matching users with the best β€œoffers” by analyzing their needs and aligning them with partner solutions.
  • Driving high-quality, relevant traffic to our partners’ offers, ensuring maximum conversion rates.

We leverage a data-driven approach, utilizing analytics, machine learning, and automation to:

  • Optimize the lead generation process
  • Improve personalized recommendations
  • Enhance marketing campaign efficiency
  • Increase conversion rates and lead quality

Your Responsibilities:

  • Data Collection & Ingestion: Acquire and ingest data from diverse sources, including OLTP/OLAP databases, event streams, logs, third-party APIs, and file/object storage.
  • Connector Configuration: Set up and manage data source connectors to ensure seamless data flow into the Data Lake.
  • Data Modeling & Documentation: Design, document, and maintain data models to support efficient storage, retrieval, and processing.
  • ETL/ELT Pipeline Development: Build, optimize, and maintain batch and streaming data pipelines for efficient data processing.
  • Data Warehousing & Lake Management: Organize and manage Data Warehouses and Data Lakes to ensure scalability, security, and performance.
  • Data Preparation for BI & ML: Transform and curate data to support Business Intelligence and Machine Learning workflows.
  • Data Quality & Observability: Implement testing, monitoring, and observability frameworks to ensure data accuracy, reliability, and consistency.

Your Experience:

  • Russian fluent
  • Understanding of ETL-processes
  • Experience in Data preparation and maintenance for ML models
  • Experience in deploying ML models as API endpoints
  • Programming skills in Python (Pandas)
  • Strong proficiency in SQL
  • Analytical databases (BigQuery)
  • Experience with Cloud Platforms (GCP, others)
  • GCS / AWS S3 / Azure Blob Storage
  • Fundamentals of mathematical statistics and data processing algorithms
  • PDocker / Kubernetes

We Offer:

  • Top rate pay
  • Language classes compensation
  • Flexible working hours
  • Senior-level team

Ready to apply for this role?

Apply Now β†’

Related jobs

Apply Now β†’