Data Engineer (AWS)

Job description

About us

Who we are? We are Big Data experts, working with international clients, creating and leading innovative projects related to the Big Data environment. We offer tailor-made solutions. It does not matter whether we are talking about building a Data Lake, conducting training in the field of data management, or performing detailed Big Data analysis. We don’t just focus on one technology, instead we specialize in a whole range of open-source and public cloud tools. Our team brings together over 130 specialists in their fields. We have participated in dozens of conferences, written countless amounts of code, we are the organizers of Big Data Tech Summit Warsaw, the largest Polish conference related to Big Data topics. We run webinars, share knowledge on blogs, creating whitepapers and more. Why? Because we believe that Big Data is an indispensable future of business.
Thanks to that, we always select the most optimal Big Data solutions.


We are working on a project for one of the technology companies from the US east coast (working hours by Central European Time). The customer is the global market leader in analyzing the value of digital advertising placements and processes a hundred billion events daily.


Tasks in the project are related to migrating pipelines for processing ad impressions in social media and streaming services to the AWS cloud. The goal is also to improve pipelines’ observability and over time to build modern reusable components for the whole data platform.


  • Analysis of current code

  • Development of high-quality code applying engineering best practices (tests, reviews, CI/CD)

  • Reengineering parts of the process to remove technical debt and optimize resource use

  • Preparing detailed designs and implementation proposals

Technologies used:
  • Python
  • Spark
  • AWS (S3, Glue, ECS, EMR, CloudFormation)
  • GCP (BigQuery, GCS)
  • Airflow
  • Elastic stack


  • At least 3 years of experience in Data and Software Engineer roles
  • Python programming experience
  • Good data engineering background especially with distributed data processing (like Spark), distributed data datastores (Hive), and tools for building data pipelines (Airflow)
  • Working knowledge of data engineering in AWS
We offer
  • Salary : 100-150 PLN net + VAT/h B2B (depending on knowledge and experience)

  • 100% remote work

  • Elastic working hours

  • Possibility to work from the office located in the heart of Warsaw

  • Opportunity to learn and develop with the best Big Data specialists in Poland

  • International projects

  • Possibility of conducting workshops and training.

  • Clear career path and certifications

  • Co-financing sport card

  • Co-financing health care

  • All equipment needed for work