Senior Data Engineer (Azure / Databricks)
- Remote
- Warsaw, Mazowieckie, Poland
- PLN 160 - PLN 200
- Data Engineering
Job description
About us
GetInData | Part of Xebia is a leading data company working for international Clients, delivering innovative projects related to Data, AI, Cloud, Analytics, ML/LLM, and GenAI. The company was founded in 2014 by data engineers and today brings together 120 Data & AI experts. Our Clients are both fast-growing scaleups and large corporations that are industry leaders. In 2022, we joined forces with Xebia Group to broaden our horizons and bring new international opportunities.
What about the projects we work with?
We run a variety of projects in which our sweepmasters can excel. Advanced Analytics, Data Platforms, Streaming Analytics Platforms, Machine Learning Models, Generative AI and more. We like working with top technologies and open-source solutions for Data & AI and ML/AI. In our portfolio, you can find Clients from many industries, e.g., media, e-commerce, retail, fintech, banking, and telcos, such as Truecaller, Spotify, ING, Acast, Volt, Play, and Allegro. You can read some customer stories here.
What else do we do besides working on projects?
We conduct many initiatives like Guilds and Labs and other knowledge-sharing initiatives. We build a community around Data & AI, thanks to our conference Big Data Technology Warsaw Summit, meetup Warsaw Data Tech Talks, Radio Data podcast, and DATA Pill newsletter.
Data & AI projects that we run and the company's philosophy of sharing knowledge and ideas in this field make GetInData | Part of Xebia not only a great place to work but also a place that provides you with a real opportunity to boost your career.
If you want to be up to date with the latest news from us, please follow up on our LinkedIn profile.
About role
A Data Engineer's role involves the design, construction, and upkeep of data architecture, tools, and procedures facilitating an organization's collection, storage, manipulation, and analysis of substantial data volumes. This position involves erecting data platforms atop commonly provided infrastructure and establishing a streamlined path for Analytics Engineers who rely on the system.
Responsibilities
- Working together with Platform Engineers to assess and choose the most suitable technologies and tools for the project
- Development and committing of new functionalities and open-source tools
- Executing intricate data intake procedures
- Implementing and enacting policies in line with the company's strategic plans regarding utilized technologies, work organization, etc.
- Ensuring compliance with industry standards and regulations in terms of security, data privacy applied in the data processing layer
- Conducting training and knowledge-sharing
Job requirements
- Proficiency in a programming language like Python / Scala or Java
- Knowledge of Lakehouse platforms - Databricks
- Experience working with dbt
- Familiarity with Version Control Systems, particularly GIT
- Experience as a programmer and knowledge of software engineering, good principles, practices, and solutions
- Extensive experience in Microsoft Azure
- Knowledge of at least one orchestration and scheduling tool, for example, Airflow, Azure Data Factory, Prefect, Dagster
- Familiarity with DevOps practices and tools, including Docker, Terraform, CI/CD, Azure DevOps
- Ability to actively participate/lead discussions with clients to identify and assess concrete and ambitious avenues for improvement
Salary: 160 - 200 PLN net + VAT/h B2B (depending on knowledge and experience)
100% remote work
Flexible working hours
Possibility to work from the office located in the heart of Warsaw
Opportunity to learn and develop with the best Big Data experts
International projects
Possibility of conducting workshops and training
Certifications
Co-financing sport card
Co-financing health care
All equipment needed for work
- Warsaw, Mazowieckie, Poland
or
All done!
Your application has been successfully submitted!