DevOps

Job description

About us

Who we are? We are Big Data experts , working with international clients, creating and leading innovative projects related to the Big Data environment. We offer tailor-made solutions. It does not matter whether we are talking about building a Data Lake , conducting training in the field of data management, or performing detailed Big Data analysis. We don’t just focus on one technology, instead we specialize in a whole range of open-source and public cloud tools.Our team brings together over 130 specialists in their fields. We have participated in dozens of conferences, written countless amounts of code, we are the organizers of Big Data Tech Summit Warsaw, the largest Polish conference related to Big Data topics. We run webinars, share knowledge on blogs, creating whitepapers and more. Why? Because we believe that Big Data is an indispensable future of business.
Thanks to that, we always select the most optimal Big Data solutions.


Project
The solution encompasses also building analytics workbench to data analysts self-serviced, observability platform and whole infrastructure to make the solution robust, scalable, fault tolerant and high quality, according to the best DevOps and engineering practices. The platform is created from scratch, the project is just starting.

Responsibilities

  • Deploy and maintain stateful applications on Kubernetes

  • Use best engineering practices: DevOps, continuous integration and delivery, infrastructure as a code

  • Work collaboratively within a team of cross-functional engineers

  • Troubleshoot the platform across framework components, some JVM, Kubernetes and OS

Technologies used:
  • Kubernetes
  • Docker
  • Flink/Ververica
  • Ceph
  • Jenkins/ArgoCD
  • Terraform
  • Helm

Requirements

  • Knowledge of CI/CD principles
  • Experience with cloud-native Kubernetes applications and their packaging (Helm)
  • Good understanding of DevOps principles
  • Hands-on experience with service orchestration and management
  • Good understanding of process isolation, virtualization and containerization
  • Strong tendency to keep things simple and maintainable (stick to KISS + YAGNI)
  • Experience in programming and system administration on Linux environments
  • Experience in Ansible and/or Terraform

Nice to have:

  • Experience with operating distributed systems, applications, or services
  • Experience with Kerberized big data environments, especially HDFS
  • Programming experience in at least one modern programming language, Go or Java/Scala or Python


We offer
  • Salary: 120-180 PLN net + VAT/h B2B (depending on knowledge and experience)

  • 100% remote work

  • Elastic working hours

  • Possibility to work from the office located in the heart of Warsaw

  • Opportunity to learn and develop with the best Big Data specialists in Poland

  • International projects

  • Possibility of conducting workshops and training.

  • Clear career path and certifications

  • Co-financing sport card

  • Co-financing health care

  • All equipment needed for work