What will you be doing at DeepL?
As a data platform engineer you act as a key figure between our software developers, data analysts, product managers and the devops team. Together with your dedicated team you develop, maintain and manage a constantly growing internal analytics platform (PaaS-like).
Your responsibilities
- Creating, testing and alerting for data pipelines
- Building out our data infrastructure and managing dependencies between data pipelines
- Implementing of software for internal tooling and our streaming pipelines
- Defining and implementing metrics that provide visibility into our data quality
- Managing and maintaining our Kafka and Clickhouse clusters
- Defining new requirements and constantly challenge our ideas
- Last but not least, you are not afraid of trying new technologies and ways even if StackOverflow says it can’t be done
What we offer
- Data at scale from products used by more than 100 million people worldwide
- Our own analytics and experimentation platform - far beyond the limitations of standard web analytics platforms
- You get the chance to work and play with: datawarehouses (Clickhouse), configuration management (Ansible), container orchestration (Kubernetes) and CI/CD (Jenkins/Gitlab CI)
- Interesting challenges: design and programming at the highest level
- A friendly, international, and highly committed team with a lot of trust and very short decision making processes
- Meaningful work: We break down language barriers worldwide and bring different cultures closer together
- A comfortable office in Cologne or Paderborn (or suitable equipment for your home office) and a lot of flexibility
About you
- 2+ years of industry experience as a data engineer
- A university degree in computer science, information systems or a similar technical field or a similar qualification
- Expert in Python and basic knowledge in a compiled language (like C#, C++, Java)
- A solid understanding of SQL
- Hands-on experience with event streaming and common technologies like Apache Kafka or RabbitMQ
- You know your way around Linux and can debug a failing system
- Familiar with the processing of unstructured and structured data at scale
- You have worked with containers (like Docker) and automated their creation
- Good communication and team player skills
- Fluent in English, German is a plus
We are looking forward to your application!
Link: https://www.deepl.com/de/jobs.html