You see data as fundamentally important for business decisions. With your skills and experience in working with data lakes, event streams, and data pipelines, it is always easy for you to set up new infrastructure and processes, make necessary changes to the configuration and ensure reliable application and user access to the data. Your desk will be located in Munich or Berlin, but your impact will be global.
What will you be doing:
- Bring data to life for enabling growth opportunities across the company and for our clients.
- Design, implementation and maintenance of a data lake and data pipelines from various sources
- Work with data analysts on consolidated and prepared data projects
- Create and maintain a high-available, sustainable, flexible and machine-learning ready data infrastructure
- Implement new data requirements from all business areas
- Ensure data protection and security compliance
You’ll need to have:
- +3 years experience in a data organization
- Sound expertise of cloud infrastructure
- Able to understand and design complex information and to transfer it to a data infrastructure
- Knowledge of and experience with data architecture and data catalogues
- Experience with SQL, Python, source control, data streaming and data pipeline tools (e.g. Git, Glue, Spark, Apache Airflow, Azkhaban, …)
- Skills in Scala, Kafka, Flink and data lake technologies like e.g. Apache Hudi, Apache Iceberg are beneficial.
- Experience in a multi-cloud environment is beneficial