Big Data Developer or Architect (Cloud, DevOps, Open Source)
Job description
As a Big Data Developer, your aim is to design architectures for Applications, On Demand Workflows, On Demand Data Labs and Microservices using Open Source and Cloud tools and implement them together with the client.
Therefore we are looking for Developers and Architects of different levels with different focus areas:
Generally good to have: Docker, Kafka, Hadoop, Spark, Kubernetes/Mesos, Version Control.
Developer focus: Java/Python/Scala/Go, Unix, shells, Testing.
DevOps focus: Ansible, Puppet, Chef, Jenkins, Prometheous, Nagios, ELK stack…
Cloud Expert: AWS, Azure, GCP, OpenStack.
Your tasks
- As part of our dynamic team, you will support our clients during the successful development and implementation of complex Big Data solutions.
- Through the power of distributed open source and cloud technologies, you provide scalable, cost efficient and flexible solutions to our customers.
- You automate workflows and work seamlessly across various technology domains (e.g. native cloud and open source technologies).
- You are in regular contact with customers and stakeholders and you communicate efficiently and professionally.
Why us?
- Voted Best Place to Work in Reply Germany in 2016, 2017, 2018.
- The right to 10 days of training each year for your professional growth; trainings are defined by you with approval of your manager.
- Bonuses based on your performance and billability. Targets are set by you with your Manager and must be achievable.
- Work in a great multicultural team, on interesting projects with new technologies.
- Several teambuilding events throughout the year (Family Firdays, XChange, Summer & Winter Workshop, etc).
- German Lessons.
- Easy to get involved on topics of running a consulting business: HR, Sales, Marketing, Project Staffing.
- Buddy Program: You will get a Buddy to help you get started.
Requirements
- A strong technical background, ideally a master’s degree in Computer Science or similar.
- Experienced with Java / C++ / other object orientated languages.
- Experience with one or more major public cloud providers (AWS, Google Cloud Platform, Azure).
- Experience with different distributed computing technologies (e.g. Kafka, HDFS, Hive, Spark).
- Basic experience with Infrastructure as Code frameworks (e.g. Ansible, Chef, Puppet).
- Experience in containerization technologies and container orchestration (Docker, Mesos, Kubernetes…).
- Experience in Unix & shell.
- Experience in Git or other versioning tools.
- Experience with CI/CD (e.g. in Jenkins).
- Proficiency in English.
- Proficiency in German or willingness to become proficient within 12 months.
Are you up for the challenge?
Apply now via the application form.
Agency calls are not appreciated.