The product: A containerized data science environment Our ambition is to create a platform that gives data scientists a flexible, consistent, and simple environment based on Docker containers, where their code can be written in a large variety of languages (Python, R, Go, Scala). This tool then turns their code into stateless functions that can be easily deployed into powerful data pipelines. The stack Kubernetes, OpenFaas, Docker The challenge Having great DevOps engineering support is crucial in order to guarantee that our micro-service based platform runs smoothly and reliably, no matter where it is deployed (we support cloud and on-premise deployments).
The challenge: Building services around a containerized data science environment Our mission is to combine sophisticated data science with a great user experience. Our flexible data science environment enables businesses to create interactive, data-driven decision tools and automations.This environment gives data scientists a flexible, consistent, and simple collaborative tool based on Docker containers orchestrated in Kubernetes. Use cases can be implemented in a large variety of languages (Python, R, Go, Scala) and are deployed as stateless functions that can easily be composed into powerful data pipelines.
Your job – performs as smoothly as our cars: You will be responsible for designing, implementing and operating our analytical data layer on the cloud.You are responsible for designing, implementing and operating end-to-end distributed systems on the cloud.You will be working on automating our event stream pipeline and data workload to a data layer that serves all analytical demands in the company.You will assist other engineers on designing the right data models to meet the requirements for both operational and analytical purposes.
THE CHALLENGE We are building a data science environment supporting data scientists all the way from exploring data and experimentation with different models to deploying them in decision tools or automations. At the core of our data science environment are small, reusable “data science bundles”, built using the tools of your choice. We’re mainly using Go with the occasional bit of Python thrown in. We are planning to open-source our data science environment during the course of this year with the goal of making it an Apache project eventually.
Sixt is a leading mobility provider offering premium service in over 100 countries. Since the beginning technology has been an integral part of our business. Our Engineers are responsible for designing, building and running the applications making up the core of our mobility ecosystem. They are versatile technologists who have a lot of experience up and down the stack and are comfortable tuning the Linux kernel, automating cloud infrastructure, writing resilient and high-performance code or scaling large distributed systems.
Business Intelligence Manager/ Data Scientist (m/f) Urban Sports Club is an online-based sports club that provides an all-inclusive membership for sports venues across the city. With a single monthly subscription customers can do sports at more than 1500 studios offering more than 40 different types of sports and thousands of activities every week. So why not fitness on Monday, yoga on Tuesday, climbing on Wednesday, tennis on Thursday, martial arts on Friday, team sports on Saturday and on Sunday have a message and relax at one of the many saunas?
Responsibilities include but are not limited to designing, developing and supporting big data systems and machine learning models; creating statistical models, analyzing and processing various kinds of data. Applicants are also expected to participate in after-hours work. All candidates will have a Bachelor’s or higher degree in technical field of studya minimum of two years experience designing, developing and supporting big data systems, machine learning models and end-to-end data pipelinesexcellent knowledge with at least one modern programming language, such as Go, Java, C++, Python and Scalaexperience with big data technologies, such as Kafka, Spark, Storm, Flink and Cassandraexcellent troubleshooting and creative problem-solving abilitiesexcellent written and oral communication and interpersonal skillsIdeally, candidates will also have
Delivery Hero is building the next generation global online food-ordering platform. Our awesome international team already operates in over 40 countries worldwide to ensure hungry customers get to their favorite takeaway food the fastest way possible. The company has grown from its inception in 2011 to become the world’s largest food-ordering network. This is an exciting time for Delivery Hero with a huge and rapid amount of growth in countries, market size and opportunities.
Join our team and actively shape the technical future of the world’s largest online hotel search. Enjoy the freedom to question established processes, work with top-notch technologies and develop new tools to impress our 120 million users per month! We’re looking for an experienced developer to manage and further improve our core hotel data. We want someone to increase the amount and quality of metadata we have by analysing, investigating and interpreting data delivered by our partners and to match these to our hotel data.