Growth from Knowledge
Our world is changing fast. Consumers, users, and buyers are calling the shots. New things become possible every second. And more complicated, too. Our clients are businesses around the globe. To make the best possible decisions every day, they need to really know what is going on, now and in the future. We don’t have a crystal ball, either. But we love data and science and we understand how to connect the two. We care about attention to detail and accuracy. We are digital engineers who build world-class research, powered by high technology. Because people who know best lead the way. This is why GfK means Growth from Knowledge.
Position: Senior Machine Learning Engineer (m/f)
Country: Germany
Location: Nuremberg
Job description
In our Technology & Data business unit, we are responsible for the entire lifecycle of our data-driven products and the division Global Data Science is an essential part of it.
As Machine Learning Engineers, we are part of the Global Data Science division and focus on developing scalable and re-usable machine learning components and solutions at enterprise scale.
Now we are looking for you - a passionate hands-on engineer who is greedy for coding in high-end data science ecosystems!
Our data science solutions are determined to be effective, highly efficient, high available, reliable, scalable, maintainable and of course, highly automated. Hence, you will play in the premier league of data science: at the intercept of computational statistics and state-of-the-art software development and DevOps culture.
You’ll be looked after by Markus, who is a passionate techie with a background in statistics and a excellent track record in building actionable data-driven solutions and enterprise-scale big data platforms.
Sounds challenging? Well… it is! But don’t be afraid! Nobody knows everything! You will closely collaborate in agile cross-functional teams where you will learn from scientists, software engineers, big data engineers and infrastructure experts.
Of course, it takes a steep learning curve to acquire knowledge and engineering skills in all fields of this multi-disciplinary environment but we will always take care of your personal development in order to make you a master in productizing data science.
Purpose/Activities
- Design, develop and implement data science and machine learning models in proof-of-concepts and prototypes
- Develop, optimize, standardize and implement data science and machine learning solutions at scale in data pipelines and distributed systems (e.g. Hadoop/Spark/Kubernetes ecosystem)
- Engineer at scale using service-oriented architecture, containerized applications and functions as a service, especially in cloud service environments (i.e. IaaS, PaaS, SaaS, FaaS)
- Optimize data science and machine learning models using high performance computing (e.g. GPGPU) and real-time techniques (e.g. messaging/streaming services, reactive programming)
- Represent GfK’s machine learning and data science expertise at workshops and conferences.
Requirements
Skills:
- Expert skills with regard to code performance optimization and scalability (e.g., parallelization, sharding, scattering/gathering etc.)
- Solid understanding of service-oriented architectures (e.g. microservices) & distributed systems (e.g. Hadoop, Kubernetes)
- Solid knowledge of cloud computing environments and tooling (AWS, Azure)
Experience:
- Hands-on experience with scalable machine learning frameworks and general data science tooling required
- Hands-on experience with continuous integration and big data environments required
- Hands-on experience with Docker & Kubernetes required
- Background in a statistical or mathematical field, ideally with a computational element, such as Physics, Data Science, Computer Science is a must
- Experience in development of custom machine learning models from prototyping to scalable implementations would be a strong plus
Our toolset/ecosystem:
- OS: Linux (Ubuntu, Debian, CoreOS)
- Programming languages: Java, Scala, C#, C++
- Analytical languages: Python, Spark (Java, Scala, Pyspark, SparkR), R
- ML-Frameworks: H2O, TF, Spark MLlib, Theano, Torch, Caffe, MXNet, Seldon.io
- HPC: CUDA, Numba, C/C++
- Presentation: Jupyter, Dash, Shiny, Vaadin
- Backend: Flask, Dash, Eve, Spring, Spring Boot
- Big Data: Hadoop, Hive, Spark, Accumulo, Nifi
- Messaging/Streaming: Kafka, Flink
- DevOps: Docker, Kubernetes, Bitbucket, Bamboo, Nexus, ELK, Prometheus, Ansible, Terraform
We offer:
- Various training and development programs
- Sports and health management
- 30 days holiday per year (plus 24.12 / 31.12.)
- Occupational pensions
- Canteen subsidy
- Accident insurance also for private life
- Childcare during the summer holidays
- Free German and English lessons
- Free coffee… of course :)
We value skills and talents, and will support your development within our international teams. We offer an exciting work environment that brings people together and encourages an entrepreneurial and innovative spirit. We passionately focus on addressing our clients’ needs and improving their knowledge through the best digital research solutions in the world. We do this by integrating data from all sources and by providing prescriptive analytics giving insightful answers to their key business questions. We call this Growth from Knowledge.
If this sounds interesting to you, do not hesitate to reach out to Maximilian.Osterkamp@ext.gfk.com.
We would love to hear from you!
Link: http://www.gfk.com/careers/jobs-at-gfk/