What we’re looking for
At Smallpdf you will be building a state of the art data pipeline on AWS that handles terabytes of data. Together with the data team you will lay the foundation for our data infrastructure and help our engineering teams to run experiments and create personalized Smallpdf experiences.
- Build a scalable and fast data pipeline that our engineers and pm’s can use in a self-service model to run experiments and make data-informed decisions.
- Design the data and analytics architecture.
- Ensure our data is reliable
- Automate manual analytics workloads
- The chance to personally impact a successful & rapidly growing startup
- Tricky challenges scaling a high-traffic web application
- Work in small teams that have direct impact on tools that are used by million of users
- Become part of a highly motivated and international team that pushes boundaries
- Fun company events, such as snowshoe hikes in the Swiss Alps, wake-surfing on lake Zurich, after work BBQs and more!
- Free German language course
- Regular Hack Days to challenge yourself
- Nice rooftop office in central Zurich
Do you like the job? please apply here: https://apply.workable.com/smallpdf/j/B8DE52D52D
By sending your application you allow Smallpdf to handle and store your data.
- BSc or MSc in Computer Science or a related field
- Real passion for collaboration and strong interpersonal skills
- Experience crafting and implementing high-performance services handling millions of requests per day and stream-processing of large data sets
- Strong grasp of database structure, design, query languages (e.g. SQL), large data sets, distributed systems, fundamentals of mathematics and statistical concepts
- Experience in building data pipelines, processing services and models in Python (scikit-learn, pandas know-how is a huge plus)
- Machine-learning experience
- AWS experience
- Command-line and unix tools experience
- Experience in at least one statically typed language. We use Go.