At Bonial, the Data Team is one of the hubs between Online Marketing, Product, IT, Operations and Sales. It’s a unique position as we can explore data from different aspects and bring valuable insights to the company. Our mission is to transform data into the source of truth and thereby enable Bonial employees to make smarter decisions. We turn data into information then into knowledge, help teams understand their capabilities, support teams in shaping the right questions for their challenges and teach them how to use the tools we provide. Our success would be to transform Bonial decision making process from gut feeling into scientific approach.We’re looking for an experienced Data Scientist with a curious analytical mindset and strong product intuition to join our Data team. You will be able to consult different teams on both technical and non-technical applications of data science. You will be responsible for:
- Designing and building production-ready predictive models or machine learning algorithms;
- Optimizing our personalization algorithms;
- Crunching, analyzing and investigating large amount of data to discover trends and patterns;
- Investigating new machine learning techniques and tools, educating others on the possible applications of these techniques;
- Visualizing your analysis in a way that everyone gets it at first glance;
- Collaborating with different teams/departments to proactively suggest improvement points, using your initiative to develop new topics.
What you should bring to the table:
- A ‘can-do’ attitude with a passion for analytics and the insight it can provide;
- At least 2 years of experience as a data scientist or data analyst;
- A degree in Mathematics, Statistic or Computer Science; graduate degree in Data Science or other quantitative field is preferred;
- Good knowledge of Python, SQL or R;
- Experience in practical application of machine learning or predictive analytics;
- Ability to convey complex issues to a non-technical audience with a high level of abstraction;
- Experience in data visualization.
- Experience with PySpark or SparkR;
- Experience building APIs;
- Experience using Git, Docker, Airflow, AWS.