Job description
In the Data Services department, we develop the company’s data strategy and data landscape. Our solutions are used in all parts of the company and form the basis for internal reporting as well as data products in direction of our customers.
In this role you are responsible for internal data products like our data lake which enables the whole company to perform analytics. In addition, you will work closely with the product department to design and implement analytics solutions which we offer to our customers.
We’re looking for am experienced developer to join our Data Platform team, which is responsible for the efficient, reliable, secure, and compliant propagation of all datasets across the company to our data warehouses. At SoundCloud, we rely heavily upon data-driven, informed decision making, and the Data Platform team is key to enabling that.
As a lead data engineer, you will work with on-premises data infrastructure (HDFS, Spark, Airflow, Kafka, Kubernetes) and with data services in AWS (Redshift, S3) and GCP (BigQuery, GCS).
We are looking for a Senior Data Engineer that is passionate about building modern, scalable and well-engineered data infrastructure products to enable product teams at idealo to work autonomously with data. You will be part of a team responsible for developing, running, maintaining the idealo data lake end-to-end. You will contribute to ensuring that this product among other products is successful at idealo.
As the Data Management area at idealo, we are building a cutting-edge, cloud-based data platform suitable for self-service analytics as well as for machine learning, data products, and our enterprise reporting.
Your tasks:
As a Data Science Engineer at Holoplot, you will empower a data driven approach across the full organisation. You will work within agile, cross disciplinary teams for manufacturing, sales, marketing, SCM and hardware development, fundamentally enhancing their work experience with data and insights. You will architect, build and introduce cloud-based infrastructure for collecting, analysing and visualising all data coming out of the various sources for applications. like predictive maintenance/quality, automation, robotics, growth, QA and others.