YOUR TASKS
- Migration of the existing Data Piplines implemented in Scala.
- Continuous development of an analytics platform as a product.
- Administering a Tableau server and data warehouse.
- Identifying data quality deficiencies and improvement in target and source systems.
- Implementation and setup of the automation and orchestration pipeline with Prefect.
- Performance optimizations to the existing data warehouse, infrastructure and DBT data models.
- Introduction and setup of logging and monitoring of running processes.
- Setup, optimization and extension of ETL routes.
- Connection of new data sources.
YOUR PROFIL
Must have
- At least 3 years of experience in data engineering, data warehouse or ETL.
- Very good knowledge in implementing and operating data pipelines for data analytics.
- Also very good knowledge in at least one programming language, preferably python or scala.
- Good knowledge of SQL. Experienced handling of RDB systems. PostgreSQL is an advantage.
- Knowledge in optimization and performance enhancement of database systems and queries.
Nice to have
- Basic knowledge in working with document-based or noSQL databases is an advantage. Especially MongoDB, but not a must.
- Knowledge of DBT, AWS, Kubernetes, Docker, REST/SOAP is an advantage.
- Knowledge of Tableau or similar reporting solutions.
- Knowledge of Tableau Server administration.
- Knowledge of working with Linux distributions.
- Scala knowledge would be an advantage, but not a must.
Tech Stack
Tableau/Tableau Server - Reporting, PostgreSQL Data Warehouse, DBT SQL tranformation and modelling tool, Python and SQL as primary languages, Prefect for automation and orchestration, AWS as infrastructure, MongoDB as main online database, Bitbucket/GIT as source control, SCRUM/KANBAN as agile method for cross-functional teams.
Link: https://www.joinveact.net/