Data Engineer (m/f/d)
Dresden / full-time
We offer
- Attractive and competitive salary as well as participation in success
- Developed and defined areas of responsibility directly by implementing your own ideas as well as by taking the lead for independent projects
- Modern offices with a pool table, foosball and ergonomic office equipment right in the heart of the cities of Dresden and Berlin
- Flexible working hours, working from home when needed and an equilibrium between work and life
- Professional and personal development through feedback talks, language classes, MOOC fee reimbursements (e. g. Coursera) as well as in-house events such as ThunderTalks, Showtime, Open Space Days or the LOVOO Compass
- A motivated and multicultural team with familial spirit
- Cross-departmental communication and knowledge-sharing as well as learning directly in cross-functional teams and an US exchange program with our parent company ‘The Meet Group’
- Opportunity to attend and speak at international conferences and local meet-ups
- Assistance with relocation for internationals
- A rich culinary variety with regular lunch and dinner offers as well as a comprehensive selection of drinks
- Diverse sports and health offerings plus our own LOVOO Gym
Your tasks
- Development of backend software in Go, Python or PHP to enhance our anti-spam, anti-fraud and recommendation microservices and improve our moderation tools
- Rewriting and improvement of Kubernetes deployments for existing and new microservices, using Docker, Helm and Google Cloud Build
- Envisionment of new strategies, experimentation with new technologies and integration of new services to react to more sophisticated spam, scam and fraud attacks, developing new anti-spam components in Go, using gRPC, Apache Kafka and Docker
- Analysis of anti-spam metrics and past predictions to update our machine learning models, using SQL in BigQuery together with pandas and scikit-learn
- Documentation updates and continuously improve software configurations, build files, code-generation definitions, etc.
- Enhancement of Goka, our open-source Go-programming library for Apache Kafka
- Extension of the main data pipeline to stream and anonymize app events to our BigQuery data warehouse, using Airflow (Cloud Composer) and Apache Beam (Cloud Dataflow, Python, Java)
- Writing of analytical SQL queries to process, analyze and aggregate warehouse data
- Writing and extension of Python 3 programs to fetch missing data from external service providers
- Solving incidents by reading and interpreting Prometheus metrics in Grafana and Datadog Dashboards and by analyzing service logs
Our requirements
- University degree in computer science, mathematics or a comparable subject
- At least one-year professional experience as a data engineer, a software engineer or in a similar position
- A strong passion for software development and the enthusiasm to work with the latest technologies
- Experience in writing and deploying production software for backend systems or data-loading pipelines
- First experiences in the field of writing, deploying and maintaining software in Kubernetes in particular or the Google Cloud in general are desirable
- Enjoying the discussion about technical and non-technical challenges and presenting their solutions
- Transparent and open communication style and a developed team-player mentality
- A high willingness to learn and a pronounced hands on mentality
- Excellent command of English, both verbal and written
The application via our Online-Tool will take about 5 minutes. Please keep your complete documents ready.
Saskia Gebhard | Human Resources
Office Dresden
Phone: +49 (351) 41887777
Prager Straße 10 | 01069 Dresden
Germany