Wir suchen ab sofort oder zu einem späteren Zeitpunkt eine:nSenior Machine Learning Engineer
Dein Geschlecht ist uns unwichtig, hauptsache du bist technologiebegeistert.
in Teilzeit oder Vollzeit.
Standorte: Karlsruhe, Pforzheim, Stuttgart, München, Köln und Hamburg.
Wer wir sind:
Im Geschäftsbereich Data Management & Analytics helfen wir unseren Kunden, aus Daten wirtschaftlichen Nutzen zu generieren. Wir integrieren heterogene Datenquellen in klassische Data- Warehouse- Strukturen und multidimensionalen Analysemodellen, entwerfen und implementieren Big-Data-Plattformen und Realtime-Szenarien und verwenden innovative Suchtechnologien.
Senior Software Engineer (w/m), 80 - 100%
ARGUS DATA INSIGHTS® Schweiz AG ist der führende Anbieter von integrierten Business Intelligence Lösungen in der Schweiz mit Hauptsitz in Zürich. Basis des Erfolges ist die einzigartige Kombination aus einer umfassenden globalen Medienabdeckung, innovativer Technologie, persönlicher Beratung und mehr als 100 Jahren Erfahrung. Rund 180 Medien-, Kommunikations-, Analyse- und Daten-Experten finden, verdichten und analysieren relevante Medieninhalte zu hochwertigen Medienspiegeln, Analysen und Insights für fundierte Entscheidungen in Kommunikation, Marketing und Strategie.
Deutsches Elektronen-Synchrotron DESY
A Research Centre of the Helmholtz Association
For our location in Hamburg we are seeking:
Scientist – Data Science for Accelerator Controls
DESY
DESY, with its 2700 employees at its two locations in Hamburg and Zeuthen, is one of the world’s leading research centres. Its research focuses on decoding the structure and function of matter, from the smallest particles of the universe to the building blocks of life.
Als Software Engineer für Business Intelligence & Data Warehouse sind Sie Teil unseres Business Intelligence Teams. Sie setzen mit agilen Methoden gemeinsam mit dem Team Business Intelligence Anforderungen um.Ihre Aufgaben:
Weiterentwicklung des bestehenden BI/DWH-Systems (End-to-End)Entwicklung von ETL ProzessenErweiterungen, Refactoring des Data Vault Model des Enterprise Data WarehouseTechnische Definition, Erweiterung und Umsetzung von Data MartsTechnischer Aufbau von QlikView Komponenten für die DatenanalyseUmsetzung von automatisch erzeugten Berichten mittels Pentaho Report DesignerGemeinschaftliche Umsetzung von User Stories zusammen mit Scrum-Teams inklusive Pair Programming, Code Reviews, Testdaten Erstellung etc.
<p>For the application of modern machine learning techniques to complex NLP problems we are looking for an exceptionally experienced and creative engineer / scientist. In a small interdisciplinary Scrum team you will apply state-of-the-art deep learning methods to real-world problems, process large amounts of data and deploy production quality models at scale.</p><p><strong>Your main tasks include</strong></p><ul><li>Conception, design and implementation of mission-critical ML & AI solutions (analysis, architecture, design, implementation & deployment of end-to-end systems)</li><li>Design and implementation of data acquisition, ingestion, validation, transformation, augmentation and visualization pipelines</li><li>Investigating new approaches and evaluating new technologies and tools</li></ul><p><strong>Preferred qualifications</strong></p><ul><li>Master's / PhD in Computer Science, Data Science or equivalent work experience</li><li>Practical experience in machine learning technologies related to NLP for tasks such as Embeddings, Named Entity Recognition, Classification, Taxonomies Construction, Sentiment Analysis, Text Similarity and Predictive Modelling</li><li>Hands-on experience with ML toolkits such as Gorgonia, Tensorflow, Keras, PyTorch etc.
Oracle Developer / Business Analyst - Investment / Asset Management - This is a long term contract opportunity to join a global Financial Services company to support the development and administration of an existing data warehouse.
Your experience/skills:
5+ years of practice in database development with Oracle 11g+ including Oracle Functions, Procedures, Triggers, Packages, SQL, PL/SQL and performance tuningWork experience in financial markets and products along with investment and risk managementGood understanding of data modelling concepts and BI tools such as Tableau, Analysis Services and Business ObjectsKnowhow of Azure Cloud is beneficialFamiliarity with BlackRock Aladdin Analytics is an advantageLanguages: English, fluent in written and spokenYour tasks:
Commerce Quality Analyst with Chinese vacancy for a globally operating Zurich based company in the technology sector.
Your experience/skills:
Relevant working experience as a Quality Operations Analyst in E-Commerce sector is a mustProficiency in using Doc/SpreadsheetsSufficient knowledge in SQL Database management, as well as the ability to handle big data with SQL statementGood Web research and analytical skills to collect, organize and analyse significant amount of information with attention to detailsStrong work ethic, positive attitude, and excellent collaboration skillsLanguages: English and Chinese, fluent in written and spokenYour tasks:
PartnerRe Ltd. is a leading global reinsurer, providing multi-line reinsurance to insurance companies. It is a dynamic, challenging, and rewarding place to work. We are always looking for bright, proactive people with expert knowledge skills, and integrity to join our international team. Our culture is based on trust, responsibility, openness, and initiative, and we pride ourselves on delivering the best possible reinsurance solutions for our clients.Data Scientist: Predictive modelling (80-100%)We are seeking for our Zurich office a technically strong and solution-oriented Data Scientist to join our global analytics team within the Life & Health department.
Our goal @Zenjob is to match people with work that interests them by giving them the freedom to decide when and where they want to work through automation and digitalization. We focus on this effort by working in value streams with engineers embedded directly into cross-functional teams. As a Senior Backend Engineer for Machine Learning, your main objective is to identify the relevant information we need and provide methods to use this knowledge to provide the best possible experience for our customers.
Die Cloud Information Factory bildet die zentrale analytische Plattform der Wincasa. Die Factory umfasst das zentrale Data Warehouse, die zentrale BI-Plattform und den Data Lake. Als Teil unseres wachsenden Delivery Teams am Standort Zürich suchen wir einen interessierten und engagierten
Technischer Projektleiter DWH/BI (m/w/d)
Deine Aufgaben
Management von IT-Projekten unter Berücksichtigung der Unternehmensstrategie Analyse, Gestaltung, Implementierung und Umsetzung von DWH/BI-Lösungen in direkter Zusammenarbeit mit den Geschäftsbereichen und Kunden Weiterentwicklung der Cloud Information Factory (BI- und Analytics-Lösung) auf Basis neuester Technologien/Services von Microsoft (PaaS, Azure) Interdisziplinäre Konzeption von zusätzlichen Komponenten (zur Prozesssteuerung, Freigabe, Datenqualität) Erstellung und Pflege von Standards und Guidelines Erhebung, Beurteilung und Spezifikation von firmeninternen und externen Anforderungen (Analytics, Reporting, Schnittstellen) Datenübernahme von neuen Kunden und Mandaten Coaching und Mentoring des Data Engineering Feature-Teams Dein Profil
Senior UX/UI Designer/-in, 80 - 100%
ARGUS DATA INSIGHTS® Schweiz AG ist der führende Anbieter von integrierten Business Intelligence Lösungen in der Schweiz mit Hauptsitz in Zürich. Basis des Erfolges ist die einzigartige Kombination aus einer umfassenden globalen Medienabdeckung, innovativer Technologie, persönlicher Beratung und mehr als 100 Jahren Erfahrung. Rund 180 Medien-, Kommunikations-, Analyse- und Daten-Experten finden, verdichten und analysieren relevante Medieninhalte zu hochwertigen Medienspiegeln, Analysen und Insights für fundierte Entscheidungen in Kommunikation, Marketing und Strategie.
For KI labs we are looking for product managers experienced in managing software products and services, to join our rapidly growing team in Munich. As our Product Manager you will actively shape the product vision for our internal and external projects, continuously drive and deliver user-centered features and services, and push the boundaries of possibilities together with our development teams. Since most of our clients are not based in Munich, please note that you might need to travel up to 2-3 days a week (mostly within Germany).
Role objectives and accountabilities:
SUPPORT our business with its data, research and analytics needs
- Provide prescriptive insights to all business functions that improve decision-making.
INFORM our business through reporting and ROI evaluation
- Develop intelligent reporting frameworks that define appropriate metrics and identify risks and
opportunities for our business
- Develop tools in Tableau that enable teams to access data for themselves, helping to create a data culture’ throughout the organization
Data Engineer with analytical skills (Python, PySpark) vacancy for a globally operating Zurich based company in the financial sector.
Your experience/skills:
6+ years of experience as Data Analyst or Data Engineer including 3+ years of practice with Python ideally within the banking industryExpertise in Data Models and Database Design as well as with Data Discovery and Data Cleaning on new data with ETLKnow-how of distributed computing tools such as PySpark, Cloudera and HDFS is a mustGood Understanding of Business Analysis and Machine Learning is preferableCapability of Java and SQL as well as experience with Foundry and Palantir tools is a plusBeing Agile practitioner with excellent stakeholder management skillsLanguages: English, fluent in written and spokenYour tasks:
BenefitsWork in a lean, agile and highly motivated teamBesides the default social benefits, we offer a number of additional perksAn attractive workplace in Zurich SeefeldWe offer you computing power to develop your hobby projectsRole and ResponsibilitiesYou work with various datasets from different Ringier companies in different business divisions across publishing and marketplaces and in different regions across continentsYou choose, improve, and apply various methods from statistics, machine and deep learning technologies in order to solve business problemsYou model the data of our users, their interests and online behavior in order to serve the business demands, and enable the future business modelsYou work closely with our data engineering team and other cross-functional teams to develop prototypes for cross-portfolio use cases, and support the deployment of your models or algorithms into the cloud-native production environmentYou communicate and visualize the value of data by describing the findings or the way how techniques work to both technical and non-technical audiencesYou keep following closely research and engineering developments, like TensorFlow, Torch and Amazon Sagemaker, roll in the relevant technologies and methodologies to keep our modern technology landscape up to dateEducation and ExperienceYou have a degree (MS, Ph.
<ul> <li>Drive the BI & data component of Bank-wide transformation and business projects </li> <li>Manage the timelines, dependencies and resources of the projects assigned</li> <li>Perform data functional analysis, analyze data business requirements</li> <li>Datawarehouse (DWH) design, data modelling, ETL design and development</li> <li>Definition and implementation of Key Performance Indicators</li> <li>Transform raw data into smart data and meaningful insights</li> <li>Participate actively in the definition of the Road map and future of BI & DWH in the Bank</li> <li>Identify opportunities to optimize and automate data processes across the Bank</li> <li>Ensure the availability, integrity and quality of the data in the DWH</li> <li>Work closely with business, IT and offshore production support and development teams</li> </ul>
We are looking for a Senior Data Engineer that is passionate about building modern, scalable and well-engineered data infrastructure products to enable product teams at idealo to work autonomously with data. You will be part of a team responsible for developing, running, maintaining the idealo data lake end-to-end. You will contribute to ensuring that this product among other products is successful at idealo.
As the Data Management area at idealo, we are building a cutting-edge, cloud-based data platform suitable for self-service analytics as well as for machine learning, data products, and our enterprise reporting.
The Role
Data is a core advantage of OTT/Unicast TV over traditional broadcast. As a platform operator we know exactly what content our users watch, how popular a show is and how many seconds of each and every ad campaign is viewed in our different applications. With roughly 3 million active users every month the data we collect from them allow for efficient and precise business processes including billing, forecasting and performance measurement and also planning for our product development and measurement.
Du bist neugierig und willst jeden Tag dazulernen? Du suchst die Herausforderung und möchtest das, was Du im Studium lernst, endlich umsetzen? Dann starte bei uns und arbeite international, agil, digital und zukunftsorientiert. Dabei hast Du viele Möglichkeiten zum Networking und erhältst neben einer fairen Vergütung auch ein Jobticket, freie Getränke, Essenszuschüsse und mehr. Klingt gut? Dann bewirb Dich!
Deine Aufgaben:
Große und/oder komplexe Datenmengen sind Dein Ding? Du hast Spaß an Entwicklung & Automatisierung und willst sowohl Daten als auch Dich selbst vernetzen?
As a part of our developer team, you will develop the HbbTV applications that will run on 10 million TV sets in German households. To ensure that our cutting edge products run perfectly on all TV sets in the wild we need to better understand how they work on the different TV sets provided by the manufacturers. Your part is to combine the two worlds of our JS development team and the data we use to measure the performance.