Would you like to shape the future of urban mobility? Join our team of experts in software development, data science, computer vision and geoinformation systems. At Parkbob we work on enabling a seamless last mile experience by delivering digital curb-side data as well as mobility optimization services. We are excited about the future of urban mobility and hope you will join us on this journey.
For ourStreetCrowd service, we are building a continuously growing crowd user base and engaging users in enabling a more effective mobility service across cities through car sharing fleet optimization.
At FinCompare we are on a mission to simplify SME finance by using technology to offer businesses a convenient one-stop destination for all their financing needs. In a non-transparent financing market, we are providing access to and the opportunity to compare modern and suitable financing products. Therefore, we are looking for smart team members to join us on our journey to become Europe’s biggest, most advanced and reliable SME financing platform.
European XFEL is an international non-profit company located in the Hamburg area in Germany. It operates a 3.4 km long X-ray laser, which produces X-rays of unique quality for studies in physics, chemistry, the life sciences, materials research and other disciplines. The diverse scientific facilities at European XFEL enable scientists from across the globe to carry out a wide range of experimental techniques. Early user operation started in September 2017.
Would you like to shape the future of urban mobility? Join our team of experts in software development, data science, computer vision and geoinformation systems, and work on products that enable a seamless last mile experience by delivering context-aware parking rules & restrictions as well as real-time parking information.
For our StreetCrowd service, we are building a continuously growing crowd user base and engaging users in enabling a more effective mobility service across cities through car sharing fleet optimization.
As an engaged and application-oriented data engineer, you will implement a state-of-the-art biotech software concentrating on data repository andcollaborative analytics functionalities at mid- to large biotech companies. You will leverage their current data infrastructure and align with Exputec’s architecture in order to successfully deploy the software to the customer.
Prinfiple Roles & Responsibilities
Assist clients with the installation of the inCyght software on their Linux and Windows-based IT infrastructure. This may proceed via SSH access, web-conference, remote desktop, or on-site visit.
IMAGINE - What you will work on:
Our Transportation Infrastructure Technologies Competence Unit (https://www.ait.ac.at/en/research-topics/road-condition-monitoring-assessments/ ) is at the forefront of technology development, for example on the subject of road condition assessment. Our fleet of measuring vehicles, like The RoadSTAR is a mobile laboratory that provides accurate and objective data on the condition of road surfaces (motorway surfaces).You will be part of an multidisciplinary expert team and you will be responsible for the development of the software for the measurement and monitoring systems and you will conduct in-depth data analyses.
Entwicklung und Erprobung effizienter CFD-Methoden für den digitalen Flugzeugentwurf
Ingenieur/in Luft- und Raumfahrtechnik, Mathematiker/in o. ä.
Ihre Mission:
Das Institut für Softwaremethoden zur Produkt-Virtualisierung in Dresden beschäftigt sich mit der Entwicklung softwaretechnischer Grundlagen zur Realisierung des virtuellen Produkts auf Basis einer hochgenauen mathematisch-numerischen Beschreibung. Zu diesem Zweck werden zugeschnittene HPC-Berechnungsmethoden zur effizienten instationären Simulation des fliegenden Flugzeugs entwickelt.
Im Rahmen eines Patenschaftsvertrags mit Airbus Operations GmbH in Bremen soll der von DLR, ONERA und Airbus gemeinsam entwickelte CFD-Code RHEA der nächsten Generation hinsichtlich eines effizienten, linearen Lösungsverfahrens für zeit-periodische Strömungen im Frequenzbereich (LFD) erweitert werden.
Software Engineer (m/w/d) – Machine Learning / Deep Learning (Bachelor/FH) (AS-2019-089)
Bewerbungsfrist
27. September 2019
Arbeitszeit
Vollzeit
Laufbahn
gehobener Dienst
Arbeitsort
Gauting (PLZ 821)
Kennzahl
AS-2019-089
Anforderungsprofil
Abgeschlossenes Bachelor- bzw. Fachhochschulstudium der Informatik, der Elektro- / Informationstechnik oder vergleichbarIT-Fachkenntnisse aus folgenden Bereichen:Gute Kenntnisse in mindestens einer objektorientierten höheren Programmiersprache (z.B. Java, C++, …) sowie Kenntnisse in PythonGrundlegende Kenntnisse in Machine-Learning- bzw. Mustererkennungsverfahren oder der Informationsextraktion gesprochener Sprache sind wünschenswertSicherer Umgang mit gängigen Betriebssystemen, insbesondere LinuxKenntnisse im Umgang mit relationalen und nichtrelationalen DatenbankenErste Erfahrungen im Umgang mit agilen Softwareentwicklungsprozessen (z.
Wir suchen Verstärkung für unser außergewöhnliches Team bei „The unbelievable Machine Company.” Als Experten mit Hand, Hirn und Herz entwickeln wir präzise IT-Lösungen für die individuellen Herausforderungen unserer Kunden. Unsere Steckenpferde sind die Bereiche Big Data, Cloud Services und Hosting. Wir reduzieren komplexe Problemstellungen mit neuen Denk- und Handlungsansätzen und schlagkräftigen Technologien.
Wir sind keine Krawattenträger und dennoch hochgradig professionell. Darum erzielen wir außergewöhnliche Ergebnisse, wachsen seit Jahren kontinuierlich. Wenn du unsere Werte teilst und das „gewisse Etwas“ mitbringst, freuen wir uns auf deine Bewerbung!
Are you passionate about creatively using technology? Do you want the chance to work with databases operating at 1012 scale? Our Hadoop development team is looking for a skilled developer and expert on systems level who can help build and maintain our big data architecture. We want someone who can not only use but also build a complex Hadoop cluster and data pipeline infrastructure. Interested? Apply today!
What you’ll do:
Für München suchen wir eine/n Machine Learning Engineer (m/w/d)
Du findest, dass Elektroautos die Zukunft sind? Für Dich ist Programmieren mehr als nur ein Job und Du kannst Projekte vorweisen, auf die Du besonders stolz bist?
Wir ermöglichen es durch einen digitalen Zwilling, den Gesamtzustand von Lithium-Ionen-Batterien in Elektrofahrzeugen in Echtzeit präzise zu analysieren und ihre Lebensdauer vorherzusagen.
Das klingt spannend für Dich und Du willst uns dabei unterstützen? Dann bewirb Dich bei der TWAICE-Familie!
Entwicklung und Erprobung effizienter CFD-Methoden für den digitalen Flugzeugentwurf
Ingenieur/in Luft- und Raumfahrtechnik, Mathematiker/in o. ä.
Ihre Mission:
Das Institut für Softwaremethoden zur Produkt-Virtualisierung in Dresden beschäftigt sich mit der Entwicklung softwaretechnischer Grundlagen zur Realisierung des virtuellen Produkts auf Basis einer hochgenauen mathematisch-numerischen Beschreibung. Zu diesem Zweck werden zugeschnittene HPC-Berechnungsmethoden zur effizienten instationären Simulation des fliegenden Flugzeugs entwickelt.
Im Rahmen eines Patenschaftsvertrags mit Airbus Operations GmbH in Bremen soll der von DLR, ONERA und Airbus gemeinsam entwickelte CFD-Code RHEA der nächsten Generation hinsichtlich eines effizienten, linearen Lösungsverfahrens für zeit-periodische Strömungen im Frequenzbereich (LFD) erweitert werden.
Ihre Aufgabe
Zur Verstärkung der Abteilung Kranken/Business Intelligence am Standort Aachen bzw. Hamburg suchen wir Sie zum nächstmöglichen Zeitpunkt als Big Data-Plattformarchitekt (m/w/d) – Schwerpunkt BI und Analytics.
Sie sind verantwortlich für den Aufbau und die Weiterentwicklung einer Big Data-Plattform im Rahmen eines agilen TeamsSie konzipieren, designen und entwickeln Komponenten, Prozesse und Frameworks zum Aufbau einer zentralen Datendrehscheibe (Data Lake) für strukturierte und unstrukturierte Daten mit dem Schwerpunkt Enablement BI und AnalyticsSie sind verantwortlich für die Bereitstellung und den Betrieb (DevOps) aller Plattformkomponenten unter Einhaltung der vereinbarten SLAsSie übernehmen das Monitoring bzgl.
The Role
As a Data Engineer at KONUX you will be responsible for managing a large research data resource that is growing rapidly in both size and complexity. You will take pride in ensuring that the data science team has the best possible access to well-organized data sources. You will also act as an interface with our production team, ensuring alignment of model data requirements. You will, of course, be automating data management processes and be responsible for data cleansing.
Who we are
We create software that puts users in control over their online browsing experience. Our products, such as Adblock Plus, Adblock Browser and Flattr, help sustain and grow a fair, open web because they give users control while providing user-friendly monetization. Our most popular product, Adblock Plus (ABP), is currently used on over 100 million devices.
Our multi-cultural team wants to change the Internet for the better, and you can become an important part of it.
Your Tasks – Paint the world green
Holistic cloud-based infrastructure automationDistributed data processing clusters as well as data streaming platforms based on Kafka, Flink and SparkMicroservice platforms based on DockerDevelopment infrastructure and QA automationContinuous Integration/Delivery/DeploymentYour Profile – Ready to hop on board
Experience in building and operating complex infrastructureExpert-level: Linux, System AdministrationExperience with Cloud Services, Expert-Level with either AWS or GCP Experience server and operation-system-level virtualization is a strong plus, in particular practical experience with Docker and cluster technologies like Kubernetes, AWS ECS, OpenShiftMindset: “Automate Everything”, “Infrastructure as Code”, “Pipelines as Code”, “Everything as Code”Hands-on experience with “Infrastructure as Code” tools: TerraForm, CloudFormation, PackerExperience with a provisioning / configuration management tools (Ansible, Chef, Puppet, Salt)Experience designing, building and integrating systems for instrumentation, metrics/log collection, and monitoring: CloudWatch, Prometheus, Grafana, DataDog, ELKAt least basic knowledge in designing and implementing Service Level AgreementsSolid knowledge of Network and general Security EngineeringAt least basic experience with systems and approaches for Test, Build and Deployment automation (CI/CD): Jenkins, TravisCI, BambooAt least basic hands-on DBA experience, experience with data backup and recoveryExperience with JVM-based build automation is a plus: Maven, Gradle, Nexus, JFrog Artifactory Link: https://www.
Your Tasks – Paint the world green
Holistic cloud-based infrastructure automationDistributed data processing clusters as well as data streaming platforms based on Kafka, Flink and SparkMicroservice platforms based on DockerDevelopment infrastructure and QA automationContinuous Integration/Delivery/DeploymentYour Profile – Ready to hop on board
Experience in building and operating complex infrastructureExpert-level: Linux, System AdministrationExperience with Cloud Services, Expert-Level with either AWS or GCP Experience server and operation-system-level virtualization is a strong plus, in particular practical experience with Docker and cluster technologies like Kubernetes, AWS ECS, OpenShiftMindset: “Automate Everything”, “Infrastructure as Code”, “Pipelines as Code”, “Everything as Code”Hands-on experience with “Infrastructure as Code” tools: TerraForm, CloudFormation, PackerExperience with a provisioning / configuration management tools (Ansible, Chef, Puppet, Salt)Experience designing, building and integrating systems for instrumentation, metrics/log collection, and monitoring: CloudWatch, Prometheus, Grafana, DataDog, ELKAt least basic knowledge in designing and implementing Service Level AgreementsSolid knowledge of Network and general Security EngineeringAt least basic experience with systems and approaches for Test, Build and Deployment automation (CI/CD): Jenkins, TravisCI, BambooAt least basic hands-on DBA experience, experience with data backup and recoveryExperience with JVM-based build automation is a plus: Maven, Gradle, Nexus, JFrog Artifactory Link: https://www.
Are you passionate about creatively using technology? Do you want the chance to work with databases operating at 1012 scale? Our Hadoop development team is looking for a skilled developer and expert on systems level who can help build and maintain our big data architecture. We want someone who can not only use but also build a complex Hadoop cluster and data pipeline infrastructure. Interested? Apply today!
What you’ll do:
Project A is the operational VC that provides its ventures with capital, an extensive network and exclusive access to a wide range of operational expertise. The Berlin-based investor makes use of the 260m in assets under its management to back early-stage companies in the digital technology space. With its unique organizational structure featuring 100 operational experts, Project A offers its portfolio companies hands-on support in the areas of Software Engineering, Digital Marketing, Design, Communications, Business Intelligence, Sales and Recruiting.
Data Engineer (f/m) - Universal Music Entertainment GmbH
Universal Music is the world’s leading music company. We own and operate a broad array of businesses engaged in recorded music, music publishing, merchandising and audiovisual content in more than 60 countries. Universal Music Group is a subsidiary of Vivendi, a global acting media and communication company.
Digitalization is influencing almost every part of life, especially the music industry. As the number of sales units grows faster due to streams and downloads we must handle bigger data feeds, process them faster and analyze them more precisely to gather more and better insights.