Are you passionate about creatively using technology? Do you want the chance to work with databases operating at 1012 scale? Our Hadoop development team is looking for a skilled developer and expert on systems level who can help build and maintain our big data architecture. We want someone who can not only use but also build a complex Hadoop cluster and data pipeline infrastructure. Interested? Apply today! What you’ll do:
Project A is the operational VC that provides its ventures with capital, an extensive network and exclusive access to a wide range of operational expertise. The Berlin-based investor makes use of the 260m in assets under its management to back early-stage companies in the digital technology space. With its unique organizational structure featuring 100 operational experts, Project A offers its portfolio companies hands-on support in the areas of Software Engineering, Digital Marketing, Design, Communications, Business Intelligence, Sales and Recruiting.
Your Tasks – Paint the world green Holistic cloud-based infrastructure automationDistributed data processing clusters as well as data streaming platforms based on Kafka, Flink and SparkMicroservice platforms based on DockerDevelopment infrastructure and QA automationContinuous Integration/Delivery/DeploymentYour Profile – Ready to hop on board Experience in building and operating complex infrastructureExpert-level: Linux, System AdministrationExperience with Cloud Services, Expert-Level with either AWS or GCP Experience server and operation-system-level virtualization is a strong plus, in particular practical experience with Docker and cluster technologies like Kubernetes, AWS ECS, OpenShiftMindset: “Automate Everything”, “Infrastructure as Code”, “Pipelines as Code”, “Everything as Code”Hands-on experience with “Infrastructure as Code” tools: TerraForm, CloudFormation, PackerExperience with a provisioning / configuration management tools (Ansible, Chef, Puppet, Salt)Experience designing, building and integrating systems for instrumentation, metrics/log collection, and monitoring: CloudWatch, Prometheus, Grafana, DataDog, ELKAt least basic knowledge in designing and implementing Service Level AgreementsSolid knowledge of Network and general Security EngineeringAt least basic experience with systems and approaches for Test, Build and Deployment automation (CI/CD): Jenkins, TravisCI, BambooAt least basic hands-on DBA experience, experience with data backup and recoveryExperience with JVM-based build automation is a plus: Maven, Gradle, Nexus, JFrog Artifactory Link: https://www.
Big Data Developer If you love writing high quality code, Docker is your daily business or you know SPARK like the back of your hand we have just the right challenges for you! At Ultra Tendency you would: Make Big Data useful (build program code, test and deploy to various environments, design and optimize data processing algorithms for our customers)Support your development team with feature implementation and automate testing of data driven applicationsMake a difference by bring in new ideas to improve quality or development methodology of the development teamIdeally you have:
Data Engineer (f/m) - Universal Music Entertainment GmbH Universal Music is the world’s leading music company. We own and operate a broad array of businesses engaged in recorded music, music publishing, merchandising and audiovisual content in more than 60 countries. Universal Music Group is a subsidiary of Vivendi, a global acting media and communication company. Digitalization is influencing almost every part of life, especially the music industry. As the number of sales units grows faster due to streams and downloads we must handle bigger data feeds, process them faster and analyze them more precisely to gather more and better insights.
Delivery Hero is building the next generation global online food-delivery platform. We’re a truly global team, working across 45 countries to ensure our customers are able to find, order and receive their favourite food in the fastest way possible. Since we started our journey in 2011, Delivery Hero has become the world’s largest food-delivery network, and we’re focused on a culture of growth, in both size and opportunities. If you’re an enthusiastic, innovative problem solver, hungry for a new adventure, an exciting job and an international workplace is waiting for you in the heart of Berlin!
We are looking for a new position as Data Engineer (m/f) for our client KONUX in Munich: At KONUX, we are builders, problem solvers and team players. We combine German engineering with Silicon Valley speed and innovation to create the best possible predictive analytics solutions for our customers. Our mission is to help industrial companies improve their decisions on maintenance and performance in a data-driven world. Working with us means opening up new paths with both tech-savvy engineers and creative business pros, and being part of a fast-paced, high-performance environment.
mbr targeting uses patent pending algorithms for highly efficient real-time advertising. We are 100% science- and technology-focused and process and analyze massive amounts of data. We are working at the cutting edge of big data, machine learning and real-time technologies and we are operating large-scale deployments of real-time web Services. We are looking for a Data Engineer (m/f). You will … Implement, optimise and maintain ETL processes.Manage our Big Data infrastructure, including the petabyte-scale Hadoop cluster.
As Data Warehouse Engineer (m/f) in Online Gaming you will take a key role in the ongoing development, maintenance and architectural evolution of our data warehousing systems. You will be shaping the future of our data warehouse solution based on big data technologies and be responsible for collecting, processing and providing data in accordance with decisive business requirements. You join a supportive team of specialists in a pragmatic environment, with short communication processes and flat hierarchies, perfect for developing your skills and competencies and having a high impact on our business results.
We are looking for a Data Engineer. You will … Implement, optimise and maintain ETL processes.Manage our Big Data infrastructure, including the petabyte-scale Hadoop cluster.Collaborate with Data Scientists on building scalable and reliable data pipelines.Develop new and improve existing business intelligence tools and dashboards.Build data expertise and take responsibility for data quality in our data warehouse.You have … proficiency with one or more programming languages we use every day (e.g. Java, Scala, Python).
We are looking for a backend developer with reverse-engineering experience to join the team in our Berlin office. You like a good challenge and don’t give up easily? You will work together with our data scientists to provide and process game data. We will teach you a lot, but ideally, you already have some hands-on reverse-engineering experience. Our team consists of passionate gamers who value a culture of clear communication and constant improvement, and is looking forward to welcome you to the DOJO.
We are looking for: A new team member for our A-Team. What will keep you challenged? Help our business intelligence team to build data-driven applications for our ventures: data warehouses, recommendation engines and CRM systems (developed in-house, based on open-source technologies). See https://www.youtube.com/watch?v=GdtFuOah-5c for an overview of how we do thisIntegrate data from various systems into flexible and consistent representations. You will make sure that all people and IT systems in the organization have an easy access to the data through various front-ends and interfacesAdvance our software architecture and toolset to growing challenges and data amounts (performance, scaling, data quality)Working in an agile software development process in a close collaboration with the product management teamWhich traits contribute to your success?
As Data Warehouse Architect (m/f) you will take a key role in the evolution of our data warehousing systems. You will be shaping the future of our data warehouse solution based on big data technologies and be responsible for driving its implementation in accordance with decisive business requirements. Your Job: You will take control of the present data warehouse solution and be in charge of designing, validating and implementing improvements to the system architectureYou will make sure the system is reliably capable of handling large volumes of data and fulfill the necessary analytical processing load, while keeping maintenance efforts at bayYou keep an eye on providing robust, easy to use solutions, which allow efficient day-to-day operations by yourself and your team members, regarding system maintenance as well as the implementation of ETL pipelines and business reportsYou will be conceiving and implementing ETL jobs, integrating and aggregating data from many disparate internal and external sources, advancing data quality and consistencyYou will be in direct contact with stakeholders and analysts from business departments and game studios, understanding their business requirements and translating them into technical solutionsYour Profile:
Mach mit bei uns Große Datenmengen sind deine Leidenschaft und alles, was mit Big Data zu tun hat, findest du spannend? Prima, wir auch! Zur Verstärkung unseres Big Data Teams suchen wir motivierte und selbstständige Kolleginnen und Kollegen, die Lust haben, mit uns die technologische Zukunft aktiv mit zu gestalten: Du konntest bereits erste Erfahrungen im Hadoop Umfeld sammeln und kennst die Komponenten des EcosystemsIm Umgang mit Linux bist du fit und ggf.
View job and apply on the corporate career site here. Tasks include: Push the frontiers of what is possible in the area of Machine Learning working with real dataExplore, understand, and implement most recent machine learning algorithms and approachesFulfill both functional and non-functional (availability, throughput, latency) requirementsCreate excellence both in terms of results quality and system scalability through continuous evaluation, analysis, and refinementCommunicate the relevance of implemented systems and achieved results in a visual and consistent wayRequired skills & qualifications:
What you can expect We are a dynamic and openminded IT team with system administrators, software developers, data scientists and data analysts with a full DevOps-oriented working style mindset. Our goal is to ensure, through continuous optimization, high standards of quality. In this role, you’ll collaboratively work with software developers to deploy, debug and operate our systems, automating and optimizing our operations and processes as much as possible. You will build and maintain tools for deployment, monitoring and operations.
Major duties and responsibilities: Lead the development and optimization of Python and C/C++ software executed on cloud and deep learning infrastructures using state of the art practices in collaboration with the technical lead of the projectResponsible for defining and implementing QA and best software practices throughout the organizationTechnical survey of C/C++ and python standardsEssential skills and experience required: Deep knowledge of data structures and algorithmsStrong proficiency in Python and C/C++ programming languagesProfessional experience in software engineering, software design patterns and software testingProfessional experience using UNIX/Linux operating systemsGood team player and fluent EnglishStrong work ethicPreferred:
As you know, SAP’s vision is to help the world run better and improve people’s lives. As THE cloud company powered by SAP HANA®, SAP is a market leader in enterprise application software, helping companies of all sizes and industries Run Simple. We empower people and organizations to work together more efficiently and use business insight more effectively. SAP applications and services enable our customers to operate profitably, adapt continuously, and grow sustainably.
HERE is a leader in navigation, mapping and location experiences. We combine highly accurate and fresh maps with cloud technology to enable rich, real-time location experiences in a broad range of connected devices from smartphones and tablets to wearables and vehicles. To learn more about HERE, including our work in the areas of connected and autonomous driving, visit http://360.here.com The Client Data Service team aims to provide HERE customers the most relevant and the most up-to-date maps.
Ihre Aufgaben: Installation und Betrieb von Hardware für Big Data Systeme, Installation und Betrieb von Software für Big Data Anwendungen (im Wesentlichen aus dem Hadoop Bereich) und die Bereitstellung der Big Data Umgebung für KundenBeratung und Unterstützung von Kunden bei der Durchführung von Big Data ProjektenUnterstützende Tätigkeiten bei Informationsveranstaltungen im Big Data Umfeld und bei Aktivitäten zur Erhöhung des Bekanntheitsgrades und der Sichtbarkeit von it-Aktivitäten bzw. Services im Big Data Umfeld Ihr Profil: