[Close] 

Big Data DevOps Engineer

At the Volkswagen Electronics Research Library (VWERL), this hands-on position is responsible for helping to design, develop, deploy, and maintain cutting-edge big data systems to enable the VW Group to better understand and serve our customers. This position will work with local and international colleagues developing global-scale infrastructure supporting acquisition, analysis, and machine learning using data coming from future vehicle releases of the VW Group (including brands such as Volkswagen, Audi, Porsche, etc.).
Products impacted by this role range from customer insights informing new services to the most advanced autonomous driving systems in the industry.
Responsibilities:
Lead the DevOps efforts for a petabyte-scale data collection and machine learning infrastructure project that will power future intelligent driving systems.
Update, deploy, and monitor systems developed by the team, proactively looking for opportunities to improve performance, reliability, and automation.
Help move implementation through automotive industry development phases, from research to deployment supporting millions of cars on the road.
Analyze and support DevOps requirements from infrastructure developers and machine learning specialists, focusing on the most important topics in the rapidly evolving automotive industry.
Oversee day-to-day usage and provisioning of cloud resources, both in terms of fixed resource needs, as well as helping build systems for dynamic scaling based on project needs.
Experience with key DevOps technologies including Docker, as well as DevOps experience in AWS, including CloudFormation.
Broad skills and experience in deploying, debugging, and managing systems in cloud environments such as AWS.
Ability to build and maintain continuous integration (CI) and continuous deployment (CD) systems for complex, distributed applications, using tools like Jenkins.
Experience diagnosing and debugging applications in complex, distributed, heterogeneous computing environments.
Mastery of key development tools such as GIT, and familiarity with collaboration tools such as Jira and Confluence or similar tools.
Overall, 4 years of industry experience working with cloud technologies and handling DevOps functions for complex systems.
Bachelor's degree or equivalent required.
Desired Skills and Experience:
Familiarity with AWS CodePipeline, AWS CodeDeploy, and AWS CodeCommit CI/CD services.
Hands-on development with relevant distributed computing languages, frameworks, and libraries.
Experience with highly-scalable and distributed databases and the key issues affecting their performance and reliability.
Experience or understanding of developing machine/deep learning systems in a distributed environment.
Conventional backend development experience



Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.

More Jobs

Senior Big Data Engineer
Belmont, CA Volkswagen Group of America
Sr. Engineer, Big Data Dev Ops
Burbank, CA Time Warner, Inc.
Sr. Software Engineer - Distributed Computing ...
Mountain View, CA 23andMe, Inc.
Senior Java Engineer - Big Data
Pleasanton, CA Veeva Systems
Big Data GeoAnalytics Product Engineer
Redlands, CA ESRI