Veröffentlicht 29 May 2024, 9:00 am

Software Engineer II Perception bei Latitude AI

Latitude AI (lat.ai) is an automated driving technology company developing a hands-free, eyes-off driver assist system for next-generation Ford vehicles at scale. We’re driven by the opportunity to reimagine what it’s like to drive and make travel safer, less stressful, and more enjoyable for everyone.  

When you join the Latitude team, you’ll work alongside leading experts across machine learning and robotics, cloud platforms, mapping, sensors and compute systems, test operations, systems and safety engineering – all dedicated to making a real, positive impact on the driving experience for millions of people. 

As a Ford Motor Company subsidiary, we operate independently to develop automated driving technology at the speed of a technology startup. Latitude is headquartered in Pittsburgh with engineering centers in Dearborn, Mich., and Palo Alto, Calif.

Meet the team:

The Perception Anomaly Detection team at Latitude is responsible for detecting and handling scenarios outside the vehicle's known operational domain. This includes identifying out-of-distribution or unknown objects and scenarios as well as environmental conditions such as precipitation and low visibility. Equipping our perception system with the ability to say 'no' when faced with unfamiliar input is an essential necessity, particularly in safety-critical applications.

Our team integrates these systems onboard and evaluates their performance offboard to ensure they align with our safety case needs. We work closely with systems engineering to develop a formal validation story and ensure the vehicle operates correctly while requesting that the user retakes control in the previously described situations. As the lead for this team, you will have a unique opportunity to transform research concepts into practical solutions that have a significant impact on our product.

What you’ll do: 

  • Develop innovative and scalable solutions for anomaly detection, using cutting-edge technologies and best practices
  • Collaborate closely with internal perception experts and external stakeholders to ensure seamless integration of software and systems
  • Champion safety and quality standards throughout the software development lifecycle, from design to deployment
  • Continuously evaluate and optimize software performance, using data-driven insights and feedback from stakeholders
  • Stay up-to-date with emerging trends and technologies in the field, and share knowledge with the team
  • Build and maintain industry-leading software practices and principles

What you'll need to succeed:

  • Bachelor's degree in Computer Science, ECE, Robotics, Physics, Mathematics, or related field and 2+ years of experience
  • 2+ years of experience in computer vision, machine learning and deep learning
  • 2+ years of experience in C++ or Python software development
  • Track record of Deep learning research and development, ML research and development and/or AI research and development. 
  • Track record of product development culminating in successful launches/releases/etc

Nice to have: 

  • Master’s or PhD degree in Robotics, ECE, Computer Science, or a related field
  • Track record of technical leadership (not limited to management) taking products or large features from research to consumer release
  • Experience in the field of autonomous driving

What we offer you:

  • Competitive compensation packages
  • High-quality individual and family medical, dental, and vision insurance
  • Health savings account with available employer match
  • Employer-matched 401(k) retirement plan with immediate vesting
  • Employer-paid group term life insurance and the option to elect voluntary life insurance
  • Paid parental leave
  • Paid medical leave
  • Unlimited vacation
  • 15 paid holidays
  • Complimentary daily lunches, beverages, and snacks for onsite employees
  • Pre-tax spending accounts for healthcare and dependent care expenses
  • Pre-tax commuter benefits
  • Monthly wellness stipend
  • Adoption/Surrogacy support program
  • Backup child and elder care program
  • Professional development reimbursement
  • Employee assistance program
  • Discounted programs that include legal services, identity theft protection, pet insurance, and more
  • Company and team bonding outlets: employee resource groups, quarterly team activity stipend, and wellness initiatives

The expected total salary range for this full-time position in California is $240,960-$300,320 USD. Actual starting pay will be based on job-related factors, including exact work location, experience, relevant training and education, and skill level. Latitude employees are also eligible to participate in Latitude’s annual bonus programs, equity compensation, and generous Company benefits program, subject to eligibility requirements.

Learn more about Latitude’s team, mission and career opportunities at lat.ai!

Candidates for positions with Latitude AI must be legally authorized to work in the United States on a permanent basis. Verification of employment eligibility will be required at the time of hire. Visa sponsorship is available for this position.

We are an Equal Opportunity Employer committed to a culturally diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, color, age, sex, national origin, sexual orientation, gender identity, disability status or protected veteran status.



Please mention the word **REAFFIRMATION** and tag RMTUxLjgwLjE0My4yMDY= when applying to show you read the job post completely (#RMTUxLjgwLjE0My4yMDY=). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.

Für den Inhalt dieser Seite / des Stellenangebotes ist das anbietende Unternehmen verantwortlich.

Source: Remote Ok

Ähnliche Stellenanzeigen

Rarible
9 Jun 2024, 7:11 pm
Remote Ok
NEU
Supabase
10 Jun 2024, 1:00 pm
Remote Ok
NEU

Wöchentlich neue
mobile Arbeitsstellen für Programming (Back-End)
in deinem Postfach.