Applied Scientist - Perception (SLAM/VIO), Fauna
Job
Amazon.com, Inc.
New York, NY (In Person)
Full-Time
Review key factors to help you decide if the role fits your goals.
Pay Growth
?
out of 5
Not enough data
Not enough info to score pay or growth
Job Security
?
out of 5
Not enough data
Calculating job security score...
Total Score
90
out of 100
Average of individual scores
Skill Insights
Compare your current skills to what this opportunity needs—we'll show you what you already have and what could strengthen your application.
Job Description
Description We are seeking an Applied Scientist to develop and optimize Visual Inertial Odometry (VIO) and sensor fusion systems for our intelligent robots. In this role, you will design, implement, and deploy state estimation and tracking algorithms that enable robots to understand their position and motion in real time, even in challenging and dynamic environments. You will own the full pipeline from algorithm development through embedded deployment, ensuring that perception systems run efficiently on resource-constrained robotic hardware. You will also leverage modern machine learning approaches to push the boundaries of classical perception methods, combining learned representations with geometric techniques to achieve robust, real-time performance. This is a deeply hands-on role. You will work directly with sensors, hardware, and real-world data, while prototyping, testing, and iterating in physical environments. The ideal candidate has strong foundations in VIO and sensor fusion, practical experience optimizing algorithms for embedded platforms, and familiarity with how modern deep learning is transforming perception. Key job responsibilities
- Design and implement Visual Inertial Odometry algorithms for robust real-time state estimation on robotic platforms like Sprout
- Develop multi-sensor fusion pipelines integrating cameras, IMUs, and other sensing modalities for accurate pose tracking
- Optimize perception and tracking algorithms for deployment on embedded hardware (e.g., ARM, GPU-accelerated edge devices) under strict latency and power constraints
- Apply modern ML-based perception techniques (learned features, depth estimation, neural odometry) to complement and improve classical geometric approaches
- Build and maintain calibration, evaluation, and benchmarking infrastructure for perception systems
- Collaborate with hardware, controls, and navigation teams to integrate perception outputs into the robot's autonomy stack
- Lead technical projects from research prototyping through production deployment Basic Qualifications
- PhD, or Master's degree and 3+ years of applied research experience
- Experience with any programming language such as Python, Java, C++
- Hands-on experience developing and deploying Visual Inertial Odometry or visual-inertial SLAM systems
- Strong understanding of multi-sensor fusion (cameras, IMUs, odometry) and state estimation (EKF, factor graphs)
- Experience optimizing perception algorithms for embedded or resource-constrained hardware
- Demonstrated hands-on experience with real sensor data, calibration, and physical robot platforms
- Familiarity with modern ML approaches to perception (learned feature extraction, depth prediction, end-to-end odometry) Preferred Qualifications
- Experience leading technical initiatives and key deliverables
- Publication record at major robotics or computer vision conferences (e.g., ICRA, IROS, RSS, CVPR, ECCV)
- Experience with real-time systems programming and performance profiling on ARM/GPU platforms
- Experience with state estimation on legged robots
- Experience with stereo vision systems, camera-IMU calibration, time synchronization, and sensor characterization
- Track record of shipping VIO or SLAM systems to production on physical robots at scale
- Experience with NVIDIA Jetson, Qualcomm RB5, or similar embedded AI platforms
- Familiarity with
ROS/ROS2
- Experience integrating learned perception modules (e.g., neural depth, feature matching networks) into geometric estimation pipelines
- History of technical leadership and cross-functional collaboration Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status.
- 172,400.00
- 223,400.
Similar remote jobs
Insight Global
Boston, MA
Posted2 days ago
Updated4 hours ago
Similar jobs in New York, NY
Mount Sinai Health System
New York, NY
Posted2 days ago
Updated4 hours ago
Port Authority of New York and New Jersey
New York, NY
Posted2 days ago
Updated4 hours ago
Similar jobs in New York
Soliant Health
Liverpool, NY
Posted2 days ago
Updated4 hours ago
New York State Education Department
Buffalo, NY
Posted2 days ago
Updated4 hours ago
Accenture
Albany, NY
Posted2 days ago
Updated4 hours ago
Tanner Therapy Group
New York, NY
Posted2 days ago
Updated4 hours ago