GET racing and VA Imaging: Innovative Vision-Only Autonomous Vehicles

GET racing, the Formula Student team from the Technical University of Dortmund, is setting new standards in autonomous vehicles with their innovative vision-only approach. Traditionally, autonomous vehicles have relied heavily on LiDAR sensors to perceive their environment with high precision. By eliminating the need for traditional sensor systems like LiDAR, GET racing is relying solely on advanced industrial cameras and computer vision software to guide their car around the track. This breakthrough is powered by VA Imaging and marks a significant leap forward in autonomous racing technology.

Table of contents
The Formula Student Competition: From LiDAR to Vision-Only Autonomous Vehicles
Formula Student is an international competition where university team design, build, and race autonomous vehicles. Since 2005, GET racing has built over eleven race cars, each incorporating cutting-edge technology. The team transitioned from combustion engines to electric vehicles in 2022 and introduced their second electric car in 2023. Now, GET racing is focusing on developing vision-only autonomous vehicles, where cameras and software replace the need for LiDAR and other sensors.
From LiDAR to Vision-Only: The Shift in Autonomous Vehicle Technology
In 2024, GET racing used a combination of LiDAR technology and cameras to detect track cones. However, a failure of the LiDAR sensor during a thunderstorm in Hungary forced the team to switch to a camera-only system just before joining the Formula Student Germany event. This quick shift led to the development of a vision-only autonomous system that earned them the Real-Time Video Processing Award 2024.
The 2025 Breakthrough: Replacing LiDAR with Vision-Only Autonomous Vehicles
For the 2025 season, GET racing made a major leap by designing their car around a single central camera system, moving away from the traditional LiDAR-first approach. By using a wide-angle fisheye lens, the team achieved a 185° field of view, covering the front, sides, and even slightly behind the car.
The key benefits of the vision-only approach are:
For processing the system uses a low power embedded computer, which can process the images over 80 times per second. This wouldn’t be possible without their optimized GET Vision 2 perception software to use the hardware to its full extent. This low-latency, low-power solution outperforms the heavier, slower LiDAR systems still used by many teams in autonomous racing.
Software Innovation: Vision-Based Planning for Autonomous Vehicles
GET racing also completely rewrote their driverless software. Instead of mapping the track during the first lap (a common approach in autonomous racing), their system uses machine learning to calculate the driving trajectory directly from camera data. This allows the car to operate at full speed from the very first lap, unlike many teams that use slower exploration laps.
Challenges and Solutions in Developing Vision-Only Autonomous Vehicles
Building such a system posed several engineering challenges. During testing, the team discovered that their lightweight waterproof camera housing caused the cameras to overheat. To solve this, they added a cooling fan and heatsink, ensuring the system maintained optimal performance during high-intensity racing conditions.
Another challenge was ensuring optical clarity while protecting the camera lens. A custom acrylic dome was incorporated using 3D printing into the case to prevent image distortion caused by flat protective windows.
Vision Hardware: Key Components for Vision-Only Autonomous Vehicles
The 2025 GET racing car integrates advanced camera technology from VA Imaging, making their vision-only system possible. These components enablereal- time processing of essential track features.
Key components include:
- MER3-506-58G3C-P Industrial camera: a 5MP GigE Camera with a Sony IMX547 sensor that captures 58 frames per second. This camera is designed for high-speed data processing and is connected through Power over Ethernet (PoE) for easy testing.
- VA-LCM-5MP-1.8MM-F1.4-015-FISH Fisheye Lens: A 5MP fisheye lens with a C-mount, providing an 185° field of view. The lens offers F1.4 aperture, ensuring excellent image clarity even in low-light conditions.
These components enable fast high-quality image processing, allowing the car to navigate autonomously with precision. For more information on the hardware components available by VA Imaging go here.

Integration and Collaboration with VA Imaging
The collaboration with VA Imaging was crucial to GET racing’s success in developing a vision-only autonomous system. VA Imaging provided technical consultation, assisting with camera and lens selection. The team rated the collaboration highly:
This collaboration enabled GET racing to build a vision-only system that is compact, reliable and more cost-effective than traditional systems relying on LiDAR.
How Vision-Only Systems are Replacing LiDAR in Autonomous Racing
The partnership between GET racing and VA Imaging is a prime example of how industry-academia collaboration can drive technological innovation. Together, they have developed a vision-only autonomous vehicle that is faster, lighter and more efficient by moving beyond traditional LiDAR-based systems. This vision-only approach is setting the stage for the future of autonomous vehicles in motorsports, demonstrating that technology and ingenuity can push the limits of what is possible.
Interested in how VA Imaging's advanced vision systems are enabling autonomous vehicle innovations like the GET racing collaboration? Whether you're exploring similar applications or want to learn more about our cameras and lenses, reach out to us today. Fill in out the contact form below to connect with our experts.

Gaspar van Elmbt
2023 GET Racing - VA Imaging collab
In 2023, GET racing and VA Imaging began their collaboration to boost performance in the Formula Student with advanced and LiDAR technology.
Their partnership laid the foundation for GET racing's vision-based system, combining high-speed image capture and reliable perception for early steps toward autonomous racing.
Read more