A multi-sensor traffic scene dataset with omnidirectional video
The development of vehicles that perceive their environ-
ment, in particular those using computer vision, indispens-
ably requires large databases of sensor recordings obtained
from real cars driven in realistic traffic situations. These
datasets should be time shaped for enabling synchroniza-
tion of sensor data from different sources. Furthermore,
full surround environment perception requires high frame
rates of synchronized omnidirectional video data to prevent
information loss at any speeds.
This paper describes an experimental setup and software
environment for recording such synchronized multi-sensor
data streams and storing them in a new open source for-
mat. The dataset consists of sequences recorded in various
environments from a car equipped with an omnidirectional
multi-camera, height sensors, an IMU, a velocity sensor,
and a GPS. The software environment for reading these data
sets will be provided to the public, together with a collection
of long multi-sensor and multi-camera data streams stored
in the developed format.
Philipp Koschorrek, Tommaso Piccini, Per Öberg, Michael Felsberg, Lars Nielsen and Rudolf Mester
2013
Page responsible: webmaster
Last updated: 2021-11-10