The present thesis addresses the topic of 3D range imaging in a twofold way: new methods in the field of range data simulation as well as the accumulation of range images into a consistent data representation, namely 3D environment models, for high quality 3D object reconstruction are proposed.
Within the last years inexpensive Time-of-Flight (ToF) range imaging devices have become an alternative to traditional depth measuring approaches. ToF cameras measure full-range distance information by actively illuminating a scene and measuring the time until back-scattered light is detected. The final distance information is computed from multiple raw images. This thesis proposes a method for simulating the ToF principle in real-time along with the major sensor characteristics. The approach is motivated by physically-based illumination models and applied to the simulation of Photonic Mixing Devices, a specific type of ToF sensors.
Furthermore, this thesis presents new methods of range data accumulation in real-time. While the hierarchical volumetric approach supports merging and subtraction of sub-volumes with arbitrary resolutions, the point-based fusion method accounts for spatial limitations of previous approaches and addresses high quality 3D reconstructions at extended scales. Additionally, dynamically changing scenes are supported which results in advanced camera pose estimation as well as reduced drift errors. The algorithms are evaluated using simulated data as well as real camera data from structured light and ToF devices.
The algorithms presented in this thesis feature an extensive data-parallel implementation on current graphics processing units in order to ensure the online capability of the methods but without restricting the algorithms to hardware-specific features.