360º Color and Depth Map Panorama
Used a Microsoft Kinect 2.0 to capture images of an indoor room and stitch them together to form a 360º panoramic image.
In my senior thesis project at Stony Brook University, I composited color and depth images into a 360º panorama using data collected from a Microsoft Kinect 2.0. With the Kinect at the center of rotation, approximately the center of the room, sixteen images (eight color and eight depth) were captured, two at each 45º interval of rotation. Using MATLAB, the depth images were normalized and aligned to the coordinate system of the color camera. Then, features were detected, exctracted, and matched between adjacent color images. Using the data of the matched characteristics, the redundant part of the image was removed, seamlessly stitching together the adjacent photos. This information was then used to stitch the respective depth images in the same way. After applying this computation at each view collected, the result is a 360º Color and Depth panorama of an indoor room.\n\nMy respective paper documenting this project was published by the Long Island Systems, Applications and Technology Conference (LISAT) in IEEE Xplore, recieving the Best Student Paper award in 2018.
Read The Paper