remove subscribe

LiDAR Magazine Today


follow us on Twitter 

Sponsored By

TAS Lidar Content
TAS Content
Meet the Authors
Check out our fine lineup of writers. Each an expert in his or her field.
Sponsored By

Partner Sites

American Surveyor







Spatial Media LLC properties




LinkedIn Group
facebook group

Home   LiDARmag     

Immersive Virtual Reality (VR) Visualization System for use in Civil Engineering and Geomatics Print E-mail
Friday, 16 January 2015

The Civil and Construction Engineering Geomatics Research Lab at Oregon State University (OSU) has constructed an Immersive Virtual Reality (VR) visualization system for the purpose of visualizing digital three-dimensional (3D) data, more specifically, lidar point clouds, and digital elevation models (DEMs). The VR system was constructed based on a hardware configuration developed by UC Davis Research Scientist, Oliver Kreylos (http://idav.ucdavis.edu/~okreylos/ResDev/LowCostVR/). The VR system consists of a 65-inch active 3D LED television coupled with an Optitrack infrared tracking array and VR software (Vrui). The array of three Optitrack cameras track custom made tracking antlers mounted on the user’s stereo imaging glasses and a Nintendo Wii remote used for data interaction. The Virtual Reality User Interface (Vrui) software, also created by Mr. Krelyos, is an open source VR toolkit for developing VR applications on Unix/Linux based machines.  

The most common way for exploring a point cloud involves navigating with a mouse in a 3D scene visualized on a two-dimensional (2D) computer monitor. Tasks such as detailed editing, collection of measurements, and visual inspection of registration results can be troublesome when using this method of visualization. The presence of depth in a given point cloud can be quite apparent when actively navigating; however, as soon as movement ceases, the static image projected on the 2D monitor loses its sense of depth. This lack of depth combined with the limitations of a mouse interface device can make accurate point selection or a detailed measurement difficult to perform.

Use of a visualization system that supports stereo imaging provides a solution to part of this problem by creating a sense of depth even when the point cloud is not being visually manipulated. Add to these visualization capabilities the ability to track the user’s point of view (POV) and the use of an interface device (Wii remote) and you have an immersive VR system. Position and orientation tracking of the user’s head position and interface device allows the VR system to adjust the displayed stereo 3D scene for the user’s POV. Optimizing the visualization for the user’s POV enables both the re-creation of realistic geometry and perspective as well as the ability to place a virtual cursor floating off the end of the handheld interface device, which is a required component for the user to locate themselves within the dataset.

Current work using the VR system by our research group has included quality control of registered point clouds comprised of numerous laser scanner positions, review of seismically induced damage inflicted upon residential structures, and refinement of visualization performance for individual users of the system.

Identification of registration misalignment in dense terrestrial lidar point clouds can be challenging, especially in complex natural environments lacking planar manmade objects. Subtle offsets in registered point clouds that are difficult to detect using common visualization techniques are easily identified within the VR system. The realistic visualization experience delivered by an immersive VR system stimulates the user’s brain to quickly identify unrealistic aspects such as point cloud registration error.

Using the VR system, we are able to perform virtual fieldwork for inspection of residential structures damaged by seismic activity. Some observations and measurements made in the VR system would simply not be possible in the actual structure due safety concerns. In addition, the ability to apply differing color schemes, real-time lighting, and plane-fitting to groups of points all while immersed in a stereo VR environment allows the user to identify details and observe relationships that would not be possible in the real world.

We are fast-approaching an exciting time for VR hardware where systems like Oculus Rift (www.oculus.com), Open Source VR (www.razerzone.com/osvr), and zSpace (www.zspace.com) are becoming commercially available. As expected, most VR software development will be taking place in the video game industry; however, we can expect elements of innovative VR software to trickle down to software packages specific to civil engineering and geomatics.

Future goals here at OSU include continued use of this visualization tool to enhance lidar point cloud analysis and processing as well as development of custom Vrui software for further applications. We hope to build upon Vrui’s open source platform, opening the door to performing tasks such as point cloud registration and 3D modeling in an immersive VR environment.


Matt O’Banion is a Ph.D. student at Oregon State University studying geomatics and computer science. Research interests include scientific visualization, virtual reality, and development of algorithms for processing and analyzing terrestrial lidar data.

< Prev   Next >


Share this page with your favorite social networks! 


LiDARmag Exclusive Online-only Article ticker
Featured LiDAR MAG Events
List Your Event Here
contact LiDAR


Klau Introduces

press [at] lidarmag.com
Online Internet Content


White Papers

post a job
Reach our audience of Professional land surveyors and Geo-Technology professionals with your career ad. Feel free to contact us if you need additional information.

News Feeds

Subscribe to LiDAR MAG updates via friendfeed

Need Help? See this RSS Tutorial

©Spatial Media LLC - All rights reserved / Privacy Statement
Spatial Media LLC
7820B Wormans Mill Road, #236
Frederick MD 21701
301-695-1538 - fax