A TECHNOLOGY project in WA has demonstrated the ability to combine video, virtual reality and real-time data to deliver enhanced information for process control. This is the result of months of work done in collaboration between software company Sentient Computing and Torq Software employee Tyson Stolarski as part of his CEED (Co-operative Education for Enterprise Development) project.
Stolarski’s augmented reality concept treats video feed information in a similar manner to industrial SCADA type data — the video is just another source of information. To bring together such diverse data sources, Stolarski used MVX, a 3D visualisation/HMI package developed by Sentient Computing.
MVX allowed him to develop and augment reality scenes involving video, 3D models, real time data and historical data. Using a 3D model of his office, he was able to synchronise historical real- time data with stored video files. He switched seamlessly between looking at values in 3D and at the video feed with the augmented data displayed.
“The most challenging aspect was to get the camera calibrated and merge the 3D virtual scene with the video data to get realistic rendering,” Dr Du Huynh told PACE. Dr Huynh is Associate Professor, School of Computer Science and Software Engineering, The University of Western Australia and guided Stolarski in this project.
In the final result, Stolarski’s video feed showed activities in the office being augmented with information and data overlaid on the positions of the devices. In the field, this tech nology is able to deliver enhanced information using the increasing number of cameras already monitoring industrial automation processes, such as ship loading.
MVX makes use of the same server infrastructure and integrates into the same MVX visualisation engine for seamless switching between ‘virtual’ and ‘augmented reality’ views. The MVX object server is the existing server back- end system for MVX.
Traditionally SCADA systems organise their data via a tag-based struc ture where every discrete real-time value point is identified via a unique name but the MVX object server groups and organises all of its data and then exposes this data to connected clients via a tree of objects that can be explored via the relationships between the objects.
The data requirements for an augmented reality server consist of the following: video feeds, the location of the field of view of each camera, relevant data and location of data points. The MVX augmented reality client handles the information and calculates the coor dinate transformation to change from server-defined coordinate to 3D scene coordinates.
The client also synchronises the 3D engine’s camera to the same characteris tics as the real-world camera used for the video feed and correctly places the computer-generated imagery in their respective 3D engine scene.
The team is currently developing support for smart phones and other hand held devices with camera/GPS/compass for mobile augmented reality around the plant.
“Commercial applications for this project could be control engineering, augmented reality visualisation and processes that one would want to visu alise,” says Dr Huynh.