please enter please select

NEWS & INSIGHTS | Opinion

Your next maintenance mission is ready to play

17 February 2020

Agent A1-13, your mission is to repair a faulty value at Area 51. Along the way you will encounter many other items of equipment that may or may not need your attention. Once you have successfully completed your task, log your report and make your way back to base.

Now that’s probably not going to unseat Fortnite as the teenagers’ favourite or have hardcore gamers dropping Call of Duty to play late into the night, but it could be straight from the plotline of our next generation of digitally-enabled workers’ daily story.

We were lucky to have Louis Deane, CEO of mixed reality pioneers, VISR, at the Oil & Gas Technology Centre for his Tech20 talk “Gaming the Systems” where he related how his company’s computer games development heritage has set it up perfectly to tackle some of the digital working challenges.

Those of us with any experience of computer gaming will understand the concept of setting players a mission and quickly grasp the analogy in an industrial environment.

Louis explained the challenges to creating fully integrated, digitally enhanced working environments and how their experience of developing computer games is proving to be the perfect platform for developing and understanding.

Equipping and immersing a human into a digitally-enhanced experience where they can carry out meaningful tasks, safely and efficiently requires some hefty computing power which is only now becoming available to the masses.

Three key technologies are required. A digital worker of the future, carrying out guided maintenance task for example, will be equipped with a headset that understands where its user is, what it is looking at and can communicate with other workers and a central command centre.

Localisation and mapping the environment in which an operator is working requires complex calculations with continuous 2D and 3D data to build a three-dimensional map of the world the computer finds itself trying to identify. Laser scanners provide surface detail, depth cues and other cameras can collect additional visual attributes such as colour. Robots are the current experts at utilising this technology.

Spatial perception or understanding and identifying what the computer is looking is the next step. This is the sort of activity that self-driving cars need to do very rapidly. Artificial intelligence endows the computer with a toolset that learns and refines as it goes. Just like a human, gut feeling, later backed up with a higher probability of identification is how a car spots what may be a traffic cone in the distance, but then confirms it is a pedestrian as it gets closer when it really matters.

These two intelligences can subsequently be compounded to provide confirmation against previously confirmed visual reference points or 3D models or as offset reference points for new, unexperienced environments that can be assimilated into the system’s developing understanding of the whole space.

All the actors within a game, whether they are human or computer-controlled, interact with the environment, each other and other connected objects in many varied and complex ways. The game engine co-ordinates the effects of these interactions and handles the updating of the overall system to reflect the changes that have been made and any subsequent, linked cause and effect relationships. Then, communicating this all back to the user quickly and in an intuitive way is vital to close the loop on this whole system.

Whilst much of this has been baked-in to multi-player co-operative computer gaming for years, the technology to accomplish this kind of environment awareness and infrastructure physical and digital interaction that can be usefully transferred to an industrial setting has been longer coming.

Microsoft’s Hololens 2 has built on the promise of its genesis device, to now deliver industry an affordable and capable platform on which to build these types of digital worker solutions.

On wearing the redesigned headset, its most obvious feature, a vivid display of 3D digital assets and information right there in the field of view seems the most obvious way of communicating with the user – don’t tell, show! But there is more to this device. Multiple cameras and scanners collect data from the wearer’s environment and track their physical movements with hand, finger and even facial tracking. A powerful Windows Mobile computing platform is contained within the headset providing all the processing power without being tethered by wires.

Louis’s company, VISR, has built a suite of cloud-based platforms which use the Hololens as their core enabling communication component to achieve their vision of the digitally-enabled worker using mixed reality.

VISR will be joining forces with Centrica, the Oil & Gas Technology Centre and other major technology companies to develop and trial their technologies at Centrica’s Easington Gas plant on Humberside in the largest mixed reality field trial project of its type in Europe. Following this, and with more industry support, it is hoped that trials can move offshore.

Whilst there are still many technical hurdles to overcome, and indeed, many other business considerations, mixed reality could be the next step-change we need to make the digitally-enabled worker become a reality in our industry.

At the Oil and Gas Technology Centre, as part of this project and other similar advances coming through our technology developer network, we are establishing a mixed reality industry working group set to launch in March. Watch out for Jeff Hailey’s blog next week where he discuss the background of this group and what’s planned for its launch. Or if you would like to learn more now and possibly be a part of the group, send your details to [email protected].

Subscribe for the latest updates