Being there: VR and 3D open up new worlds in terminal safety design
Digital virtual environments (virtual reality, VR, and augmented reality, AR) enable immersive, realistic experiences of safety solutions already at the design stage.
At Kalmar, 'safety first' is a watchword we take very seriously. Over the last few years, we have been working hard to augment the traditional safety design process with new digital environments. By using VR and 3D visualisation, we can proactively design, test and evaluate safety solutions more accurately, and cover a dramatically increased number of scenarios in a given time.
With these virtual experiences, it is possible to model and visualise safety-related factors both in the processes of the entire terminal as well as in human-machine interaction. This enables safety risks to be identified and addressed long before a single physical machine is built and delivered to the site.
Traditionally, safety design and verification has progressed in a two-step pattern: First you plan something on paper and set down your conclusions in a written report. Then you verify your conclusions on the real machine, either at a test facility or at the actual production site. The problem with this is that it can be an extremely slow and labour-intensive process, with no guarantee that you've even found the optimum solution as the end result. New simulation and visualisation tools can profoundly change this situation.
For example, in a recent development project for a terminal customer, we used the virtual environments lab at our Technology and Competence Centre in Tampere, Finland, to tweak the camera positions – and thus improve operator visibility – on a remote-controlled RTG installation. Over the course of two days, we tested over 120 different camera placements before settling on the perfect configuration for the client's needs. Yes, we have the luxury of a fully equipped RTG test site in our back yard in Tampere, but how long would it take to test 120 camera positions on a real crane? Maybe two months, with a crew of two technicians working full time, at height on a 20-metre lift, in short daylight hours in the freezing winter weather?
So, even with this simple visualisation application, we can accomplish something like a twentyfold improvement in safety testing productivity, while simultaneously improving the occupational safety of our own staff and contractors – and all this before the new cameras are installed on the client's RTG, where the real benefits actually become realised in the day-to-day work of the terminal.
In the middle of it all
Perhaps the most revolutionary aspect of these digital environments is that they enable a proactive mode of safety design in which we no longer need to draw conclusions only on the basis of estimates and statistical data. Instead, we can virtually put ourselves in the midst of the situation and experience it ourselves. And it really does feel like being there.
Once you've used the VR truck driving simulator at our Tampere lab to park your 18-wheeler semi-trailer in the correct position in the RTG truck lane, there really is no debate about whether the positioning indicators on the crane have been mounted in the right place or not. You either see them clearly from where you are sitting in the truck cabin, or you don't.
Digital environments enable a proactive mode of safety design in which we no longer need to draw conclusions only on the basis of estimates and statistical data.
The great thing is that this kind of modelling is not science fiction. Thanks to the rapid development of both 360° VR display hardware and widely available 3D game engines, the technology is not only accessible but also developing at an impressive pace, both in graphical quality and in the cost/performance equation.
Automation will, by default, always make container terminals safer places to work. But safety has to be designed in from the initial specifications. It is not something that can be just added onto a product or system later.
In our ongoing development project for AutoRTG cranes, we have used 3D visualisation to model the entire container terminal. This enables not only simulation of dramatic incident scenarios that would be impossible to test on real equipment, but also more grassroots-level safety design. Is the terminal safe from the perspective of a truck driver driving through it? If someone needs to access a stack of reefer containers in the middle of an automated area, what does this mean from the point of view of safety? For the first time, we can actually test all of this with enough realism to draw meaningful conclusions from it.
One of the most exciting and profound things about VR and 3D visualisation technologies is that they provide an immediate, immersive experience of the situation being modelled. This allows direct access to the tacit experiential knowledge of people who are familiar with the operating environment. When you use VR to put yourself in that RTG truck lane, you feel the dynamics of the developing situation based on your own actions in it. You experience it.
For the experts among us, this allows us to identify and minimise the risks inherent in the system much earlier in the design process. And for the non-expert decision maker, VR/AR and 3D enable an immediate appreciation of where the risks are, and what steps need to be taken to address them.
Visualising the future
The potential benefits of 3D/VR/AR are already huge today, but the story doesn't end there. For example, once we have accurate 3D models of the entire terminal and real-time position and usage data from the container handling equipment, why not use the same technology to analyse and visualise any safety incidents or near misses that have occurred at the terminal? By viewing the incident as highly accurate 3D playback visualisation, we can gain a new perspective on exactly what happened, why, and – most importantly – what steps need to be taken to prevent it from happening again.
Likewise, from current VR and AR applications, it's only a small step to see-through technology and sensor fusion solutions that "dissolve" the entire cockpit around the driver/operator, enabling an unobstructed view of the crane's surroundings. This can be accomplished either with VR/AR headsets or pillar-mounted displays in the physical cabin, and the technology is already available.
In addition, once we know the sensoring concept in use on a given type of container handling equipment, we can use the same ideas to visualise how the various sensors on the machine "perceive" the working environment with its various obstacles. This enables us to optimise the entire system to a notably high degree very early in the design process.
Collaborative robotics, in which people and industrial robots safely work together on shared tasks‚ are already a reality. The current paradigm for terminal automation almost always involves a strict segregation of people from automated equipment, and this division is likely to hold for the foreseeable future. However, as sensoring and automation technologies develop, it is only a matter of time when this line begins to blur and collaborative human-machine interaction begins to make its way to the world of container handling. Virtual design tools will make it much easier to prepare for and adapt to this future, which really is just around the corner.
If you can imagine it, VR and 3D enable others to share the experience with you. So, what's your vision? What does the future look like at your terminal?
Jari Hämäläinen is Director, Terminal Automation, and Hannu Santahuhta is Senior Manager, Simulation & Virtual environments, Terminal Automation at Kalmar.