Digital twins for the maritime sector

- Story by: Mikael Lind, Hanane Becha, Richard T. Watson, Norbert Kouwenhoven, Phanthian Zuesongdham and Ulrich Baldauf
- July 15, 2020
- Features
Decision making is the central activity of all organisations, and decision makers use explanatory or causal models either implicitly or explicitly. They decide based on the anticipated effects of their intervention, write Mikael Lind, Hanane Becha, Richard T. Watson, Norbert Kouwenhoven, Phanthian Zuesongdham and Ulrich Baldauf.
Decision making is typically improved by open sharing of decision models with colleagues and calibrating them with data. The value of a decision model is often determined by the quality and breadth of data used for creation and calibration. The vast and growing Internet of Things (IoT) will be a key source of real-time data for model building and reality assessment.
The simplest and most common decision model is based on a measured association among variables in a data set. Methods such as regression and machine learning fit this mould. A more advanced approach is to test interventions prior to implementation, such as with pilot studies and experiments, aimed to validate a causal model before scaling to a larger population.
The problem with interventions is that some don’t work and might harm the subjects, such as when testing new drugs, or jeopardize financial sustainability, such as infrastructural investments for a port that are below the intended return.
The most rigorous approach to decision making is to build a hi-fidelity mathematical or biochemical model, or digital twin, of the environment of concern and simulate a possible range of interventions. This enables the exploration of counterfactuals, such as ‘what if we did x instead of y’.
Such models do not physically harm humans or nature and provide a conceptual foundation for decision-making for future sustainable business operations. You can distinguish these three approaches, respectively, as building a theory from data, testing a theory in one or more real settings through interventions, or testing a theory many times using a digital twin to simulate many possible settings. The latter is the least risky and likely to be the most successful.
Digital twins require the construction of a precise set of equations for each component in the model and the interaction among the components. They also need data for their calibration and operation.
As the digital transformation of the maritime sector proceeds, it can also create the data required to calibrate digital twins of the various components of a ship, a port, and other elements of the transport infrastructure such as the goods being transported (as e.g. dry and reefer containers).
As more devices become connected, such as a smart container with data generated by diverse use cases, digital data streams built upon standardised data sharing provide opportunities for real-time representation and simulation of authentic situations. Digital twins will displace simulation models because of the order of magnitude increase in the fidelity of representation of the physical world and their continual recalibration via digital data streams to local conditions and changed circumstances.
In this article, we elaborate on the key fundamentals of digital twinning followed by how it may improve the decision making of shipping companies, port operators and others in the transport and shipping ecosystem.
Digital twins
A digital twin is a dynamic digital representation of an object or a system describing its characteristics and properties as a set of equations. Complex processes involving a multitude of actors are often difficult decision-making environments that are best modelled digitally prior to action.
A digital twin includes both the hardware to acquire and process data and the software to represent and manipulate these data, making them more powerful than models and simulations because they leverage digital data streams to bridge the barrier between the physical entity and its representation.
This means that digital twin analytics relies on historical data (e.g. a data lake), and real-time digital data streams (e.g. IoT generated data), to analyse possible outcomes (Figure 1).

Traditionally, we have used data modelling to surface the core component within a standard and to ensure compatibility across standards. This has been followed by efforts of defining standardised interfaces for communication, so-called APIs (Application Protocol Interfaces).
However, data have a dual role: transaction processing and data analytics, such as that facilitated by a digital twin. Thus, a digital twin is another use case that needs to be supported by standardised digital data streams using standardised APIs.
We need to redesign business processes to support the generation of IoT derived data necessary for digital twin creation and operation. To prepare for the era of digital twins, standardisation bodies, such as UN/CEFACT, GS1, WCO, and DCSA have developed various building blocks in support of the digital twin concept, namely the UN/CEFACT Smart Container data model and the DCSA IoT connectivity infrastructure. But extra standards are still needed to build and deploy fully the digital twins.
Maritime use cases
Three areas of the maritime sector that will likely benefit from digital twins are fleet optimisation, port optimisation, end-to-end supply chain optimisation and increasing key stakeholders’ situational awareness.
Fleet optimisation – Typically, a shipping company serves multiple clients at the same time, and clients may use different shipping companies simultaneously. Thus, a shipping company needs to maintain and gain in competitiveness by optimising its fleet in terms of ships and their cargo carrying capacity.
This need for sensitivity analysis could be served by a digital twin based on historical, ongoing, and future predictions of business transactions. This digital twin could form the basis of strategic decision-making by testing a variety of scenarios for trade patterns and shipping fleets.
Furthermore, a digital twin for fleet optimisation could also enhance operational decision-making under diverse contextual factors, such as weather conditions that create atypical situations, and various options need to be rapidly reviewed.
Port and terminal optimisation – Port efficiency relies on balancing demand and supply in a flexible way, integrating the entire transport system. A port is dependent of a continuous inbound and outbound flow of cargo and passengers arriving and departing from the port by different means of transport. For strategic planning, a port and its partners need to capture historical, ongoing, and predicted future trade in a digital twin.
Such a model should incorporate the different parameters and relationships that port decision-makers should include in their strategic decisions, such as investment in infrastructure, port design, and terminal capacity. Typically, questions that such a model should address include how many berths are needed for the port need to meet punctuality goals, or how much yard space is needed to allow for different customers to store their cargo as it moves between transport services, either shipping or other modes.
A digital twin, fed by multiple data streams of real-time data and historical databases, is also an operational planning tool for the coordination and synchronisation of port operations. It could be an essential foundation for virtual arrival processes and green steaming and for the hinterland window to support efficient use of trucks, trains, and infrastructure for diverse needs.
Situational awareness – Cargo owners, transport buyers, and end-customers seek enhanced visibility and predictability on the state of the transport of their goods. To enhance situational awareness for these groups, it is feasible to consider a parallel linking of relevant digital twins so that the repercussions of a delay in one stage can be thoroughly analysed, adjustments made, and situational awareness updated.
In addition, connected digital twins are a tool for investigating the coordinated development of infrastructure investments across a web of ports that frequently interact, so that key stakeholders also gain long-term situational awareness. This allows them to collaboratively make decisions to serve the common goals of the eco-system, like minimising emission in ports.
Conclusions
A digital twin is constructed by generic mathematical representations of many components and their relationship with other components. These generic representations are parameterised so they can be tailored to specific circumstances, such as the unloading speed of a crane given its position, the position of a container on a ship, and the prediction of berth slots occupied by visiting ships.
Those groups with deep knowledge of each component, such as crane manufacturers, port infrastructure designers, and ship designers, need to develop, or advise on development of, a standard model of their component. Standardised digital models of all components in the shipping industry is the next wave of standardisation if the industry is to achieve higher levels of capital productivity through analytics based operational and strategic decision making.
The physical instances of all components need to have embedded sensors that generate standardised data streams to calibrate their associated digital model. Current operation and future needs can be both guided by digital twins provided the maritime industry cooperates to standardise digital data streams and models of digital components.
Editor’s note: This article is an abridged version of a longer paper by the authors, including further details and a full list of references, which can be downloaded here.
About the Authors

Mikael Lind is Associate Professor and Senior strategic research advisor at RISE, and the co-founder of Maritime Informatics. He also works at Chalmers University of Technology, for the World Economic Forum, Europe’s Digital Transport Logistic Forum (DTLF), and UN/CEFACT.
Hanane Becha is the IoT programme project lead at DCSA, and Leader of the UN/CEFACT Smart Container Project, as well as the Cross Industry Supply Chain Track and Trace Project.
Dr Richard T. Watson is a Regents Professor and the J. Rex Fuqua Distinguished Chair for Internet Strategy at the University of Georgia.
Norbert Kouwenhoven is an employee of IBM NL, Authorities Leader in the (IBM/Maersk) TradeLens Organisation, as well as Solutions Leader for IBM EU in the area of Customs, Immigration and Border Management. He is also a member of the EU's Digital Transport Logistic Forum.
Phanthian Zuesongdham is Head of Digital and Business Transformation and Head of smartPORT Program at the Hamburg Port Authority (HPA), which she represents in groups like Europe’s Digital and Transport Logistics Forum (DTLF), IMO, ESPO and IPCSA.
Ulrich Baldauf is Head of IT-Innovation at the Hamburg Port Authority (HPA), responsible for IT architecture integration of new technologies the smartPORT initiative since the beginning of in 2013.