Share this story
Share on facebook
Share on twitter
Share on whatsapp
Share on print
Share on email
Within the maritime and shipping industry, the availability and types of data from a multitude of sources are growing exponentially. As in many industries today, data is a gold mine for smart decision-making. However, for most enterprises, strategic, tactical, and operational decisions need to be based on data from internal and external sources within a multitude of locations.
The maritime sector is arranged as a self-organising ecosystem of autonomous actors being in co-opetition, at many times episodically tightly coupled to each other. Decisions in an emerging landscape of co-opetition need to be made on solid foundations reflected by high accuracy and fine-grained situational awareness requiring robust data management. In this short article, we elaborate on what the maritime sector can learn from the application of data management pursued within other industries.
Coordination and synchronisation are solved within the self-organising maritime ecosystem where each involved co-producing actor adds value by adjusting to other’s plans and progress in order to optimise their capital productivity.
Being a co-producer of value in the maritime ecosystem is, however, highly challenging, due to the unpredictability of operations pursued along the maritime transport chain that do not correspond to settled expectations, and the maritime industry suffers from many sub-optimisations causing resources to become idle.
This may, for example, concern a ship that has to wait before being served when visiting a port, or a port actor setting aside resources for serving a ship that does not arrive according to plan. The consequences are that surplus resources, such as ships and terminals are built and invested in, as well as unnecessary energy consumption during operations. Many ports operate on a first-come-first-served basis causing shipping companies to commonly wait for hours, days or even weeks before being accommodated. A lot of attention is now put into enabling just-in-time shipping.
In the maritime ecosystem, each player does not exist in isolation, but still needs to cater for its own existence. Business intelligence needs to be derived to enable the enterprise to make smarter decisions, requiring robust data management, including best practices relating to data quality, data governance and stewardship.
Data management in other industries
With this in mind, much is to be learned from other industries which have embraced data management and governance strategies. As a consequence of the 2008 financial crisis, financial services institutions have been heavily tasked with complying with regulatory reporting requirements to increase transparency and reduce systemic risk.
The key to being successful here is the creation of robust data management functions, enabling banks, asset managers, hedge funds and pension funds alike, to be able to report in an accurate, complete, and timely manner. Participants must also understand the full audit and lineage around data for compliance purposes.
A further example from the financial services sector is how data management and workflow automation can benefit investment decisions. Within this scenario, there are multiple data inputs that are considered – the speed and accuracy of these decisions are paramount, the automated processing and ability to make sense of data to take action are what drives wins.
Data management and data strategies also play a crucial role in organisational resilience. Those that have invested in these areas are more advanced in their digital transformation journeys and, as a result, have been able to adapt more quickly to the impact of COVID-19. This includes pivoting their approach to working environments, decision-making and data and technology utilisation.
Upstream oil gas exploration and production is another example of an industry that has seen benefits from data management. The cost of drilling significant, and operators (including majors, national oil companies and independents) have invested in data management functions around well and sub-surface data points. This has driven in-depth analysis and interpretation, resulting in greater success rates and confidence in drilling and upstream processes.
Data management in the maritime ecosystem
The self-organising characteristics of the maritime ecosystem require industry participants to look for collaboration opportunities in information sharing. Information sharing is complex by a host of factors, the most basic being the ability for industry participants and technologies to speak a common language.
For instance, the UN/LOCODE for the Port of Long Beach is USLBG, while the US CBP requires the use of port code 2709 or 2704 when communicating with it, depending on the situation. Carriers also have different technology stacks with each one using its own coding system for ports.
On a similar note, there is little consistency with the formats of dates and times between systems, nor is there a commonly used standard for city names across industry actors. From a data management perspective, participants should strive to create standardisation and consistency in data across the organisation – doing so creates trust and the ability to make quicker and more effective decisions rather than manually wrangle with various data types.
It may be to reduce infrastructural (vessels, ports) idling times through more accurate predictions in movements and port visits, to identify capacity openings, to reduce energy consumption and infrastructure efficiency, or to achieve greater supply chain and end-to-end logistics visibility. It is critical to achieve further levels of efficiency, i.e. increasing assets utilisation and productivity to make transition to low/zero emissions energy assets affordable, as highlighted in ALICE Physical Internet roadmap.
This means that short- and long-term planning of available infrastructure and resource must be informed by what is happening outside the enterprise itself. The enterprise needs to assure connectivity outside its system boundaries to allow both provision and consumption of data to/from external actors to be combined with existing data within the enterprise. This needs to be combined with interconnected services and processes and shared protocols across industries.
Furthermore, the volume and varieties of information are growing rapidly. Tomorrow’s decisions will require multiple data inputs, including both historical and real-time information and the ability to combine data sources.
Whilst this creates opportunity, the disparity of information is challenging, namely the inefficient and laborious process of bringing these multiple datasets together, especially as these datasets come in varying formats. The downside is often missed opportunities because users are unable to capitalise on complete and accurate data.
Manual processes attempting to solve these problems are troublesome, leaving industry participants prone to human error, lack of scalability, and inability to be dynamic – highlighted by the ongoing COVID-19 pandemic. The organisation does thus need to establish an internal data environment that allows for connecting different data sets together, independent of whether they are real-time or historical feeds or internal or external sources.
There must also be a focus on data quality. Most actors lack robust data quality functions to meet in-house data standards, commonly citing the process of identifying issues in end user reports and downstream systems as a tactical workaround. This is a cause for concern as industry participants strive for more data collaboration and the following questions must be asked – if my data is not quality-controlled, should I be sharing it? If I do not trust my data, should I trust others’ data?
Participants must take proactive responsibility for the quality and completeness of their data, must understand its full audit and lineage – where did the data come from, who changed it, at what time and why. As additional digital data streams become available, the more opportunity consumers have to increase the quality of their data used in decision-making.
They will also have a stronger foundation to question the data that is provided by others. Enterprises must establish such robust data quality capabilities and governance to ensure decisions across the enterprise are calculated, confident and accurate.
With data being their most valuable asset, industry operators must take action in their strategies towards this information. To drive true growth, efficiencies and collaboration, core data management at the organisational level must be established for the long-term success.
Further, in conjunction with the investment in initiatives such as data lakes and business intelligence platforms, the notion “garbage-in, garbage-out” rings true, and industry participants must implement best practices – organisational data standards, technology, culture and accountability to ensure success individually and for the greater good in the larger ecosystem.
Adapting workflows in maritime through clusters
Transformation of the data infrastructure in other industries was preceded by a transformation in industry thinking. Indeed, change of any magnitude requires a shift in mindset by leaders who subsequently set in motion company activities.
In recent months maritime has begun to experience such a shift evidenced by the adoption of several industry wide decarbonisation and sustainability conventions such as the Poseidon Principles and the Sea Cargo Charter, which entail the transfer of business sensitive information. Firms are implementing many of the best practices described above.
Indeed, the momentum towards the creation of a broader maritime ecosystem accelerated in 2020, due in large part to the pandemic-driven dispersion of staff to individual countries and home offices.
Teams adapted to virtual work which, in turn, made evident the need to collaborate with both colleagues and customers using connected technologies and platforms. Individuals isolated from their co-workers could no longer rely on culturally ingrained, value driving workflows.
Yet, structural barriers prevent the maritime ecosystem from flourishing more organically. To wit, there are nearly 1,000 ship owners in liquid cargo and 2,200 ship owners in dry bulk, not to mention the myriad technology companies, financial entities, and service providers in the broader ecosystem. This fragmentation exists as a breakwater against a tide of widespread collaboration. Collaboration in such an environment is hard to achieve. As such, self-organisation within this larger ecosystem happens on a smaller scale.
To augment (or even substitute for) those workflows, micro-ecosystems emerged in which one or more technology service providers, working in concert with an end-customer, developed business-specific and mutually beneficial solutions. Each solution lays down the necessary APIs and integrations to move data among systems, and solves for the challenges of data security, sharing and accuracy. We call these collaborations, “clusters”.
Through the creation of these clusters, the architectural components for a broader maritime ecosystem are being built also responding to planetary and humanitarian concerns. However, most importantly, each new cluster drives measurable business value for the participants, generating evidence and case study narratives that demonstrate the importance of the micro-ecosystem becoming part of the global and integrated maritime transport ecosystem.
Towards optimised resource utilisation
Maritime transport is seeking ways to achieve efficiencies in the way it operates and services its customers at the same time as environmental concerns are raised on the agenda. Within the maritime sector, a collaborative responsibility to balance efforts of capital productivity and energy efficiency is now emerging.
This act of balancing as well as data management are core themes of Maritime Informatics, an emerging discipline that is uniting practitioners and researchers in their efforts to improve the efficiency, sustainability, safety, and resiliency of shipping.
For all participants, critical decisions need to be made on data coming from multiple internal and external data sources to derive meaningful and actionable insights and analytics. To do this, stakeholders must also focus on their internal data management and governance practices as a foundation for the wider ecosystem’s success.
To facilitate scalability, it is key to adopt concepts such as “connect once and be able to connect to all”. It is also important being able to define resources, processes and procedures following universal protocols and standards and open-source connectivity capabilities, which simplify access to assets, services, and resources.
There is a strong need to implement data management at the (sub)ecosystem level as well. Such an approach will achieve gains for multiple stakeholders (for example empowering local ecosystems involving a multitude of players to acquire and better serve existing customers, to drive accuracy in forecasting and planning strategies, and to lighten the burden on technology and operations teams in managing data).
The performance of the self-organising maritime ecosystem is never going to be better than its weakest link. This means each enterprise that is involved must perceive data as a competitive asset and manage it accordingly in order to stay ahead and drive wider ecosystem success.
For the self-organising ecosystem of maritime transports, participants must acknowledge the need for collaboration to develop its competitive edge; enterprises of tomorrow must thus act in co-opetition empowered by data its data management building upon internal and external resources.
Editor’s note: This article is an abridged version of the original paper by the authors, which includes a full list of references. The full paper can be downloaded here.