ao link
Business Reporter
Business Reporter
Business Reporter
Search Business Report
My Account
Remember Login
My Account
Remember Login

Optimising supply chains in the new era of Big Data

Linked InTwitterFacebook

Russ Kennedy at Nasuni outlines the steps that today’s global organisations need to take to harness the new opportunities from Big Data analytics

 

Global firms have responded pragmatically to the geopolitical threats of the last 18 months by rethinking their processes, suppliers and supply chains. But they still face ongoing data management issues as they seek further efficiencies, or try to optimise what they have.

 

And if that wasn’t enough, there is the imminent prospect of having to re-engineer the way they manage information to ensure the successful implementation of ground-breaking artificial intelligence (AI) and machine learning (ML) tools.

 

Data challenges

Take manufacturing supply chains: they face data management and related challenges as companies strive to understand the tide of data generated by newly-federated operations; analysts are broadly agreed that the proportion of data being generated and processed outside central data centres or clouds will soon reach 80% or more.

 

This year’s arrival of AI as a practical option for improving business processes has increased the pressure on CIOs to better control and understand their existing data, to pave the way for its adoption. Analyst IDC forecasts that more than half (55%) of the world’s top 2000 manufacturers intend to redesign their supply chains to include AI elements by 2026 to cut plant downtime and improve materials use.

 

And as manufacturing processes or safety equipment migrate to the network edge, this will create new potential attack surfaces for criminals and rogue elements. A recent AT&T survey of 1500 global firms estimates they will spend between 11 and 20% of their edge computing budget on security – adding to the demands for IT and security budgets to achieve more with less.

 

While the complexity of enterprise file data, sources and origins poses tough questions for CIOs, the encouraging news is that these different data, efficiency and security challenges can be now met, in part because unstructured data can now be made available to work with a range of new analytics services.

 

As a result, companies with global operations can rethink their data use to improve processes, drive efficiencies and spur new innovations for a new era of Big Data. 

 

Towards the new Big Data era

To progress on this path to a new understanding and ability to harness different data sets, global enterprises need to take five main steps:

 

1) Making data referenceable, trackable and more valuable 

Enterprises will need to organise both their unstructured and existing data using intelligent and automated tools that tag these different data sets, so they can be categorised according to their type, content or format, to make them referenceable. This stage will make it easier for different departmental teams or supply chain applications to leverage relevant information.

 

A global architecture company has extracted greater value from its historical images by adopting a cloud footprint. Its designers can now search more than 500,000 drawings and photos to inspire their work or use relevant examples of previous work in their client pitches.

 

Aside of the data’s referenceability, many companies might be surprised to learn that by using AI and ML tools from hyperscale cloud providers, this vital legwork can be done automatically.

 

2) Know your data, size up new opportunities

Identify a platform that helps organisations better understand the supply chain data they have. For most global companies, data assets could be anything from simple text and images to complex 3D drawings and diagrams, as well as the data from ever-increasing numbers of edge and IoT sensors and employee devices.

 

An energy company that came to us had to maintain 2.2 million pages of safety, maintenance and compliance documentation and drawings that needed to be stored, labelled and rendered easily discoverable. Such a system would once have absorbed basements full of filing cabinets, but now, this crucial information has been digitised and is maintained in the cloud.

 

From this agile foundation, the energy company was able to implement machine learning (ML) tools to classify and label the digitised records and in a subsequent phase, use separate AI tools to convert hand-written letters to machine-readable text.

 

3) Accessibility transforms team collaborations

Better innovation — including streamlined supply chain operations — demands that business-critical data sets are available to all relevant departments and geographies. If some of the newly-labelled data is allowed to reside in a traditional on-premises storage infrastructure, this could limit the information’s accessibility.

 

For example, an engineering team’s ability to collaborate efficiently on a design from multiple locations will be constrained if data sets are tied to file systems and data centres thousands of miles away.

 

Where data is made accessible, enterprises enjoy a range of benefits. A European manufacturer moved its file data off on-premises file servers located around the world to a central cloud platform, enabling company data to be cached locally from 14 different locations but with the file system being housed in the cloud and controlled from a central console.

 

With end users now sharing files across locations, existing backup storage infrastructures greatly reduced and local backups done away with, the firm has made its business-critical information more accessible and at reduced overall cost.

 

4) Secure critical data to future-proof your business

To underpin these key steps to the new Big Data age, enterprises need to better detect, mitigate and recover from cyber-attacks, and in doing so, help rebalance their security budgets.

 

The latest cloud file services platforms not only reduce required administrative effort and eliminate costly backup and disaster recovery processes, they also incorporate edge-level ransomware detection and file recovery capabilities.

 

Where IT teams detect ransomware attacks, they can now rapidly restore relevant file systems to any point in time and with the capacity to recover millions of files corrupted or encrypted in an attack, within minutes. These new tools enable rapid recovery from incidents that otherwise might disrupt company operations for months, even years.

 

5) Capitalise

In a world of accelerated AI and ML tools, global organisations need to find ways to extract value from newly tagged, accessible and discoverable data. The leading clouds already offer ML, AI and deep learning services that can carry out content search, image recognition, pattern matching, compliance discovery among many other functions.

 

New levels of collaboration and innovation

Data has evolved into a globally accessible asset that can be used in new and more creative ways. The era of Big Data may be more than a decade old, but this new age of instant insights and intelligence from data could prove to be even more transformative.

 


 

Russ Kennedy is Chief Product Officer at Nasuni

 

Main image courtesy of iStockPhoto.com

Linked InTwitterFacebook
Business Reporter

23-29 Hendon Lane, London, N3 1RT

23-29 Hendon Lane, London, N3 1RT

020 8349 4363

© 2024, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543

We use cookies so we can provide you with the best online experience. By continuing to browse this site you are agreeing to our use of cookies. Click on the banner to find out more.
Cookie Settings