ao link
Business Reporter
Business Reporter
Business Reporter
Search Business Report
My Account
Remember Login
My Account
Remember Login

Why bigger isn’t always better where data is concerned

Jon Payne at InterSystems argues that, for data, quality always wins over quantity

 

When it comes to data, there is a common misconception that the more you have, the better. However, that isn’t strictly true. In fact, the quality of the data is arguably more important than the quantity, with large numbers of organisations today collecting far more data than is truly needed to uncover the core insights they require.

 

The emphasis on collecting as much data as possible has a number of repercussions for businesses. Firstly, storing and processing such large volumes of data can be extremely costly, not to mention difficult to source, resulting in significant silos. Secondly, many organisations find that they can’t actually use a great deal of the data they source, or at least not effectively, for example due to a lack of context and biases. Finally, having to sift through so much data to get the answers they require can be very time intensive and distracting, impacting a business’ ability to be agile.  

 

To avoid becoming overwhelmed by too much, often poor quality, data, organisations need to initially understand exactly what they want to achieve. From here, they must work out the questions they need to ask of their data to get there and find a way to answer them with the smallest possible dataset.

 

What is small data?

With large volumes of data coming with their own complications, the power of small data, the minimum viable dataset required for the task in hand, shouldn’t be underestimated. The minimum viable dataset refers to the smallest possible amount of data that is needed to enable an enterprise to act effectively, for instance, to power any models that have been designed.

 

To define just how much data that actually is, the focus should be on what the business needs and obtaining clean, high-quality data. This high-quality data is essential, with quality most definitely trumping quantity.

 

An example of this in practice is if a company decided it wanted to diversify its offering, it would first need to understand whether the appetite is there for new services and what those services should be. Obtaining these insights requires them to have access to a small amount of high-quality data to determine gaps in their current offering and which individuals would likely be interested in taking on additional services.

 

Obtaining the right data

Another important consideration for businesses is whether they have the data they need, and if not, if they would benefit from integrating data from third parties. Historically, many organisations have found integrating external data sources a challenge owing to a lack of data integration capabilities which have resulted in the creation of additional data lakes and swamps. However, it is now possible to overcome these issues through the use of modern data architectures.

 

A smart data fabric speeds and simplifies access to data assets across the entire business. It accesses, transforms, and harmonises data from multiple sources, on demand, to make it usable and actionable for a wide variety of business applications. Additionally, the incorporation of embedded analytics capabilities like machine learning and artificial intelligence help businesses to derive greater insights from small data, in real-time.

 

Unleashing the power of small data

Taking a small data approach allows organisations to realise many benefits, from increased efficiency to risk reduction. This is owing to working with a smaller amount of much more relevant real-time data empowering them to gather critical insights much more quickly than possible when working with large volumes of data, thereby enabling them to identify potential risks faster.

 

For example, if a salesperson is trying to understand which customers to upsell to, they will be able to make more informed decisions by being able to quickly access only the most relevant customer transactions and data, rather than every single customer transaction. As such, this allows them to reduce risk to the business, while handling less data which also reduces risk around GDPR and data governance, as well as the risk of data theft.

 

Additionally, this approach enables businesses to gain access to insights quicker, reduce costs by limiting the amount of data needing to be stored and analysed, and, importantly in today’s changing landscape, increase agility.

 

Bigger isn’t always better

In the context of data, bigger isn’t always better. Rather, the focus should be on the quality, not the quantity of data at a company’s disposal. As such, what they need is access to high-quality, relevant data to undertake the initiatives that will make a real difference to the organisation and its customers.

 

This small data approach, powered by smart fabric technology, will allow businesses to do more with less and obtain the intelligent and relevant insights many have struggled to achieve previously. Armed with these insights, both those on the frontline and business leaders will be able to make more informed, accurate decisions at a glance to keep pace with disruption, market volatility, and changing customer needs.

 


 

Jon Payne is manager - sales engineering at InterSystems

 

Main image courtesy of iStockPhoto.com

Business Reporter

Winston House, 3rd Floor, Units 306-309, 2-4 Dollis Park, London, N3 1HF

23-29 Hendon Lane, London, N3 1RT

020 8349 4363

© 2024, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543