Alex Meehan addresses the challenge of simplifying large swathes of information into something actionable. Interview with Steve Fleming, CEO Voxxify.
When it comes to making the most of machine learning and artificial intelligence, getting your data into the right format to be properly analysed is crucial. That’s the lesson from several industry experts who all agree on one thing – when it comes to doing business analytics the right way, data compatibility is everything.
“When I first heard about machine learning years ago, I though it was the kind of thing where you could get data from lots of different sources, throw it into a computer and the machine would be smart enough to work out any problems to do with formatting and would spit out an answer,” Steve Fleming, founder and chief executive of Voxxify, said.
But Fleming quickly found out that things are rarely that simple, and essentially ‘ you can’t mix apples and oranges’. Before data from different sources can be merged and used to generate insight, it first has to be ‘normalised’ so that the computer can find a way to measure it against a common scale.
“When you talk to statisticians and big data experts, they’ll tell you there are tools, techniques and processes to manipulate and ‘wrangle’ data in a process know as data ingestion to prepare it for the machine and there are people who specialise in just that,” he said.
According to Fleming, it’s important for business leaders not to have an expectation that they’re going to be able to take every bit of data they have, put it into a machine learning system and have it all work seamlessly.
“There’s an expression that says ‘the best time to pant a tree is 20 years ago, the second best time is now’ and something similar can be said about data. To get the most out of machine learning and business analytics, you need to have your data in a format that can be queried, but not many organisations do,” he said.
“So if that’s the case with you, then the first thing to do is to try to make sure that going forward, any new data you generate is in a format that lends itself to your projected future business needs.”
In reality, when it comes to making old data and data from lots of different sources work together, compromises will have to be made.
“Not all data is suitable for this kind of analysis. To get the most out of the machine learning techniques that are out there, you need a lot of variation in the data. Let’s say you’ve got a data set made up of records of individuals in which everybody is five foot seven inches tall, everyones hair is brown, their eyes are green and they’re the same age,” he said.
“The machine isn’t going to be able to separate that data out well enough to be able to do that if there is more variation in the data set, such as if you have lots of different hair colours, lots of different heights and a wide variation in ages. Then the machine can pick out differences.”
To try to make sure that the companies he works with get the most from any data analytics process they engage with, Fleming insists it’s important to start from the right perspective.
“There are people out there with vast amounts of data streaming off multiple systems and they need to make sense of it. For them, AI and machine learning could be perfect, but there are a lot of other people out there who start things the other way around. They start with being excited about the possibilities rather than looking at the business case,” he said.
“They feel that if they’re not using whatever the latest tech is, then they might be missing a trick. Instead we ask the customer what the end result they’re looking for is, and we work backwards from that. AI and machine learning might be a tool to help bring that about, but it depends on what the company is trying to do and what data they have, how it was captured and so on.”
Read full article on Business Post