Organisations are collecting ever-increasing amounts of data, both in terms of volume from existing sources but also the range of sources they glean it from. Whether it’s IoT sensors on the factory floor or information about how customers interact with a website, the challenge for companies is in integrating these disparate sources to create a complete picture of their organisations.
Making the most of data requires both technology and strategy, as John Spooner, head of Artificial Intelligence, EMEA, H2O.ai, explains: “To make the most of organisational data, build a cross-functional team, and make sure everyone is involved, be they experts in data, technology, analytics, industry knowledge or line of business specialists.” It’s also vital to have an idea of what data analysis can and can’t do. “Companies should start by having a clear idea of business problems they could solve and processes they could improve with their data sources, before spending too much effort on operationalising a wider variety of data sources,” says André Balleyguier, DataRobot’s Chief Data Scientist for EMEA. “Once business ideas are clearly scoped and road mapped according to their value and feasibility, the next step is to prioritise which data sources to focus. This prioritisation goes hand-in-hand with the development of a data strategy that scales to an increasingly large variety and volume of data sources needed to solve the business problems.”
While such unified analytics has revolutionary potential, it’s not a fix-all, as Spooner explains: “Raising expectations too high at the outset can lead to overhype. Similarly, failing to explain what is possible can lead to indifference and a poorly supported project. Select the right use cases to start and then expand. The initial use cases should be of high value, achievable, near term and data ready.” The real benefits come through standardisation. “Scaling often means adopting standard processes and platforms for data cataloguing and management that are both flexible and replicable, to avoid unnecessary manual customisation for each data source,” says Balleyguier.
Reasons for the democratisation of access to data analytics
According to Spooner:
“1. Technology advancements such as cloud, big data platforms and GPU processing now allow employees to process a large amount of data quickly and at an affordable initial investment.
2. The growth of open-source analytics frameworks and interactive analytics dashboards have enabled employees to have access to affordable software that enables them to quickly gain insight from their data.
3. It has been proven through many use cases that data analytics deliver massive returns on investment for organisations, allowing them to increase revenues and reduce cost.”
It’s also a question of ensuring that the correct stakeholders have access to the relevant data. “The other challenges that need to be addressed relate to how these data sources are made available for consumption by the business: are the relevant business users able to easily access and analyse the data? Are they aware of what is available?” says Balleyguier. Of course, the definition of what a stakeholder should be is also expanding. “An increasing number of tools are available to make the use of data more accessible to non-experts. In the same way that anyone is now able to create a website without being an expert web developer with the appropriate tools, or analysts can make rather complex calculations on data using Excel without coding, there has been a surge in the number of platforms to allow business users to create rather complex data visualisations or dashboards. These platforms can be directly connected to the data sources to allow more informed and agile business decisions.”
The benefits of that democratisation are manifold. “Democratising access to analytics helps resolve the disconnects that may happen between the business and the analytics departments and mitigates the shortage of expert skills available,” says Balleyguier. “This in turn helps companies identify and solve more business problems, with the same resources, at faster speed, without relying on a siloed department of experts.”
The future of connected data will of course involve expanding to hitherto untapped sources, but it will also require technological advances. “A decrease in the time to insight, as advancements in hardware such as chip design, automation in many parts of the analytics process and embedded analytics directing into applications that business users will utilise, will enable employees to get answers to their business questions in seconds, rather than having to wait hours or in some cases days,” says Spooner. That process is already underway, with the ongoing COVID-19 pandemic accelerating existing trends. “Geo-location or mobility data has been instrumental in managing the disruption of COVID-19, and this trend will continue: augmenting analytics with a wider variety of data like images, text or location, will continue to be a key enabler in the years to come,” says Balleyguier.
Perhaps the most important development will be the increased integration of artificial intelligence into analysis. “Moreover, better and faster access to more digital data is encouraging a wider use of AI solutions, heavily consuming this data,” says Balleyguier. “This will enable companies to move away from decisions mainly based on dashboards and business intuition to more informed decisions augmented with AI. In the next few years, we will see companies move from AI experimentation to AI industrialisation, by adopting more repeatable and automated operational processes for AI creation, deployment and management.”