Solving the data paradox: Think big, start small, scale fast

By Zhiwei Jiang, CEO of the Insights & Data Global Business Line at Capgemini.

  • 3 years ago Posted in

If data is the new key enterprise asset, then many companies have a richness in store that is remarkably unexplored. When sourced, used, governed and stored accurately, fairly and ethically, data is the single most valuable tool businesses have to mitigate the current risks they face and re-model plans for the long term.

However, organizations must first make sense of the data they have or they risk drowning in it. Critical insight is often lost in vast, inaccessible data stores or hidden from view in business silos. To use this lucrative “commodity” correctly, organizations must re-think both their enterprise culture and data architecture. Below, I outline three essential success factors for unlocking the value of enterprise data. Below, we outline three essential success factors for unlocking the value of enterprise data.

Establishing data as a cultural pillar

Data can, and should indeed be, an asset to all areas of a business, not just within IT. To truly harness the value of data is to recognize that anyone and everyone within the organization will gain insight from it – from marketing to operations. Enabling this mindset requires a shift in data practices within the business. This means developing a highly automated data ‘pipeline’ that provides employees across functions with access to the right data through an intuitive, self-service interface that is welcoming to employees, even when completely unfamiliar with analytics, AI and data science.

The case for a federated data landscape

Data is rarely static; it is constantly moving, evolving and expanding, reflecting the world around us. For organizations, this means that they must constantly adapt and evolve those aspects of the business which are data dependent and manage the complexity of this multi-directional, agile process. To help address this they should take a federated approach to data, by which they acknowledge that data is stored in many different ways and can be owned and used by different entities.

But for that, data must be united through a common platform architecture which enables connection and collaboration across the business. This is increasingly supported by AI and intelligent automation. Using this approach, organizations can fetch data from multiple sources and analyze and activate it as it relates to any category. One of the most common examples of this concept can be seen in the creation and management of data lakes. Unlike their well-established sibling, data warehouses – which often lack the agility and diversity to be fully useful – data lakes have flexible boundaries regarding what is collected, who has access to it and how it is analyzed. The lake is self-sustaining, but it is not free functioning; it is connected to other lakes, as well as alternative data stores.

While this is the promise of data lakes, some organizations have found the real-world application to fall short. Improper management and unclear parameters have prompted some data lakes to become virtual dumping grounds, even less accessible and manageable then their data warehouse predecessors. For this reason, it is more than ever important for organizations to remain vigilant as they collect and store data, activating only data which is accurate, high-quality, consistent, unbiased, and meaningful. It is of the utmost importance that these foundational aspects are met before the data is leveraged for analytics and activation purposes.

Think big, start small, scale fast

Cultural change has to be anticipated and embraced. It is important to adopt an agile enterprise mindset that encourages experimentation, wherein a business identifies goals at a program’s outset but then adapts the approach in near real time based on outcomes. In most organizations, the key to embracing this type of mentality, both with leadership and employees alike, is by establishing momentum through results and creating appetite for more: think big, start small, scale fast.

With any program it is important to build scalability into the model and keep that top of mind throughout. In some cases, we see a fail-fast mentality work against organizations – they become so fixated on proving a concept that they fail to consider its utility for the organization at large. This is for example quite apparent from multiple years of trying to scale up promising AI developments. To avoid this possible pitfall, every proof of concept should be underpinned by a business need and/or its relevance to the entire organization. Moving the data landscape to the cloud for example, can be effectively done with small initial steps that deliver immediate business value, really like a ‘proof of value’. This then results in a productive, value-oriented foundation for rapidly scaling across the entire enterprise.

Today’s business landscape prizes speed and efficiency. An understanding of one’s data and, more importantly, the ability to accrue value from it, is critical to achieving this. Organizations must take a holistic approach to their data and weave it into the lifeblood of the enterprise, both from a cultural and architectural standpoint.

 

By Vera Huang, Sales Director, Data Services at IQ-EQ.
By Trevor Schulze, Chief Information Officer at Alteryx.
By Jonny Dixon, Senior Project Manager at Dremio.
By James Hall, UK Country Manager, Snowflake.
By Barley Laing, the UK Managing Director at Melissa.
Srini Srinivasan, CTO at Aerospike, says it is not about how much data you have, but what you do...