Data Lakes are Only Good for Those Who Can Swim

Data Lakes are Only Good for Those Who Can Swim

March 24, 2020

We jump into data lakes and its barriers to entry by both the big and smaller fish with Talend Head of Cloud Alliances Robert Cornell.

By Joanne Leila Smith

Companies are adopting the cloud for its low-cost, scalable storage, flexibility of infrastructure on demand and for increased speed of new application deployment.

The major cloud providers, such as Amazon Web Services (AWS) and Microsoft Azure, are racing to avoid commoditisation by differentiating their offerings with unique capabilities to provide value to their customers and to drive growth. Amazon announced more than 20 updates to AWS in February alone, and Microsoft has posted even more updates and enhancements, many of which drive flexibility of deployment, scalable performance and storage and security.

This activity is pushing the global services market to grow about 18 percent this year compared to 2018, according to Gartner estimates.

North America is the strongest market for cloud services, but the Asia Pacific region is slowly gaining ground – and Singapore is one of a few rapidly growing markets in the region outranking all other Asian countries in cloud readiness.

The Infocomm Media Development Authority (IMDA) says it is also committed to encouraging the adoption of cloud services and solutions. As the adoption increases across the region, business models are also changing in favour of cloud-enabled models bolstered by the services economy.

At the moment, only large organisations are using data lakes. Previously, big data platforms and the insights derived were only the domain of large companies because of the cost necessary to set it up was prohibitive. Now with cloud tech, these platforms and services are available at fractions of the cost and may be accessible to far smaller organisations that can now afford to experiment.

We asked Talend Head of Cloud Alliances Robert Cornell just how big an organization needs to be that they can afford to use data lakes and, two, get meaningful insights out of data in the cloud.

“The barrier to entry has been reduced significantly so almost any organisation can afford to do some level of insight analysis from their data. The barrier is no longer the technology, the systems or cost, but comes down to the people that remain the bottleneck in doing meaningful analysis of data. The thing that limits what the world does with data, is access to the data engineers, data scientists and data analysts, many of whom spend a large percentage of their time on low value tasks trying to bring together data from different data sources. With platforms like Talend Data Fabric making data self-service a reality, these scarce (human) resources can be maximised to deliver real results for any business in a controlled, governed and managed way,” says Cornell.

On the subject of gaining meaningful insights, we asked Cornell whether there were a tier of organisations or groups who may not have thought about how they drive data insights into their business from internal and external perspectives.

Cornell argues that many companies think about what they want to do with data and the related insights, but they often limit themselves to the initial use cases.

“Leaders in every sector will be the ones that develop the insights for an initial use case, then reuse the same insights in a plethora of different ways. Once insights are developed, every business should test to see who that insight can be used to deliver new and interesting customer, partner and employee experiences, often powered by APIs to allow programmatic access. Once the same insights are used in different ways, the organisation will reap exponentially more value form their data, but they must also ensure they maintain the governance and compliance over the data when its usage is expanded,” says Cornell.

Now with access to immense global data sets and years of accumulated internal data, Cornell says the kind of problems organisations can start looking to resolve that were not previously possible for most businesses rests on proactiveness.

“It is no longer enough to allow a customer to control the buying process, a patient to control their healthcare or a student to control their tertiary education. With huge amounts of historical data, the leading organisations in every field will take back the control and tell their clients what will happen next before they know it themselves. By using massive amounts of historical data, organisation can predict what will happen and become more proactive in their relationship with customers. Being able to make meaningful and timely predictions through historical data and trends may help minimize potential negative impact to business, enable us to react faster with access to real-time data, and, develop the right products and services to meet the needs of the consumer,” says Cornell.


Leave a Reply

Your email address will not be published. Required fields are marked *