Cloud as enabler for data-driven organizations

Expert Views

Daniel Klemm

Becoming more data-driven is a goal for many companies of all sizes. When it comes to realizing the benefits, most still have a lot of work to do and plenty of untapped potential. 

Big data and analytics are a top priority for many enterprises. Few have already widely integrated Machine Learning and Artificial Intelligence solutions. Most are yet to set up scalable and flexible data processing systems to handle the anticipated amount of data and data sources, thus enabling future use cases. For many companies, Cloud Computing is the cornerstone of their target system. Today, building a data platform often requires a multi-cloud approach spanning from proprietary systems on Private and Public Clouds as well as multiple Software-as-a-Service (SaaS) products. 

This article will present some current data related trends and how the cloud can help. 


Strong demand for real-time insights 

Realtime insights are becoming essential. Companies increasingly turn to the Cloud to build their data platform that enables different real-time use cases for example in the areas of marketing, customer experience, and business operations. Common building blocks of a data platform that enables live insights are streaming analytics to process data as it comes, a data catalogue to support self-service and business intelligence tools for visualization and report creation. 


Building the foundation for Machine Learning & Artificial Intelligence 

Most companies still have some work to do to create pre-conditions for Machine Learning & AI use cases. For many use cases, especially when it comes to predictive analytics, access to historical as well as current data is required. On a technical site, this requires reliable data pipelines, scalable and durable data storage and scalable computing power to do the necessary processing. A governance model that enables access for those who need it while ensuring compliance is also required. 


Amount of data is growing 

The amount of data in databases (structured) and data lakes (unstructured, semi-structured data) is growing. Especially when collecting data from (industrial) Internet of Things devices the number of data points quickly goes up to levels not imaginable years ago. 

But also, customer data used for personalization and tailored services or products as well as other data is growing. 


Heterogenous data landscape 

Data is collected and processed in numerous different systems and formats that form a complex landscape. Data can be found in legacy systems hosted in traditional data centers, Software-as-a-Service solutions, modern Cloud applications and on the edge. For use cases that have (near) real-time requirements computation on the edge is often mandatory. 

More complex Cloud setups like Multi Cloud / Dual Cloud / Hybrid Cloud are becoming increasingly a reality for many companies. 


Data gets treated as a product 

Some forward-thinking companies treat data as a product. That means applying a product mindset putting the customer at the centre of activities. The customers for a team providing data products can be internal, partners or external. Data can for example be provided internally or to the ecosystem via APIs (API economy) or as data sets. Customers typically discover such data via API directories, data catalogues and / or marketplaces. 


Areas where the cloud can help even more 

The big Cloud providers have a lot of helpful services in their portfolios that can accelerate and improve your companies journey to become data-driven. Not only are relevant building blocks provided for storing and processing of large quantities of data. Lab environments speed up explorational tasks and prototyping. The Cloud providers offer modern special purpose hardware that can be used easily, which can be beneficial especially for Machine Learning and Artificial Intelligence. Furthermore the portfolio of the Public Cloud provides – among other services –  sophisticated security solutions and the hosting of huge public or commercial data sets ready for processing inside the Cloud. 

The following table shows a selection of relevant services from the popular hyperscalers Amazon Web Services (AWS), Google Cloud Platform (GCP) and Microsoft Azure (Azure). 



When looking at the Cloud providers portfolios, we see that the essential building blocks required for data capturing, processing, and analysing are available on all platforms. Maturity of the respective offerings of the public cloud providers grew a lot in recent years. 

Amazon Webservices is still the market leader with the broadest and deepest portfolio. 

Microsoft Azure was rather late to the cloud game but had some years to do some catch up and became the fastest growing provider with a portfolio that attracts customers who have a long history with Microsoft products.  

Google Cloud Platform is very strong when it comes to Kubernetes, Big Data as well as Machine Learning and AI. 

Nevertheless, for most use cases the compute resources of the cloud where the data is located are used as there is a data gravity effect. Transfer of data out of one Cloud is associated with costs and takes time, which leads to processing of larger amounts of data typically happening close to were the data is stored. 

Our 2022 cloud native study showed that being quick and able to adapt is the most important driver for companies future cloud native approach. Databases, Big Data Analytics and Business Intelligence solutions are the most platform services in use at the companies interviewed. Big Data / Analytics is the most important use case already implemented and the most common cloud use case currently being worked on. Our research clearly shows that many companies turn to the Cloud when it comes to building the tools and platform(s) to become more data-driven. 



Success Factors & Recommendations 

When getting started to leverage the Cloud for your data related use cases, it is important to find the right balance: Do not get overwhelmed by the complexity involved, but also do not jump in without a vision. Take some time to get clear about vision and strategy and then focus on implementing your first high-impact use cases. Having some guiding principles written down is a good starting point which should be followed by an evolutionary approach to architecture and vision. This allows to start quickly and to get more specific and incorporate learnings as you go. 

Especially when it comes to cloud and data architecture, high impact decisions – mainly characterized by cost of change – should ideally be identified early on and made thoroughly and in time. 

Make use of the special capabilities of the Cloud: the flexibility and scalability provided by the cloud allows for different solutions than in a fixed capacity on-premises data center environment. Infrastructure as Code allows for re-use of infrastructure components that can be created quickly and teared down as they are no longer needed. Platform services provide ready to use building blocks, that can drastically reduce required efforts (setup and implementation). This contrasts with the traditional data center approach where huge upfront investments in hardware is necessary, and its procurement and installation add complexity and delays. 

Companies should focus on creating business value rather sooner than later. It is recommended to target high-impact use cases directly. This allows for realizing a return on investment early and ensures that the business will profit from these tech-heavy activities in the best possible way. Alignment and involvement of relevant parties – e.g., through cross-functional teams – is necessary as the platform gets build iteratively. To capture value early on it is recommended to start working with existing data. Use this data to augment and improve what you are already doing. Focus on people and processes and let them guide you to the right solutions. That means that data experts, Cloud experts and business experts should work closely together. Also, end-users should be involved as early as possible. 

It is not only about technology – organizational topics like skills, talent, culture, operating model, and processes are very important if you want to capture value with the technology. 

As you proceed, keep an eye on the big picture. Regular reviews and updates of your overall vision and assumptions can be of great help to reach the target state that is ideal for your business.