February 8, 2021


We rang in 2020 with all the expectations that cloud computing would continue its progression as a massive catalyst for digital transformation throughout the enterprise. What we didn’t expect was a worldwide health crisis that led to a huge jump in cloud usage.

Cloud megadeals have heralded a new era where cloud is a key driver in how organisations deploy operating models and platforms. In just the past 6 months, we saw 6 years’ worth of changes in the way businesses are run.

There has been a drastic shift to remote work – with the percentage of workers using desktop services in the cloud skyrocketing. Gallup estimates that as of September, nearly 40 percent of full-time employees were working entirely from home, compared to 4 percent before the crisis.

We are seeing renewed interest in workload migration to public cloud or hybrid environments. Gartner forecasts that annual spending on cloud system infrastructure services will grow from $44 billion in 2019 to $81 billion by 2022.

In light of these underlying changes in the business landscape, here are some key cloud trends that will reshape IT in 2021:

Remote working continues to drive cloud and security services

The year 2020 saw a huge expansion of services available in the public cloud to support remote workers. Big improvements in available CPUs means remote workers can have access to high-end graphics capabilities to perform processing-intense tasks such as visual renderings or complex engineering work. And as worker access to corporate networks increases, the cloud serves as an optimal platform for a Zero Trust approach to security, which requires verification of all identities trying to connect to systems before access can be granted.

Latest system-on-a-chip technology will support cloud adoption

The introduction of the Apple M1 in November marked the beginning of an era of computing where essentially the whole computer system resides on a single chip, which delivers incredible cost savings.  Apple’s Arm-based system on a chip represents a radical transformation away from traditional Intel x86 architectures. In 2020, Amazon Web Services also launched Graviton2, based on 64-bit Arm architecture, a processer that Amazon says provides up to a 40 percent improvement in price performance for a variety of cloud workloads, compared to current x86-based instances.

These advancements will help enterprises migrate to the cloud more easily with a compelling price-to-performance story to justify the move.

Move to serverless computing will expand cloud skills

More than ever, companies are retooling their operating models to adopt serverless computing and letting cloud providers run the servers, prompting enormous changes in the way they operate, provide security, develop, test and deploy.

As serverless computing grows, IT personnel can quickly learn and apply serverless development techniques, which will help expand cloud skills across the industry. Today’s developers often prefer a serverless environment so they can spend less time provisioning hardware and more time coding. While many enterprises are challenged with the financial management aspects of predicting consumption in a serverless environment, cost optimisation services can play a key role in overcoming these risks.

Machine learning and AI workloads will expand in the public cloud

As a broad range of useful tools becomes available, the ability to operate machine learning and AI in a public cloud environment – and integrate them into application development at a low cost – is moving forward very quickly.

For example, highly specialised processors, such as Google’s TPU and AWS Trainium, can manage the unique characteristics of AI and machine learning workloads in the cloud. These chips can dramatically decrease the cost of computing while delivering better performance. Adoption will grow as organisations figure out how to leverage them effectively.

More data will move to the cloud

Data gravity is the idea that large masses of data exert a form of gravitational pull within IT systems and attract applications, services and even other data. Public cloud providers invite free data import, but data export carries a charge.

This is prompting enterprises to build architectures that optimise for not paying that egress charge, which means pushing workloads and their data to reside in a single cloud, rather than multicloud environments. Data usage in the cloud can eventually amass enough gravity to increase the cost and consumption of cloud services.

However, as cloud technology continues to mature, organisations should not be afraid to duplicate data in the cloud – that is, it is perfectly fine to have data in different formats in the cloud. A key goal is to have your data optimised for the way you access it, and the cloud allows you to do that.

The journey continues

So as the world around us changed in unprecedented ways in 2020, cloud computing continued its march to enterprise ubiquity and is a key layer of the Enterprise Technology Stack. Yet, by most measures, cloud is still only something like 5 to 10 percent of IT, so there is still a lot of room to grow. Getting the most out of cloud technology going forward is not a 5-year journey; it may take many years to truly tap into its full potential.

The shift in operational models and the organisational change needed to fully embrace cloud computing is significant. It is not a minor task to undergo that transformation journey. In early 2022, we will most certainly look back and be amazed again at how far cloud has progressed in one year.

About the author

About the author

James Miller is DXC Technology’s chief technology officer and vice president for Cloud and Platform Services. He builds key client relationships, advises senior leadership on technology trends and initiatives, and provides oversight and thought leadership to grow both DXC and our customers’ businesses. Previously, Jim was a Fellow and industry chief technologist for manufacturing, automotive, aerospace and defense, and strategic accounts at Hewlett Packard Enterprise.