Cloud predictions for 2020

Multi-cloud environments have been a hot topic for the last year. Already, businesses have been realizing the benefits of a vendor-agnostic approach, which not only minimizes costs but gives them the freedom to innovate. However, there are a couple of aspects of operations which will be key in ensuring multi-cloud remains viable for enterprises in the long-term.

Despite the freedom which comes with a vendor neutral ecosystem, orchestrators haven’t yet overcome the headache associated with migrating workloads between these different cloud infrastructures. The past year saw major cloud players like IBM making acquisitions to address this, but as yet, they haven’t found a successful solution. Over the next year, this will be a priority for enterprises looking to remove the bottlenecks in their CI/CD pipeline. Organizations will invest in services which can help them harness a multi-cloud ecosystem, by supporting fast deployment, scalability, integration and operational tasks across public and private clouds.

Another piece of the puzzle will be in observability and monitoring across clouds. To ensure operations are maintained across the entire ecosystem and that they are fulfilling the workloads, the components for observability must be in place. This becomes complex in a multi-cloud infrastructure, where the same level of visibility and governance must be applied across instances. 2020 will be the year public cloud providers start to put these projects together, and already we are seeing the first instances of this with the likes of Google Anthos.

Unicorn start-ups will begin repatriating workloads from the cloud

There has been a lot said about cloud repatriation of late. While this won’t be a mass exodus from the cloud -- in fact quite the opposite, with public cloud growth expected to increase - 2020 will see cloud native organizations leveraging a hybrid environment to enjoy greater cost savings.

For businesses starting out or working with limited budgets, which require an environment for playing around with the latest technology, public cloud is the perfect place to start. With the  public cloud, you are your own limit and get immediate reward for innovation. But as these costs begin mounting, it’s prudent to consider how to regain control of cloud economics.

Repatriating workloads to on-premise is certainly a viable option, but it doesn’t mean to say that we will start to see the decline of cloud. As organizations get past each new milestone in the development process, repatriation becomes more and more of a challenge. What we will likely see is public cloud providers reaching into the data center to support this hybrid demand, so that they can capitalize on the trend.

Public cloud providers will be subject to increased security standards

The US Department of Defense’s decision to award the 10-year contract for its JEDI project to Microsoft will prove to be a watershed moment, serving as a trigger for more government agencies to move applications and unify information in the public cloud. The lure of major Federal spending will drive other cloud providers to compete in this multi-billion dollar space.

One of the biggest impacts will be the need to raise security and compliance standards in the public cloud. Government bodies work to extremely high requirements, which will now be placed on cloud providers and will have a spillover effect on the sector as a whole. This will include higher standards for how hybrid environments are architected and the need for a complete data separation between public cloud and on-premise environments. It will also encourage a move away from the outsourcing model as organizations will seek to build up their in-house cloud skills to meet requirements.

While this will primarily impact the US cloud market, it will also have ripple effects for other markets. The hyperscale providers are global in nature and so will be required to adjust their policies and practices for jurisdictions such as Post-Brexit United Kingdom, where there will be new standards around data protection and data separation from non-UK entities

Greater level of network automation through AI/Machine learning

The state of artificial intelligence and machine learning (AI/ML) in business has matured from a nebulous vision into tangible deployments. Companies are now giving a much heavier focus to AI/ML and are reorganizing their IT and business operations to cater for the trend. We’re observing this first hand through Kubeflow, where we see scores of startups and established enterprises joining every day to explore that they can do with AI/ML and how they can make deployments easier.

One specific area that’s already being enhanced by AI is in networking. We’re working with several IT and telecoms companies in this area that want to build better networks and gain far deeper insight into how those networks are being used -- across everything from optimizing power consumption through to the automation of maintenance tasks. In 2020 we will see the focus around AI/ML in the networking space get bigger than ever as more and more case studies emerge.

Kubernetes will no longer be seen as the silver bullet

Kubernetes has become an integral part of modern cloud infrastructure and serves as a gateway to building and experimenting with new technology. It’s little surprise that many companies we observe are doubling down on the application and reorienting their DevOps team around it to explore new things such as enabling serverless applications and automating data orchestration. We think this trend will continue at strength in 2020.

On a more cautious note, we may also see some companies questioning whether Kubernetes is really the correct tool for their purposes. While the technology can provide tremendous value, in some cases it can be complex to manage and requires specialist skills. As Kubernetes is now commonly being used for production at scale, it becomes increasingly likely that users encounter issues around security and downtime. As a result of these challenges, we can expect the community will mature and -- in some cases -- come to the viewpoint that it might not be right for every application or increase the need to bring in outsourced vendors to aid with specialized expertise.

5G will enable 'Netflix for gaming'

Mobile gaming has grown at a ferocious pace over the past decade to become a mainstream phenomenon. The global video gaming industry last year was worth nearly $138 billion, of which more than $70 billion was in mobile gaming. In 2020 we will see this trend accelerate further with the expansion of 5G, which offers the robust connectivity, low latency and bandwidth required to host heavy graphical content. This will enable the 'streaming as a service' model to flourish in gaming, giving users access to a curated set of gaming applications on a subscription basis, all at the touch of a button.

Where 5G really makes a difference in this instance is in freeing up compute power that would normally be required to download gaming applications. The popularity of cloud gaming (or gaming on demand) applications such as PlayStation Now and Shadow was accelerated by rapid developments in GPUs. In mobile, this same phenomenon will be made possible by 5G as an edge use case. An early example is Stadia, due to go live on Google in November of this year, which allows browser-based streaming.

We expect mobile gaming will become a far bigger trend in 2020 as the large telco providers roll out their 5G networks. This in turn will trigger a greater focus on open source as a means for building resilience and scalability into their cloud-based 5G core.

Photo Credit: aapsky/Shutterstock

Stephan Fabel is Director of Product, Canonical -- the publishers ofUbuntu. He is responsible for cloud products, including OpenStack, Kubernetes and MAAS. His main interests lie in creating attractive and reliant infrastructure solutions enabling higher productivity for developers and business alike.

One Response to Cloud predictions for 2020

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.