Generative AI Storms into Cloud Computing
![Wireless technology abstract image](/sites/acquia.prod/files/styles/720x400/public/2020/08/25/technology13-adobe.jpg?h=55c9a8d7&itok=r8ORrH7U)
Generative Artificial Intelligence is disrupting nearly every industry, from healthcare to entertainment (where it helped spark a writer's and actor's strike), but regardless of the application or industry, its impact may be felt most intensely in the cloud computing landscape. This article will explore how generative AI is changing cloud computing and driving the need for more powerful edge solutions.
Generative AI refers to algorithms (e.g., ChatGPT, DALL-E, Midjourney) that can create new content such as text, images, audio, and software code. But AI’s power is limited by the quality of the underlying data sets used to train it and the processing power upon which it resides. The more massive the training dataset, the more powerful the algorithm can be, generating demand for more computing power and storage.
Cloud Computing
Since it emerged in the early 2000s, modern cloud computing has become the leading solution for managing large-scale computational tasks and data storage, both of which are vital for generative AI. According to KPMG, the cloud is expected to surpass on-premises infrastructure by 2024. However, the often enormous data sets used for training an AI algorithm and the need for intense computational power can lead to latency issues when dealing with cloud services, thus the need for hybrid cloud/edge architecture.
Cloud computing is well suited for running the algorithm during its training period when data and computational demands are typically the highest. Cloud computing allows a company to dynamically scale up or down rather than incurring the cost of building out infrastructure scaled for the max load needed while training the algorithm. Cloud computing for AI is scalable and cost-effective, increases accessibility and allows companies to start with smaller, trial generative AI projects.
According to Grand View Research, the global edge computing market grew from $7.43 billion in 2021 to $11.24 billion in 2022, a 51% increase, and is expected to expand at a compound annual growth rate of 37.9% from 2023 to 2030. Cloud platforms are innately well-designed for collaboration across geographies and organizations, which can help accelerate development and deployment. Cloud services can be combined with Edge Computing, which provides multiple benefits:
- Edge computing offers processing power closer to its user, either on edge devices or local servers, versus relying solely on completely centralized cloud infrastructure to significantly reduce latency and bandwidth requirements. This is critical with applications such as self-driving.
- Some AI use cases involve sensitive or personal data, such as medical records or financial information. Edge computing can improve data privacy and security by keeping the processing and storage more localized, reducing the risk of data breaches, and allowing for compliance with privacy regulations.
- The increase in local processing reduces the amount of data that needs to be transmitted, lowering bandwidth needs which can reduce also costs and improve performance.
Cloud Leaders
The major players in cloud computing, many of which are also leaders in edge computing, are adding generative AI to their cloud services, and we predict that in the coming years, AI will be to cloud services what influencers are to vacation destinations - you can’t imagine one without the other. According to a Jeffries survey of CIOs, 48% reported that Amazon (AMZN) Web Services was their primary cloud provider, 43% reported Microsoft’s (MSFT) Azure, 8% reported Google, and 3% use Oracle.
- In April, Alibaba (BABA) unveiled its latest large language model, Tongyi Qianwen, which will be integrated across various businesses. It gives customers and developers access to the model to create customized AI features cost-effectively. According to CEO Daniel Zhang, “As a leading global cloud computing service provider, Alibaba Cloud is committed to making computing and AI services more accessible and inclusive for enterprises and developers, enabling them to uncover more insights, explore new business models for growth, and create more cutting-edge products and services for society.” You’ll read similar comments from the other cloud leaders.
- Alphabet’s (GOOG) Google Cloud generative AI offering includes a suite of AI and machine learning products such as PaLM, Vertex AI,Model Garden, Duet AI, Generative AI Studio, and Generative AI App Builder. In March, Google’s Workspace announced solutions for the rest of us just trying to get through the day. For example, in Gmail and Google Docs, you can type in a topic you’d like to write about (such as a job description), and a draft will be generated for you. The company also offers Google Distributed Cloud, which extends its cloud service to localized data processing centers, helping customers meet regulatory requirements and build applications faster.
- In April, Amazon released a new set of AI technologies integrated into AWS, such as Titan, a foundational large language model, and Bedrock, which allows customers to build customized “foundation models” using their own data. The company also announced a preview version of its AI-powered coding assistant, CodeWhisper, which is similar to Replit Ghostwriter and GitHub Copilot. At the Reuters MOMENTUM conference in Austin on July 11, the VP of Amazon’s Applications Group, Dilip Kumar, told the audience, "These models are expensive. We're taking on a lot of that undifferentiated heavy lifting so as to be able to lower the cost for our customers." Amazon’s strategy for AI looks to be similar to its retail strategy - go big and price low. Like many of its competitors, Amazon’s Cloud solution includes Edge offerings with equipment located outside of AWS data center clusters.
- In February, IBM (IBM) announced that it had developed an AI-focused supercomputer dubbed Vela that is natively integrated with IBM Cloud’s virtual private network. It has been online since May 2022 and is currently only available for the IBM Research community. The company has hinted that Vela is a proof of concept for a larger deployment plan, which means AI-supercomputing-as-a-service is on its way.
- On July 13, Microsoft and KPMG announced an expanded global partnership that involves AI solutions. KPMG has pledged to invest $2 billion in Microsoft Cloud and AI services over the next five years. This is expected to create a growth opportunity of over $12 billion for KPMG as it incorporates AI into its core audit, tax, and advisory services. Microsoft has already invested billions in OpenAI’s ChatGPT and offers Azure Cognitive Services, which brings AI solutions to its cloud offering. Acceleration Economy reported that Microsoft is expected to report quarterly cloud revenue of over $30 billion for the first time when it releases Q4 numbers later in July. Last month Amy Hood (EVP and CFO) and Kevin Scott (EVP of AI & CTO) shared during an AI discussion that they believe AI built into Microsoft Cloud solutions “will be the fastest growing $10 billion business in our history.” The company also has an Edge Computing division called Azure Private, allowing customers to choose their edge platform and create secure connections between the endpoint and Azure hybrid cloud.
- Semiconductor giant Nvidia (NVDA) targets edge computing with its EGX Edge Computing Platform. Its products are used by some of the world’s leading cloud computing providers such as Alibaba, Amazon Web Services, Baidu (BIDU) Cloud, Google Cloud, IBM Cloud, Microsoft Azure, Oracle (ORCL) Cloud, and Tencent (TCTZF) Cloud which helped push its share price up over 200% in 2023. Advanced Micro Devices (AMD) and Intel (INTC) will benefit from the growth in cloud service providers building data centers to support the massive AI workloads.
The bottom line is that AI’s ability to massively improve what we do, how we do it, how we live, and even how long we live makes it a powerful and unstoppable innovative force. With AI being incorporated into so many cloud services, access to AI will become increasingly democratized, making even greater levels of innovation possible. With greater innovation will come even greater demand for cloud services in a virtuous cycle wherein innovation drives ever more demand for the tools of innovation. Those semiconductor multiples may not be so insane after all.
The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.