Info Image

Can AI Predict the Future? AI Networking Transformation Through the Lens of Advanced Language Models

Can AI Predict the Future? AI Networking Transformation Through the Lens of Advanced Language Models Image Credit: sakkmesterke/BigStockPhoto.com

When discussing predictions, I like to reference Peter Drucker - a renowned management consultant, who said: “Predicting the future is like trying to drive down a country road at night with no lights while looking out the back window.”

While predictions are a risky business, consider the following twist on Drucker's scenario: navigating that country road at night with no lights, looking out the back window, but this time, letting an artificial intelligence handle the navigation. This may indeed work. And it raises a compelling question: If advanced AI tools can succeed driving a car at night, can they also predict the future with the same exceptional data analytic capabilities?

A 12-Months AI Networking Transformation Prediction

The thought made me approach this yearly prediction exercise differently and enlist AI agents (ChatGPT and Bard) to peer into the crystal ball. They will offer their educated predictions about the trends that will shape AI networking in the next 12 months, and based on their insights I will craft a collective vision. This way perhaps I might even succeed in predicting the future!

The prompt used.

Although both AI Agents are using different large language models (LLM), it was easy to create a collective vision out of their outputs. They both highlighted the following five trends that are likely to influence the AI-networking space in 2024:

  1. Exponential growth in AI workloads
  2. Open networking
  3. Edge computing will need a distributed architecture
  4. Sustainable and energy-efficient networking
  5. AI for IT Operations (AIOps)

These trends are set to shape the evolution of AI networking, making it more intelligent, efficient, and adaptable. Let’s examine how each of these trends impacts future networking solutions.

#1: Exponential Growth in AI Workloads

The recent exponential growth in compute power has opened the door to large-scale AI model training.

Compute power over the years - source: “Compute Trends Across Three Eras of Machine Learning” by Sevilla et. al, 2022.

Applications such as ChatGPT, Bard, and X.AI's Grok use large-scale LLMs. But the demand for larger, better models continue to grow, requiring hyperscalers to support even larger workloads with clusters of thousands of GPUs. We can expect this trend to continue, driven by new AI algorithms and the proliferation of AI applications across various sectors. It will drive significant increase in the size of AI workloads and of the GPU clusters supporting them. The underlying architecture and network connectivity play crucial roles in ensuring the efficient use of these clusters and the successful training of the models. Workload architects will seek networking solutions that can ensure optimal Job Completion Time (JCT) performance at any scale, even with large-scale clusters of 16,000 and 32,000 GPUs.  

#2: Open Networking

Hyperscalers already embraced open and disaggregated networking solutions in their data centers, recognizing that monolithic and proprietary solutions cannot deliver the scalability, flexibility, and cost-effectiveness needed for their large-scale compute resources. This is why it's clear that proprietary networking solutions, previously suitable for High-Performance Computing (HPC) will gradually be overhauled by more open alternatives. Proprietary networking solutions lock the infrastructure to one end-to-end solution, slowing down innovation and keeping costs high due to lack of competition. Open and standard solutions are therefore critical to the growth of the AI eco system – they enable cost effective infrastructure for high-scale workloads, which will enable proliferation of LLMs and drive new applications. This is the bases of the introduction of Ultra Ethernet Consortium (UEC) – open AI networking to a standard (Ethernet-based) model that will be adapted to the needs of AI. My predication is therefore, that Ethernet will be adopted for AI backend networking. 

#3: Edge computing and distributed architecture

Large backend workloads are great for handling highly complicated tasks or training extensive AI models. However, for AI inference there is a growing trend to shift computing power closer to the application to enhance user experience, particularly in scenarios requiring rapid decision-making. While a fully distributed AI workload may not be a reality in 2024, the movement toward edge computing will continue and will lead to more frequent interconnections between front-end and back-end networks. This trend highlights a current networking issue: the inconsistency of the connectivity protocols between backend and frontend networks. A unified networking solution could significantly streamline network management and potentially boost overall performance, we have already seen initial steps in this direction with the introduction of UEC. 

#4: Sustainable and Energy-Efficient Networking

As AI workloads intensify, especially those involving thousands of GPUs, their substantial power consumption becomes a major concern. Although the energy impact of the network is much lower than that of the compute - 10X lower, it should also be addressed. Additionally, the carbon footprint remains a key concern, irrespective of scale. I predict that new AI networking solutions will put more focus on energy-efficiency. This includes adopting energy-efficient hardware and embracing a sustainable vision aligned with the principles of the circular economy. Furthermore, we anticipate a growing emphasis on advanced software designed to enhance resource utilization.  

#5: AI for IT Operations (AIOps)

AIOps is already implemented by several networking vendors to improve network operations. I expect that more investments in AIOps tools can make a greater impact on network operations efficiency and revolutionize Networking. Specifically, predictive analytics and real-time anomaly detection, can resolve potential network issues and improve reliability. As AI networking evolves it will substentially improve high-performance connectivity.  

Can LLMs predict the future? We'll have a clearer picture in 12 months.

What is clear, however, is that after the 'eureka' moment of AI with the introduction of ChatGPT, all the puzzle of AI will continue to evolve and improve. This 2024 forecast anticipates a transformation towards smarter, more efficient, and adaptable AI-Networking, ready to meet the expanding demands of AI evolution.

NEW REPORT:
Next-Gen DPI for ZTNA: Advanced Traffic Detection for Real-Time Identity and Context Awareness
Author

Shai is a Director of Product Marketing at DriveNets. Previously he served as a Product Marketing Manager in Radware’s network security group. His domain of expertise is service provider/carrier solutions, where he leads positioning, messaging, and product launches. Prior to Radware, Shai had a tech marketing role at Orange IL, Sony Mobile and startup companies. Shai holds a B.Sc. degree in Electronic Engineering from Ariel University.

PREVIOUS POST

Push to Eliminate 'Digital Poverty' to Drive Demand for Satellite-Powered Broadband Connectivity Post Pandemic