Seeing a bright future in the development of AI technologies, Microsoft has invested $10 billion into the well-known research laboratory OpenAI, the creators of ChatGPT. On the back of this announcement, Microsoft launched an improved version of its search engine Bing. The new Bing has incorporated a large-scale language model named Prometheus and the technology that underlays ChatGPT. Prometheus is a collaboration between Microsoft and OpenAI.
Not to be left out, Baidu launched ERNIE Bot. Initially operating as a standalone software, ERNIE Bot will be integrated into Baidu’s own search engine at a later time.
Regarding the models and specifications of the computing chips used in these AI projects, ChatGPT has mainly adopted NVIDIA’s A100 and exclusively utilises the cloud-based resources and services of Microsoft Azure. If the demand from ChatGPT and Microsoft’s other applications are combined, then Microsoft’s demand for AI servers is projected to total around 25 000 units for 2023.
Turning to Baidu’s ERNIE Bot, it originally adopted NVIDIA’s A100. However, due to the export control restrictions implemented by the US Commerce Department, ERNIE Bot has now switched to the A800. If the demand from ERNIE Bot and Baidu’s other applications are combined, then Baidu’s demand for AI servers is projected to total around 2000 units for 2023. A survey by TrendForce has revealed that in the market for server GPUs used in AI-related computing, the mainstream products include the H100, A100, and A800 from NVIDIA, and the MI250 and MI250X series from AMD. It should be noted that the A800 is designed specifically for the Chinese market under the context of the latest export restrictions. In terms of the market share for server GPUs, NVIDIA now controls about 80%, whereas AMD controls about 20%.
Focusing just on the specifications of these GPUs, ones that are involved in high-bandwidth computing and thus require high-bandwidth memory (HBM), have attracted even more attention in the market. HBM currently represents about 1,5% of the entire DRAM market. The main suppliers for HBM solutions are Samsung, SK Hynix and Micron. Among them, SK Hynix is expected to become the dominant supplier for HBM3 solutions as it is only one capable of mass producing the HBM3 solution that has been adopted by NVIDIA.
© Technews Publishing (Pty) Ltd | All Rights Reserved