Elon Musk unveils ‘Colossus’ supercomputer to challenge GPT-4

Picture of Musa

Musa

Elon Musk

Elon Musk’s latest venture, xAI, has swiftly brought its new supercomputer, “Colossus,” online over the Labor Day weekend. This powerful system is designed to train xAI’s large language model (LLM), Grok, which aims to compete with OpenAI’s GPT-4. Musk claims that Colossus is “the most powerful AI training system in the world” and has ambitious plans to double its capacity by acquiring 50,000 of Nvidia’s advanced H200 chips, which are twice as powerful as the current H100 models.

The supercomputer, located in Memphis, was rapidly completed despite fierce competition among tech giants like Microsoft, Google, and Amazon for Nvidia’s Hopper series AI chips during the current AI boom. Musk has committed to spending $3-4 billion on Nvidia hardware this year for both xAI and Tesla, giving xAI a significant advantage with Tesla’s AI chips.

Also Read: Elon Musk’s X Unveils New Video Conferencing Feature

Colossus will be used to train Grok-3, Musk’s third-generation AI model, which is expected to be the most powerful AI upon its release in December. An early beta version of Grok-2, trained on 15,000 Nvidia H100 chips, is already being recognized as one of the most capable AI models.

The project’s scale has raised concerns in Memphis due to Colossus’s high resource demands, including up to 1 million gallons of water per day for cooling and the potential to consume 150 megawatts of power. However, Musk’s focus on speed and scale has driven xAI’s rapid progress. In May, the company raised $6 billion in a Series B funding round with backing from major investors like Andreessen Horowitz, Sequoia Capital, Fidelity, and Saudi Arabia’s Kingdom Holding. Musk has also hinted at a $5 billion investment from Tesla into xAI, which could further accelerate the company’s growth.

Trending

Recent News

Category Block

Type to Search