[ad_1]
AI chip company Grok wants everyone to forget about Elon Musk Grok, the sarcastic chatbot with almost the same name, Grok’s lightning-fast demo went viral this weekend, leading to its current version chatgpt, Gemini And even the Grok look dull. Grok claims to provide “the world’s fastest large language model”, and third-party testing is saying this claim can hold up.
In a second, Grok wrote hundreds of words in factual reply, citing sources, According to a demo posted on, In another demo, founder and CEO Jonathan Ross explained CNN host conducts real-time, verbal conversation with AI chatbot on live television around the world, While ChatGPT, Gemini, and other chatbots are impressive, Grok can make them lightning fast. Fast enough for practical use cases in the real world.
Grok makes AI chips called language processing units (LPUs), which it claims are faster than Nvidia’s graphics processing units (GPUs). Nvidia’s GPUs are generally seen as the industry standard for running AI models, but early results suggest the LPU could render them useless.
Grok is an “inference engine”, not a chatbot like ChatGPT, Gemini, or Grok. This helps these chatbots run incredibly fast but does not replace them completely. On Grok’s website, You can test different chatbots And see how fast they go using Grok’s LPU.
According to one, Grok produces 247 tokens/second compared to Microsoft’s 18 tokens/second Third-party testing from artificial analysis Published last week. This means that ChatGPT can run more than 13 times faster if it is running on Grok’s chips.
AI chatbots like ChatGPT, Gemini, and Grok could be much more useful if they were faster. One current limitation is that these models cannot keep pace with human speech in real time; Conversations seem robotic with some delays. Google Recently faked my Gemini demo To make it seem like Gemini can do real-time, multi-modal conversations, even though it can’t. But with Grok’s increased speed, that video could become reality.
Before Grok, Ross co-founded Google’s AI chip division, which produced cutting-edge chips to train AI models. With LPUs, Ross says Grok overcomes two LLM bottlenecks that GPUs and CPUs are stuck at: Calculate density and memory bandwidth,
Where does the name Grok come from? stranger in a Strange Land, a 1961 science fiction book by Robert Heinlein. The word means “to understand deeply and intuitively.” That’s why so many AI companies are using it to describe their AI products.
Not only is there Ross’s Grok and Elon Musk’s Grok, but there is also another AI-enabled IT company named Grok, Grimes also has AI-powered toy, Grok, She is reportedly named after the way she and Musk’s children say “groquette.” However, Ross claims his Grok debut was in 2016.
“Welcome to the Grox Galaxy, Elon” said on a November blog post From Ross, three days after Elon Musk released the xAI version of Grok. “You see, I’m the founder and CEO of a company called Grok™,” Ross said, making sure that Grok is a trademarked name.
While Grok is getting a lot of buzz, it remains to be seen whether its AI chips have the same scalability as Nvidia’s GPUs or Google’s TPUs. AI chips are a major focus these days for OpenAI CEO Sam Altman, who is even considering making them himself. Grok’s increased chip speeds could accelerate the AI world, creating new possibilities for real-time communication with AI chatbots.
[ad_2]