Elon Musk-led artificial intelligence (AI) startup xAI is making its first generative AI-powered chatbot Grok available to users in India.
Aside from India, xAI announced that Grok AI will be available in 46 other countries including Singapore, New Zealand, Malaysia, Canada and Australia, in less than a week after it arrived in the US.
Currently, Grok AI is available to X Premium+ subscribers. To those unaware, Premium+ is the top subscription tier of X (formerly Twitter).
Earlier this month, X said access to the AI chatbot is being rolled out in a phased manner. In other words, the company granted early access to longstanding subscribers across the web, iOS and Android.
"Grok is now gracing the world in even more countries, spreading knowledge and laughter far and wide. The future is looking brighter already," X CEO Linda Yaccarino announced in a post.
How is Grok AI different from other AI-powered chatbots?
In an announcement post in November, xAI said Grok is designed to "answer questions with a bit of wit" and has a "rebellious streak".
xAI said Grok will answer questions that are currently rejected by other AI chatbots. Also, the company noted that, unlike its rivals, Grok will have access to real-time information thanks to data from X.
However, xAI was recently accused of using OpenAI's codebase to train Grok AI. Musk's AI company, on the other hand, claims Grok is based on its proprietary large language model (LLM) known as Grok-0.
Furthermore, xAI says Grok-0 is trained with 33 billion parameters and can outperform ChatGPT based on the GPT 3.5 language model (free version). Soon after rolling out to users in the US, Grok found itself in several controversies.
For instance, Grok didn't disappoint when it was asked to roast Musk. According to Musk's own chatbot, he is a "delicate flower".
Grok also questioned many of Musk's actions including his obsession with X. Moreover, Grok does not align with Musk's political views.
While the AI bot currently answers most political questions in the same tone as other AI bots, Musk has previously stated that xAI was "taking immediate action to shift Grok closer to politically neutral".
Running Grok AI requires a lot of power
Meanwhile, a report by Business Insider has shed more light on Musk's original plan for Grok AI. According to chief technology officer of Oracle Larry Ellison, Musk's xAI wanted more Nvidia chips for its AI chatbot than Oracle could provide.
Nvidia stock has more than tripled in 2023 since investors believe the graphics chip specialist will play a key role in the artificial intelligence revolution.
Ellison recently said Musk is one of those who can't get enough of Nvidia's semiconductors. Notably, Grok currently uses Oracle's servers to run.
While it was able to provide enough Nvidia chips to power the first version of Grok, the cloud-infrastructure giant later fell short of Musk's demands, said Ellison during Oracle's latest earnings call, as per a transcript provided by AlphaSense/Sentieo.
"Boy, did they want a lot more GPUs than we gave them. We gave them quite a few but they wanted more, and we are in the process of getting them more," Oracle's billionaire cofounder said.
"They were able to use it, but they want dramatically more as there's this gold rush towards building the world's greatest large language model," he added.
Ellison, who is one of Tesla's biggest shareholders, went on to say that the company is sparing no effort to give its customers what it can this quarter,
He also noted that Oracle is gearing up to dramatically increase its ability to give its customers "more and more capacity each succeeding quarter".
Responding to a Wall Street analyst's question about Oracle's plans to build 100 new cloud data centres, up from around 66 today, Ellison revealed that Microsoft alone has ordered twenty data centres to support its Azure cloud-computing platform.
Moreover, the company wants to build more cloud data centres to meet the demands of its direct customers, the top executive added.
In the meantime, Musk has purchased 10,000 graphics processors for xAI, according to an earlier report by Business Insider's Kali Hays.
"We're using a lot of Nvidia hardware. We'll actually take it as fast as they'll deliver it to us. Frankly, if they could deliver us enough GPUs, we might not need Dojo. But they can't. They've got so many customers," Musk said on an earnings call in July.