International technology corporations and regional startups are attempting to open up profitable new markets in India by using artificial intelligence platforms that are tailored for the wide variety of languages and sectors in the most populous nation on earth.
Microsoft, Google, and Silicon Valley-backed Sarvam AI as well as Krutrim, which was created by Bhavish Aggarwal of the Indian mobility business Ola, are among the start-ups developing AI voice assistants and chatbots that can speak languages like Tamil and Hindi. The instruments are intended for India’s rapidly expanding sectors, including the vast call centers and customer service sector.
India has 22 official languages, the most common being Hindi, although academics believe that the country’s 1.4 billion inhabitants speak thousands of different languages and dialects. Google released the Gemini AI assistant in nine Indian languages on Tuesday.
Twelve Indian languages are supported by Microsoft’s Copilot AI assistant. The corporation is also developing other projects specifically for India, such as creating “tiny” language models at its research center in Bengaluru. These less costly substitutes for the pricey big language models that form the basis of generative AI may operate on smartphones as opposed to the cloud, which makes them more affordable and possibly more appropriate for nations with spotty internet, like India.
According to Puneet Chandok, president of Microsoft for India and south Asia, the company intends to “make [AI] simple and easy to use and get it into the hands of all these customers and partners.” This was stated to the Financial Times. “Adapting it to the Indian context, making it more relevant and precise,” he added.
Microsoft and Sarvam AI are partners as well. The Bengaluru-based startup was only established last year, and it is creating a “full stack” of generative AI solutions for Indian companies. The start-up has raised $41 million from investors, including Lightspeed Venture Partners of Menlo Park and Peak XV, the previously India division of Sequoia.
Hemant Mohapatra, a partner at Lightspeed, stated that as governments work to create “sovereign AI”-AI that is educated and stored domestically-investing in regional AI startups is becoming increasingly crucial.
According to Mohapatra, “the AI supply chain is starting to fragment.” “It must be an Indian company, focused on Indian use cases, Indian-domiciled, Indian founders, and so on, if you’re training a foundation model in India on Indian citizen data, audio, video, text, and different languages.”
Building LLMs from scratch to compete with industry leaders like Open AI is not the strategy used in India’s AI competition. The resources and capital needed, according to investors, would be too great to be justified.
Rather than relying on text, businesses like Sarvam AI are concentrating on modifying current LLMs for use in Indian languages. This increases their effectiveness in a nation where a large portion of people would rather communicate verbally than in paper.
Partner at Lightspeed Bejul Somaia stated, “There’s still a huge gap between these underlying models and real-world use cases in countries as complex as India.” “You’re going to need to have a little ecosystem that emerges in a market like India to enable companies to use the underlying model capabilities.”
One more advantage of testing new tools and technologies in a country the size and diversity of India, according to Tanuja Ganu, manager of Microsoft Research in Bengaluru, is that they can be transferred to other countries.
“It’s about validating some of the technology in India and seeing how we can expand it to other parts of the world,” the speaker explained, referring to India as a test bed.