Upon revealing the uncooked computer code required to construct a chatbot to the public last year, competing businesses claimed Meta was introducing unclear and potentially hazardous technology.
Google is now taking a similar action, suggesting that those who are against sharing A.I. technology are falling behind their counterparts in the field. After keeping this kind of technology under wraps for several months, Google on Wednesday made the computer code that controls its online chatbot publicly available.
Similar to Meta, Google claimed that the advantages of openly disseminating the technology—referred to as a big language model—outweighed any possible drawbacks.
In a blog post, the business announced the release of two artificial intelligence language models that may be used by outside businesses and independent software developers to create web chatbots that resemble Google’s chatbot. Although Google’s A.I. technologies, known as Gemma 2B and Gemma 7B, are not the most potent, the corporation said that they were on par with many of the top systems in the market.
According to Tris Warkentin, director of product management at Google DeepMind, “we’re hoping to re-engage the third-party developer community and make sure that” Google-based models become the norm for how contemporary A.I. is developed.
According to Google, there are no present intentions to make their main A.I. model, Gemini, available for free. Gemini has the potential to do greater damage because it is more potent.
Google started charging for the most potent version of Gemini this month. The model can be made available as an online service, allowing the company to maintain tighter technological control.
Some businesses, such as OpenAI, the creator of the online chatbotChatGPT, are become more and more reticent about the processes and software that go into their products out of fear that A.I. technologies may be exploited to propagate hate speech, misinformation, and other harmful content.
Conversely, some, such as Meta and the French startup Mistral, contend that open sourcing, or the practice of freely sharing code, is a safer strategy since it lets outsiders point out issues with the technology and offer fixes.
Chief artificial intelligence scientist at Meta Yann LeCun has stated that governments and consumers won’t accept AI unless it isn’t controlled by firms like Microsoft, Google, and Meta.
He asked The New York Times last year, “Do you want every A.I. system to be under the control of a couple of powerful American companies?”
Google used to make many of its cutting-edge artificial intelligence technologies open source, including the core technology behind its chatbots. However, in response to criticism from OpenAI, which was a rival, it grew more opaque about their construction.
In response to developer interest, the business chose to open source its artificial intelligence once more, according to Google vice president of developer relations Jeanine Banks in an interview.
The business said as it got ready to launch its Gemma technologies that it had taken precautions to make sure they were secure and that it was against its software license to use them for the distribution of harmful content and misinformation.
Mr. Warkentin stated, “We make sure we’re releasing completely safe approaches as much as possible, both within the proprietary sphere and within the open sphere.” “With the releases of these 2B and 7B models, we’re relatively confident that we’ve taken an extremely safe and responsible approach in making sure that these can land well in the industry.”
However, malicious individuals may still utilize these technology to create issues.
Users can now download algorithms from Google that have been trained on massive volumes of digital text extracted from the internet. Referring to the specific mathematical values that the system learns as it processes data, researchers refer to this as “releasing the weights.”
Tens of millions of dollars and hundreds of specialist computer processors are usually needed to analyze all that data. Most businesses, let alone individuals, lack those kinds of resources.