Microsoft announces general availability of Azure OpenAI Service and promises ChatGPT soon
AI services are more than a trend, they are a phenomenon, and every technology company wants to get in on the action. Microsoft is no exception; the company has just announced that its Azure OpenAI Service is now generally available, giving access to a far wider audience.
Microsoft is pushing Azure as "the best place to build AI workloads", and part of this involves harnessing the power of GPT-3-powered natural language. On the horizon is ChatGPT, which is described as "a fine-tuned version of GPT-3.5 that has been trained and runs inference on Azure AI infrastructure".
See also:
- Microsoft releases a script to restore a 'subset' of shortcuts deleted by rogue Defender ASR rule
- Microsoft brings the Windows 11 Media Player to Windows 10
- Use these hacks to unlock hidden taskbar and Start menu search options in Windows 11
Microsoft's Azure OpenAI Service is only a little over a year old, having debuted in November 2021, but has evolved in leaps and bounds. Having already brought DALL-E 2 to the platform, it comes as little surprise that the company is embracing more and more AI models.
Satya Nadella shared the news on Twitter:
Writing on the Microsoft Azure blog, Eric Boyd Corporate Vice President of AI Platform, says: "Large language models are quickly becoming an essential platform for people to innovate, apply AI to solve big problems, and imagine what’s possible. Today, we are excited to announce the general availability of Azure OpenAI Service as part of Microsoft’s continued commitment to democratizing AI, and ongoing partnership with OpenAI".
He continues:
With Azure OpenAI Service now generally available, more businesses can apply for access to the most advanced AI models in the world -- including GPT-3.5, Codex, and DALL•E 2 -- backed by the trusted enterprise-grade capabilities and AI-optimized infrastructure of Microsoft Azure, to create cutting-edge applications. Customers will also be able to access ChatGPT -- a fine-tuned version of GPT-3.5 that has been trained and runs inference on Azure AI infrastructure -- through Azure OpenAI Service soon.
The growing public interest in artificial intelligence and the explosion of accessible tools hitting the open market have fuelled concerns about how the technology could be used -- and whether such technologies could replace humans entirely.
Writing on his blog The Red Hand Files, singer Nick Cave this week gave a damning reaction to a song "written" by ChatGPT, mimicking his style. He says the "song is bullshit, a grotesque mockery of what it is to be human".
Cave writes:
Since its launch in November last year many people, most buzzing with a kind of algorithmic awe, have sent me songs "in the style of Nick Cave" created by ChatGPT. There have been dozens of them. Suffice to say, I do not feel the same enthusiasm around this technology. I understand that ChatGPT is in its infancy but perhaps that is the emerging horror of AI -- that it will forever be in its infancy, as it will always have further to go, and the direction is always forward, always faster. It can never be rolled back, or slowed down, as it moves us toward a utopian future, maybe, or our total destruction. Who can possibly say which? Judging by this song "in the style of Nick Cave" though, it doesn't look good. The apocalypse is well on its way. This song sucks.
While trying to use artificial intelligence to ape the style of an artist would not necessarily be labelled as irresponsible, Microsoft says that it is committed to "a responsible approach to AI". The company explains:
As part of our Limited Access Framework, developers are required to apply for access, describing their intended use case or application before they are given access to the service. Content filters uniquely designed to catch abusive, hateful, and offensive content constantly monitor the input provided to the service as well as the generated content. In the event of a confirmed policy violation, we may ask the developer to take immediate action to prevent further abuse.
More information is available here.
Image credit: rokas91 / depositphotos