Who will win the battle of open vs closed AI? [Q&A]

Artificual Intelligence Bias

Closed AI products like Bard and ChatGPT (ironically from OpenAI) have already delivered a practical, powerful chatbot experience and are being employed by many businesses.

Open AI by contrast is still in its early stages and has not seen wide adoption. We spoke to Mike Finley, CTO and co-founder of AnswerRocket, to find out the differences between the two and how they're set to develop.

BN: What are the differences between open and closed AI?

MF: There isn't an industry-wide standard definition for 'open' and 'closed' AI yet. The question is further muddied because one specific company, OpenAI, who is at the heart of the AI explosion, actually falls under the umbrella of 'closed AI.' But generally speaking, the difference between open and closed AI is the same as the difference between any open and closed software.

Open AI refers to AI technologies that are developed and distributed with open-source licenses. Open-source AI encourages collaboration and allows anyone to use, modify, and contribute to the AI's development. With open AI, there's a commitment to transparency and accessibility in research and development. Like any other open source software, these technologies prioritize the sharing of research findings, code, and data openly with the broader AI community and the public. Anyone can participate and contribute to the development of an open AI technology.

Closed AI is developed by large tech companies and kept as proprietary assets. The source code of these technologies is not openly shared, and access is typically restricted. Any given closed AI technology will be controlled by a single organization. These technologies are created primarily for commercial purposes, with a focus on generating profit for the company that built them.

In terms of characteristics, closed AI technologies provide a premium, highly polished, easily usable product, while open AI tech aims to provide cost efficiency, flexibility, and support for niche applications.

BN: How mature are each of the approaches today?

MF: Closed AI has taken a considerable lead right out of the gate. Closed AI products like Bard and ChatGPT (the latter ironically from OpenAI) have already delivered a practical, powerful chatbot experience and are being leveraged by many businesses. Meanwhile, open AI offerings like Llama are rough around the edges, suffer from inferior performance, are hindered by disputes about ownership, and have been sparsely adopted within the enterprise.

BN: How will the competition between the two play out in the future?

MF: Open AI will catch up in a couple years as community and vendor support grows. Ultimately, the race between the two will bear similarities to the iPhone vs. Android battle: Apple moves their technology forward aggressively and ensures that every aspect of the ecosystem is easy, polished and as consumer-ready as it is expensive. On the other hand, Android is far more open, moved forward by many vendors for many different uses, has rough edges that come from all that variety, but is still considered optimal by the value-conscious, anyone who wants to support niche applications, and those who philosophically object to closed tech.

In a two-year timeframe, all of these differences will shake out because the underlying technology that is powering all of these models is fundamentally similar. But that's a commercially important timeframe and many enterprises will be forced to move forward with closed AI before the dust settles on open AI and those models are able to solve their challenges and commoditize.

BN: Once open AI has caught up a bit with closed AI, how should an enterprise decide between the two?

MF: Organizations should leverage closed AI if they need to support standard use cases and want a solution that they can quickly roll out and begin getting value from immediately. Adoption is easy and you don't need a team of experts to use the tech -- the companies that own closed AI technologies will work with customers to ensure they’re wielding them correctly and getting maximum business value.

Enterprises may want to use open AI if they require something that’s cheaper, allows experimentation and prioritizes customizability. If you’re looking to support a lot of unique, uncommon use cases, then open AI might be for you. It's important to note, though, that even with more commoditization, open AI (like any open software) requires some in-house expertise to effectively use, especially for niche purposes.

BN: How can businesses future proof their investment?

MF: In the near term, most businesses are probably going to want to use closed AI systems, which are stable, productive, can swiftly get enterprise use cases off the ground, are highly scalable (thousands of users or more), and already have a track record of delivering value. Many closed AI solutions offer attractive returns on modest investment. Moreover, adopting closed AI now does not preclude you from switching to open models later, if and when those models prove advantageous. Organizations should be launching and scaling pilot projects to discover the emergent value of AI now.

In parallel, organizations should stay engaged with the AI space because powerful open models will give closed models a run for their money. Expect something like the Linux evolution, where an open solution delivers compelling proof of what's possible, and then value-add service providers wrap that open tech in enterprise-friendly support and maintenance for a winning combination.

Image Credit: Wayne Williams

Comments are closed.

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.