Blockchain, the tech behind bitcoin, may have found its 'killer use case' by keeping AI in check
by Arjun Kharpal · CNBCKey Points
- Blockchain could be used to keep a check on the data artificial intelligence models are being trained on to avoid issues like bias, executives told CNBC at the World Economic Forum in Davos.
- One of the concerns around the AI models — the kind that underpin applications like ChatGPT — is that the data they are trained on could contain biases or misinformation.
- AI training data can be put on the blockchain. That will allow the developers of the AI system to keep track of the data that the model has been trained on to mitigate those dangers, the executives added.
DAVOS, Switzerland — Blockchain could be used to prevent bias in the data that artificial intelligence models are being trained on — and that could be a "killer use case" for the technology, executives told CNBC.
One of the concerns about the AI models — the kind that underpin applications like ChatGPT — is that the data they are trained on could contain biases or misinformation. That means the answers an AI system may give would contain those biases and false information.
Blockchain hit the market in 2009 with the launch of the cryptocurrency bitcoin. In the context of bitcoin, the technology is an immutable and tamper-proof public ledger of transactions. Businesses have been looking to put these principles to use in other applications for blockchain, which is sometimes referred to as distributed ledger technology.
In the case of AI, training data can be put on the blockchain. That will allow the developers of the AI system to keep track of the data that the model has been trained on.
Casper Labs, a business-focused blockchain firm, partnered with IBM this month to create such a system.
"The product that we are developing, the datasets are actually checkpointed and stored on the blockchain so you have a proof of how the AI is trained," Medha Parlikar, chief technology officer and co-founder of Casper Labs, told CNBC during a panel discussion at the World Economic Forum in Davos this week.
"And so as you use the AI, if it's learning and you find that the AI is starting to hallucinate, you can actually roll back the AI. And so you can undo some of the learning and go back to a previous version of the AI."
Hallucinations broadly refer to when an AI system gives out false information.
Blockchain is a technology that has been spoken about for many years, and a host of industries ranging from finance to health care have been looking at ways to use it.
Sheila Warren, the CEO of the Crypto Council for Innovation, said, however, that a blockchain-based AI training data ledger could be the "killer use case" for the technology.
"I actually do think that the verification of an AI and sort of the checks and balances ... within an AI system, are going to be blockchain driven and blockchain backed," Warren told CNBC during the panel.