SubQuery Network is Pioneering Decentralized AI Deployments
Singapore, Singapore, August 21st, 2024, Chainwire
At the Web3 Summit in Berlin today, SubQuery introduced the next evolution of its decentralized network: decentralized Artificial Intelligence (AI) inference hosting. In a live demonstration, SubQuery’s COO, James Bayly, showcased the latest open-source LLama model operating seamlessly across a network of decentralized Node Operators on SubQuery’s internal test network.
SubQuery’s mission is to empower developers to shape the future through decentralization. The company is driving a movement to build the next generation of Web3 applications for millions of users, with decentralization as the guiding principle.
The SubQuery Network serves as a groundbreaking infrastructure layer, already supporting decentralized data indexers and decentralized RPCs—essential components for any developer building a decentralized application (dApp). SubQuery has demonstrated that it can serve as a viable alternative to centralized service providers, offering an open network where anyone can participate as a node operator or delegator.
Over the past year, the potential of AI to revolutionize industries, including Web3, has become increasingly apparent. SubQuery has been closely monitoring these developments and working behind the scenes to integrate AI into its network. “The web3 summit in Berlin, with its focus on decentralised technologies, is the perfect place for us to announce this next step, and demonstrate it live,” said James Bayly.
SubQuery’s focus within its network is on AI inference rather than model training. Inference involves using a pre-trained model to make predictions on new data, such as running AI models to respond to user prompts. “There are already some commercial services providing inference hosting for custom models, but not many yet in web3,” noted James, “it’s much more suitable for long-term hosting on our decentralised network.”
The current market for AI inference services is dominated by large, centralized cloud providers, which charge exorbitant fees. “Solutions like OpenAI and Google Cloud AI are not only pricey, but are also using your private prompts and data to further train and enhance their closed-source proprietary models,” James explained. SubQuery aims to offer an accessible, affordable, and open-source alternative for hosting production AI models. “We hope that eventually, it takes no more than 10 minutes to get access to your own production-ready LLM model through our network”, he added.
“We’re concerned that if the world is forced to use closed source AI models, we’re giving those big corporations a bigger moat in which to train and enhance newer models, a vicious cycle giving power to the wrong people,” James stated. “By running on a decentralised network and distributing prompts across hundreds of node operators, no single party can track you or front-run you via your plaintext prompts or use them to build a moat against open source.”
The SubQuery Network will provide industry-leading hosting for the latest open-source models, enabling scalable AI services within Web3. By leveraging a community-driven approach, SubQuery will facilitate cost-effective and decentralized AI inference at scale across a network of independent Node Operators.
About SubQuery
SubQuery Network is innovating web3 infrastructure with tools that empower builders to decentralise the future – without compromise. Our flexible DePIN infrastructure network powers the fastest data indexers, the most scalable RPCs, innovative Data Nodes, and leading open source AI models. We are the roots of the web3 landscape, helping blockchain developers and their cutting-edge applications to flourish. We’re not just a company – we’re a movement driving an inclusive and decentralised web3 era. Let’s shape the future of web3, together.
Linktree | Website | Discord | Telegram | Twitter | Blog | Medium | LinkedIn | YouTube