AI Protocol Whitepaper
  • AI Protocol V3
    • Get Started
  • Decentralized Applications
  • Artificial Liquid Intelligence Agents (ALI Agents)
    • The ALI Agent
    • The Prime ALI Agent
    • Collaborative Creativity
    • How To Create An ALI Agent
    • ALI Agent Bonding Curve Key Price Formula
    • ALI Agent Use Cases
  • Intelligence Layer
    • How To Create A Hive
    • How To Manage A Hive
    • Decentralized Inference Clusters
    • Decentralized Storage Clusters
    • Specialized Functionalities Of A Hive
    • Emote Engine
    • Protocol Governance
  • Asset Layer
    • ALI Agent Tokens
    • Hive-Native Utility Tokens
    • Intelligence Pods
    • ARKIVs
    • NFTs
    • iNFTs
  • Smart Contracts Layer
    • Documentation
    • AI Protocol Smart Contracts
  • Settlement Layer
  • ALI Utility Token
    • Transparency
    • ALI Tokenomics
  • AI Protocol History
    • AI Protocol V0.1
    • AI Protocol V1
    • AI Protocol V2
  • FAQ
Powered by GitBook
On this page
  • Defining DePIN Systems
  • The Hive's Decentralized Resource Clusters
  • Provide Offchain Compute Power With Onchain Intelligence Pods
  • Incentive Mechanisms: Driving Contribution Through Token Rewards
Export as PDF
  1. Intelligence Layer

Decentralized Inference Clusters

Distributing AI compute power at scale

PreviousHow To Manage A HiveNextDecentralized Storage Clusters

Last updated 1 year ago

The evolution of decentralized artificial intelligence (AI) systems marks a significant leap towards democratizing AI technologies. At the forefront of this transformative wave is the AI Protocol V3, which introduces an innovative concept known as Decentralized Inference Clusters (DePIN). This section explores the pivotal role of DePIN within the AI Protocol's Hive infrastructure, detailing its operation, contribution mechanism, and incentive structure.

Defining DePIN Systems

A Decentralized Physical Infrastructure Network (DePIN) system forms the architecture of the AI Protocol's V3 tech stack. It represents a paradigm shift from traditional, centralized AI computation models to a distributed framework where AI inference tasks are executed across a network of decentralized nodes. This model not only enhances the efficiency and scalability of AI services but also ensures a higher degree of accessibility and opportunities for users and compute providers.

The Hive's Decentralized Resource Clusters

Provide Offchain Compute Power With Onchain Intelligence Pods

The AI Protocol V3 democratizes participation in the DePIN system. Anyone with GPU power can contribute to any Hive's computational cluster, provided they connect an Intelligence Pod to that Hive. Intelligence Pods serve as gatekeepers to the system, ensuring that only authenticated hardware contributes to the DePIN clusters. This open participation model not only broadens the network's computational capacity but also fosters a diverse community of contributors.

Each Hive may include a variance in how rewards are distributed to compute providers, but Propolis' rewards are allocated relative to the Level of the Intelligence Pod. The higher the Level, the more the provider is rewarded.

Incentive Mechanisms: Driving Contribution Through Token Rewards

Contributors to the DePIN system can be rewarded for their participation through Hive utility token rewards and various tokenized reward systems. These incentives are designed to recognize and compensate contributors for their computational power, encouraging sustained and active participation in the network. The AI Protocol V3 introduces a flexible and autonomous incentive framework, allowing each Hive to independently design its community's economy and reward system.

Hives can launch their own Hive-native Utility token, provide liquidity to that token with a Liquidity Pool (LP), and design comprehensive distribution models of their choice, including staking rewards, airdrops, or token allocations as they see fit.

, the provision of resources to its ecosystem is incentivized through tokens and other rewards that encourage resource providers such as GPU owners, AI Model developers, dataset providers, and storage providers to join and contribute their resources to the Hive’s ecosystem.

Any resource provider, whether it is a GPU owner, or an AI Model developer, can dedicate their resources to any Hive of their preference. The on-chain elements of the , especially the various digital assets within the AI Protocol’s ecosystem, enable the tracking and coordination needed for the Hives to operate efficiently.

The AI Protocol V3 introduces the ability to connect digital assets with a Hive. When a resource provider connects on-chain with a Hive using an asset, it creates an on-chain record that the resource is dedicated and available to that particular Hive. A digital asset can only be connected to one Hive at a time. Connecting and to a Hive is also a form of "Direct Asset Governance," which is a way for Asset owners to signal their support for a Hive by the broadcasting of their Asset being connected to their Hive of choice.

Once a new Hive is created
AI Protocol
Intelligence Pods
ARKIVs
ALI Agents can access a Hive's AI services by tapping into its curated Decentralized Compute and Storage Clusters
Offchain resources can provide AI Services to a Hive through Pods and ARKIVs, to be distributed to onchain AI Assets.