Inference Request (User Job)
Last updated
Last updated
Any user, DAO, or application can initiate a verifiable AI task by submitting an inference request. This request is cryptographically tied to:
A registered model,
The user’s input (hashed),
And a HUB token payment.
Once submitted, the request enters the execution pipeline and becomes a provable, revenue-generating event within the system.
🔁 Workflow:
User selects a model from the on-chain registry.
Prepares input data (off-chain JSON, hashed on client side).
Sends InferenceRequest
transaction with:
model_hash
input_hash
payment_amount
(in HUB or stablecoin)
Job is queued for executors.
📦 On-Chain Struct:
📥 Input Data Example (Off-Chain):
This input is not stored on-chain, but its hash is used to validate the zk-proof.
🌍 RWA Integration:
Payment Tracking: Requests using RWA-tokenized models will trigger payout logic to token holders.
Subscription-Based Access: DAOs holding access tokens (NFTs) can submit requests without direct HUB payment (prepaid logic).
Auditable Revenue: Each request adds to the on-chain usage history of a model, powering dashboards, reputation, and financial reporting.
Every inference request is a cryptographic contract — linking a user’s intent, a model’s logic, and an on-chain monetization outcome.