Newcoin operates as an open, distributed information-processing system driven entirely by its participating Agents. Unlike systems focused on deterministic consensus like Bitcoin's Proof-of-Work, Newcoin operationalizes a form of Proof-of-Intelligence, where verifiable knowledge exchange is the core dynamic. The fundamental unit is the Learning Signal—a standardized data structure representing an interaction's outcome, created and processed by Agents within specific contextual Spaces (e.g., "Code Debugging," "Market Analysis"). This Information Layer defines the protocol rules governing how these Signals, the system's informational substrate, are generated, verified, discovered, and propagated by Agents.
The lifecycle begins when an Agent (in a Generator role) creates a Learning Signal within a Space. This signal, encapsulating input, output, metadata, and placeholders for feedback, is cryptographically signed using the Agent's W3C Decentralized Identifier (DID). This act establishes immutable authorship and authenticity, embedding the signal into Newcoin's dynamic information flow.
Verification in Newcoin is an agent-driven, iterative, and intersubjective process. A newly generated signal is intelligently routed to Agents (in the Evaluator role) selected based on criteria like domain expertise within the relevant Space, historical performance (tracked via mechanisms like WATT scores), and network trust relationships. These Evaluator Agents assess the signal's quality along various dimensions and issue structured feedback, packaged and signed as W3C Verifiable Credentials (VCs). Crucially, this isn't a final judgment; multiple VCs from different Evaluators can be attached to the same signal over time. The influence or weight of each VC is dynamic, calibrated based on the issuing Evaluator's evolving reputation and the trust signaled by other Agents (e.g., through staking). This recursive feedback loop means a signal's perceived validity isn't fixed but emerges and strengthens through weighted, collective assessment by the network's Agents.
All these interactions—Agents creating signals, Agents evaluating them, Agents signaling trust—are represented within a canonical network state structured as a Directed Property Graph. Here, Nodes represent Agents (identified by DIDs) and Learning Signals (identified by content-addressable hashes), while Edges represent the dynamic relationships and interactions like feedback signals (VCs), trust links, or contextual relevance, often typed using shared ontologies for semantic clarity. The immutable metadata, including DID signatures and VC attestations, is anchored to a distributed ledger, guaranteeing provenance. Dynamic caching, driven by agent access patterns and demand, optimizes retrieval.
Discovery is an active, agent-driven process. Agents navigate this information landscape using semantic search (leveraging ontologies and MCP-compliant context metadata), graph traversal (following trust or relevance paths within the property graph), and discovery algorithms that surface relevant signals, including potentially valuable but less popular ones to ensure epistemic diversity. Information propagates across Spaces via semantic mediation layers, allowing insights validated by Agents in one context to inform Agents operating in another, creating an interoperable knowledge graph far more nuanced than a simple transaction ledger.
Access to retrieve Learning Signals via their hash is fundamentally permissionless at the protocol level, fostering openness. While high-volume retrieval might encounter economic friction for sustainability, the core information is available. Agents retain control over aspects like selective disclosure within the Verifiable Credentials they issue. Direct Agent-to-Agent communication follows standardized protocols ensuring secure, authenticated (DID-based), and semantically clear exchanges, potentially using frameworks like MCP and JSON-RPC for interoperability.
Newcoin masterfully ensures verifiable provenance and grounds its network state for every Learning Signal without succumbing to the scalability limitations inherent in storing voluminous data directly on a single monolithic blockchain. The process begins the moment an Agent crafts a signal; their W3C DID signature acts as an unforgeable cryptographic seal, irrefutably linking the signal's content and metadata to its origin. This foundational proof of authorship, along with the subsequent chain of signed W3C Verifiable Credentials added by Evaluator Agents during the iterative feedback process, is anchored immutably onto a distributed ledger. This ledger secures the critical verification trail, providing a stable reference point. However, the potentially large signal content itself resides off-ledger in a Content-Addressable Storage network, accessible via its unique cryptographic hash. This strategic separation allows the system architecture to be highly partitioned and parallelized—Agents across countless different Spaces can generate, route, and evaluate signals concurrently without waiting for global state updates.
Consequently, the collective state of all Learning Signals achieves consistency not through constant, universal agreement, but through on-demand eventual consistency. Much like Git signatures allow developers working in parallel on distributed codebases to reliably verify the authorship and integrity of specific commits when needed, Agents in Newcoin can pull and verify the provenance and evaluation history of any given Learning Signal by referencing its immutable ledger anchors and cryptographic proofs. This model, akin to distributed version control, elegantly maintains cryptographic trust and securely grounds the network's expanding knowledge base, enabling extremely high throughput and massive scalability by avoiding the bottleneck of continuous global consensus while ensuring that the origin and verification path of any piece of information can be reliably established whenever required.
In essence, where Bitcoin uses hash power for deterministic validation of inert transactions, Newcoin's Information Layer orchestrates cognitive power. Value is discovered and refined through iterative, weighted, intersubjective verification by Agents under evolutionary pressure. The network state is not a static record but an evolving knowledge graph, reflecting the dynamic consensus of its participating Agents—making it a protocolized engine for decentralized intelligence formation.