We use the word “tokens” in two completely different technological contexts. In cryptocurrency, tokens represent value—coins, assets, ownership rights. In large language models like GPT-4 and Claude, tokens represent computation—the atomic units of text processing that determine API pricing.
Both systems independently converged on “tokens” as their fundamental unit of exchange. That linguistic parallel might reveal something deeper about how we’re rebuilding value systems in computational terms.
The Functional Parallel
When you buy cryptocurrency tokens, you’re acquiring units of value that can be exchanged, stored, or used to access services. When you use an LLM API, you’re spending tokens—computational units that determine how much processing you get.
The exchange mechanism looks remarkably similar:
- You acquire tokens through purchase or allocation
- You spend tokens to access a service
- The service provider receives tokens as payment
- Scarcity creates pricing pressure
The difference is what you’re purchasing: financial utility (access to blockchain networks, smart contracts, digital assets) versus computational utility (text generation, reasoning, content creation).
But the economic primitive? Nearly identical. Both systems discovered they needed discrete, countable units to meter access to previously un-meteored resources.
Why Both Systems Chose “Tokens”
The word “token” carries specific meanings in both computer science and economics: a discrete, countable unit that can be exchanged for something else.
In programming, tokenization means breaking input into atomic units—the lexical analysis phase of compilation. In economics, tokens are physical objects representing value: poker chips, arcade tokens, subway tokens.[^1]
When blockchain architects designed cryptocurrency systems in the late 2000s, they needed terminology for “transferable unit of value that’s not technically money.”[^2] Token worked because it implied something symbolic—not the thing itself, but a representation of the thing. When Ethereum introduced smart contract tokens in 2015, the abstraction layer became explicit: ERC-20 tokens weren’t trying to be money; they were programmable representations of value.
When AI researchers designed transformer-based language models, they needed terminology for “atomic unit of text processing.” Token worked because it meant the smallest processable chunk of input. OpenAI’s GPT-2 paper (2019) formalized this: byte-pair encoding creates a vocabulary of tokens that the model learns to predict.[^3]
Both communities faced the same challenge: how do you break continuous concepts (value, language) into discrete, countable units?
The answer: tokenization.
And once you tokenize something, you can measure it. Once you measure it, you can price it. Once you price it, it becomes currency.
[^1]: The use of tokens as value placeholders dates to at least Roman tesserae (small tokens used for theater admission and food rations). Modern examples like subway tokens emerged in the 1950s as a way to pre-sell access and reduce cash handling.
[^2]: Ethereum’s ERC-20 token standard (2015) formalized this, creating a protocol for fungible tokens on the blockchain.
[^3]: Radford, A., et al. “Language Models are Unsupervised Multitask Learners.” OpenAI (2019). The paper describes how GPT-2 uses byte-pair encoding to create a vocabulary of 50,257 tokens.
What This Reveals About Value
Here’s where the parallel becomes more interesting.
When you spend an Ethereum token, you’re not moving a physical coin—you’re updating a distributed ledger through cryptographic consensus. The “value” exists as agreement among network participants about what that ledger says.[^4]
When you spend an LLM token, you’re not using a physical resource—you’re triggering matrix multiplications across GPU clusters. The “value” exists in the resulting text generation that those computations produce.
In both cases, the token creates an abstraction layer between human intent and computational reality. You don’t need to understand elliptic curve cryptography to send ETH. You don’t need to understand transformer architectures to use ChatGPT. You just spend tokens.
This might suggest something about where value systems are heading: as more human activity becomes mediated by computation, more of our economy gets denominated in computational units rather than physical ones.
The progression from “money backed by gold” to “money backed by government decree” to “money backed by computational consensus” could be less about trust degradation and more about the natural digitization of value systems. Whether that’s desirable is a different question—but the direction seems clear.
[^4]: Blockchain consensus mechanisms—from Bitcoin’s proof-of-work to Ethereum’s proof-of-stake—create value through distributed agreement. Ethereum’s ERC-20 standard formalized programmable tokens in 2015, enabling smart contract-based value systems beyond simple currency transfer.
Scarcity as Design Choice
Both systems use scarcity to create value, but implement it through different mechanisms.
Cryptocurrency tokens achieve scarcity through:
- Burn mechanisms (Ethereum’s EIP-1559 destroys ETH with each transaction)[^5]
- Protocol limits (Bitcoin’s 21 million coin cap, fixed token supplies)
- Staking requirements (proof-of-stake’s economic security model)
LLM tokens achieve scarcity through:
- Rate limits (requests per minute restrictions)
- Computational costs (GPU processing time)
- Context windows (finite sequence lengths models can process)
In both cases, scarcity is a design choice rather than a physical constraint.
Ethereum could have been designed without token burns—the deflationary mechanism is programmatic, not inevitable. LLM APIs could theoretically be unlimited, but without rate limits and pricing tiers, demand would overwhelm supply and make the service unusable for everyone.
Scarcity creates markets. Markets create prices. Prices signal value. The token—whether crypto or AI—is the mechanism that enables this economic coordination.
[^5]: Ethereum implemented EIP-1559 in 2021, which burns a portion of transaction fees, making ETH potentially deflationary. This transformed gas fees from pure miner revenue into a supply-reduction mechanism.
The Convergence Thesis
The stronger claim here is that we might be heading toward infrastructure that processes both financial and computational tokens in integrated ways.
The technical requirements overlap:
- Payment channels for crypto tokens need low-latency micropayment processing
- Streaming protocols for LLM tokens need real-time metering and billing
- Both require high-throughput transaction handling at sub-cent granularity
The HTTP 402 Payment Required status code has existed since 1997 but was never fully implemented. Recent proposals like x402 and micropayment APIs suggest renewed interest.
A hypothetical closed-loop economy could work like this:
A content creator publishes an article to Arweave (permanent decentralized storage). Readers pay in ETH or other tokens (via Ethereum L2s like Arbitrum or Optimism) to access it. The article was generated using LLM tokens. The creator’s payment goes into an Ethereum wallet that can purchase more LLM tokens to generate more content. The financial infrastructure, storage infrastructure, and computational infrastructure become interoperable.
The primitives exist today:
- Ethereum L2 solutions (Arbitrum, Optimism, Base) enable fast, low-cost token transactions
- Arweave provides permanent, pay-once storage for content with crypto payments
- Lightning Network enables Bitcoin micropayments as an alternative payment rail
- OpenAI and Anthropic sell LLM tokens as prepaid computational credits
- WordPress plugins like Token Access accept cryptocurrency for content access
- LangChain enables programmatic LLM workflows
Whether these pieces actually converge depends on regulatory clarity, user experience friction, and whether the value of micropayment economies justifies the infrastructure complexity. The technical feasibility exists; the economic viability is less certain.
Mental Models for Understanding Token Economics
This parallel between crypto tokens and LLM tokens connects to several frameworks for understanding how digital resources become economic primitives:
1. Commoditization of Computation
Benedict Evans argues that technology becomes infrastructure when it’s cheap, abundant, and invisible—you stop thinking about it and just use it. Electricity followed this path; so did cloud computing.
Tokens (both crypto and AI) might represent the next step: the financialization of computational infrastructure. Just as electricity grids created metered pricing per kilowatt-hour, token-based systems create metered pricing per unit of blockchain consensus or LLM inference. The resource becomes a tradeable commodity with transparent pricing.
Practical implication: If computation continues commoditizing, expect more services to adopt token-based pricing. Cloud providers already do this with credits; APIs increasingly do this with usage-based billing. Tokens make the invisible visible through measurement and pricing.
2. Medium of Exchange Theory
Classical economics defines money through three functions: medium of exchange, store of value, and unit of account. Tokens fulfill all three—whether they represent cryptocurrency or computation.[^6]
The abstraction layer matters: you don’t need to understand SHA-256 hashing or transformer architectures to spend tokens. This abstraction enables tokens to function as money even when the “backing” is purely computational rather than physical.
Practical implication: For tokens to succeed as money, they need stability and predictability. Crypto tokens struggle with this (price volatility). LLM tokens handle it better (stable pricing per token). Adoption depends on which model wins user trust.
3. Attention Economy (Davenport & Beck)
Thomas Davenport argued that attention—not information—is the scarce resource in the digital age. We’re drowning in information but starving for time to process it.
Tokens mediate this scarcity: you spend computational tokens to generate content (reducing production time) and financial tokens to access it (signaling quality through price). Attention becomes the ultimate bottleneck, with tokens as the exchange mechanism.
Practical implication: Content creators compete for attention using token-gated access, premium tiers, and pay-per-view models. Readers use price as a quality filter. The attention economy needs both content tokens (creation) and access tokens (consumption).
4. Protocols Not Platforms (Mike Masnick)
Mike Masnick’s thesis is that decentralized protocols enable innovation without centralized control, unlike platforms that gatekeep access.
Both crypto tokens and LLM tokens exist within protocol-defined economies: Ethereum’s ERC-20 standard determines token validity and smart contract interactions; OpenAI’s API specifications determine token pricing. Protocols define scarcity, exchange, and value without requiring centralized gatekeepers—though in practice, many “decentralized” systems still have significant centralization.
Practical implication: Protocol-based token economies enable permissionless innovation but struggle with coordination problems. Platform-based token economies enable better UX but create single points of failure. The hybrid models (e.g., stablecoins on Ethereum, LLM APIs with open-source alternatives) might navigate this tension.
Synthesis
Tokens could represent the financialization of computation within protocol-defined economies where scarcity creates value, abstraction enables exchange, and attention determines utility.
If crypto tokens and LLM tokens converge, we’re not just merging technologies—we’re creating an economic layer where intelligence, consensus, and value become different aspects of the same computational primitive. Whether that’s desirable or inevitable remains an open question.
[^6]: Cryptocurrency tokens fulfill monetary functions through different mechanisms: Bitcoin’s fixed supply (21M cap), Ethereum’s programmatic burn (EIP-1559), stablecoins’ fiat pegging. See Voshmgir’s “Token Economy” (2020) for comprehensive treatment of how tokens function as money in protocol-defined systems.
Where I Might Be Wrong
Let me be honest about the weaknesses in this argument.
The convergence assumption: I’m assuming that functional similarity implies eventual integration. But crypto tokens and LLM tokens might serve fundamentally different purposes that keep them separate. Financial infrastructure needs regulatory compliance, audit trails, and legal frameworks. Computational infrastructure needs low latency, high throughput, and developer ergonomics. These requirements might be incompatible enough that integration never makes economic sense, even if it’s technically feasible.
The closed-loop economy assumption: I’m assuming creators want to be paid in tokens they can spend on AI tools. But maybe they just want dollars—real, boring, regulated dollars with stable value and universal acceptance. Converting between crypto tokens, LLM tokens, and fiat currency introduces friction: transaction fees, tax reporting, volatility risk. Maybe that friction exceeds the benefits of closed-loop systems.[^7]
The scarcity-based pricing assumption: Both systems use token-based metering, but maybe that’s not optimal. LLM tokens could be priced on value delivered (how useful was the output?) rather than computational resources consumed (how many tokens processed?). Cryptocurrency tokens could prioritize utility over scarcity-driven speculation. Flat-rate subscriptions might work better than metered usage for both systems.
The linguistic coincidence objection: “Token” is just a word. Crypto tokens exist on blockchains; LLM tokens exist in neural networks. Different infrastructure, different economics, different regulatory frameworks. Maybe I’m seeing deep structural similarity where there’s only superficial resemblance. The parallel could be coincidental terminology rather than meaningful convergence.
The micropayment skepticism: I might be wrong about X402 micropayments becoming viable. Transaction costs—both computational (blockchain fees, payment processing) and cognitive (decision fatigue, mental accounting)—might always exceed the value of small payments. The 1990s dot-com era tried micropayments; they failed.[^8] Bundling and subscription models might always dominate because they reduce cognitive overhead and provide predictable revenue.
The alternative outcome: If I’m overestimating convergence or underestimating friction, crypto tokens and LLM tokens will remain parallel currencies in separate ecosystems—never merging, never interoperable, just coincidentally using the same word.
But even if they don’t converge technologically, they’re converging conceptually: both represent the quantification of previously unquantifiable resources (consensus, intelligence) into tradeable economic units. That shift—from analog to digital, from qualitative to quantitative, from scarce physical resources to artificially scarce computational resources—seems significant regardless of whether the token standards merge.
The uncertainty isn’t whether tokens matter. It’s whether they’ll integrate or remain parallel currencies serving different masters.
[^7]: Tax treatment of cryptocurrency remains complex. The IRS treats crypto as property, requiring capital gains reporting on every transaction. This creates significant compliance burden for micropayment use cases.
[^8]: Clay Shirky’s 2000 essay “The Case Against Micropayments” argued that mental transaction costs exceed the value of small payments. Twenty-six years later, despite improved technology, most content still uses subscriptions or advertising rather than pay-per-article models.
What You Can Experiment With Now
If this thesis interests you, here are concrete ways to explore token-based economies:
Understand your own token usage:
- Check your OpenAI or Anthropic API usage dashboard. How many tokens are you spending per month? What’s your cost per token?
- Calculate your “token efficiency”: useful output generated per 1,000 tokens spent. Are you prompting efficiently or wasting tokens?
- Compare token pricing across providers (OpenAI, Anthropic, Google). Prices vary significantly; Anthropic’s Claude offers better value for certain workloads.
Experiment with crypto micropayments:
- Set up an Ethereum wallet (MetaMask, Rainbow) and try an L2 like Arbitrum or Base
- Send ETH or tokens to experience fast, low-cost transactions
- Try token-gated content platforms like Unlock Protocol or Mirror (both Ethereum-based)
- Explore Arweave for permanent content storage with crypto payments
- Optional: Set up a Lightning Network wallet for Bitcoin micropayments
Build something:
- Create a simple script that tracks your LLM token usage and costs over time
- Build a token-gated blog post using WordPress plugins like Token Access (accepts ETH and ERC-20 tokens)
- Upload content to Arweave and gate access with Ethereum tokens
- Experiment with LangChain to create programmatic LLM workflows that track token consumption
- Try connecting MetaMask to an API service that accepts crypto payments
Read the primary sources:
- ERC-20 token standard: see how Ethereum standardized programmable tokens
- EIP-1559: understand Ethereum’s burn mechanism
- GPT-2 paper (24 pages): understand how tokens represent language through prediction
- Arweave yellow paper: permanent storage economics
- OpenAI’s tokenization tool: visualize how text becomes tokens
The goal isn’t to bet on convergence—it’s to understand how both systems work and where their economic primitives overlap. Even if they never integrate, understanding token economics helps you navigate an increasingly tokenized economy.
The Bigger Question
The real question isn’t whether crypto tokens and LLM tokens are similar—they clearly share structural parallels.
The real question is: what happens when more valuable resources get tokenized?
Your attention is already tokenized through advertising impressions and engagement metrics. Your reputation is tokenized through social media followers and karma scores. Your creativity is tokenized through content views and platform analytics. Your time is tokenized through gig economy platforms and hourly billing.
Crypto tokens tokenize value. LLM tokens tokenize intelligence. What’s next?
When everything becomes a token, everything becomes subject to similar economic logic: scarcity creates value, markets determine prices, optimization algorithms maximize utility.
That’s either liberating (efficiency, transparency, composability) or concerning (commodification, reductionism, loss of context). Probably both, depending on implementation and power dynamics.
The convergence of crypto tokens and LLM tokens—whether technological or just conceptual—signals something about how computational abstraction layers are mediating increasing portions of human activity.
“Tokens” might be what we call the currency of that economy. Whether that’s exciting or concerning depends largely on who controls the protocols and how much agency individuals retain within token-based systems.
The question isn’t whether tokenization happens. It’s whether it serves human flourishing or just system efficiency.
Looking Back from 2026
Writing this in January 2026, three years after ChatGPT’s mainstream emergence and thirteen years after Ethereum launched, the linguistic parallel between crypto tokens and LLM tokens seems obvious. But the communities still barely overlap.
In 2021, “token” meant crypto speculation—NFTs, DeFi, memecoins. By 2023, “token” meant LLM usage—context windows, API limits, rate pricing. Different worlds, same word.
Now? Anthropic’s Claude, OpenAI’s GPT-4, and Google’s Gemini all price their APIs in tokens. Ethereum, Arweave, Bitcoin, and Solana all use tokens for value transfer (or storage, in Arweave’s case). The linguistic convergence is complete.
But the technological convergence? Still uncertain. X402 brings HTTP 402 Payment Required closer to reality, but mainstream implementation remains limited. Crypto-for-LLM-access exists in experiments but hasn’t achieved mainstream adoption. The closed-loop content economy combining Ethereum payments, Arweave storage, and LLM tokens remains a hypothesis more than a reality.
The thesis feels plausible—both systems independently discovered they needed discrete, countable units to meter previously un-metered resources. Whether that parallel leads to integration or remains a curiosity depends on factors we can’t predict yet: regulatory frameworks, user experience friction, whether micropayments ever overcome their cognitive overhead problem.
Ask me again in 2029. By then we’ll know if this was prescient or just pattern-matching.
Find me: Contact form, @divydovy, hi@divydovy.com
Related reading:
- What You Need to Know About Web3 and WordPress – Web3 primitives in WordPress ecosystem
- WordPress & Blockchain: Disruption, Challenges, and Opportunities – Earlier exploration of decentralized value systems
- Future of WordPress: AI, Blockchain, and Beyond – Convergence of transformative technologies
Further reading:
- Token Economy: How the Web3 reinvents the Internet – Shermin Voshmgir’s comprehensive treatment of token economics
- Large Language Models: A New Compute Paradigm – Sequoia Capital on LLM economics
- The Attention Economy – Thomas Davenport on attention as currency
- Protocols Not Platforms – Mike Masnick on decentralized protocol economics