The next step is spreadsheets, obviously. If you've played civ management games they always devolve into spreadsheets. Europa Universalis is an excellent example. So are several mid-2000s spacefaring games. They always come back to spreadsheets for resource management, throughput analysis, etc.
I spent an afternoon or two whipping this up to scratch an itch. I'll admit, it isn't my itch, but one I keep seeing pop up: demonetization, blurry or non-existent appeals processes, high bars for earnings, etc. What I've come up with is probably not revolutionary, probably has been tried before. I figured it was at least worth sharing.
Naming things is hard, and this particular name is probably not the final name.
What lives behind the link is a vibe-coded example of the end application. It has all the basics, but pulls caching (CDN) and moderation into the main app. In the end product, these too will be services.
---
A quick overview of my idea:
# Agents
An agent is a discrete actor in the economic system that has roles/responsibilities that are generally orthogonal to any other agent.
## Producer
The producer agent is the primary source of content in the system. Producers spend cash and time to create content.
## Consumer
The consumer agent is the primary beneficiary of a producer's efforts. Consumers spend time (and attention) watching content made by producers.
## Service Provider
Every other part of this micro-economy is some kind of service-provider; whether it be a cache for content delivery, transcription, moderation, legal compliance, etc., all are in service of supplying the consumer with the creator's content.
# Economics
This application is designed to be a cash-for-services engine rather than a store-of-value like its forebears (speaking of crypto schemes as stores of value). This means that the health of the system can be directly measured by the flow of cash instead of by the amount of liquidity held.
Cash enters the network one of two ways: advertisement dollars paid to a producer in exchange for consumer view time, or consumer dollars paid to the producer for ad-free content.
## Producer -> Consumer Axis
The primary axis of this micro economy is the exchange of goods and money between producers and consumers. Generally this is a one way relationship. Consumers may 'subscribe' at the network level (covered later), at the producer level (a channel subscription), buy content one-at-a-time, OR suffer advertisements and recieve content free.
## Producer/Consumer <-> Advertiser Axis
The last option expands the producer-consumer economic axis into a producer/consumer-advertiser relationship, where the producer sells their viewer's time/attention to an advertiser in exchange for the advertiser injecting ads into the producer's content.
## Producer <-> Service Providers
In the course of supplying content to consumers, producers require additional services; among these are UI, discovery (curation), moderation and legal compliance (where necessary), hosting (caches), payment processing -- getting cash into and out of the network, etc. All of these draw on the producer's income and only the producer's income.
## Position Value Tax, or PVT
As service providers become more central, they are taxed for the privelege of benefiting from the network in a major way. The more central/used a service provider is in a network, the higher their taxes are.
These taxes are held in a central treasury and split amongst all actively contributing agents on the network (producers and service providers) for the previous period of time. The distribution of this "Universal Participation Dividend" (UPD) happens every set period of time (TBD) and may be pro-rated for a given period of time.
I like doing goofy things with code. I wrote an s-expression parser using TeraTerm (BASIC-like language). I came up with this generator only recursive descent thing in python. I never did anything with these except to fiddle around and see what was possible. Goofy stuff in code makes me happy.
There are ways around this. You can push the success rate close to 100% if you use chain of thought and a quorum selection. It isn't great, and it slows response times, but if 85% isn't good enough, you just need to flip the coin about 5 times to get nearly(!) guaranteed results.
Good insight here, we actually did not include thinking into this model partly because we saw how incredibly fast it was to just get the minimum amount of tokens to output an answer.
Thinking helps performance scores but we'll leave it up to users to add additional tokens if they want. Our goal here was the leanest weight and token base for blazing fast performance for you all.
I would consider feature complete with robust testing to be a great proxy for code quality. Specifically, that if a chunk of code is feature complete and well tested and now changing slowly, it means -- as far as I can tell -- that the abstractions contained are at least ok at modeling the problem domain.
I would expect code that continually changes and deprecates and creates new features is still looking for a good problem domain fit.
Most of our customers are enterprises, so I feel relatively comfortable assuming they have some decent testing and QA in place. Perhaps I am too optimistic?
That sounds like an opportunity for some inspection; coverage, linting (type checking??), and a by-hand spot check to assess the quality of testing. You might also inspect the QA process (ride-along with folks from QA).
This has been my major concern, so much do that I'm going to be launching a tool to handle this specific task: agent conception and testing. There is so little visibility in the tools I've used that debug is just a game of whackamole.