Past/present/future timeline made me go to... Conway's Law.
Thinking (not a novel thought) aloud... it's a new angle for me.
70s style data processing was place-oriented, as in, literally a place (/the/ central computer /building/) gated by mainframe mages.
The question of /sharing data multilaterally between cooperating data systems/ did not exist. There was only the central computer. People put data (punch cards, tapes, print-outs, paper receipts, phone calls) in the building and carried answers out of it (also as punch cards, tapes, print-outs, paper receipts, phone calls). But they would only ever talk to the same computer.
As organisations grew more interconnected, organisational silos developed around the same central-store information structure that was now embedded inside the organisation, having been a very successful model of information management in 70s style organisations, through then 90s (and severally, today too).
Simulacra <> Simulation.
The property ownership assumption baked into database engines therefore, would have been /non-sharing of schema/, not really non-sharing of data (data was being schlepped around using sneakernets).
And for this very reason, trying to build "local first" stuff around central-store data systems is ill-fated to break down, because that's a "shared-everything" world by default, and so aspects of attribution, ownership, control, interpreting/lensing/parsing/slicing etc. (somehow) need to flow along with the information. Turning the database inside-out alone is not enough.
I guess I'm saying... Ted Nelson will eventually have the last laugh.
In fact, that is my line of thinking, except "using whatever already exists on my computer(s)", which is: bash, sed, grep, jq, awk, pandoc, inotifywait, and xdotool.
The point being exactly to avoid whatever a third party may or may not deign to let me use, without hassle.
It's job is to help me make my website. Thus, its scope, (mis)feature set, polish will always be production-grade, where production is "works on my machine(s)" :)
- 1,600 sequential (in a single process) read-after-write transactions, append-only, no batching.
- With a separate writer process (sequential), and concurrently, two reader processes, I'm seeing 400+ append transactions/second (into the append-only table, no batching), and a total of 41,000 reads per second, doing `select *` on the trigger-updated table.
My schema is (deliberately) poor --- most of it is TEXT.
It employs "flexible typing", which does not mean "everything is text". What I am doing is writing fully denormalised text (strings) in most fields, with column type declared as TEXT.
This is deliberate, to emulate "whoops, if I screw up my types, how bad does it get?".
However, when written into the DB with some care, each value is stored per the following storage classes:
Each value stored in an SQLite database (or manipulated by the database engine) has one of the following storage classes:
NULL. The value is a NULL value.
INTEGER. The value is a signed integer, stored in 0, 1, 2, 3, 4, 6, or 8 bytes depending on the magnitude of the value.
REAL. The value is a floating point value, stored as an 8-byte IEEE floating point number.
TEXT. The value is a text string, stored using the database encoding (UTF-8, UTF-16BE or UTF-16LE).
BLOB. The value is a blob of data, stored exactly as it was input.
A storage class is more general than a datatype. The INTEGER storage class, for example, includes 7 different integer datatypes of different lengths. This makes a difference on disk. But as soon as INTEGER values are read off of disk and into memory for processing, they are converted to the most general datatype (8-byte signed integer). And so for the most part, "storage class" is indistinguishable from "datatype" and the two terms can be used interchangeably.
Any column in an SQLite version 3 database, except an INTEGER PRIMARY KEY column, may be used to store a value of any storage class.
All values in SQL statements, whether they are literals embedded in SQL statement text or parameters bound to precompiled SQL statements have an implicit storage class. Under circumstances described below, the database engine may convert values between numeric storage classes (INTEGER and REAL) and TEXT during query execution.
```
(edits: formatting, clarify what I'm doing v/s what SQLite does)
Apparently, they actually have a whole hit "reality" show that does it without fatalities; "Naked and Afraid". But they get training, and have an "out" back into civilisation.
So I can completely imagine they---the poor hapless tiktok influencers---meeting the unfortunate captive parrot's fate, if suddenly sent out into the maw of the wild, without any warning, preparation, or way back to second dibs at a home.
Why I think this sort of "High-tech Computer Hardware Cottage Industry" stuff is significant (ignoring the fact that it's One Internet Rando versus One Trillion Dollars).
IMO, we --- as in someone somewhere who's seeing it coming --- stand to gain far greater indirect benefits, as and when GPU datacenter over-investments transmute into serving today's severely under-served but world-reforming science/industry application areas…
Think Massive GPU Infrastructure -> Industry application transmutations... "on-campus GPU supercomputers too cheap to meter".
My optimistic LLM-AI scenario is a hope that we get a version of what happened after the boom years of railroads, telecoms, and/or cloud computing (currently in progress)… Which was the decades after massive capital investments, the implosion of which unprecedently fuelled large-scale industrial and economic and socio-political phenomena, by way of infrastructure ownership re-allocations through write-offs, fire sales, and bankruptcy style M&A.
A hope that we get a disintermediation of datacenters. Back to the neighbourhood VPS provider. People shipping out containers to private industry and universities and so forth — stacks of supercomputers in your backyard... A whole new breed of Oxide Computer Company companies.
But this dream-like phenomenon is not going to happen in places in poverty of hands-on local neighbourhood "Computer/GPU hardware mechanic" expertise. (A poverty that is tied to zoning laws, tariffs, import duties, and public policy --- Are you pouring gobs of cash into making large datacenters, at the cost of all the other sides of the equation; education, training, small and medium businesses, precision manufacturing capacity, long-range sponsorship of the various sciences, R&D, arts etc. etc. etc.)
The revolutionary proliferation of mobile telephony in India (where I live), for example, was---and continues to ride---largely on the back of a mobile phone cottage industry that proliferated.
Mom-and-pop shops that can do pretty much everything you need to ... repair, update, un-bork your cell phone, your phone plans, prepaid sims etc. Print you your documents and photos, fix your broken screens, replace bloated batteries, do "whatsapp agent" stuff (government paperwork). This has been an unbroken trend from the early days of the Nokia 3310 to the now-a-days of cheap ubiquitous android devices, and even "feature phones" participating in money flows via zero cost-to-consumer UPI payments.
A similarly revolutionary thing did not happen for computers in India, because of decades-long protectionist policies. High import duties ("luxury goods"), and regulatory capture by computer hardware distributors who still maintain a choke-hold on imports and supply. We do have an equivalent cottage industry of computer repair people, but it's nowhere close to the ubiquity that it could have had because it's just so damned hard to sell computer hardware in India.
I'm feeling a rather "HN moment"... I found out, because I'd submitted an essay for the 2025 Berggruen Prize Essay competition too (aiming for last place --- no delusions of grandeur here, no siree). They just announced the results, and I'd noticed "Reppel" last night on hnpwd. I'd also submitted my site for hnpwd. And here we are.
> The English-language jury also awarded Honorable Mentions to Ian Reppel and Helen Yetter-Chappell, recognizing their essays for originality, clarity, and thoughtful engagement with the year’s theme.
Thinking (not a novel thought) aloud... it's a new angle for me.
70s style data processing was place-oriented, as in, literally a place (/the/ central computer /building/) gated by mainframe mages.
The question of /sharing data multilaterally between cooperating data systems/ did not exist. There was only the central computer. People put data (punch cards, tapes, print-outs, paper receipts, phone calls) in the building and carried answers out of it (also as punch cards, tapes, print-outs, paper receipts, phone calls). But they would only ever talk to the same computer.
As organisations grew more interconnected, organisational silos developed around the same central-store information structure that was now embedded inside the organisation, having been a very successful model of information management in 70s style organisations, through then 90s (and severally, today too).
Simulacra <> Simulation.
The property ownership assumption baked into database engines therefore, would have been /non-sharing of schema/, not really non-sharing of data (data was being schlepped around using sneakernets).
And for this very reason, trying to build "local first" stuff around central-store data systems is ill-fated to break down, because that's a "shared-everything" world by default, and so aspects of attribution, ownership, control, interpreting/lensing/parsing/slicing etc. (somehow) need to flow along with the information. Turning the database inside-out alone is not enough.
I guess I'm saying... Ted Nelson will eventually have the last laugh.
reply