I've been following a story where Elon Musk's xAI is building an 88 acre solar farm next to its Colossus data center near Memphis TN after public outrage due to running 35 methane gas turbines without a permit, which increased NOx emissions enough to allegedly impact health:
88 acres = 356,124 m2
4.56 kWh/m2 per day solar insolation (4.5 is typical for much of the US)
4.56 kWh/m2 per day \* 356,124 m2 = 1,623,924 kWh/day = 67,664 kW = 67.66 MW average
1000 W/m2 \* 356,124 m2 = 356 MW peak
They're estimating that they'll get 30 MW on average from that, but I'd estimate more like 15 MW at a solar panel efficiency just over 20%. Still, the total cost for that power should be less than for turbines, since solar is now the cheapest electricity other than hypothetical nuclear (assuming an ideal breeder or waste-consuming reactor and excluding mining/waste externalities/insurance).
30 MW is still only 10% of the the 300 MW used by the data center. But there's lots of land out there, so roughly 1000 acres per data center doesn't seem that extreme to me. That's a 4 km2 or 1.5 mile2 lot, or about 2 km or 1.25 miles on a side.
Basically every GPU server uses 1 kW (about 1 space heater), which puts into perspective just how much computing power is available at these data centers. Running a GPU continuously at home would need 24 kWh/day, so with > 20% efficiency panels that's 4.5*.2 = 0.9 kWh/m2 per day, so 26.67 m2, so at 2 m2 per commercial solar panel and assuming that my math is right: that's about 14 panels considering nights and seasons.
It's interesting to think just how many panels it takes to run a GPU or space heater continuously, even when they put out 500 W or 250 W/m2 peak. And how cheap that electricity really is when it's sold for on the order of $0.15 per kWh, or $3.60 per day.
I've found that the very best way to save on your electric bill is to have a few south-facing slider doors and windows, which is like running a space heater every square meter of window. There's just no way that any other form of power generation can compete with that. Also, I feel that we're doing it wrong with solar. This analysis shows just how much better alternatives like trough solar and concentrated solar (mirrors towards solar panels) might be cost-wise. On an ironic note, solar panels now cost less than windows by area, and probably mirrors.
I'll never forget the feeling of the whoosh when I was working as a furniture mover in the early 2000s and felt the implosion when a cardboard box collapsed and dumped a large CRT TV face-down on the driveway, blowing our hair back. When the boss asked what happened to the TV, I said it fell, and our lead man (who had set it on the box) later thanked me for putting it so diplomatically.
That was nothing compared to the time the CAT scan machine fell face down off the lift gate on the back of the delivery truck because our driver pushed the wrong button and tipped it instead of lowering it, but I missed the flack from that because I was on a move somewhere thankfully. Afterwords he was forever known as the quarter million dollar man.
Oops just saw this! I never heard, last time I saw it, it was sitting busted up against a wall with blankets thrown over it. The warehouse usually ate breakages because that was cheaper than insurance overhead in the long run.
The solution to authoritarian problems is to organize.
In this case, we're overdue for a service that we all pay into, like a collective credit card, that only continues making payments to companies like Amazon if all of the members are happy. When you get banned without due process, payments stop until the matter is resolved.
Also, the collective can bargain-down rates. If it senses price increases beyond inflation, it just sends the adjusted amount, like 95%, until the matter is resolved.
We need this collective bargaining for housing (like tenant unions), the workplace, politics, pharmaceuticals, etc. The scale of this is so large that the collective could exist beyond any specific industry. So that it would operate as a meta economy beside the so-called free market economy (late-stage capitalism) that we operate under today due to the lack of antitrust enforcement.
Groups like the Wellbeing Economy Alliance (WEAll) are working towards these sorts of goals on a number of fronts:
This is very hacker-like thinking, using tech's biases against it!
I can't help but feel like we're all doing it wrong against scraping. Cloudflare is not the answer, in fact, I think that they lost their geek cred when they added their "verify you are human" challenge screen to become the new gatekeeper of the internet. That must remain a permanent stain on their reputation until they make amends.
Are there any open source tools we could install that detect a high number of requests and send those IP addresses to a common pool somewhere? So that individuals wouldn't get tracked, but bots would? Then we could query the pool for the current request's IP address and throttle it down based on volume (not block it completely). Possibly at the server level with nginx or at whatever edge caching layer we use.
I know there may be scaling and privacy issues with this. Maybe it could use hashing or zero knowledge proofs somehow? I realize this is hopelessly naive. And no, I haven't looked up whether someone has done this. I just feel like there must be a bulletproof solution to this problem, with a very simple explanation as to how it works, or else we've missed something fundamental. Why all the hand waving?
Your approach to GenAI scrapers is similar to our fight with email spam. The reason email spam got solved was because the industry was interested in solving it. But this issue got the industry split: without scraping, GenAI tools are less functional. And there is some serious money involved, so they will use whatever means necessary, technical and legal, to fight such initiatives.
I've been exploring decentralized trust algorithms lately, and so reading this was nice. I've a similar intuition - for every advance in scraping detection, scrapers will learn too, and so it's an ongoing war of mutations, but no real victor.
The internet has seen success with social media content moderation and so it seems natural enough that an application could exist for web traffic itself. Hosts being able to "downvote" malicious traffic, and some sort of decay mechanism given IP's recycling. This exists in a basic sense with known TOR exit nodes and known AWS, GCP IP's, etc.
That said, we probably don't have the right building blocks yet, IP's are too ephemeral, yet anything more identity-bound is a little too authoritarian IMO. Further, querying something for every request is probably too heavy.
Piggybacking also to say that I hope you succeed, as your work aligns closely with the type of runtime that I had hoped to write someday when I first used MATLAB in the early 2000s (now mostly GNU Octave for small hobby projects).
The loop fusion idea sounds amazing. Another point of friction which I ran into is that MATLAB uses 1-based offsets instead of 0-based offsets for matrices/arrays, which can make porting code examples from other languages tricky. I wish there was a way to specify the offset base with something like a C #define or compiler directive. Or a way to rewrite code in-place to use the other base, a bit like running Go's gofmt to format code. Apologies if something like this exists and I'm just too out of the loop.
I'd like to point out one last thing, which is that working at the fringe outside of corporate sponsorship causes good ideas to take 10 or 20 years to mature. We all suffer poor tooling because the people that win the internet lottery pull up the ladder behind them.
The experience with this has been quite mixed, creating a new surface for bugs to appear. Used well, it can be very convenient for the reasons you state.
julia> A = collect(1:5)
5-element Vector{Int64}:
1
2
3
4
5
julia> B = OffsetArray(A, -1)
5-element OffsetArray(::Vector{Int64}, 0:4) with eltype Int64 with indices 0:4:
1
2
3
4
5
julia> A[1]
1
julia> B[0]
1
“So far we have not seen any evidence for a preferred chirality,” (Dan) Glavin says (important for understanding why amino acids on Earth seem to all be left-handed):
Life is probably abundant everywhere in the universe. Also, evolution seems to spring up everywhere, in any system of sufficiently advanced complexity, regardless of what substrate it operates on. So I think that we'll start seeing life-like emergent behavior in computing, especially quantum computing, in the next 5-10 years.
So the question becomes: what great filter (in the sense of the Drake equation and Fermi's paradox) causes life as we know it to go dark or wipe itself out just after it achieves sentience?
Well, we're finding out the answer right now. Life probably merges with AI and moves into what could be thought of as another dimension. Where time moves, say, a million times faster than our wall clock time, so that it lives out lifetimes in a matter of seconds. Life everywhere that managed to survive probably ascended when it entered the matrix. So that by now, after billions of years since the first life did this and learned all of the answers, we're considered so primitive that Earth is just a zoo for aliens.
Or to rephrase, omnipotent consciousness probably gets bored and drops out of the matrix periodically to experience mortal life in places like Earth. So simulation theory probably isn't real, but divine intervention might be.
> Life is probably abundant everywhere in the universe.
I'm not convinced of that. Yes it seems like the building blocks are abundant but there's so many steps beyond that to get to abundant life.
The first life we had in the Archeaen era was dependant on sulfur, which was concentrated around volcanic vents so this already presumes a lot, namely oceans and a geologically active planet. Oxygen leeched a bunch of minerals into the water.
And then came cyanobacteria who no longer needed volcano but had this annoying habit of producing a new waste product: oxygen. This both absolutely killed all the Archeaen life but also cleansed the oceans as ions like iron precipitated into ferric oxide and we can see the layers of these cycles in the rock.
So the Earth needed all these elements and the Sun and Solar System needed to be sufficiently stable for billions of years just to get to this point and there are so many steps beyond this.
I personally believe it's more likely than not that we are the only potentially spacefaring civilization in our entire galaxy.
This all hinges on the presupposition that our solar system is unique in its configuration and location in the galaxy.
We haven't surveyed nearly enough other planetary systems to have any real idea if our system is unique. We barely have the ability to even see systems like ours in the first place. There's so little data available that it's not reasonable to draw a conclusion either way.
I've only been working with AI for a couple of months, but IMHO it's over. The Internet Age which ran 30 years from roughly 1995-2025 has ended and we've entered the AI Age (maybe the last age).
I know people with little programming experience who have already passed me in productivity, and I've been doing this since the 80s. And that trend is only going to accelerate and intensify.
The main point that people are having a hard time seeing, probably due to denial, is that once problem solving is solved at any level with AI, then it's solved at all levels. We're lost in the details of LLMs, NNs, etc, but not seeing the big picture. That if AI can work through a todo list, then it can write a todo list. It can check if a todo list is done. It can work recursively at any level of the problem solving hierarchy and in parallel. It can come up with new ideas creatively with stable diffusion. It can learn and it can teach. And most importantly, it can evolve.
Based on the context I have before me, I predict that at the end of 2026 (coinciding with the election) America and probably the world will enter a massive recession, likely bigger than the Housing Bubble popping. Definitely bigger than the Dot Bomb. Where too many bad decisions compounded for too many decades converge to throw away most of the quality of life gains that humanity has made since WWII, forcing us to start over. I'll just call it the Great Dumbpression.
If something like UBI is the eventual goal for humankind, or soft versions of that such as democratic socialism, it's on the other side of a bottleneck. One where 1000 billionaires and a few trillionaires effectively own the world, while everyone else scratches out a subsistence income under neofeudalism. One where as much food gets thrown away as what the world consumes, and a billion people go hungry. One where some people have more than they could use in countless lifetimes, including the option to cheat death, while everyone else faces their own mortality.
"AI was the answer to Earth's problems" could be the opening line of a novel. But I've heard this story too many times. In those stories, the next 10 years don't go as planned. Once we enter the Singularity and the rate of technological progress goes exponential, it becomes impossible to predict the future. Meaning that a lot of fringe and unthinkable timelines become highly likely. It's basically the Great Filter in the Drake equation and Fermi paradox.
This is a little hard for me to come to terms with after a lifetime of little or no progress in the areas of tech that I care about. I remember in the late 90s when people were talking about AI and couldn't find a use for it, so it had no funding. The best they could come up with was predicting the stock market, auditing, genetics, stuff like that. Who knew that AI would take off because of self-help, adult material and parody? But I guess we should have known. Every other form of information technology followed those trends.
Because of that lack of real tech as labor-saving devices to help us get real work done, there's been an explosion of phantom tech that increases our burden through distraction and makes our work/life balance even less healthy as underemployment. This is why AI will inevitably be recruited to demand an increase in productivity from us for the same income, not decrease our share of the workload.
What keeps me going is that I've always been wrong about the future. Maybe one of those timelines sees a great democratization of tech, where even the poorest people have access to free problem solving tech that allows them to build assistants that increase their leverage enough to escape poverty without money. In effect making (late-stage) capitalism irrelevent.
If the rate of increasing equity is faster than the rate of increasing excess, then we have a small window of time to catch up before we enter a Long Now of suffering, where wealth inequality approaches an asymptote making life performative, pageantry for the masses who must please an emperor with no clothes.
In a recent interview with Mel Robbins in episode 715 of Real Time, Bill Maher said "my book would be called: It's Not Gonna Be That" about the future not being what we think it is. I can't find a video, but he describes it starting around the 19:00 mark:
I started programming on an 8 MHz Mac Plus in the late 1980s and got a bachelors degree in computer engineering in the late 1990s. From my perspective, a kind of inverse Moore's Law happened, where single-threaded performance stays approximately constant as the number of transistors doubles every 18 months.
Wondering why that happened is a bit like asking how high the national debt would have to get before we tax rich people, or how many millions of people have to die in a holocaust before the world's economic superpowers stop it. In other words, it just did.
But I think that we've reached such an astounding number of transistors per chip (100 billion or more) that we finally have a chance to try alternative approaches that are competitive. Because so few transistors are in use per-instruction that it wouldn't take much to beat status quo performance. Note that I'm talking about multicore desktop computing here, not GPUs (their SIMD performance actually has increased).
I had hoped that FPGAs would allow us to do this, but their evolution seems to have been halted by the powers that be. I also have some ideas for MIMD on SIMD, which is the only other way that I can see this happening. I think if the author can reach the CMOS compatibility they spoke of, and home lithography could be provided by an open source device the way that 3D printing happened, and if we could get above 1 million transistors running over 100 MHz, then we could play around with cores having the performance of a MIPS, PowerPC or Pentium.
In the meantime, it might be fun to prototype with AI and build a transputer at home with local memories. Looks like a $1 Raspberry Pi RP2040 (266 MIPS, 2 core, 32 bit, 264 kB on-chip RAM) could be a contender. It has about 5 times the MIPS of an early 32 bit PowerPC or Pentium processor.
For comparison, the early Intel i7-920 had 12,000 MIPS (at 64 bits), so the RP2040 is about 50 times slower (not too shabby for a $1 chip). But where the i7 had 731 million transistors, the RP2040 has only 134,000 (not a typo). So 50 times the performance for over 5000 times the number of transistors means that the i7 is only about 1% as performant as it should be per transistor.
I'm picturing an array of at least 256 of these low-cost cores and designing an infinite-thread programming language that auto-parallelizes code without having to manually use intrinsics. Then we could really start exploring stuff like genetic algorithms, large agent simulations and even artificial life without having to manually transpile our code to whatever non-symmetric multiprocessing runtime we're forced to use currently.
A mechanism for harm could be that glyphosate disrupts the gut lining barrier and flora, which can cause or contribute to leaky gut, a loose term for digestive waste and foreign bodies entering the bloodstream.
Those bodies can cause chronic inflammation and the strange autoimmune disorders we see rising over time. Note that some brands like Cheerios (which don't sell an organic equivalent) can contain 700-800 ppb of glyphosate, well over the 160 ppb limit recommend for children by the Environmental Working Group (EWG).
US wheat and other crops seem to have become harder to digest for some people due to genetic tampering. They contains substances borrowed from other species to reduce pest damage, which the body has little or no experience with, which may trigger various reactions (this has not been studied enough to be proven yet).
All of these effects from gut toxicity could lead to ailments like obesity, malnourishment, cardiovascular disease, maybe even cancer. This is why I worry that GLP-1 agonists may be masking symptoms, rather than healing the underlying causes of metabolic syndrome that have been increasing over time.
Many people have chosen to buy organic non-GMO wheat from other countries for this reason. I believe this is partially why the Trump administration imposed a 107% tariff on Italian wheat for example, to protect US agribusiness.
Before you jump on me for this being a conspiracy theory, note that I got these answers from AI and so will you.
My personal, anecdotal experience with this was living with leaky gut symptoms for 5 years after a severe burnout in 2019 from (work) stress, which may have been triggered by food poisoning. I also had extremely high cortisol which disrupted everything else. So I got to the point where my meals were reduced to stuff like green bananas, trying everything I could to heal my gut but failing, until I finally snapped out of my denial and sought medical attention.
For anyone reading this: if holistic approaches don't fix it within say 6 weeks to 6 months, they aren't going to, and you may need medication for a time to get your body out of dysbiosis. But you can definitely recover and return to a normal life like I did, by the grace of God the universe and everything.
https://techcrunch.com/2026/01/12/trumps-epa-plans-to-ignore...
They're estimating that they'll get 30 MW on average from that, but I'd estimate more like 15 MW at a solar panel efficiency just over 20%. Still, the total cost for that power should be less than for turbines, since solar is now the cheapest electricity other than hypothetical nuclear (assuming an ideal breeder or waste-consuming reactor and excluding mining/waste externalities/insurance).30 MW is still only 10% of the the 300 MW used by the data center. But there's lots of land out there, so roughly 1000 acres per data center doesn't seem that extreme to me. That's a 4 km2 or 1.5 mile2 lot, or about 2 km or 1.25 miles on a side.
Basically every GPU server uses 1 kW (about 1 space heater), which puts into perspective just how much computing power is available at these data centers. Running a GPU continuously at home would need 24 kWh/day, so with > 20% efficiency panels that's 4.5*.2 = 0.9 kWh/m2 per day, so 26.67 m2, so at 2 m2 per commercial solar panel and assuming that my math is right: that's about 14 panels considering nights and seasons.
It's interesting to think just how many panels it takes to run a GPU or space heater continuously, even when they put out 500 W or 250 W/m2 peak. And how cheap that electricity really is when it's sold for on the order of $0.15 per kWh, or $3.60 per day.
I've found that the very best way to save on your electric bill is to have a few south-facing slider doors and windows, which is like running a space heater every square meter of window. There's just no way that any other form of power generation can compete with that. Also, I feel that we're doing it wrong with solar. This analysis shows just how much better alternatives like trough solar and concentrated solar (mirrors towards solar panels) might be cost-wise. On an ironic note, solar panels now cost less than windows by area, and probably mirrors.