I think, the cost of running and training models is going to fall due to Moore's law and friends.
That also meas that the worth of today's computers declines rapidly and all the money invested in compute power with it.
Compute is basically what the AI VC money is spend on. That money will be gone in a few years due to the hardware being worthless.
On the other side running a model (locally) will become cheaper and cheaper to the point that Ai stuff becomes a everyday commodity running on cheapo devices everywhere.
Then there are optimizations too. Which lowers the cost.
So it's not going away and it's not going to be expensive for the consumer in the long run.
My 5 year old rtx 2027 runs models those output would have been state of the art a couple of years ago.
In a few years something running on the level of today's top models might run under your desk if that progress goes on at this pace.
Ah yes, the good old “everything good from the Soviet Union was because of non-Russian soviets and everything bad about was because of the Russian soviets”. It’s astounding that open bigotry is tolerated here.
I recall looking at a car to buy, and the salesman toted the gas cap on the right as the "safe side".
The logic was, if you run out of gas, you can refill on the side away from traffic.
Dumbest design reasoning. Plan the side, for an event most people never experience?! Or if they do, once... and maybe on a rural dirt road, not necessarily a freeway.
For your V2: a Can-Bus connector would be great. Should really be standard for toolheads nowadays. Makes cable management so much easyer. And the board does not need the driver for the extruder, the heater contoll and sensors anymore.
So maybe a version which is optimized for Can-Bus toolheads?
And more driver slots, 4 is not sufficient if you want a self leveling bed.
Actually if you're standing next to people the air you breathe in also has some of their exhaust gases in it, in this case slightly elevated CO2. If there's a dozen people in a small meeting room with the windows closed and no AC the air quality is significantly worse in that room than it would be say, stood on the roof... unless you're in the middle of a major city where maybe the air on the roof is full of exhaust from motor vehicles, hence legislation to restrict vehicle exhaust.
This is the downward spiral for a lot of brands. They sell out to an investor, who uses their brand reputation inertia, reduces cost and quality, etc. There's barely any brands left. IIRC Miele is still one of the few good brands for home appliances, but they're also significantly more expensive. At least for the initial purchase, I'm sure it evens out long term.
Also, TrackIR is just an IR webcam, IR leds, and a hat with reflectors. You can DIY the exact same setup easily with OpenTrack, but OpenTrack also has a neural net webcam-only tracker which is, AFAIK, pretty much state of the art. At any rate it works incredibly robustly.
Actually I have already used it to implement the same idea as the post, with the added feature of anaglyph (red/blue) glasses 3D. The way I did it, I put an entire lightfield into a texture and rendered it with a shader. Then I just piped the output of OpenTrack directly into the shader and Robert, c'est votre proverbial oncle. The latency isn't quite up to VR standard (the old term for this is "fishtank VR"), but it's still quite convincing if you don't move your head too fast.
There's already a wide variety of Opentrack plugins that use everything from off the shelf webcams to DIY infrared trackers to an iPhone app and FaceID/AirPods.
I definitely don't want to be randomly interrupted by Ai garbage.
Those horrible automatic translations are bad enough.
And it seems the slop can't be completely disabled. I guess sooner or later it will spew out"usefull recommendations" and end up being just another vehicle for ads.
It will be shit like "did you know that the singer of band {xyz} likes this brand of {snake oil}?" or
"the song you are listening to reminds me of {insert crypto scam}".
It seems soon antoher browser plug-in is required to get rid of yet antoher annoying anti-feature.
That also meas that the worth of today's computers declines rapidly and all the money invested in compute power with it.
Compute is basically what the AI VC money is spend on. That money will be gone in a few years due to the hardware being worthless.
On the other side running a model (locally) will become cheaper and cheaper to the point that Ai stuff becomes a everyday commodity running on cheapo devices everywhere.
Then there are optimizations too. Which lowers the cost.
So it's not going away and it's not going to be expensive for the consumer in the long run.
My 5 year old rtx 2027 runs models those output would have been state of the art a couple of years ago. In a few years something running on the level of today's top models might run under your desk if that progress goes on at this pace.