Hacker Newsnew | past | comments | ask | show | jobs | submit | more AgentME's commentslogin

I'm tired of every other discussion about EA online assuming that SBF is representative of the average EA member, instead of being an infamous outlier.


What reasons at all do you have?


What reasons do you have to assume EA = SBF?


The reported case about water wells running dry had to do with issues in construction rather than anything about the data center's regular operation:

> But the reason their taps ran dry (which the article itself says) was entirely because of sediment buildup in groundwater from construction. It had nothing to do with the data center’s normal operations (it hadn’t begun operating yet, and doesn’t even draw from local groundwater). The residents were wronged by Meta here and deserve compensation, but this is not an example of a data center’s water demand harming a local population.

https://andymasley.substack.com/p/the-ai-water-issue-is-fake...


>Almonds are pretty cherry picked here as notorious for their high water use.

If water use was such a dire issue that we needed to start cutting down on high uses of it, then we should absolutely cherry pick the high usages of it and start there. (Or we should just apply a pigouvian tax across all water use, which will naturally affect the biggest consumers of it.)


Yes, that's roughly what I said in my post. If we're doing a controlled economy and triaging for the health of the ecosystem, we'd start with feed for cattle, and almonds wouldn't be much further down on the list.

The contention with AI water use is that something like this is currently happening as local water supplies are being diverted for data-centers.


I have >1 gbps service from them.


The future isn't evenly distributed. I recently discovered an actively developed software project that had a ton of helper functions based on the design of `gets` with the same vulnerability. Surprisingly not all C/C++ developers have learned yet to recoil in horror at seeing a buffer pointer being passed around without a length. (C++'s std::span was very convenient for fixing the issue by letting the buffer pointer and length be kept together, exactly like Go and Rust slices.)


> Surprisingly not all C/C++ developers have learned yet to recoil in horror at seeing a buffer pointer being passed around without a length.

As someone who wasn't taught better (partly due to not picking CS as a career stream), are there any languages which avoid such vulnerability issues? Does something like rust help with this?


Almost everything else, besides any language that is copy-paste compatible with C, including systems languages that predate C for a decade, like JOVIAL, ESPOL, NEWP, PL/I and other ALGOL inspired systems languages.

Xerox PARC started with BCPL for their systems, but eventually created Mesa exactly for safe systems programming.

https://en.wikipedia.org/wiki/Mesa_(programming_language)

http://toastytech.com/guis/star.html

"The Mesa Programming Environment" - very first IDE for a systems language

https://www.digibarn.com/friends/curbow/star/XDEPaper.pdf

While Pascal as originally designed wasn't suitable for systems programming, and various dialects sprung out of it, with Object Pascal from Apple/Borland being the most famous one, by 1978 the first standard for Modula-2 was released, which was inspired in Mesa, after Niklaus Wirth spent a sabaticall year at Xerox PARC. Years later, through a similar experience, the evolution of Mesa (Cedar) would influence him to come up with Oberon.

https://en.wikipedia.org/wiki/Modula-2

https://www.modula2.org/modula2-history.php

Then there was Ada, although too expensive to get compilers and high hardware requirements for 1980's computers.

Also all BASIC compilers in the 8 and 16 bit home computers had support for low level systems programming.

In recent programming languages, something like Zig would be the closest to what those languages were offering, in safety without having a GC of some form.

Naturally this takes cares of most C flaws, minus use-after-free, however due to their type systems, one tends to use heap allocations less than in C, although it remains an issue.


Yes, Rust protects against this and so does almost every language with garbage collection (Java, C#, Python, JS/TS, etc). C/C++ are pretty unique in being some of the only popular languages remaining that don't protect you from memory safety issues often causing exploitable vulnerabilities.


> Grok is without a doubt the single most important contributor to convincing believers of right-wing conspiracy theories that maybe the theories aren't as sound as they thought. I have seen this play out hundreds of times. Grok often serves as a kind of referee or tiebreaker in threads between right-wing conspiracy theorists and debunkers, and it typically sides overwhelmingly with the debunkers. (Or at least used to.) And it does it in a way that validates the conspiracy theorist's feelings, so it's less likely to trigger a psychological immune system response.

I've seen this too and agree. It's surprising how well it accomplishes that referee role today, though I wonder how much of that is just because many right-wingers truly expect Grok to be similarly right-wing to them as Elon appears to intend it to be. It's going to be sad when Elon eventually gets more successful at beating it into better following his ideology.


By considering the facts of the matter, sure it's as you said. But if you ignore every detail then it does look like everyone is exactly as bad as each other and it's impossible to say anything is good or bad.


This seems pretty similar to Cloudflare Workflows (https://developers.cloudflare.com/workflows/), but with code-rewriting to use syntax magic (functions annotated with "use workflow" and "use step") instead of Cloudflare's `step.do` and `step.sleep` function calls. (I think I lightly prefer Cloudflare's model for not relying on a code-rewriting step, partly because I think it's easier for programmers to see what's going on in the system.) Workflow's Hooks are similar to Cloudflare's `step.waitForEvent` + `instance.sendEvent`. It's kind of exciting to see this programming model get more popular. I wonder if the ecosystem can evolve into a standardized model.


Actually, both Vercel and Cloudflare are based off of the API that we built at https://inngest.com (disclaimer, I'm a founder).

I strongly believe that being obvious about steps with `step.run` is important: it improves o11y, makes things explicit, and you can see transactional boundaries.


Hasn't he been saying that OpenAI is going to shut down every year for the last few years now? And that models are never going to get better than they were in 2022? I think he's pretty clearly a grifter chasing an audience that wants to be reassured that big tech is going away soon.


Ed Zitron may be many things but he is no grifter. He writes what he believes and believes what he says, and I basically agree with all of it. The chattering class in SV has been wildly wrong for years, and they'll really look foolish when the market crashes horribly and these companies collapse.


He's saying more than that the companies are going to collapse; he's making pronouncements about the underlying technology, which are claims that are much harder to defend. I'm not entirely sure he understands the distinction between the companies and the technology, though.


Respectably...what?? Ed at this point is one of the most well read people on Earth for this topic. Of course he knows the difference between the companies and the technology. He goes in depth both on why he think the companies are financially unviable AND why he's unimpressed by LLM's technologically alllll the time.


Even as someone who is generally inclined to agree with his thesis, I find Ed Zitron's discussions as to why AI does not and will never work deeply unconvincing.


I don't think he fundamentally gets what's going on with AI on the tech level and how the Moore's law type improvements in compute have driven this and will keep doing so. He just kind of sees that LLM chatbots are not much good and assumes things will stay like that. If that were so investing $1tn would make no sense. But it's not true.


Having a large audience does not imply being the most well informed or correct.


"Saying what a lot of people want to hear" is not a good proxy for truthfulness or correctness.


Lecun pretty much says the same things, as most experts actually. Only the execs and marketing teams keep yapping about AGI


I didn't say anything about AGI. I think AGI is very silly.


The original meaning of AI is what some now call AGI. Some don't choose to follow meaning shifts forced by large companies for advertisement purposes. Same like Full* Self** Driving***.


How do you want to define grifter? He shows up, makes a lot of big promises, talks a lot of shit, doesn't actually engage with any real criticism, gets paid for it, and then exits, stage left. He could be right, he could be wrong, but he leaves no room for debate. If all you want is someone to yell at you about how right your feelings on something are, I mean, hey, I have a therapist too. I don't ask her for financial advice though.


Does Ed produce anything other than a newsletter?


He didn't even necessarily say they were wrong about it. He just emphasized that their top priority was making sure they could score points from it:

> We hit some new lows over the weekend with the MAGA gang desperately trying to characterize this kid who murdered Charlie Kirk as anything other than one of them and doing everything they can to score political points from it. In between the finger-pointing, there was grieving.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: