Hacker Newsnew | past | comments | ask | show | jobs | submit | orionblastar's commentslogin

Seti@Home was fun to run on your computer.

SQLite is for a small number of users.

This is not true.

The limitation for sqlite is writes, it doesn't support concurrent writes.



Was this bug on purpose, or just some random bug? In either case, it eats up your tokens.

Python is growing in interest in learning for a lot of new developers. It is replacing Ruby and other languages. Python is easy to learn when you have the right video course or ebook.


Google removed all Neocities pages from its search. I wonder what else it removed/censored?


I always thought that C was a stepping stone to learn other languages. Like Pascal, it was educational to learn. My Comp Sci courses in 1986-1990 used Turbo Pascal and Turbo C.


I think so to, for most devs C is like Latin, or Roman Law, not something we develop and use, but rather something we learn for context and to understand future developments.

There's some people that still develop on C for sure, but it's limited to FOSS and embedded at this point, Low Level proprietary systems having migrated to C++ or Rust mostly.

I agree with the main thesis that C isn't a language like the others, something that we practice, that it's mostly an ancient but highly influential language, and it's an API/ABI.

What I disagree with is that 'critiquing' C is productive in the same way that critiquing Roman Law or Latin or Plato is productive, the horse is dead, one might think they are being clever or novel for finding flaws in the dead horse, but it's more often a defense mechanism to justify having a hard time learning the decades of backwards compatibility, edge cases and warts that have been developed.

It's easier to think of the previous generation as being dumb and having made mistakes that could have been fixed, and that it all could be simpler, rather than recognize that engineering is super complex and that we might as well dedicate our full life to learning this craft and still not make a dent.

I applaud the new generation for taking on this challenge and giving their best shot at the revolution, but I'm personally thinking of bridging the next-next generation and the previous generation of devs, the historical complexity of the field will increase linearly with time and I think if we pace ourselves we can keep the complexity down, and the more times we hop unto a revolution that disregards the previous generation as dumb, the bigger the complexity is going to be.


There is also still a lot of low-level proprietary code developed in C. I would guess far more than what is developed in Rust.

I fully agree about your last point. The proposed solutions to some of the deficiencies of C are sometimes worse than the disease while its benefits are often exaggerated, at the same time adding unnecessary layers of complexity that will haunt us for decades. In contrast, my hope would be to to carefully revise the things we have, but this takes time and patience.


>There is also still a lot of low-level proprietary code developed in C. I would guess far more than what is developed in Rust.

Do you mean that there's still code being developed, or that it still exists? Because I meant the former, the latter is true of a lot of dead languages like fortran and cobol, and it would cement the idea of being a dead language.


Fortran has its problems with its standard and portability, but it is hardly a dead language yet.


I meant there is a lot more new low-level C code being developed than Rust code.


C was never a gateway to any flavor of Pascal, a "police state language".


It's not a coincidence that Rust was invented in 1984 by some animals on a farm! Stay in thy lane, programmer!


My resume was long with Commodore Amiga, COBOL, FORTRAN, IBM 360/370 DOS/VSE, OS/2, Turbo Pascal, etc., and I had to remove those skills to shorten it. I used many different platforms.


> In the past, he said, poor countries were failing to outgrow rich ones because of unfortunate circumstances (“the war, bad policies, and dysfunctional institutions that afflicted developing nations in the mid-20th century”)

Or is it the wealthy exploiting the poor through low wages?


They’re not exploiting them, it is a function of not having really strong safety nets or even UBI.

So a lot of people are desperate to survive and keep a roof over their head.

And technology makes money flow upwards to the rich and corporations.

Soon with AI employment is going to become pity-employment. Make-work jobs. Because people just can’t seem to trust other people to be charge of their own time and have free money. Overton window in USA is not there yet. So capital will find ridiculous ways to exploit labor via the desperation of the masses. Maybe gig economies and race to the bottom for service providers as out-of-work people flood the market with useless crap, who knows.


So they need a social safety net? I agree. They also need a minimum wage to earn enough to live on.


Once people have a UBI you can eliminate the minimum wage laws. People can take unpaid internships, hobbies etc.


I agree that a UBI would have to be used when AI and robots take over jobs.


When I was young, for my birthday, I was given a Casio LED Calculator. My father said it was a crutch because then I wouldn't be able to do math in my head, and I wouldn't always have a calculator with me. Now they have iPhones with built in Calculators and even ChatGPT. I saw a video of a 3-year-old using an iPad and ChatGPT for everything, and couldn't even read. ChatGPT read for him. I read that 40% of the kids can't read at their grade level because they use ChatGPT for homework and in school.

Steve Jobs kept his children away from iPhones, iPads, the Internet, and TV. He said something about the way it affects their brains. https://www.uniladtech.com/apple/controversial-reason-steve-...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: