Hacker Newsnew | past | comments | ask | show | jobs | submit | dan-robertson's commentslogin

I assume someone in the 5th percentile of wealth is going to have very negative wealth which is only really possible in developed countries, eg an American medical student or a doctor who is part-way through paying off their loans, or someone suffering from massive credit card debt / car loans. (I think this isn’t really what you were thinking of though. I think the poorest people in the world still live, in many ways like medieval peasants except with much lower infant mortality and somewhat net food security)

https://www.who.int/news/item/24-07-2024-hunger-numbers-stub...

Hunger worldwide has been getting worse for the last quarter century or so.

733 million people don’t have food security. I think about 5-10 million die every year from starvation.

In medieval times there were famines, but they were caused by there not being enough food to go around due to disease or bad harvests.

Today millions of people starve even if there is no bad harvest or animal pandemics.


> Hunger worldwide has been getting worse for the last quarter century or so.

That doesn't appear to be true. E.g., following links from the WHO page you cite gets me to https://openknowledge.fao.org/server/api/core/bitstreams/39d... ...

where Figure 1 shows hunger consistently decreasing from the start of the graph in 2005 to somewhere around 2014, at which point it plateaus for a while and then starts increasing somewhere around 2019-2020.

My recollection is that by 2005 where that graph begins, hunger had been consistently decreasing for quite some time, but a bit of googling hasn't found anything that quite answers that question. I did find https://www.jstor.org/stable/40572886 (looking at data from 1930 to 1990) whose publicly-accessible abstract says that "the proportion undernourished has been in decline since the 1960s and that the absolute number has also declined in recent years".

So I think the truth is not "hunger has been getting worse for the last quarter century or so" but something more like "10 years ago, hunger had been improving for about half a century; the improvement stalled for about 5 years and over about the last five years it has been getting worse".

(Which is still bad news, as far as the present state of things is concerned, but a rather different sort of bad news.)


And malnutrition isn't only about lack of food, it's also about mediocre quality of food:

> Similarly, new estimates of adult obesity show a steady increase over the last decade, from 12.1 percent (2012) to 15.8 percent (2022). Projections indicate that by 2030, the world will have more than 1.2 billion obese adults. The double burden of malnutrition – the co-existence of undernutrition together with overweight and obesity – has also surged globally across all age groups.

Obesity will soon, if not already, become a major public health disaster in poor countries.


Well giving people food doesn't solve hunger problems because those people just breed.

Why doesn't Africa have more farms and infrastructure?


Short answer: Climate (even without climate change) and war.

Long answer: Western colonialism.


> someone in the 5th percentile of wealth

I find the percentile measure terrible to technically mean 95% of the population, but is often colloquially understood the other way around. It's like German numbers, when people say five and forty to mean 45. The general population rejects needless complexity.


The general population cultivate disagreement on every matter it can afford to.

The correct metric is probably something like Actual Individual Consumption.

Is it always the case that these white label products are all equivalent? That is, is there still some input from the purchasing company on choice of components, quality control, etc, and does that make a difference to the product?

I suppose one should expect ‘time to agi’ to be a function of:

1. Researchers

2. Productivity-enhancing things in the org (either technical stuff like frameworks/optimisations or nebulous social things like better collaboration)

3. Amount of available compute.

4. Luck

And then the past experience with openAI should help a lot with doing a good job of #1 and #2 even if it is just being a competent executive.


Surely availability of data is here as well?

You say ‘modern relationships’ but I feel like you’re describing a stereotypical 1950s relationship in that paragraph. The lack of contrast surprises me.

in the 1950s, your choice for life partner is the 50 kids in your high school class. Women got married below the age of 25 and didn't have careers.

Today, Tinder and Instagram gives you access to literally the entire planet of single people and the illusion that you have the chance to be with one.


I think I agree with you that though women could work in the ’50s, there weren’t really careers available to them in the same way as for men. Maybe it is just women having ‘real’ careers and therefore higher opportunity cost/more practical liberty/fulfilling alternatives to children making a big difference.

I guess what I’m getting at is that, even if you describe men’s desires accurately, I don’t think it describes their behaviour in my parents’ generation let alone mine. But maybe this just varies a lot by country/income/education/social class and I see some weird sample. I know divorce rates have become super divergent by education in the US for example so presumably relationships are quite different too.


The big difference is that mom is working now.

The problem is not who does the most household work, the problem is that the one who does (usually the mom) can't compensate by not working. A single income is rarely sufficient for a family.


Wasn’t the non-working housewife mostly a middle class thing and a weird blip in history? Women worked long before the baby boom either with less heavy tasks related to subsistence agriculture or cottage industries, and many worked during the baby boom. (I think the difference from today is partly different rates and partly women having actual careers in ways they didn’t before)

Maybe that is a good explanation for the baby boom though.


These seem worse IMO. Not sure if it’s the medium (eg more saturated colours, the particular website) or if I just like the compositions less.

They have more color but way less resolution, thus less detail. Pretty much what you would expect to see, given that the original Mac and Amiga came out around the same time.

Both Motorola 68000 machines, typically 512K-1024K of RAM. So similar underlying constraints, under which they made very different choices for how to prioritize graphics.

Weird there were no hires images. Amiga's horizontal hires resolution was >720 pixels.

Of course, in order to get square pixels, you needed to enable interlace as well.


The usual case of looking at pictures what was made on and for a CRT monitor (or even TV).

You can try Screenitron to imitate something like this.

https://littlebattlebits.xyz/screenitron


I’m a bit conflicted: when I used to care more about this freedom stuff say 10 years ago, I would have been more in favour of these regulations. Today I care less about that and more about security and I mostly think that Apple’s preferred approach is better for security than what the EU proposes. That said, I am not super happy about the rate of scams or junk in the App Store.

I think even for Americans who like the anti-gatekeeper regulations, you might worry about the precedent for the powers European governments get over these tech companies as the other thing they want is removing as much encryption as reasonably possible, which you may not want. Those changes seem quite unavoidable though so maybe it’s not worth thinking about them together.

The more damning thing IMO is the whole ‘America innovates Europe regulates’ trend. I think it seems pretty important that the EU (and U.K.) work out how to escape the anti-innovation troughs they have found themselves in. Or perhaps by 2050 the EU will largely be a tourist destination where citizens watch ads for the American tech companies to make profits to be highly taxed by the EU to fund subsidies for the German auto industry to sell cars to Americans and Chinese.


Today I care less about that and more about security and I mostly think that Apple’s preferred approach is better for security than what the EU proposes.

This is mostly a false dichotomy that Apple likes to push. macOS has strong security with sandboxing, code signing, malware scanning, etc. I have never encountered someone among my direct acquaintances who had their Mac compromised. Yet, it's perfectly possible to make an alternative app store, circumvent code singing, etc. on a Mac.

Even with the freedom of an EU iPhone, you can still choose to completely stay in the Apple ecosystem and pretend that the extra freedoms that you have gained aren not there.

The thing is that Apple knows that people will purchase from an alternative reputable store if the prices are lower because the margins are lower. Or that developers will move there because they can increase their margins. And then Apple will actually have to compete on price (app store fee) and features.

It has very little to do with security and mostly with Apple wanting to keep their 15%/30% because it's hugely profitable.

the precedent for the powers European governments get over these tech companies as the other thing they want is removing as much encryption as reasonably possible

This does not make any sense at all. Why would you remove encryption, you could just accept an additional root certificate as a user and be protected by the same encryption.

I think it seems pretty important that the EU (and U.K.) work out how to escape the anti-innovation troughs they have found themselves in.

We are doing fine, we just don't believe in profit over everything. Moreover, the current US tech feudalism makes it harder to innovate and develop competitors, because you only get to do what the feudalist overlord permits you to do. Regulation is necessary to make it a fair marketplace again.


iPhones are a much more juicy target compared to macs.

No they aren't. They sync almost all of your data across the both of them; that's touted as a benefit of the Apple Ecosystem. Your juicy iPhone data is directly accessible via the linked Mac.

A better argument is that Macs are less popular and thus less targeted.


I dispute the data syncing somewhat: I think my phone somehow ‘knows’ more eg online banking credentials than my Mac. But also the different population sizes is the main reason that iPhones are juicier targets. I don’t think you’re really disagreeing with me. Perhaps I should have said that iPhone exploits are more valuable than Mac exploits.

I think you are seeing it slightly skewed: in the past, for a variety of reasons, the US got at the forefront of tech and got even richer in some pockets of the country.

The EU and other countries had some pretty compelling competitors which got more or less slowly crushed by the US.

After over 30 years of this, a handful of the remaining US megacorps turned around and started fencing their own little profitable field, disallowing anyone else to even try to get in.

EU is the only non-purely adversarial entity to uphold laws also to these seemingly untouchable megacorps.

What I find weird is that there is a selective memory in people who are either from the US or pro-big businesses where on one side they are openly against these claims the EU makes (calling them anti-innovation) while also being a fervent supporter of "liberal" policies like medicaid, right to repair, warranties and such. As if they do not realise that they stem from the exact same place, and often they do come directly from Europe.

I'm at a point where I believe that if someone is against what the EU is doing against these megacorps (not saying everything the EU does is gold btw) has either A) vested interest in such companies, B) hates the concept of EU and anything it touches, C) they are rich and don't really care about anything, D) not very bright.


The whole tech house of cards would fall apart if tracking a user is made illegal - or serving ads based on any sort of tracking.

> as the other thing they want is removing as much encryption as reasonably possible, which you may not want

This is the wish of a vocal, but powerless minority, not an actual law. It often gets misused as anti-EU FUD.


Another bias can be who leaves reviews.


People begin from the premise that there should be lots of primates living away from earth, and then work backwards from there. I don’t think governments should be paying so much for such activities.

In my opinion it is not a good use of public money relative to other space projects, and crude missions to mars would likely destroy any hope of finding life there should it have existed, making them actively harmful.

It seems sometimes that for NASA, the big goal is to have a large source of steady funding is the goal, and a space station is the best way to get that (hence the insane space station in the lunar mission plan). I get some vague impression that politicians like talking about missions to the moon or mars more than space stations, telescopes or probes.


> People begin from the premise that there should be lots of primates living away from earth, and then work backwards from there.

No, people begin with interest in all things around them and from that conclude that being closer to them would be more convenient. This idea is perhaps as old as the human race itself.

Modern robots aren't as capable as humans, so if the project is big enough, human involvement is reasonable.


I don’t think ‘closer to thing’ is a legitimate goal to spend enormous amounts of public money on. (I’m more ambivalent about private funding except for the contamination problem)

I’m not aware of any serious research questions about the solar system which could be answered sooner or more cost effectively by sending people to investigate (except for questions about eg how well primates survive in various extraterrestrial situations, which are only interesting insofar as there are legitimate reasons to put primates in such situations).


The Moon research was significantly advanced with the data collected from the rocks Apollo astronauts brought from the Moon.

Your skepticism is widely shared and also widely debunked. I don't think we'll move the state of the debate ahead with short messages, it needs more detailed and prolonged discussion. There are reasons defending the space exploration, and there are reasons defending human flights. You can personally disagree, but your arguments are based only on opinions, which aren't better than opinions of the other side.


In the 1960s it would be hard to retrieve samples from the moon without humans. I don’t think that’s true today. For mars, isn’t it obviously much more difficult to try to get a big enough rocket to return a load of heavy meatbags than to just return the rocks and leave the robots behind?

This is just a fact of lossy compression: you want to throw away information that contributes less to the perception of the video so that you can describe it with fewer bits of information.

There are two possible advantages for this kind of grain synthesis. For Netflix, they could produce the same perceived quality at lower bitrates, which reduces costs per view and allows customers with marginally slow connections to get a higher quality version. For a consumer, the advantage would be getting more non-grain detail for a fixed bitrate.

You are right that if you subtract the dentists frame from the raw one, showing only the estimated noise, you would get some impression of the scene. I think there’s two reasons for this. Firstly, the places where the denoiser produced a blurry line that should be sharp may show up as faint lines. I don’t think this is ‘hidden information’ so much as it is information lost to lossy compression. In the same way, if you look at the difference between a raw image and one with compression, you may see some emphasized edges due to compression artefacts. Secondly, the less exposed regions of the film will have more noise so noisiness becomes a proxy for darkness, allowing some reproduction of the scene. I would expect this detail to be lost after adjusting for the piecewise linear function for grain intensity at different brightness levels.

Perhaps a third thing is the level of noise in the blacks and the ‘grain size’ or other statistical properties tell you about the kind of film being used, but I think those things are captured in the film grain simulation model.

Possibly there are some other artefacts like evidence of special effects, post processing, etc.


Excel is one example of a product that is well liked and considered extremely valuable by some people. Azure (if that’s still the name for their aws competitor) is also reasonably well liked from what I can tell. Certainly it seemed they were beating gcp by having a product people liked more.


As someone that works with large AWS, GCP, AliCloud, and Azure footprints I can assure you that Azure is god awful in every single aspect.

Especially, but not limited to, support.


Excel was feature complete in the 90s. Everything added since has made it worse. I have 12 cores and it takes as long as it did to merely startup as in 1995.


Launching Excel 2010 is bliss compared to all newer versions. But lambda and let can be magical.

And Alt-Q to find Excel features is kinda good. There's usually a short annoying delay while Excel activates the feature. And once it returns results, it never places the keyboard focus on the first result, so you always have to down-arrow instead of just pressing enter to accept the first result.


The improved table naming and array functions seem pretty important. The flash full feature also seems quite useful.


You never learned to use the features added since 1995. That’s the problem.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: