Hacker Newsnew | past | comments | ask | show | jobs | submit | Laforet's commentslogin

This paper you linked does not even involve aspartame. The only sweetener they experimented with is saccharin. You can check out the main figures from the link below:

https://www.researchgate.net/publication/265791239_Artificia...

I would be very reluctant to read too deep into this given saccharin is known to behave very differently in animal models - for a long time it was thought to cause bladder cancer, but follow up studies proved that it’s an idiosyncratic reaction only found in female lab rats and no other gender/species combination. Not to mention the dose used was unrealistic to begin with.

It’s entirely plausible that sugar analogs like sucralose and non-calorific sugar alcohols such as erythritol and maltitol can cause long term changes in the gut biome but high quality evidence is still lacking.


Scenario C is more likely the culprit. I have seen multiple examples of prebuilt PCs and laptops defaulting to software RAID mode for reasons unknown, and they did not always have a toggle just like in OP’s case.

The only time I have come across scenario B was with a VAIO laptop from around 2011. The machine was advertised to come with “fastest SSD on the market” which turned out to be four (!) off the shelf eMMC modules in RAID 0 through a hardware controller. As janky as it sounds, OS compatibility was never an issue because the controller was a fairly common model with well established driver support rather than some bespoke mystery.


IMO this is still a passive type of security through obfuscation. Active defence would be more like returning zip bombs to known intruders in order to crash the process.



Endlessh seems to be abandonware. linuxserver.io used to maintain a docker image but deprecated it (https://github.com/linuxserver/docker-endlessh/pull/16) after endlessh didn’t get any new updates in over 3 years. I’ve started using endlessh-go instead https://github.com/shizunge/endlessh-go


It appears it can be configured to actively return attacks:

> Portspoof can be used as an 'Exploitation Framework Frontend', that turns your system into responsive and aggressive machine. In practice this usually means exploiting your attackers' tools and exploits


I can't seem to figure out how this would work or what this mean. Most of the links to the documentation seem to be missing.

I'd actually be curious to know if this seemingly ~10 year old software still works. Also how much bandwidth it uses, CPU/RAM etc.


There's tons of client software that can be exploited if you send a dangerous payload to it. Think of an exploitable version of Curl that will fail if it receives a bad http header.


I would guess that it fingerprints the scanning software (e.g. metasploit), then feeds a payload back to it that has a known exploit in the scanning script.


The other parties you mentioned would probably have less motivation to preserve it, let alone restore it to a fully functional state. I find it rather bizarre that many posters here seem to think that it’s morally preferable for the TV set to rot in Japan rather than getting the proper care in the hands of an American collector, all because of some imaginary cultural baggage.


Heh it strikes me that while the stakes of this "relic" are kinda low, it echos the conversations about institutions like the British Museum possessing historic artefacts :) some claim there is moral argument for it keeping its artefacts, because Britain can best preserve them and protect them from damage.

Responsibility and autonomy to preserve one's own heritage (with the associated risk of failing to do so) is a longstanding ethical dilemma between cultures, and the answers aren't so clear imho! (This argument is much more compelling for museums, rather than Sony)


Yes, I am aware of those arguments and I am inclined to agree with you. Compared to cultural artifacts which are mostly neutral in terms of externalities, relics of the industrial era suffer more from the cobra effect.

Others in this thread have bought up the future of ICEs and classic car preservation. Back in the early 2000s the US government offered people cash incentives to dispose of their fuel inefficient cars, and by disposal they meant running the engine with an abrasive liquid instead of oil until it is totally ruined beyond repair. Mechanics will tell you horror stories of rare car models being destroyed this way so the owners can claim a few hundred bucks from the DOT. I'm sure car collectors had a field day back then but with such a glut in the market they could not save everything that's worth saving.

Shank Mods was able to obtain a copy of the service manual in English from somebody in the US. This fact probably means that the TV was sold on (or imported to) the domestic US market for a while. (Sony have always allowed individuals to order parts through an authorised service centre, and the latter often insist on requesting a repair manual first even if you are 100% sure of the part number) It's very likely that a number of them existed in the US only to be unceremoniously thrown out by their owners when LCD TVs became more popular. I bet nobody batted an eyelid when that happened.


> Others in this thread have bought up the future of ICEs and classic car preservation. Back in the early 2000s the US government offered people cash incentives to dispose of their fuel inefficient cars, and by disposal they meant running the engine with an abrasive liquid instead of oil until it is totally ruined beyond repair. Mechanics will tell you horror stories of rare car models being destroyed this way so the owners can claim a few hundred bucks from the DOT. I'm sure car collectors had a field day back then but with such a glut in the market they could not save everything that's worth saving.

But what else happened with that?

The glut ended. Used cars got more expensive relative to quality.

And now the cost of a 'reliable used car' is far more than inflation adjusted for the time passed.

getting back on topic...

> unceremoniously thrown out by their owners when LCD TVs became more popular. I bet nobody batted an eyelid when that happened.

IDK about all that, during the 'LCD Phase-in' everyone I knew either donated theirs and/or moved CRTs into smaller rooms when they replaced a working one.

Especially if it was 'Decent' TV, i.e. Progressive scan and component input...

Let alone if the thing cost as much new as a very nice car of the day. The sheer responsibility of it (thinking more, you really can't throw this thing out unceremoniously, at minimum it's part of a house or business space eviction proceeding...) has some weight, ironically.


> everyone I knew either donated theirs and/or moved CRTs into smaller rooms when they replaced a working one.

But you can’t do that with a 400lbs behemoth of a TV, it would fill the entire room.

This beast is highly impractical and still only 480p.

Even those smaller CRTs got disposed of quickly as soon as the 2nd generation of flat screens arrived as they already took up way too much space.


> everyone I knew either donated theirs and/or moved CRTs into smaller rooms when they replaced a working one.

That might have happened for a while but by 2008-ish CRTs were being dumped left right and center. My city runs a annual kerbside collection program for large appliances and furniture, and I distinctly remember metal scavengers cruising the street gutting old CRTs people have left out for the copper coils, leaving whatever remains to be collected as hazardous e-waste. Around the same time, my parents got rid of a 16:10 CRT IDTV they bought in the 90s and semi-forced me to throw out a 21 inch IBM P275 I had because "it's using too much power".

In any case I doubt any corporate (or rich household) owner of a 47 inch CRT back then would think too much about replacing it with a larger screen that took up less space. After all it's just another piece of asset that has depreciated to zero value on their books.


> That might have happened for a while but by 2008-ish CRTs were being dumped left right and center.That might have happened for a while but by 2008-ish CRTs were being dumped left right and center.

Maybe I just grew up poorer than you but it took longer than that in my world.

> my parents got rid of a 16:10 CRT IDTV they bought in the 90s

Yeah meanwhile some of us had to deal with a Zenith TV that would 'jump' with a PS1 and other consoles on the RF/AV output because 'lord knows why'.

> and semi-forced me to throw out a 21 inch IBM P275 I had because "it's using too much power".

Given the other context of your comments I doubt this is a confession of contribution of hubristic affluence contributing to our modern disposable society but I feel like this underscores the point I'm trying to make in my reply.

Resourceful not-well-off people used to really appreciate repairable things, and the worst thing C4C did was get rid of a lot of not-fuel-efficient vehicles that were at least cheap to repair.

The video of that TV and the pair further underscores it. Everything on decently laid out boards. Nowadays an LCD tv, sometimes a part can go bad and it's so integrated that even 15 years ago it could be a 30 min solder job, nowadays it's cuck the whole shebang.

> In any case I doubt any corporate (or rich household) owner of a 47 inch CRT back then would think too much about replacing it with a larger screen that took up less space. After all it's just another piece of asset that has depreciated to zero value on their books.

Corporate maybe but I'd guess any smart corporation would try to load the 'disposal' costs of a 440 pound object onto the taker somehow. Similar for any rich household that wanted to keep wealth for more than a generation or two.


> Given the other context of your comments I doubt this is a confession of contribution of hubristic affluence contributing to our modern disposable society but I feel like this underscores the point I'm trying to make in my reply.

Let me assure you that none of what I said was meant to diminish your point of view which I agree with mostly.

What I was trying to convey was that people’s mindsets were rather different during the last decade of CRT. CRT had been around since the end of WWII, it may have gotten bigger over the years but the form it took on largely remained the same so there was a sense of continuity as people handed down old TVs when they got something nicer.

When cheap LCD TVs came to the market it represented something more akin to a paradigm shift as people with limited space at home could now easily own screens 30 inches and up. My parents are actually rather frugal with my dad borders on being a tech hoarder who insist on keeping every single cell phone and laptop he ever owned somewhere in his garage. However even he was unable to justify the sheer bulk and running cost of CRT TVs back in that period. Even if he were to give it away there would have been very few takers of any.

Therefore it’s not inconceivable that this model could have been sold in the US or even few more places outside Japan. Most of them simply disappeared without a trace because at some point they were probably worth less than the space it occupies, and people were overly eager to embrace the flat panels without realising that they are not getting some of the utilities back.


I keep all my old cell phones too, but I had to get rid of a run of them from around 1998 - 2008 because the plastic started turning sticky a while back.


> not-fuel-efficient vehicles that were at least cheap to repair.

You don’t need to drive that much for fuel inefficiency to get really expensive. Even 10k miles/year which is well below average at 10MPH vs 30MPH @ 3$ / gallon is an extra 2,000$ / year, and adjusted for inflation gas is currently fairly cheap. Inflation adjusted in 2011 and 2012 gas was over 5$/gallon.

We might see consistent low gas prices intended to delay the EV transition (or the could spike), but these cars were already old 15 years ago when the program happened.


> Others in this thread have bought up the future of ICEs and classic car preservation. Back in the early 2000s the US government offered people cash incentives to dispose of their fuel inefficient cars, and by disposal they meant running the engine with an abrasive liquid instead of oil until it is totally ruined beyond repair.

Could you elaborate?


https://en.wikipedia.org/wiki/Scrappage_program

Cash for Clunkers - 700,000 cars SCRAPPED by the USA Government https://www.youtube.com/watch?v=2ZMJ_oNtzzE

UK had its own program in 2009 https://www.banpei.net/2010/04/07/wtf-mr2-sw20-in-british-ca...

All the cars lost to the 2009 Scrappage Scheme - The UK SCRAPPED all these rare cars?! https://www.youtube.com/watch?v=NLLNOUUqCUc



Thanks!

Those were wild times. I remember they also had a similar scheme in Germany. Absolute madness (and that's even if you ignore the useless damage to old cars.)

They should have just printed more money to juice the economy, instead of these wild schemes to give subsidies to specific industries.


> Back in the early 2000s the US government offered people cash incentives to dispose of their fuel inefficient cars, and by disposal they meant running the engine with an abrasive liquid instead of oil until it is totally ruined beyond repair.

So. Fucking. Stupid. As though Joe Consumer with a V8 Mustang he puts a few thousand miles on per year is the boogeyman of climate change, and not, hell just off the dome:

- Every standing military on planet Earth

- The global shipping industry

- The fossil fuel industry


> "As though Joe Consumer with a V8 Mustang he puts a few thousand miles on per year is the boogeyman of climate change"

Scrappage schemes target the smokey, rusty shit-boxes that are worth next to nothing. Not Joe Mustang's prized V8, which would be worth far more than the value of the incentive anyway.

And when it comes to old cars, reducing local air pollution is often the major concern. Not just climate change.


Cash for Clunkers did exactly what it was intended to do: It screwed up the used car market for a very long time, simply by decreasing supply while demand remained.

People still needed cars, and everything is relative. When used car prices go up relative to that of new cars, then new cars become relatively inexpensive.

This helps sell more new cars. And back in the time of "too big to fail" auto industry bailouts, selling more new cars was kind of important.

edit: And remember, there were restrictions for Cash for Clunkers. The car had to be less than 25 years old, it had to run, and it had to have been registered and insured for the last 12 months. It was deliberately designed to thin the pool of functional used vehicles.

This program claimed revered cars like Audi Quattros and BMW E30s...along with V8 Mustangs. And once turned in, they were all quite purposefully destroyed: Sodium silicate replaced the engine oil and they were run at WOT until they seized, and then they were crushed just to be sure.


According to Wikipedia only about 677k vehicles were takes out of the market. In 2009 there were about 254 million registered cars in the US so did it really put a dent in the market?

[0]:https://en.wikipedia.org/wiki/Car_Allowance_Rebate_System

[1]:https://www.statista.com/statistics/183505/number-of-vehicle...


That's the number of "registered vehicles" in the US, which is going to include everything from Joe Everyman's Mazda to every single truck AT&T uses to maintain what they assert strongly is a data network (sorry little snark there). A better thing to compare to would be the number of used cars sold. A quick google says about 35 million sales are known for 2008, comprising dealer, private, and independent sales. Taking the 677k figure at face value, that would amount to roughly 2% of the "moving" supply of vehicles being removed from the market, and worth noting, the taxpayer paid for that. Also worth noting that figure is going to be inherently conservative, because that's "all used vehicle sales" which includes things like rental companies unloading older inventory, logistics companies selling trucks, that sort of thing.

That isn't a ton but it also isn't nothing, and however you feel about it, that's 677,000 vehicles that were, according to the requirements laid out by the program, perfectly serviceable daily-driver vehicles that were in active use, that taxpayers paid to buy from consumers, strictly to destroy them. Irrespective of if it ruined the used market as the GP says, that's still a shit ton of perfectly usable machines that our government apportioned tax money to buy, and then paid contractors to destroy, on purpose.


maybe the used market salesman used it as an excuse to sell used cars more expensive.

how much did the price go up because of 2%? and all other factors excluded. even inflation is 2% a year. so thats one year. sorry but i dont buy it made a dent in the used car market. proof it to me with numbers etc please otherwise it sounds like the usual useless rant about „everything is getting worse without proof“


On aggregate, in a ridiculously-competitive market like commodity used cars that is generally free of collusion, salesmen do not get to determine sale prices.

That's simply not a thing when the other guy down the block will sell a similar car for $300 less and have his money today.

Any salesman can ask for whatever price they wish to ask for, and if it doesn't sell then there is simply no sale. (The annals of Ebay, to name one dataset that can be poked at, is rife with asking prices for things that simply did not sell.)

(This is one of the very few things that the "invisible hand of the free market" actually assures us of: Sure! A salesperson can ask $16000 for a car that is worth $6000. But if they sit on that car for years and years hoping for a bite that never comes, then maybe they can eventually sell it for $3000 -- losing money the entire time, for ever step of the process.

And while that's an example of how sales can happen, it is not an example of how sales both work and make money.

Used car salesmen do not butter their bread by losing money on sales. That's not a thing at all.)


I don't know the details of the US programme, but London's recent successful scrappage scheme had a limit of £2000 for cars/vans and £1000 for motorcycles [1].

Only "dirty" vehicles that do not meet modern emissions standards qualified. And it only makes sense to scrap your vehicle if it's worth less than the £2000 payment you'd get by scrapping it. So nobody is scrapping modern, good quality vehicles: you'd just sell it instead and get more money.

[1] https://tfl.gov.uk/modes/driving/ultra-low-emission-zone/scr...


Or the manufacture of new vehicles to replace perfectly serviceable old ones.


And agriculture


I just don't think ancient artifacts are comparable to an old TV.


hmmm i dont know. ancient artifacts sometimes highlight the technical and artistic possibilities of the time. In my opinion this tv represents very good consumer culture in the 80s as do amphitheaters in rome and greece their consumer culture.


Though I don't think anyone would have wanted it, I think there's a bit of a false dichotomy there. Maybe in theory there would have been a place for this in a curated space in Japan... if not for it being so massive at least.

Ultimately if it was a TV designed in Japan, having it on display at a local tech museum would be nice. I just don't know where it would go that could deal with the space and the weight.

Closest thing I could think of is the NTT museum, which is ginormous... but it's mostly about NTT's stuff. "Some other company in Japan made big TVs" is a bit less interesting than, say, some older tabulation machines they have there.


> I love gaming, and I destress by playing games, but it's not worth the now much higher opportunity cost to play the newer (usually worse) stuff.

This appears to be the reason with all the recent remakes of “not so old” games. People like us are much more likely to pay to relive our past joys in 4K resolution.

However a part of me often wonders if video games have got to the point where every viable idea has been attempted and it’s only downhill from here. When I was a teen, I definitely did NOT envision myself still playing Age of Empires 2 PvP with strangers online but scene is still here.


There is a system wide setting that changes all non-Unicode text encoding to another code page e.g CP932 for Shift-JIS. Third party tools are available to do the same conversion on a per application basis.

It’s not as bad as trying to load some really old CJK web pages on mobile devices: few mobile browser has an accessible option to select character encoding and there appears to be none on iOS. The only option is to change the system language and that didn’t always work for more obscure character codes.


Not surprising because only the edges of the back glass are glued for iPhone 15 so most of the surface is just floating there without support. On prior generations the entire panel is glued.

Not sure what the justification for the change would be, if anything it does make changing the back glass much faster as there is no need to scrape or laser blast the entire surface to remove all the adhesive. It would be interesting to see if this is also the case for the 16th gen.


Classic of "easier to fix vs less likely to break".

Given how much Apple is pushing apple care (and how much they charge for back glass) - it's almost like incentives have turned around.


I had to read the article twice to be sure that it was a utilitarian move (however questionable it might be) rather than a grand ideological stand that the article seems to spend much time portraying.

FWIW, data center IP addresses are already being treated as second class citizens by major content/service providers, and this has become an escalating barrier to self hosting. I am honestly not sure what the author is trying to accomplish.


> "data center IP addresses are already being treated as second class citizens by major content/service providers, and this has become an escalating barrier to self hosting"

Could you please expand on this a bit?


DC IPs are often:

1. totally blocked by some services (especially those related to copyright, like almost all the streaming services), 2. treated as suspicious by lots of CDNs (so you would get captchas more frequently; have stricter rate control, etc.)


Thanks. Still interested in OP's response too.

Also, what qualifies as a data center?


Hi, OP here. I did not respond since another poster had beaten me to it but here we go.

The reply above yours is mostly correct though I have to admit that “data center IP” could be a bit of a misnomer when it comes to IP reputation. There are essentially 4 categories:

- Residential landline connections are the most mundane but are also least restricted because this is where your average users are found. The odds of bad actors on the same network is fairly low, and most ISPs will overlook minor transgressions to not incur additional customer support costs.

- Mobile data connections are often behind CG-NAT. Blocking entire IP range tends to generate a lot of false positives so it doesn’t happen very often.

- Institutional IP ranges (such as 17.0.0.0/8 or any org that maintains their own ASN) tends to get a pass as well because they tends to have their own IT and networking department to take collective responsibility if something untoward was to happen.

- This leaves public could and hosting services on the lowest tier because these networks have very low barrier of entry for bad actors . Connections from these IP addresses are also far more likely to be bots and scrapers than a human user so most TDS systems are all too happy to block them.


It is ideological. No utilitarian explanation was provided.


Cyber- is pretty much a code prefix for anything targeted at the public sector. I too see it as a kind of dirty word TBH.


"Cyber," used on its own, is the worst of them all.


Intel actually intended for LGA1151 to remain unchanged for Coffee Lake but found out late in the testing process that many existing motherboards did not have enough power delivery capability to support the planned 6 and 8 core parts. Hence the decision to lock them out in software only. They are probably aware of the bad optics but decided that it’s better than trying to deal with the RMAs later.

It’s very similar to what had happened in 2006 when the 65nm Core 2 series were released in the same LGA775 package used by 90nm Pentium 4s, however the former mandated a specific VRM standard that not all comtemporary motherboards supported. Later 45nm parts pretty much required a new motherboard despite having the same socket again due to power supply issues.

AMD went the other route when they first introduced their 12 and 16 core parts to the AM4 socket. A lot of older motherboards were clearly struggling to cope with the power draw but AMD got to keep their implicit promise of all-round compatibility. Later on AMD tried to silently drop support for older motherboards when the Ryzen 5000 series were introduced but had to back down after some backlash. Unlike the blue brand they could not afford to offend the fanboys.

P.S. Despite the usual complaints, most previous Intel socket changes actually had valid technical reasons for them:

- LGA1155: Major change to integrated GPU, also fixed the weird pin assignment of LGA1156 which made board layout a major pain.

- LGA1150: Introduction of on-die voltage regulation (FIVR)

- LGA1151: Initial support for DDR4 and separate clock domains

This leaves the LGA1200 as the only example where there really isn’t any justification for its existence.


Thank you for providing valuable insight. I wish these kinds of comments would end up at the top instead of the usual low quality "hurr-$COMPANY evil, it's all because greedy obsolescence-durr", from people who have no idea how CPUs and motherboards work together and the compatibility challenges that come when spining new CPUs designs with big difference that aren't visible to the layman who just counts the number of cores and thinks there can't possibly be more under the hood changes beyond their $DAYJOB comprehension.

Here's a video from gamer's Nexus on AMD's HW testing lab, just to understand the depth and breadth of how much HW and compatibility testing goes into a new CPU, and that's only what they can talk about in public. https://www.youtube.com/watch?v=7H4eg2jOvVw


> LGA1155: Major change to integrated GPU, also fixed the weird pin assignment of LGA1156 which made board layout a major pain.

Of course, the P67 chipset was trivially electrically compatible with LGA1156 CPUs; Asrock's P67 Transformer motherboard proved that conclusively.

That said, the main problem with 1155 was their locking down the clock dividers, so the BCLK overclocking you could do with 1156 platforms was completely removed (even though every chip in the Sandy Bridge lineup could do 4.4GHz without any problem). This was the beginning of the "we're intentionally limiting our processor performance due to zero competition" days.

> LGA1150: Introduction of on-die voltage regulation (FIVR)

Which they would proceed to remove from the die in later generations, if I recall correctly. (And yes, Haswell was a generation with ~0% IPC uplift so no big loss there, but still.)


> P67 chipset was trivially electrically compatible with LGA1156 CPUs

Well it’s possible to shoehorn in support for the determined but iGPU support is definitely out of reach and I am not sure what segment of the market is that targeted to. Seems like an excuse for AsRock to get rid of their excess stock. The socket change was actually very well received by everybody in the industry.

> Haswell was a generation with ~0% IPC uplift so no big loss there

You are right that FIVR did not last long in that particular iteration. However Haswell does have a 10% to 30% IPC advantage over the previous gen depending on the test[1].

Haswell also added AVX2 instructions which means that it will still run the latest games whereas anything older is up to the whims of the developer (and sometimes denuvo, sadly)

https://www.anandtech.com/show/9483/intel-skylake-review-670...


Why does a socket that can support DDR3 or DDR4 need to be different from a socket that only supports DDR3?

And with the current socket being 1700, they're going to change it again for the next generation to 1851, and with a quick look I don't see any feature changes that are motivating the change. (Upgrading 4 of the PCIe lanes to match the speed of the other 16 definitely does not count as something that motivates a socket change.)

So by my reckoning, half their desktop socket changes in the last decade have been unnecessary.


Because DDR4 is electrically different and memory controllers are all on-die.

Intel could get away with doing that pre-Nehalem because the memory was connected via the northbridge and not directly (which is what AMD was doing at the time; their CPUs outperformed Intel's partially due to that), so the CPU could be memory-agnostic.

AMD would later need to switch to a new socket to run DDR3 RAM, but that socket was physically compatible with AM2 (AM3 CPUs would have both DDR2 and DDR3 memory controllers and switch depending on which memory they were paired with; AM3+ CPUs would do away with that though).

There were some benefits to doing that; the last time Intel realized them was in 2001 when RD-RAM turned out to be a dead-end. Socket 423 processors would ultimately prove compatible with RDRAM, SDRAM, and DDR SDRAM.


> Because DDR4 is electrically different and memory controllers are all on-die.

Them being on-die is exactly why you don't need a socket change to take full advantage of DDR4, since they directly showed a socket can support both at once. Unless you're particularly worried about people trying to buy a new motherboard for their existing CPU, but who does that? You can tell them no without blocking CPU upgrades for everyone else.

> pre-Nehalem

> AM3 CPUs would have both DDR2 and DDR3 memory controllers

LGA1151 supported both DDR3 and DDR4.


>many existing motherboards did not have enough power delivery capability to support the planned 6 and 8 core part

Coffee lake's 8700k had 6 core not 8, the refresh lake-s did feature 8 cores. The release TDP of 8700k was the same as 6700k - 95W, of course it consumed more at peak, and when overclocked.

The support would still depend on the motherboard manufacturers adding the CPU support (along w/ the microcode) to their BIOS. Many high-end boards, e.g asrock z170 extreme6/7, would have supported 8700k.

The situation is no different today that many (most) boards do not top support top end processors anyways, due to poor VRMs, even when the socket is the same. (or if they support the CPUs are effectively power throttled)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: