Hacker Newsnew | past | comments | ask | show | jobs | submit | naikrovek's commentslogin

5G isn’t inherently any more precise, but because of the higher frequency used in 5G, the radio signals are blocked by obstructions much more easily, so there must be many more 5G radios per unit area to provide coverage. And one feature of having many more base stations around is that triangulation of specific phone is much more accurate and precise because of how close the 5G base stations are to all 5G phones.

5G infrastructure isn’t limited to tall easily visible radio towers like 4G and before; 5G transmitters are small and relatively inexpensive, making them very common. My employer has a private 5G infrastructure, and we are not related to telecommunications in any way.


> but because of the higher frequency used in 5G

For the most part they use the same or lower frequencies. N71 (600mhz) is lower than any of the 2G/3G bands and requires less cell density than 3G (UMTS/WCDMA) did.

> 5G infrastructure isn’t limited to tall easily visible radio towers like 4G and before;

Nor were earlier technologies. DAS systems get used in large buildings/cities and were done with 4G as well. Small cells and femtocells have been a thing since at least 3G era.

> 5G transmitters are small and relatively inexpensive, making them very common.

Transmitter cost wasn't the primary limitation before, the options for unlicensed/lightly-licensed spectrum were low before and the standards weren't really designed to use them as primary carrier until NR. Also you had to run way more components to run earlier technologies, the stack is just smaller for a NR deploy.


oh dang i was way off. thank you

This is not true, 5G has multiple positioning improvements that are not related to higher frequencies. 5G has something called LMF (Location Management Function) that handles positioning of user clients through multiple means, like round trip time, angle of arrival, and dedicated 5G positioning reference signals.

You can read more about 5G positioning here:

https://www.ericsson.com/en/blog/2020/12/5g-positioning--wha...

https://www.ericsson.com/en/blog/2024/11/5g-advanced-positio...

https://arxiv.org/abs/2102.03361

https://research.chalmers.se/publication/542739/file/542739_...


i swore 5G used much higher frequencies (and is therefore blocked by so many more things that don't affect 4G and below.) I'm glad I'm wrong, thank you.

what a bunch of enormous pussies ICE are, to have tiktok do this... lol, they're children, man. children with guns.

I think you're making that up. It is widely known that tools predate humans.

This is the kind of thing that the browser should not need to do. This is the kind of thing that the operating system should be doing. The operating system (the thing you use to run programs securely) should be securing you from bad anything, not just bad native applications.

A large part of the web is awful because of all the things browsers must do that the operating system should already be doing.

We have all tolerated stagnant operating systems for too long.

Plan 9's inherent per-process namespacing has made me angry at the people behind Windows, MacOS, and Linux. If something is a security feature and it's not an inherent part of how applications run, then you have to opt in, and that's not really good enough anymore. Security should be the default. It should be inherent, difficult to turn off for a layman, and it should be provided by the operating system. That's what the operating system is for: to run your programs securely.


two years of vibecoding experience already?

his points about why he stopped using AI: these are the things us reluctant AI adopters have been saying since this all started.


The practice is older than the name, which is usually the way: first you start doing something frequently enough you need to name it, then you come up with the name.

the url "nova-is-here-to-stay" says to me that Nova will be discontinued in about 90 seconds.

"Nova Launcher's Incredible Journey"

Remember this article when you get upset that your own customers have come to rely on behavior that you told them explicitly not to rely on.

If it is possible to figure something out, your customers will eventually figure it out and rely on it.


Once a system has a sufficient number of users, it no longer matters what you "explicitly" promised in your documentation or contract.

Hyrum’s Law: all observable behaviors of your system will eventually be depended on by someone.

Even if you tell users not to rely on a specific side effect, once they discover it exists and find it useful, that behavior becomes an implicit part of your system's interface. As a result, engineers often find that "every change breaks someone’s workflow," even when that change is technically a bug fix or a performance improvement.

Reliance on unpromised behavior is something I was also introduced to as Kranz’s Law (or Scrappy's Law*), which asserts that things eventually get used for their inherent properties and effects, without regard for their intended purpose.

"I insisted SIGUSR1 and SIGUSR2 be invented for BSD. People were grabbing system signals to mean what they needed them to mean for IPC, so that (for example) some programs that segfaulted would not coredump because SIGSEGV had been hijacked. This is a general principle — people will want to hijack any tools you build, so you have to design them to either be un-hijackable or to be hijacked cleanly. Those are your only choices." —Ken Arnold in The Art Of Unix Programming


Obligatory XKCD for this

https://xkcd.com/1172/


Amtrak is not for someone that simply wants to get from A to B. I suspect a 50-day bus trip would be the same.

When I take Amtrak, it’s because I want to look out of a window for a few dozen hours and see something new (to me) every time I look out the window.

It’s probably the bus trip that they want, and not simply “go to India.”


> I’ve said many times before that I think Finder is the worst default file manager of any popular desktop environment.

[GNOME enters the chat]: "That's nothing, I'm way worse!"


When on macOS using Finder I often wish I had something as nice and consistent and usable as Nautilus.

Finder is genuinely horrible. It’s obvious no one at Apple cares about files anymore nor anyone working with them.

We’re all supposed to consume cloud these days or so it seems.


My go to example would be long lasting issues with SMB support in Finder. All operations are very slow, the search is almost unusably so. The operations that are instant on every non-Apple device take ages on a Mac. I first ran into these issues 7 years ago when I set up my NAS, and they present to this day. I tried all random suggestions and terminal commands, but eventually gave up on trying to make it perform as it does on Linux.

With Apple's focus on cloud services, fixing the bugs that prevent the user from working with their local network storage runs contrary to their financial incentives.


Is it actually though? It’s cool to criticise Nautilus but, at worst, it’s just equally as bad as Finder. Which shouldn’t be surprising given how much it’s styled to look like Finder.

However in my personal opinion Nautilus’s breadcrumb picker does edge it against Finder.

So I stand by my comment that Finder is the worst.


Nautilus opens a new window for every folder you enter. Finder does not.

That used to be a preference, and last I used it, it was not. It is forced on because that’s how the GNOME developers thought you should use it… “Our way or the highway!” — GNOME devs.

Finder wins based on that alone. Finder wins so completely because of that one single thing that I’ll never voluntarily use GNOME again.


design for design's sake is bad, and that's what Liquid Glass is. There was no thought behind it.

It is 2026 and UIs are still abysmally slow in many cases. How is that even remotely possible? Now, with that in mind, consider (just for a moment) why people might think that UX people don't know what they're doing.


> It is 2026 and UIs are still abysmally slow in many cases. How is that even remotely possible?

Because UI/X teams were separated from engineering. (Same thing happened with modern building architecture)

It's fundamentally impossible to optimize if you're unaware of physical constraints.

We need to get rid of the "It's okay to be a UI/UX designer who doesn't code" cult. (Looking at you, Adobe and Figma...)


> Same thing happened with modern building architecture

Yes. Yes, it has. I'm currently in the midst of a building project that's ten months behind schedule (and I do not know how many millions of dollars over budget), and I'd blame every one of the problems on that. I - the IT guy - was involved in the design stage, and now in construction (as in, actually doing physical labor on-site), and I'm the only person who straddles the divide.

It's utterly bizarre, because everyone gets things wrong - architects and engineers don't appreciate physical constraints; construction crews don't understand functional or design considerations - so the only way to get things right is for someone to understand both, but (apart from me, in my area - which is why I make sure to participate at both stages) literally no one on the project does.

Seen from a perspective of incentives I guess I can understand how we got here: the architects and engineers don't have to leave their offices, and are more "productive" in that they can work on more projects per year, and the construction crews can keep on cashing their sweet overtime checks. Holy shit, though, is it dispiriting to watch from a somewhat detached perspective.


Agreed. The further you are away from how a computer works internally, the worse your product for a computer will be.

We have convinced ourselves as an industry that this is not true, but it is true.


> We need to get rid of the "It's okay to be a UI/UX designer who doesn't code" cult.

I don’t think designers who don’t code are really a problem. They just need to dogfood, and be lead by someone who cares (and dogfoods as well).


In the case of Apple, I really doubt its designers don't dogfood. Do you expect them to have Android phones and linux desktops?


I would think like you, but then some of their design decision are truly baffling. I like the idea of Liquid Glass, but there are thousands of rough edges that scream lack of care.


I have a strong feeling people working and approving Liquid Glass didn't dog food it in dark mode because it just looked BAD in the first builds available.


I sometimes wonder if anyone in charge at Apple uses Apple devices the way I do. I expect they have one, consistently-apple, high-end setup and it probably works very well for their style. Some things are great but others are insane and it seems like that happens most when using things like non-apple monitors or not typing a certain way on the phone or if you don't drive the same car.

Switching windows between two non apple monitors after waking from sleep is wildly unpredictable and has insane ux like resizing itself after a drag.

My carplay always starts something playing on my car speakers even when I wasn't listening to anything before connecting. It's so off it's comical.

The iPhone alarm will go off like normal, loudly from the speaker, even if you're currently on the phone and have it up to your ear. This has been a problem since my very first iPhone.

There has been a bug about plugged in physical headphones being unrecognized sometimes after waking from sleep even if it worked fine when going into sleep. I checked once in probably 2014 and Apples' official response was that it literally wasn't physically possible despite all of us people experiencing it. The bug was ancient even at that time and >ten years later my m4 macbook pro STILL DOES IT.

Apple and apple fanboys seem to take the stance that these are all user error on my part (remember the "you just aren't a Mac person" era?). I bet some of these are configurable with settings deep in some menu somewhere so from a certain perspective that's right but also underscores my point about the limitations of myopic dogfooding.

As a fun aside, the ux for turning on the "Voice Over" tutorial is the worse thing I've ever experienced on an Apple device. I was laughing out loud trying to figure out how to get out of it instead of finishing the unknown remaining steps. I feel bad for folks who need that accessibility in order to be effective.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: