Hacker Newsnew | past | comments | ask | show | jobs | submit | untog's commentslogin

Perhaps paranoia on my part but this opens them up to deprecate that support whenever they feel like it.


If they have bluetooth, how do they "deprecate support"?


> nor do I want to invest any time learning about headphones

https://www.nytimes.com/wirecutter/reviews/best-over-ear-hea...

~ 5 minutes of reading time and done.


I used that guide a few months ago and bought the Bose NC 700, Wirecutter's recommended high-end option.

I've had a huge list of problems with it:

- The noise cancellation sets itself to maximum power every time you put it on and there's no way to completely turn off noise cancellation when using a computer; here's a thread of people complaining about it: https://community.bose.com/t5/Around-On-Ear-Headphones/NC-70...

- The only way to use it with no noise cancellation is to pair it with your phone, install the Bose app (there are three, make sure you use the right one), open the Bose app, find the settings which are not easy to find, and turn it off, and repeat this process EVERY TIME you use the headphones (there's plenty of lag in every step in this process, of course)

- It requires a Bluetooth connection to use the mic (the cable only supports speakers), which means I can't use it in Boot Camp because Boot Camp Bluetooth is currently broken

- It mispronounces my name every time I put it on, with no way to fix it, which gets very aggravating

- There's sometimes some weird static noise when connected to my Android phone, and they also automatically reconnect to my phone every time I turn it on, even after I manually disconnect it every time (my AirPods Pro don't do this so it's not just a problem all headphones share)

It's otherwise pretty good, but these flaws are really frustrating considering how much I paid for it.

The reason people are willing to pay more for Apple is because Apple products usually don't have problems like these.


how is this helpful? You’re not proving anything, and likely just pissing off GP.


How is it helpful to post an example of a buying guide for over the ear headphones? It solves the exact problem the OP pointed out: that they don't know what to buy and don't want to spend the time researching themselves. It gives you an answer right at the top of the page, from someone who did the research for you.

EDIT: would love to continue this conversation with those who replied to me but HN has decided I am "posting too fast" (my last post was an hour ago?) so... shrug.


There are many reasons why this doesn't work, at least sometimes.

First of all, you have to trust that the reviewer is truthful (this case, NY Times, might be given a pass). So you either check a bunch of reviews and frequently come out more confused than going in, even as someone who's knowledgeable about the subject or you just buy something from a brand you already trust.

Secondly, the problem with reviews is that they're done by specialists. To give you an example for movies. I've watched some of the movies which are top rated by experts. Frequently I've found that I just don't like them. Too artsy. You know what they say about specialists: they know more and more about less and less until they know everything about nothing. At some point specialists basically turn into aliens. Their tastes stop intersecting with those of the average J. Doe.

In many cases you just want a brain-dead solution to a problem because you're using your neurons for something else. And one of the safest brain-dead solutions is to go with a brand you already trust. Another one is to go with the crowd. Apple is kind of both.

And I say this as someone that generally avoids Apple products :-)


>nor do I want to invest any time learning about headphones

That categorically sounds like not wanting to read any such articles. I highly doubt the barrier is to do with Google skills or finding them in the first place. GP just doesn’t want to. Not to mention most people will not feel they have sufficiently researched a topic by reading the first piece they see.


OT but regarding your edit it means you've been rate-limited.

https://news.ycombinator.com/item?id=16443213

> We rate limit accounts when they post too many low-quality comments too quickly and/or get involved in flamewars.


Thanks, I suspected as much. I'd contend that I've done neither of those things but will instead take it as an opportunity to re-evaluate my choice in even spending time here. Time to sign out, I think.


No doubt Apple will have made a great product but that doesn't oblige you to buy them! What would the Airpod Max bring you that your existing Airpod Pros don't? What problem will they solve? It sounds like you're just buying things because Apple decided to sell them.


Buying apple also buys status


Excuse me, what?! How do you define status? Owning apple gear?


Owning Apple gear signals status among a significant portion of the population. Maybe not for you, me or the majority of HN's clientele. But even for those who don't buy into this definition, adhering to it can still bring tactically social advantages.

I have a very hard time believing you're not aware of this, and feigning ignorance in order to criticize the fact is just disingenuous.


Could you please define what status advantages Apple gear offers?


Apple gear is, to a degree, a classic Veblen good. [1]

[1] https://en.wikipedia.org/wiki/Veblen_good


Then why did Apple lower the prices of iPhones, iPads, and MacBook Airs?

According to that link’s definition of a “Veblen” good, if Apple had raised prices, then demand would have increased. Either someone in charge of pricing Apple products missed out on huge revenue increases by failing to raise the price, or it’s not a Veblen good.

And there is no contention that the amortized cost of an iPhone SE, iPad Mini, or MacBook Air gets you the lowest cost to quality ratio of any competing product that also last the longest.


They didn't.

The iPad Pro, the iPhone X, and the Mac Pro were all shockingly expensive. I remember the chatter on Hacker News like it was yesterday.

The first two sold like gangbusters and the last is less of a Veblen good and more of an anchor price for the rest of their lineup.

Apple's 'affordable' version of a product is always "a good price for an expensive X", rather than actually cheap. And they go up from there.


And the iPhone SE, iPad Mini, and MacBook Air are extremely competitively priced, especially if you amortize over their lifetimes versus competing products' lifetimes.

There isn't even a competing product for iPad Mini.

Maybe a case can be made for the highest end Apple products are Veblen goods, but that's a far less interesting comment than all Apple products are Veblen goods.


Veblen goods as defined on the Wikipedia page (Veblen himself is more careful) are a bit of a spook, a product which sells more every time the price is raised to no limit is of course impossible.

In contradiction to your post which I replied to, Apple has raised the price of their flagship iPad and iPhone, and continued to bust through sales records. That's Veblen enough for me.

They have also lowered the price of their most affordable product in a category, and in any case, this being tech, each new product isn't the same as the predecessor, which complicates the analysis.

I don't think "people buy the latest flagship iPhone because it's expensive" is as true for Apple as it is for, say, BMW.

But it's reasonably true, and there's a trope to go with it: Apple keeps releasing products where the commentariat say "that's so much money for an X! who is going to buy this?" and then sell millions of that product.


>In contradiction to your post which I replied to, Apple has raised the price of their flagship iPad and iPhone, and continued to bust through sales records. That's Veblen enough for me.

Seems like a useless concept to me. If the flagship phones are getting better and better each year, why is it noteworthy that the price might increase? Is there a business that sells a better product at the same price as an inferior product?


You're right, there's not a direct and linear increase in demand as price goes up. I concede my Veblen point.


And what's wrong with that if you can afford it like the OP said they could?


Waste. Physical electronic waste and opportunity cost.

For example, these cost more money than an average resident of Sierra Leone earns in a month. You could donate to feed their family.

Or for $550, you could spend $300 on buying a Kenyan family 5 sheep, $100 on gifting 5 Serbian women a month of literacy classes each, and $150 on 15 rides to school for refugee children in Egypt. Taxes not included.¹

¹https://cwsbestgift.org/


Or you could buy yourself a really nice set of $200 headphones and still send $350 to charity.


You're asking what's wrong with wasting money?


> They should then think again about their choice of using teams.

What percentage of Teams users do you think have a choice in their use of Teams?


If it's on their work machines then it primarily endangers their employer's data, much less their own.


Funny thing to say when we're in the middle of a global pandemic, and more people are working from home than ever.

I work at an university and I've been forced to install that crap on my home computer because I need to teach from home. And so do all the professors in around half the universities I know in my country.


Interesting, I'm surprised that they don't have to provide you with the tools needed to do your job!

In Australia, the emp is generally responsible for providing any necessary tools or equipment needed to do the job (contractors are another matter though)


In normal circumstances they do provide the tools needed for the job, as they should. But this was a sudden state of emergency triggered by a pandemic, there were no funds, reactions weren't fast enough... so basically, they didn't.

Anyway, those of us who have research projects (as is my case) typically do have computers provided by the university at home, because research has strange schedules and working from home has always been a need (meeting with colleagues in different timezones, waiting for experiments to complete at night, rushing for deadlines, etc.).

But... it's not really practical to make room for two different desktop computers for my own use in an already spaced-starved flat, or to work in a laptop for many hours when I could do so in a desktop. So in practice, my home computer and my work computer are one and the same. And it's like that here for most, if not all, people I know.

We are a Latin country and also tend to live in small flats, maybe in other places it's different. I can imagine that if I had one of those American McMansions, it would make sense to have a home office with a sober, black work computer, a good camera setup and a green screen, and then a gaming room with a flashy gaming computer and huge speakers (near the billiards and darts room, probably :)). But that's not really how things work about here. Here, separation of home and work computers at home is almost exclusive of jobs with high security restrictions. Most people in normal jobs just don't do it because it's not practical.


And then when the company loses business from the disruption, do you think employees walk away scot-free?


I consider that inherent risk. Not getting a raise because the company made business decisions that turned out suboptimal (such as gaining short-term profits by not investing IT security) is a risk that any employee faces. If you want a more stable environment you go for a more risk-averse employer, perhaps even public sector jobs.


That's a silly proposition. If my field of expertise is inherently private, I don't have that choice. Also I can't solve for every variable when searching for jobs. I choose among the ones I get an offer for, and obviously their IT decisions aren't top of my list (nor do I know what those are prior to hitting the desk)


Ruining companies that can't (or won't) get their act together (whether it's security, finance or any other critical and undervalued area) is a short-term pain that fixes the issue. Refusing to fix simply prolong the problem - at some point you have to say "enough is enough" and tear the bandaid off, if you don't, and you don't do so with severe enough consequences then businesses will simply conveniently ignore what they're being asked to do.

Necessity is the mother of invention, I have no doubt that the opportunities created by blowing away poorly-behaved incumbents will cause a healthy collections of startups who will be operating within the required framework.


You may not see yourself as having a choice but that wasn't really my point. What I was getting at is that being an employee in general comes with a diffuse risk of many factors that can result in not getting a raise or the company even going bankrupt. Many of them are outside your direct responsibility or influence and yet you take up the whole risk package when joining that company. The company getting ransomwared is just one more factor. It's not special. Well, one issue with it is that it requires criminal activity so it's dragging us down to a worse equilibrium where more resources have to be spent on countermeasures. But arguably that cat is out of the bag, so the next best thing that we can do is to make security best practices easy. And microsoft wasn't doing its part here.


The OP is asking for more detail than “not enough”, though:

“Can it escape the Chromium renderer sandbox? Or is that sandbox disabled?”


to simplify - no it’s not enabled

the real answer is more complicated as it is not necessarily a global setting and depends on what you call a “sandbox”


Thanks. I'd pay (moderately) for the more complicated answer. An ebook on Electron security might be a good idea.


I'm not an expert on Electron security!

But if not addressed to me, there is no need to pay, you can start here: - https://www.electronjs.org/docs/tutorial/security - https://github.com/electron/electron/security/advisories

As you can see there are plenty of considerations and pitfalls to take into account. Best option is to enable contextIsolation for everything.

Further, Electron security is closely tied to Chrome security so that is one deep rabbit hole


Best Electron security is not using it in first place.


Yeah, let's stick with raw C/C++, that would be much safer...

Or maybe let's use some research language made by Wirth, and get access to all 10 of packages and 5 devs worldwide using it :-)


For starters, leave it on the browser.

I didn't mention any programming language.


Telegram Desktop is a cross-platform C++ app. What similar remote code execution exploit has existed in the wild for it?



One of them requires the user to click run on a file, much like running an EXE. The other, simply saves potentially malicious data to external storage which would then have to be run by a separate malicious third-party app. This are far from RCE exploits that execute immediately without poor user decision making, and Rust is not impervious to security exploits similar to these.


C'mon. Just because there is one C++ app without remote exploits doesn't mean all C++ apps are immune.


FYI it's not just PL that factors into security. The engineers, for example.


Rather just keep it in the browser? ;-P


This is safer to a significant degree.


I doubt that very much. The iPhone was a true revelation: in user interface, in form factor, in the things it allowed you to do.

The M1 is amazing but it doesn’t compare. Performance (both raw and thermal) is a big boost but it’s not going to reinvent computing.


Loads of people didn’t think much of the iPhone when it came out either.

No physical keyboard, other devices have large touch screens, no stylus, can’t copy and paste. Appreciating the iPhone is largely done in hindsight.

Best not to make predictions too far into the future.


> Appreciating the iPhone is largely done in hindsight.

I’m sorry but that’s definitely not true. Yes, there were a vocal minority of folks who said the iPhone was nothing special but the mass market response was huge and it was heralded as a huge deal by the vast majority.

And to go back to my original point: there’s a difference between “this new invention allows you to do new things, but those things aren’t important” and “this new invention does not allow you to do new things”. Reaction to the iPhone was the former. My reaction to the M1 is the latter. It allows you to do the things you already do faster and cooler and it’s a big technological shift. But it isn’t going to change the way people use computers in the way the iPhone did.


My reaction to the M1 is the latter. It allows you to do the things you already do faster and cooler and it’s a big technological shift. But it isn’t going to change the way people use computers in the way the iPhone did.

I think the point is that they have a single unified platform now (or soon). I think MS was right to have the Surface run windows and build in touch to Windows. But that never really took off.

Now, if I’m not mistaken, the M1 is giving you full performance for a work day running off a battery. And you can run iOS apps on Big Sur.

I think Apple will pull off what MS couldn’t. Single platform, multiple form factors, all capable of running the same software because the hardware is literally the same. Without just shoving everything into a web browser, because native apps are almost always better. And Apple has all the pieces of the puzzle: desktop, laptops, tablets, phones.


Did you forget when the CEO of Microsoft laughed it off and said it was crap? [1]

[1] https://www.youtube.com/watch?v=eywi0h_Y5_U


I mean yeah, what else would the CEO of a competitor say?


Like I said, there was a vocal minority. Did you forget that he was widely mocked for being so dismissive of an obviously very impressive product?


To be fair, if you remember 2007, the first generation was lacking in many areas. You couldn't send MMS messages. Receiving MMS was awkward and was done through an AT&T web site. You couldn't install apps. You couldn't take videos, only photos. It was slow, etc.

My first iPhone was a 3G, but felt the iPhone really started taking off with the iPhone 4. It felt like a huge leap.


It won't reinvent UI, but it signaled the change to ARM and by extension, RISC.

To a normal consumer, that's not a big deal (although 17 hour battery life is already pretty huge). To future PC hardware and server/backend infrastructure, that's a huge deal.


Oh, for sure. It’s going to be a big deal to use techies and represents a huge shift in terms of who holds the power in processing. (that said, the vast majority of PCs won’t be able to use it!)

But entire industries (e.g. ride hailing) were created in the aftermath of the iPhone’s launch. I just don’t think we’ll see that kind of change reflected in most people’s lives.


The big boosts here are commodity ARM instances on the cloud. These instances already are more power efficient than their intel counterpart and cheaper - but developer machine used to be all intel. Now that has changed.


It’s not the same, but this is significantly accelerating the end of x86, and the end of Intel (at least the end of the current dominance of both). Such shifts don’t come along that often.


To put your comment in perspective:

Intel has 90+% market share in servers.

Apple has 15% market share in personal computers.


Sure, but I didn’t mean that the whole world will switch to Apple. The world will mostly switch to commodity ARM chips and Intel's margins will crumble. Might take 5-10 years.

This is Intel’s Nokia moment, when they lost a dominant position to Samsung/Android. Apple was the catalyst there as well, but Apple did not directly kill Nokia. The industry shift Apple sparked killed them.


I disagree... laptops and desktops have been “dead” forever. The release cycles are boring and the computers even more so. This chip is a big change - Apple came out of the gate offering it in 3 form factors. That’s meaningful.

For the task worker populations at my employer, we only committed serious cash to upgrades when Windows 10 started cutting off support. The funny thing is in some ways the newer devices are slower than the old Haswell stuff. Cheaper laptops are lighter but have awful thermals.


I was a little confused by this part because MS is already happy for Office to run everywhere, in browsers. Given the in house expertise that’s given us VS Code I wouldn’t be at all surprised to see an Electron powered Office app suite in the not too distant future. Linux would (and does) benefit just as much as Mac.


As someone living in NYC I can attest that the local/express lines are great. But I'd kill for 95% reliability! The on-time performance in NYC is somewhere around 80% and likely to get worse with the huge budget issues NYC will have post-COVID.


Keep in mind that the Paris metro has 0% reliability if you need to ride it between 2AM and 5:30AM. Or rather it is 100% reliably not running during those hours. NYC could probably boost those numbers if they could do track maintenance during guaranteed non-running hours.


NYC can already do that: it's not uncommon for express lines to be shut down overnight for work to be completed, for example.

(the entire system is shut down overnight right now anyway)


I was talking about the scheduled time - when it must work, but it doesn't.


What do you mean by "on-time performance"? I lived for 6 months in Manhattan on 112th and it worked fine. Probably one or two times my station was closed in the morning and I had to walk down to the next one.

And as someone who lived in Moscow for a long time I can attest it is the gold standard. It's the second busiest (or top-3) and just always works as scheduled. One rare thing one could rely on that comes from the government.


The NYC Subway has a timetable, except no one gives a shit because it has very little relation to real life.


That’s pretty strange though, I don’t think I’ve seen a subway with a time table before. It’s too frequent for it to matter in the first place, you just need to know when the next train towards your destination is coming!


It has a timetable primarily because

- due to all the merging in and out throughout the system, trains need to be in the right place at the right time or merging delays will cascade throughout the system

- for purposes of employee scheduling, you need to make sure that the right employees are in the right place at the right time; generally a train leaving a terminal is being driven by a crew that had to come in from somewhere else.


It's also still useful as a general approximation for "how long should I expect to wait for the next train". Same with the bus lines.


Isn't that a different idea than a timetable? "We're aiming for one train every 10min" and "there's a train at 12.10, 12.20, 12.30" are similar, but not quite the same. You can get your approximation without a timetable though.


Tracked vehicles require centralised traffic management and scheduling as they cannot casually overtake one another as street traffic can, and setting switching points and clearing control blocks is required for safety.

Passengers transferring to other lines or transit modes may also appreciate predictability.

https://en.wikipedia.org/wiki/Centralized_traffic_control


Available at https://new.mta.info/schedules

If you dont have one, how do you measure performance of the system?


Various approaches.

One could make the argument that a train showing up ten minutes late vs one on time doesn't really matter to a person so long as it shows up within a reasonable time of them getting to the station and it still takes them to their destination on time. If the headways are five minutes and every single train is running at the same speed just five minutes late, this is a distinction that doesn't matter to the passenger.


> too frequent

Depends on the line and destination.


Info from the MTA here:

https://www.mta.info/press-release/nyc-transit/new-full-year...

I regret to inform you that the NYC subway has many more problems than you experienced. Before the pandemic I traveled between Brooklyn and Manhattan and problems were common. Much more common was just random slowdowns, sitting between stations, etc. that makes it a lot more difficult to know when you'll arrive at your destination.


I'm not sure, but it looks that OTP is much stricter measure than what I meant. I was talking about: I want to change from line A to B on station X (or exit on the station X)... Station X is closed, and I learn about that on station (X-1), or even just by not stopping on the station X.

It happened in Paris more than "extraordinary" number of times, so I have to put extra 15 mins to every important travel.

OTP sounds like: the train is scheduled at 3:25, it arrives at 3.26. Correct me if I'm wrong here. But if the normal margin is 5 mins for total time of travel, making it 15 mins is not nice.


This notion always makes me laugh. Which is better, Google Maps or the old MapQuest pages that had arrows and reloaded the entire page when you wanted to move the map? I know which one I prefer.


That's a lot more overhead. For one it requires you to spin up an entire JavaScript engine! I don't know if it was ever implemented but the beauty of doing this at the HTTP level is that (for example) it could be built into the networking layer of iOS or Android.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: