Hacker Newsnew | past | comments | ask | show | jobs | submit | shade23's commentslogin

This is addressed in https://open.spotify.com/episode/2z6OL9jLyFQh9juJHJ4DNa?si=X... If you're interested in the creator's reasoning


The dongle is compatible with Macs too. Only issue is that you experience some interference if you plug it into a usb-c multi adapter. I resolved this by using a usb extender cable. I've had zero interference since I started using this setup.


To clarify, Logitech makes two MX Master mice - one generic and one "Mac specific" which bundles a USB-C cable and no dongle adapter. On their website, they do not reference the Mac-version compatible with the optional dongle.

Edited: forgot the words "do not reference"


Previous Discussion from 2019: https://news.ycombinator.com/item?id=20124018


The reuters article [0] offers more details on this. Perhaps these speech was a catalyst for this action, but scrutiny of the organization seems to be something that has long been in the agenda of the Chinese government.

[0]-https://www.reuters.com/article/ant-group-ipo-suspension-reg...


I believe a few automotive companies agree with touch screen being a bad interface https://www.motorauthority.com/news/1121372_why-mazda-is-pur...


This going to be more obtuse from what you would expect but i can give it a go

this is because efficiency isnt a goal , impact is. you might be efficient enough to iterate 10 times a month. but that is not what a customer wants. planning goes a long way compared to just doing something. as software Engineers we often mistaken efficiency as "getting shit done". which is great but there are too many instances when it is shit. the bigger the org and the user base / affected population, the things you do has an impact . i've known users to shift loyalty due to small mistakes that companies do. as a dev deeply connected to the impact of your product, you realize that decisions make a big difference. you plan ahead and mitigate probable failures. these require time.A good product can go a longer way than an efficient team with no direction. this does not imply that everything takes time. but repercussions of a bad decision can be lasting in a big organisation. you have a large number of stakeholders involved who would want to and can help you achieve your goal. this is however not always good. the greater amount of red tape involved can lead to competitors gaining an edge and also several cases of "i fucked up". the bottom line being. efficiency isnt merely moving fast. its about building things which matter and having an impact. if you look at efficiency in that manner, then big orgs aren't any less efficient than the smaller ones since their reach is much greater. perspective matters



Thanks. Here is the comment from a GitHub engineer addressing the root cause:

https://github.com/cocoapods/cocoapods/issues/4989#issuecomm...


How does ipfs deal with dynamic content? And how would you make sure that everyone uses an updated version of the website.


I was worried about that too.

Scheduled republication is my best answer so far.

If you promised to sign and republish the same file every day with a new timestamp, then people would know when they had the latest, and when they didn't... would just have to wonder if you fell off the earth, which we sort of do already with all those abandoned free software projects online.

Republication may be cost prohibitive for large files, so instead you could republish a metadata file that pointed to the latest hash as of the metadata file's publication time.

For the "hit by a bus" problem (or for a server doing this automatically, the "hit by a comet" problem?), it'd be nice to include a dead man's switch from a third party, where they can publish a "FINAL -- EXPECT NO MORE UPDATES"...

But that that point you're trusting a third party. If you're willing to trust a third party this is far easier. So that might be what we'd end up with... something like DNS providers, but they're suddenly managing indexes and metadata for hosted files? I don't know...

(Also, this has probably been worked out already by smarter people than me, I haven't looked at IPFS much, this was just a back of the napkin guess.)


See PubSub: https://ipfs.io/blog/29-js-ipfs-pubsub/

You can use it as the base for CRDT structures: https://ipfs.io/blog/30-js-ipfs-crdts.md


While I love the advice, my qualm are with the measurement of productivity,I use qbserve too,checking my logs I see 307 hours as an average for the past 4 months.(this includes office and my personal projects and also my studies(distance masters).But not every moment you spend on an IDE,terminal,<insert other tools associated with productivity> .. productive.I know that I spend 20% of time yak shaving.Trying to figure out why something simple is broken.This is not the time I would count as productivity.

Measuring productivity as the completion of goals/output generated per hour spent seems to be a more viable quantity, however goals get modified too.Plus this is a much more time consuming process than just measuring hours.

I find measuring time spent on productive apps to be analogous to measuring productivity using LOC, both are opaque and do not convey the actual meaning of what they are measuring.


I really do not expect such journalism from New York times,

>But a backdoor is not necessary. When a user installs Kaspersky Lab software, the company gets an all-access pass to every corner of a user’s computer network, including all applications, files and emails.

Isn't this true for all antiviruses.

>The Kremlin hacked our presidential election, is waging a cyberwar against our NATO allies and is probing opportunities to use similar tactics against democracies worldwide

Any proof for this?

Just realized that this was written by a Democratic Senator who took a stand against Kaspersky. That explains the lack of balance in the article and the tone.

Also. I would have a problem with any one having my data, be it Symantec and NSA or Kaspersky and KGB.


The idea of the Russians hacking the election is just a signal of political bias. There isn't even any accusation of miscounting votes or any typical election fraud. US voters still got what they voted for exactly as the system is supposed to work.



The NYT had been quite biased for a long time. I think the last time they did any real journalism was during the first major Wikileaks story - the one where Greenwald played a big part.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: