Hacker Newsnew | past | comments | ask | show | jobs | submit | ResidentSleeper's commentslogin

The unusually downvoted answer holds the key to the riddle: it's the hardcoded tribalism making us justify our purchases as if they're political statements.


I do think it has to do with spending money. Don't see the same levels of defense for companies with "free" stuff. If Epic had only sued Google in this issue they'd have had more support.


I think we're like crabs in a bucket, got to pull each other down if we see someone close to getting what they want.


Most people I know just click "yes" on all popups until the website starts working. In between every other website asking for notification/location permissions and a huge pile of GDPR popups (which are actively hostile towards users that attempt to opt-out), we've managed to condition web users into routinely agreeing to give up their privacy and security. Hooray.


> Most people I know just click "yes" on all popups until the website starts working

This. Defaults matter. You cannot introduce a privacy sensitive feature just with a permission dialog box. You must expect that most users just click through, and continue to protect those "dumb" users.


Backups protect against a wider class of problems. For example, a software vulnerability/bug or a human error could result in the deletion of an object in all replicas (because deletions are replicated too), but you'd usually need another incident to occur for you to lose the backup as well.


Being in the market for an enthusiast CPU is predicated on being enthusiastic about CPUs in the first place, and that's become somewhat difficult in the last couple of years.


Indeed. My old self-imposed rule was that upgrading was a waste of money unless I could get at least twice the performance. Manufacturers have generated high benchmarks by throwing "moar cores" at it but even the best chips have barely a 40% single-thread increase over my seven year old Haswell build.


40% single thread increase, but most likely 100+% increase in core count. CPU manufacturers did not slap more cores on a CPU to get high benchmark scores, it's just one of the few viable options for increasing CPU performance.


Check top/task manager from time to time. It's pretty rare any desktop applications use more than 2 or 3 cores to the max.

That means those extra cores won't really help day to day performance.


If you’re a dev (as I’m sure many here are), many big builds will max all cores for extended times. I had to up the cooling in my machine as building Chromium would regularly peg all 8 core for hours, causing thermal throttling to kick in and lower clock speeds.


Sometimes I manage to extract a little more performance by shutting down virtual cores to get lower thermal stress and better cache hit ratios. It's a very YMMV thing that depends on pretty much everything from the CPU microarchitecture and built-in power management all the way to the size of the fan, whether you polished the heat spreader on the CPU to the programs you are running and what the data they are processing looks like.


My workloads are more of an outlier when compared to the general PC user crowd, but I do keep an eye on my CPU usage and it is often between 25-100% CPU usage, depending on the situation.

Workloads, where more cores really help out, are:

* re-encoding video files to save disk space (it's amazing what speeds I can reach with my Ryzen 5 2600X here!)

* IntelliJ (startup, indexing)

* running tests (can be configured to be very parallel)

* running JS heavy sites, like Jira (you would be surprised at how resource intensive websites can be nowadays)

* running more than one Electron based app (Spotify, Slack are surprisingly CPU intensive, especially if you have poor/no GPU acceleration support)


single desktop app probably doesn't but you don't run just a single one. Chrome, IDE, database server, db client, web server, VMs, slack/teams/gitter, email client. Try provisioning a VM with single core and running them all at once inside...

EDIT: that said I wonder if you notice a difference between 4 and 8 cores in normal life


Have you looked at what you wrote? 40% increase is MASSIVE. Not only that but multi threaded the increase is in the several hundred percent range.


40% is pretty big, but over 7 years that's not much. It wasn't that long ago that 40% increases were happening every 2 or 3 years.


Gordon Moore would be very disappointed with 40% every 2-3 years. OTOH, he would probably like how GPUs are progressing.

Even though I'd say benchmarking fp16 GFLOPS against fp64 ones is cheating.


Moore's law I think is widely misunderstood as computer chips doubling in speed every two years (or something to that effect). In fact, what it says is that the number of transistors doubles every two years, which is still roughly true. In 2020, its hard to buy single-threaded CPU speed by just throwing more transistors on a chip, which is why most performance gains have been found in multithreaded workloads (because you can trivially get more CPU cores by putting more transistors on a chip).


I would love if I could find where my OS could collect informations on how long my CPUs stall for lack of proper execution ports, lacking parallelism in the code, L1-3 cache misses and so on.


If you're on Linux look at

  perf list 
then

  perf top -e [...]


are all your workloads single/lightly threaded? my 9700k compiles c++ two to three times faster than my old 4770k.


Looking at the syntax (as a Go beginner), I'm not really sure whether that's less or more to learn. Seems like one of those arbitrary language design decisions that don't make a huge difference either way.


Yup. They saved a keyword or two but now you need to count the semicolons to see if there is an increment action.

I don't think it makes a big difference either way.


Yeah when I picked up Go, the infinite loop caught me by surprise:

  for {
    ...
  }


Looks like a minor simplification of

  for (;;) {
     ...
  }
I'm biased by my experience with c, but I do like the explicit punctuation.


I've recently started using GitJournal [1] to access my notes on Android, but I haven't used it enough to be able to say whether it's worthwhile. One annoying drawback is that it requires a monthly subscription to get access to all of the (convenience) features.

[1] https://play.google.com/store/apps/details?id=io.gitjournal....


@vhanda (unable to reply directly for some reason)

First of all, congratulations on shipping a decent, polished Android app! These increasingly seem to be a rarity these days.

Regarding the subscription model, the monthly price quickly adds up, especially for users outside the US. For example, one of the most common apps in my country offers a pro subscription for a yearly price of $3, compared to a minimum of $24 ($2 * 12) in GitJournal (and this would be on top of whatever I'm paying for my desktop note-keeping app...). I would suggest that you review the pricing for low income countries (especially since the regionalized "go pro" slider UI seems glitchy, eg. I can't reselect the default price, can't select values to the right). A lower yearly subscription or a single purchase option would go a long way towards convincing me to upgrade.

I think not including ads was an excellent choice and I probably wouldn't have continued using the app with them. I was going to suggest hiding the locked pro features to make them less annoying, but upon review it's not that bad.

I'll keep using the app and leave a review on the store later when I gather more thoughts.


Hey Resident Sleeper

Thanks a lot for the feedback. Could you let me know what Country you are in? I've changed the pricing of the Indian market as I understand it better, I'll be happy to dramatically reduce the pricing for other countries where the US/EU pricing does not work.


Hi. I'm the author of GitJournal - do you have any suggestions on a different monetization model? I'm not too keen on ads.

Maybe a yearly or 3 years subscription?


Parent comment might come off as offensive, but it very accurately represents my experience with several Google apps that I've been using recently (YouTube, Google Play Music/YouTube Music, Google Photos, Google Maps, ...). But I'm on Firefox, so I guess nobody cares.


The experience on iOS is similar - the lack of respect which Google has for basic QA testing has been painfully obvious. I’m not sure why they want to train people to think that their products are only worth it at free but they’re doing a great job.


I'd be surprised to learn that there's any sensible optimization that the compiler/runtime can make to ensure optimal data locality. The problem is difficult enough that the entire Unity engine is being pretty much rewritten to make use of sequential memory access.

In addition, I really, really wish more JS developers actually cared to learn about optimization and about how the runtime/hardware actually works. We've "developed" the web to such a slow, buggy mess that I've given up on the idea that there's any way to fix it. I hope somebody figures out a way to start over, preferably with no scripting capability, because apparently giving people any half-baked scripting language results in them soon developing nuclear footguns with it.


Significant part of the web is fast and works just fine. Sure experience on major news and media sites sucks, but you have to realise that it's caused by their monetisation model and not a platform itself.


On the other hand, I'd wager that the set of words that you can recognize is vastly larger than the set of words that you're likely to come up with on the spot. Hence, using a generator would still result in higher entropy then trying to come up with a password yourself.

Random numbers picked by humans are notoriously biased. I'm guessing it's even worse when you ask them to come up with random words.


A while ago I deleted my Uber/UberEats account (they didn't make it easy btw) because of how utterly unusable their apps were compared to competition. After me and my girlfriend spent 30 minutes trying to order food on 3 different platforms, which has been a repeated experience with them for years, I decided it's time to give up. Same obvious UI/UX bugs as over a year ago - I'm actually dumbfounded at how this is possible, do they even use their own app? To sum up, I tend to look at tech coming from Uber through a somewhat sarcastic lens these days.


Try rooting your phone and installing a CPU monitor like https://f-droid.org/en/packages/com.eolwral.osmonitor/ . Then watch for apps burning CPU in the background. Uber might show up wink.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: