Hacker Newsnew | past | comments | ask | show | jobs | submit | more PossiblyKyle's commentslogin

My mother has an Android phone, and she’s not really tech savvy. Her notifications are absolutely bombarded with garbage from various websites, and her lock screen displays ads. I’m moving her to iOS, as I’m quite sick of constantly being the IT guy.


Oh I can relate to this. Bought my mum an android phone when my daughter was born so I could share photos and do video calls.

It was such a hassle doing support over the phone (She lives in NZ and I was in Singapore, and now Taiwan)

Bought her an iPhone 12. Apple support helped her set it up over the phone. 2 years and I haven’t played IT guy once!


This gets brought up every time the topic shows up but https://www.nand2tetris.org is a course that abstracts how a computer works, and is worth checking out


Is there a version of something like this that’s purely software?

Like, emulate gates and keep building bigger pieces until you have a function that’s a CPU. Etc.

I guess I’m thinking a bit more lower level than an emulator where you implement an opcode using normal capabilities of your language of choice.

I guess TFA is pretty close to this in ways, too.


I have been building a CPU using: https://github.com/hneemann/Digital

Much faster than Logisim, UI a little clunky, but my CPU runs around 0.5Mhz and it has very nice peripherals like Telnet, graphics ram, VGA etc

Terrible name that is hard to google, but great tool.


Yeah. There this book which uses Haskell based Clash to build up simple CPUs, a VGA controller and other peripherals.

hardcover:

https://www.lulu.com/shop/gerg%C5%91-%C3%A9rdi/retrocomputin...

or digital:

https://leanpub.com/retroclash/


Oh that’s exactly what I was hoping for! Thank you!


The great Niklaus Wirth wrote a book which you might want to check out: Digital Circuit Design for Computer Science Students: An Introductory Textbook.


In college, we wrote a CPU simulation in C, for a simple architecture called "little computer". You could call it an emulator, but the whole idea was to faithfully implement all the pipelining and register renaming and stuff. A main loop iteration was equivalent to a clock cycle.


Worth a mention: https://turingcomplete.game I've played for a few hours, it's well made!


Mvp, thanks for sharing


Conversely, I was applying to Meta and have made it to the final stage. The interview was held with the head PE in the country. Both him and all preceding interviews clearly signaled PEs are 'the ones who do both' here, and their scarcity has made them quite valuable in the branch (according to them). It is also reflected by their salary here, which is 15% higher for interns and 50% higher as a FTE.


Google and abandoning things, name a more iconic duo


Its only killer feature to me is that websites are designed and tested with a Chromium-first attitude. As a regular FF user I might stumble upon a website that’s quite buggy or straight up doesn’t work, which forces me to use Chrome for that specific website. Other than that I don’t really feel like there’s anything, and Edge is currently a better Chrome than Chrome anyway.

EDIT: and for the record, I’m still upset Microsoft didn’t choose FF and willingly increased Google’s grasp


I've been an FF user and developer of many frontends for years and the amount of times I run into serious differences between browsers is somewhere near zero.


...in the past decade, for me.

It once was a serious problem. No more


I've been on team FF since the switch from IE like 20 years ago- I agree sometimes there is additional jank, but I also want to add that it is rare enough that it's been a non-issue 99.9% of the time.

Even then, I expect some of it is the fact I use an obscene number of plug-ins to break most social media sites (to prevent overuse).


I built an extension[1] so that I could choose which sites are opened in Chrome... so that whenever I need to use Google Meet, it automatically opens it on Chrome.

[1]: https://onchrome.gervas.io


Thanks this sounds exactly what I need! Pretty much only use Chrome for Google Meet.


Microsoft had a huge hit with VSCode. They like Blink/chromium.


Can you explain that for those not familiar with VSCode?


Visual Studio Code is built on Electionjs like Microsoft Teams. Electionjs is built on the chromium rendering engine.


Yeah I'm still surprised MS did that too.


Which retains the high barrier of entry to Linux. Linux will never be mainstream as long as itself and the community primarily embrace tech savvies and nobody else


It may be a business opportunity. Order a bunch of Linux-unfriendly Lenovos, install Linux distro of choice, sell for a few <insert currency here> more.


On one hand, I definitely agree.

On the other, my issue is that motherboards are already one of the least reliable components in your PC, and introducing even more complications to them just makes the situation (and the cost of replacement) worse


Going to only 12V power to the motherboard doesn't actually make them much more complicated, or any more complicated really. Like half of a modern motherboard design today is just power supplies anyways, as all CPUs will do voltage scaling to save power. It's not like your CPU takes 3.3V input directly, most are running in the 0.8-1.5V range, even DDR4 is 1.2V, and the CPU is the majority of the consumption on the actual motherboard (high power graphics cards usually have their own 12V input connector, the PCIe edge cannot transfer enough).


I expect motherboards could be more reliable if there was a standard 12v to 5v, 3.3v, 1.5v etc pluggable converter that they all used. It could cut down on E waste. Instead of having to discard an entire motherboard, the low voltage converter board could be swapped out and repaired/discarded. The same companies that make ATX power supplies could also make the discreet low voltage converters. ATX innards could be smaller, less wiring between components.

No that's silly, we should continue to use a design from 1995, itself replacing a design from 1981.


Well, sure that'd be nice, but it would add quite a lot of cost. High current carrying connectors are not free.

You can buy a nice AMD Ryzen motherboard for like $80 retail (Newegg shows $59 is where new unopened ones start). Normal electronics supply chain and shipping margins means it cost roughly half that to manufacture it. So on a $40 bill of materials and labor cost if you add even just $1 to make something better/modular/whatever then you're adding a significant amount in terms of percentage cost increase.

As much as reducing electronics waste is a worthy goal, the economics of it aren't aligned quite yet in today's world.


From what I know, reddit is not entirely to blame on its own on this. Apple's policy only lets the first party app open certain domains, like YouTube (please correct me if I'm mistaken) and Twitter. Overall, a decent workaround is an app called Opener (link to follow) which lets you switch to specific apps from the share sheet. Apollo also has this feature for reddit links specifically.

However, I do agree that reddit's behavior beyond that is obnoxious.

Opener link: https://apps.apple.com/us/app/opener-open-links-in-apps/id98...


That's a tough cookie to solve. So here's the hypothetical. I create a trashware game app. Typical low-hanging fruit. I set it as a handler for amazon.com links. Then in my app I handle these links in a convincing web container.

And obvious, mangle the links with my referrer URL in the process.

I call this a hypothetical, but the only reason it is, is that apple wouldn't allow me to register as a handler for amazon.com.


Incidentally this is also one of the arguments that people point to for App Store lockdown. Otherwise, they claim that there would be nothing stopping you from making such a "trashware game app" and injecting referral links.


Except they already do. Also, AMD announced SmartShift ("shifts power inside your laptop for the optimal performance for a given task") support for Linux a couple of days ago. They're more than usable nowadays.

Probably a popular opinion here, but any display with a higher res than 1440p (1600p if it's a 16:10 display) is a waste on a laptop. I'd rather have a 1080p one over a 4K one, personally.


> Except they already do. Also, AMD announced SmartShift ("shifts power inside your laptop for the optimal performance for a given task") support for Linux a couple of days ago. They're more than usable nowadays.

Interesting, thanks!

> Probably a popular opinion here, but any display with a higher res than 1440p (1600p if it's a 16:10 display) is a waste on a laptop. I'd rather have a 1080p one over a 4K one, personally.

I don't disagree that 4k is a little nuts on a ~15" laptop LCD, but it seems until recently you're options for decent >1080p displays on laptops for the most part were a) Macbooks (2880x1800 at 15.4"), and b) 4k PC laptops


No, this is primarily a case of a populist, broken system that prevents citizens from voting


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: