Hacker Newsnew | past | comments | ask | show | jobs | submit | NTARelix's commentslogin

A couple years ago I upgraded my desktop hardware, which meant it was time to upgrade my homelab. I had gone through various operating systems and methods of managing my services: systemd on Ubuntu Server, Docker Compose on CentOS, and Podman on NixOS.

I was learning about Kubernetes at work and it seemed like such a powerful tool, so I had this grand vision of building a little cluster in my laundry room with nodes net booting into Flatcar and running services via k3s. When I started building this, I was horrified by the complexity, so I went the complete opposite direction. I didn't need a cluster, net booting, blue-green deployments, or containers. I landed on NixOS with systemd for everything. Bare git repos over ssh for personal projects. Git server hooks for CI/CD. Email server for phone notifications (upgrade failures, service down, low disk space etc). NixOS nightly upgrades.

I never understood the hate systemd gets, but I also never really took the time to learn it until now, and I really love the simplicity when paired with NixOS. I finally feel like I'm satisfied with the operation and management of my server (aside from a semi frequent kernel panic that I've been struggling to resolve).


I recall in the early or mid 2000s using some cheap earbuds plugged into the microphone port of my family computer as a pair of microphones in lieu of having a real microphone nor the money for one. Then I used Audacity to turn the terrible recording into a passable sound effect for the video games I was making.

Not knowing much about how soundcards work, I imagine it would be feasible to flash some soundcards with custom firmware to use the speaker port for input without the user knowing.


This is common at nightclubs (or was) - a DJ can use their headphones as a microphone, speaking into one channel and listening to another

Example https://m.youtube.com/watch?v=1NNP6AFkpjs

:-)


You will still see DJs do this in NYC! Old school flavor. You can also see Skepta rapping into a pair on the the music video for That's Not Me: https://www.youtube.com/watch?v=_xQKWnvtg6c

I've seen some theatrical DJs bring a cheap pair, snap them in half, and then use them like a "lollipop." Crowd eats it up. Even older school: using a telephone handset: https://imgur.com/a/1fUghXY


I've been wondering for a long time when we might expect to see a stable WebGPU API in all major browsers (mostly concerned with my daily browser, Firefox), so I've been looking for an official message on the topic. Deno claims the spec is ready

> The [WebGPU] spec has been finalized

but the official WebGPU spec [1] still describes it as a draft. Have I misinterpreted something here or is there some missing context around Deno's statement?

[1]: https://www.w3.org/TR/webgpu/


In the W3C process, in order for a TR to upgrade from a Draft to a finished specification, it needs two shipping implementations. Firefox and Safari are both working on theirs. We hope the only changes involved in this process will be minor and/or editorial. Once a second implementation ships, it will move out of draft.


This might be a better page to get an idea what's the status in different browsers:

https://github.com/gpuweb/gpuweb/wiki/Implementation-Status


If you're using any Chromium based browser it's on your machine right now.


The question is, is it still going to change?


Minor things probably yes (for instance check the "What's new in WebGPU (Chrome xxx)" articles here: https://developer.chrome.com/docs/web-platform/webgpu), breaking changes probably not.


There seem to be no current 1.0 discussions yet on the GitHub prpject, seems possible it's a mistake at the Deno end.


I've also been using it for several years and almost completely agree with your sentiment. The only areas that have given me trouble are in the dev tools. On my machines the debugger is significantly slowed down when opening very large JS files, source maps compound the debugger slow down, and I can't always inspect variables' values when using source maps (possibly a build tool config problem).


A pie chart could serve a similar purpose, but can be much easier to interpret. I like this interactive pie chart for profiling Webpack bundle size. I've used it several times at work to help find and reduce bloat in our bundles.

https://alexkuz.github.io/webpack-chart/


Given how horrifically bad people are at interpreting pie charts, that does not bode well for treemaps.

    "[Pie] charts are bad and that the only thing worse than one pie chart is lots of them"
    -Edward Tufte
https://scc.ms.unimelb.edu.au/resources/data-visualisation-a...

https://www.businessinsider.com/pie-charts-are-the-worst-201...

https://www.data-to-viz.com/caveat/pie.html

etc etc


Correction:

The chart I'm talking about has multiple names, but is not a simple pie chart. Thanks to funcDropShadow for pointing this out. The names: sunburst chart, multilevel pie chart, and radial treemap.

https://www.anychart.com/chartopedia/chart-type/sunburst-cha...


Your example is usually referred to as a sunburn chat. Although it shares all drawbacks of pie charts. Some would say sunburn chart make it even harder to correctly understand the relative size of elements than pie charts.


My designers take their own snapshot by cloning their work and using versions in the names of things. Older things are not to be modified with few exceptions. It makes for a good linking experience on my end, but I don't know what that kind of maintenance is like for them.


This is my method, especially because it shows how design will progress, with tight deadlines this becomes harder but I still consider it a very valuable way of proving work (only a minor designer though with freelancing)

I certainly can understand stand the move fast method of quickly changing things, when I'm doing concept work this makes more sense here.

Long term though, if you actually care about your work you should be making copies or different boards to show how and when you made some decisions. Especially mayor design changes.(Granted I could just be a bad designer that just can't come up with a better workflow)


The content of the series describes creating something like the beginnings of a Deno alternative, upon which the reader could fully recreate Deno or Node.js. It seems to me that the core idea presented is "a JavaScript/TypeScript interface to Rust". The thing that most interests me about something like this is not making yet another alternative to Deno or Node.js, but the potential for adding a scripting language to an application or framework written in Rust. I'm thinking like Python scripting of Blender, Lua as a scripting/modding layer for a game engine, scripting of Tiled with JS, an Electron alternative using GTK, your own browser.


I have memories of illegal immigrant workers from longer than 22 years ago and the USA has had immigration laws for much longer than 22 years. Have I misunderstood your statement?


I think you missed the “as we know it now” part.

How did the US southern border work in your memories? Would you be surprised to learn it was almost unrecognizable compared to what it is now?


My android phone's apps must ask for permission to use some of this data (location, microphone, filesystem, etc.), and android provides the options "always", "only when using the app", "this time only", and "never"; which seems to help with this problem, though I'm sure it's nowhere near a silver bullet. When I leave my home I only feel (mostly) untracked if I do so without my phone and only buy things with cash, which is almost non-existent behavior for myself and the people I know.


Over the last decade or so I've switched between Kodi (even back when it was XBMC on a literal Xbox [the original]), Plex, Emby, and Jellyfin; currently settled on Jellyfin for maybe a year and a half. I've also had a great experience with Jellyfin and love that it doesn't hide features behind a paywall. I agree that it has caught up with the competitors in terms of necessary features, but it occasionally feels a little buggy or in need of some UX polish. Perhaps one of my free weekends I'll see if I can contribute.

One of the primary features I appreciate that the others have behind a paywall is the ability to download content for offline use. 1 less reason to open up my network's SSH port to the world. The feature is great for trips where connectivity is limited, or just at a friend's house with terribly unreliable wifi.

I run Jellyfin on an Athlon II X4 (12 or 13 years old) with several other self-hosted services. Transcoding anything above 720p causes the entire system to come to a screeching halt, so I've pre-transcoded all of my content with handbrake to allow direct-play 4k content on all my home devices (Firefox, Shield TV, Chromecast, Android client, desktop client).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: