I wasn't sure how to load the images back into docker at first. I tried `docker load` but I get this error:
$ (cd ci-repack && tar cfv - .) | docker load
./
./oci-layout
./index.json
./blobs/
./blobs/sha256/
./blobs/sha256/2ad6ec1b7ff57802445459ed00e36c2d8e556c5b3cad7f32512c9146909b8ef8
./blobs/sha256/9f3908db1ae67d2622a0e2052a0364ed1a3927c4cebf7e3cc521ba8fe7ca66f1
open /var/lib/docker/tmp/docker-import-1084022012/blobs/json: no such file or directory
Then I noticed the `skopeo copy` in one of the github actions workflows. That got me further. The image was able to be pushed to a registry. But I am getting this error when pulling the repacked image:
failed to register layer: duplicates of file paths not supported
This is great! I’ve been using ipv6 on openbsd for a while now, starting with a hurricane electric tunnel years ago then to native v6 on Comcast and now Sonic. Configuring ipv6 PD has not been supported in base this whole time. I recall using wide-dhcp6c years ago and then switching to the dhcpcd in the article. The situation has improved, slowly, but it will be great to have this in the base system
They turned on native IPv6 end of last year, at least in some areas (including Berkeley where I live). You wouldn't know it from their help pages, but there are some post in the forum to that effect.
Vector is fantastic software. Currently running a multi-GB/s log pipeline with it. Vector agents as DaemonSets collecting pod and journald logs then forwarding w/ vector's protobuf protocol to a central vector aggregator Deployment with various sinks - s3, gcs/bigquery, loki, prom.
The documentation is great but it can be hard to find examples of common patterns, although it's getting better with time and a growing audience.
My pro-tip has been to prefix your searches with "vector dev <query>" for best results on google. I think "vector" is/was just too generic.
I’ve been using this for a few weeks. I converted to a paying customer. I’ve imported some but not yet all of my bookmarks.
I’ve been collecting bookmarks in Evernote and now obsidian for about a decade. I try to add as many tags as I hope will allow me to find an article again later. It’s often many months before I need to find something. My success rate at remembering a term from the title or the right tag is not great. I’ve been pretty impressed with zenfetch’s ability to search and find exactly what I was looking for. And this doesn’t even scratch the surface of what it can do when you want it to synthesize answers from many articles for you.
They’re also working on indexing your saved tweets which I’m excited about. It’s a giant pain to try to find liked tweets.
I just happen to be reading Peter F. Hamilton's Confederation universe (1996–2000) trilogy and in it earth has been dealing with the "Armada storms" for a couple hundred years due to climate change. From the descriptions there are multiple of these storms occurring at any given time. Humans now live in "arcologies" which are giant multi-kilometer domes over population centers. There was no way to reverse the damage done. We can't fly anymore so everything is in a very fast underground train system called "vac-trains". I suppose this may be where we are headed (although I'd rather love to see the vac-train system)
I love the idea. I signed up and now I'd like to import a bunch of bookmarks I have stored in Evernote and Obsidian over the years. I can't find a way to do this though. Is there a way to import all of these?
I suppose I could try the Chrome bookmarks importer if I can import everything into chrome. But I can't find the import option after the initial signup workflow.
I've been experimenting with working in a gcp VM for a few months and like it quite a bit. Between ssh+tmux+nvim and vscode-remote I have a remote and local dev exp that is basically identical.
Syncthing is my key to keeping my ~/git tree in sync between the two environments. Just have to be careful not to try to flip back and forth quickly between local/remote as syncthing delays can be ~10s.
I always had trouble when using Syncthing for .git folders: they usually end up corrupted / messed up if I push on both the local and remote .git folders.