Hacker Newsnew | past | comments | ask | show | jobs | submit | noman-land's commentslogin

Radicle is really cool. I've been running a node for months but havent pulled the trigger to use it as primary yet.

We need better forges and they need to be p2p to survive. p2p is the only viable future for the web.


AD: Thank you for your contribution! I also run a permissive seed node, vote by participation!

AD: Whats holding you back from using it as your primary?

Not the original OP — I had been thinking about this for years, though my interest is in resiliency rather than a sovereignty (though they overlap):

- is there a mirror adapter to push to a non-radicle node, such as Github or say, sourcehut? (Mirroring nixpkgs, for example)

- is there a mechanism to control syncs so it can be used on low-bandwidth, unreliable networks, or ad-hoc bluetooth networks?

- is offline seeding possible or in the works?

- language package managers often can reference a git or github. Would I be able to directly reference my local radicle node and have it manage (or perhaps even discover) the correct repos? (Or maybe this is a different problem and package repos themselves could be decentralized and sovereign)

On that last point, I mean that the whole build chain and supply chain can be made sovereign: I see radicle is written in Rust, which means dependencies on Cargo, the Rust toolchain, and so forth.


> is there a mirror adapter to push to a non-radicle node, such as Github or say, sourcehut?

You can just add a remote for another repository.

    git remote add github git@github.com:example/example.git
You can also create remotes with multiple push URLs, so that with one `git push`, you push to all of them at once.

Apart from that, it's possible to use e.g. systemd path units to run `git push` automatically whenever a repository gets updated by `radicle-node`.

This works reasonably well. What else would the adapter have to do?

> is there a mechanism to control syncs so it can be used on low-bandwidth, unreliable networks, or ad-hoc bluetooth networks?

No. The data itself usually is quite small, as the common use case is to send commits. It's not optimized for unreliable networks or Bluetooth in any special way yet. It would certainly be useful.

> is offline seeding possible or in the works?

That's contradictory in my mind. What do you mean? Offline in the sense of "not connected to the internet"? That works just fine. Currently, you still have to connect your node to the existing network by connecting to another known node (via IP address or a DNS name that resolves locally). There are plans to integrate DNS-SD, also via mDNS.

> language package managers often can reference a git or github. Would I be able to directly reference my local radicle node and have it manage (or perhaps even discover) the correct repos?

For now, no. It's however reasonably simple to deploy a component called `radicle-httpd`, which will expose your repos via Git over HTTP if you like. Looks like this: https://seed.radicle.xyz/z3gqcJUoA1n9HaHKufZs5FCSGazv5.git

> (Or maybe this is a different problem and package repos themselves could be decentralized and sovereign)

Yes. Consider things like https://www.tweag.io/blog/2020-12-16-trustix-announcement/


If the internet is down and you want to onboard someone with say, a usb thumbdrive.

With the mirroring: does radicle have any kind of event hooks?


> If the internet is down and you want to onboard someone with say, a usb thumbdrive.

All the data being synced is in a Git repo, which is in a directory on your filesystem we call "Radicle Storage". You can use `git bundle` or a plain `cp` to copy that directory over. You can also use plain Git to push. Note that for these use-cases there is no polished UX. You need to know what you are doing. The bigger issue will be to install Radicle.

> With the mirroring: does radicle have any kind of event hooks?

Yes. You can connect to `radicle-node` via a socket and subscribe to events. This is how Radicle CI, and in particular the Radicle CI Broker was implemented. You can implement your own event broker, it's just JSON over a socket.

https://radicle-ci.liw.fi/radicle-ci-broker/ci-broker.html


Honestly it's mostly lack of other users to interact with. It's my same problem with things like gitea. Someone needs an "account" to participate in your project.

I'm also not sure how to sync issues and pull requests with Github.

It's largely I haven't researched deeply enough yet.


What if the future of apps is serving a few dozen instead of a few billion?

The act of bidding itself shows interest and raises the price.

Auto bid raises the price to the second highest price among auto bidders, basically running an instant second-price auction. Sniping avoids running these pre-close auctions.

It does not. Even if you submit a snipe bid the normal eBay bidding rules apply.

The act of viewing the item page in itself demonstrates activity and is relayed to other users; leaking information about, not necessarily intent, but awareness. If you want something, figure out the details without actually clicking on it.

What was the near miss?

My instructor and I were launched using a powered winch launch. As we were ascending, another glider passed over us and barely missed us. They later claimed they didn't see us, though they were later found to be at fault.

Really amazing video. Unfortunately this article is like 60% over my head. Regardless, I actually love reading jargon-filled statements like this that are totally normal to the initiated but are completely inscrutable to outsiders.

    "That data was then brought into Houdini, where the post production team used CG Nomads GSOPs for manipulation and sequencing, and OTOY’s OctaneRender for final rendering. Thanks to this combination, the production team was also able to relight the splats."

Hi, I'm one of the creators of GSOPs for SideFX Houdini.

The gist is that Gaussian splats can replicate reality quite effectively with many 3D ellipsoids (stored as a type of point cloud). Houdini is software that excels at manipulating vast numbers of points, and renderers (such as Octane) can now leverage this type of data to integrate with traditional computer graphics primitives, lights, and techniques.


Can you put "Gaussing splats" in some kind of real world metaphor so I can understand what it means? Either that or explain why "Gaussian" and why "splat".

I am vaguely aware of stuff like Gaussian blur on Photoshop. But I never really knew what it does.


Sure!

Gaussian splatting is a bit like photogrammetry. That is, you can record video or take photos of an object or environment from many angles and reproduce it in 3D. Gaussians have the capability to "fade" their opacity based on a Gaussian distribution. This allows them to blend together in a seamless fashion.

The splatting process is achieved by using gradient descent from each camera/image pair to optimize these ellipsoids (Gaussians) such that the reproduce the original inputs as closely as possible. Given enough imagery and sufficient camera alignment, performed using Structure from Motion, you can faithfully reproduce the entire space.

Read more here: https://towardsdatascience.com/a-comprehensive-overview-of-g....


I think this means that you could produce more versions of this music video from other points of view without having to shoot the video again. For example, the drone-like effects could take a different path through the scene. Or you could move people/objects around and still get the lighting right.

Given where this technology is today, you could imagine 5-10 years from now people will watch live sports on TV, but with their own individual virtual drone that lets them view the field from almost any point.


> I am vaguely aware of stuff like Gaussian blur on Photoshop. But I never really knew what it does.

Blurring is a convolution or filter operation. You take a small patch of image (5x5 pixels) and you convolve it with another fixed matrix, called a kernel. Convolution says multiply element-wise and sum. You replace the center pixel with the result.

https://en.wikipedia.org/wiki/Box_blur is the simplest kernel - all ones, and divide by the kernel size. Every pixel becomes the average of itself and its neighbors, which looks blurry. Gaussian blur is calculated in an identical way, but the matrix elements follow the "height" of a 2D Gaussian with some amplitude. It results in a bit more smoothing as farther pixels have less influence. Bigger the kernel, more blurrier the result.There are a lot of these basic operations:

https://en.wikipedia.org/wiki/Kernel_(image_processing)

If you see "Gaussian", it implies the distribution is used somewhere in the process, but splatting and image kernels are very different operations.

For what it's worth I don't think the Wikipedia article on Gaussian Blur is particularly accessible.


> explain why "Gaussian" and why "splat".

Happily. Gaussian splats are a technique for 3D images, related to point clouds. They do the same job (take a 3D capture of reality and generate pictures later from any point of view "close enough" to the original).

The key idea is that instead of a bunch of points, it stores a bunch of semi-transparent blobs - or "splats". The transparency increases quickly with distance, following a normal distribution- also known as the "Gaussian distribution."

Hence, "Gaussian splats".


Somehow this hit right in the sweet spot at my level of knowledge. Thanks!

How can you expect someone to tailor a custom explanation, when they don’t know your level of mathematical understanding, or even your level of curiosity. You don’t know what a Gaussian blur does; do you know what a Gaussian is? How deeply do you want to understand?

If you’re curious start with the Wikipedia article and use an LLM to help you understand the parts that don’t make sense. Or just ask the LLM to provide a summary at the desired level of detail.


There's a Corridor Digital video being shared that explains it perfectly. With very little math.

https://youtube.com/watch?v=cetf0qTZ04Y


Amazing video, thanks for sharing this.

> How can you expect someone to tailor a custom explanation, when they don’t know your level of mathematical understanding, or even your level of curiosity.

The other two replies did a pretty good job!


My bad! I am the author. Gaussian splatting allows you to take a series of normal 2D images or a video and reconstruct very lifelike 3D from it. It’s a type of radiance field, like NeRFs or voxel based methods like Plenoxels!

Corridor has done some great stuff with Gaussian Splats, I recommend this video for a primer!

https://youtube.com/watch?v=cetf0qTZ04Y


Reminds me of Kurtwood Smith’s piping sales pitch in The Patriot

This is not an "if" scenario. It is possible and happening.

All your conversations are living as json files inside `~/.claude/`.

But that includes a ton of dead ends and stuff.

So who is going to make some mesh firmware for these and all other garbage computers?

Link is broken

It doesn't look broken here. Can I ask what you're seeing?

Sorry I guess it must have been on my side. It's working now.

The positive contributions he never got a chance to make are sad to contemplate.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: