A version of it powers my local rubber duck thoughts and voice note store.
Like an explicit chatgpt memory store, helps with information fomo cause I know finding the needle in haystack would be easy.
It would, probably, be cost prohibitive to use 10M context to it's fullest each time.
I instead hope for to have an api to access to the context as a datastore, so like RAG we can control what to store but unlike rag all data stays within context.
Extremely Passionate ( and Fresh graduate ) FullStack developer based in India/Abu Dhabi. Have experience working at startups and corporate. Previously Engineering Intern, doing qa and creating testing pipelines, at a startup in Paris. Interested in Backend/Fullstack, but also open to Data/MLE, positions.
By adding content to user's newsfeed, they ( facebook's algorithm ) have made a explicit decision to share the piece of content and implicit decision that they deem the post safe for sharing. In that case shouldn't a speaker operator be responsible for the voices they amplify?
You only see things in your newsfeed from people you've explicitly added to your network (i.e. explicitly opted in to receiving content from). Is facebook wrong for showing you the things you said you want it to show you?
> You only see things in your newsfeed from people you've explicitly added to your network (i.e. explicitly opted in to receiving content from).
Eh, I'm pretty sure see random posts from political Facebook groups that were commented on by a Facebook friend I added in like 2006 when that sort of "content proliferation" didn't exist yet (or at least wasn't anywhere near as prominent). I explicitly added that friend, yes, but I definitely didn't explicitly consent to receiving every single piece of Internet content that that friend ever interacts with.
I think this is an unhelpful framing of the situation, because as you imply Facebook is of course not wrong.
If you have one friend and your feed is entirely full of their posts, Facebook is off the hook. But, for most users, the situation is far more complex. People have many friends, they may also be in groups. They probably will not be on facebook long enough for them to see all of the posts and, if they are, their attention levels will differ over the entire corpus.
In this situation facebook begins to have agency - which is different from being wrong. They didn't make the content, they didn't create the link that brought the content to you, but of all the content they could show they did show you this and not that. It's a relatively new kind of agency, one that we didn't have a lot of practical experience with before very recently, so they can be forgiven somewhat for their difficulty grappling with it. However, they're an important part of the chain of information organization and it would be foolish to pretend they have no responsibility.
I don't follow enough users to have a flourishing newsfeed, so facebook gives me suggestions to follow pages and shows posts from those pages. These suggestions are purely based on what facebook thinks I may like.
With people having enough sources to feed their newsfeed, facebook decides the order and cherry picks what posts are shown.
> Is facebook wrong for showing you the things you said you want it to show you?
I do agree with this. Coming to a solution to this would be hard problem. Users will have to give finer information about types of posts people like from a creator. I doubt there are any real work incentives for a big company to build a system like this.
It’s moderation through omission. No different than traditional TV news choosing to only produce stories painting their politically allies positively or foes negatively. Any form of selection can be a means for bias.
https://github.com/DumbMachine/pg-fs
A version of it powers my local rubber duck thoughts and voice note store. Like an explicit chatgpt memory store, helps with information fomo cause I know finding the needle in haystack would be easy.