I have been looking for resources about contextual bandits too . But i failed finding anything good . Closest thing i found was Vowpal wabbit , same as you and same problem - unmaintained . Tried searching hn about bandits but didn't find anything useful either .
I was looking for algorithms that can find new interest of users . I felt like after all these research all i learned is the ancient technique of showing x percent of random items to users .
Currently, it uses both Google and Bing, but I am planning to add a small index from curated pages from forums like HN and Reddit, similar to what I did before
Do you mean google search liaison ? And for your other part of the comment They don't need statements because quarterly report is up and doing well as seen in the latest report .
Google search surprisingly gave me usable results for the search term, and yes, it seems you are right! I was thinking about Danny Sullivan, the Search Liaison.
Average serp page has 10 results . What if all 10 matches with your blacklist ? Not to mention you can't do anything if the engine dosen't search deeper .
How many can you block and filter manually ? 10 ? 100 ? 10k ? Who will test sites for the blocklist ? The domain block feature is great but unless it's collaborative listing it's not gonna be super effective.
It’s super effective for me because I just block stuff as things pop up that I don’t want. I’ve also added more weight to certain domains that I want more results from. I wouldn’t want anyone touching my config, it’s mine and it works great!
We've had this problem of "good defaults" before with ad trackers blocking domains. I'm sure it'll be Sooner than later when some community lists become popular and begin being followed en mass
I meant your average user can test a handful of sites if they are seo spam or good sites but a single search return 10+ results and even more when a user searches multiple things , multiple times a day . Average user doesn't have the time to test these many websites.
So it'll turn to yet another arms race - similar to captcha, cybersecurity and nuclear weapons. SEO will use AI to fill in fluff inside AI-generated content (which is already done).
It won't directly match ChatGPT logs and OpenAI would just be pouring precious compute to a bottomless pit trying to partial-match.
Let's see, if I go to " ⋮ -> History -> Grouped History" on the top right of the Chrome browser, I see a "Search History" ( chrome://history/grouped ).
For example `8 hours ago: "autohotkey hotkeys"` with 4 links to pages which I visited while searching.
But this is a Chrome feature, not a Google Search feature. https://myactivity.google.com/myactivity does (sometimes? can't see it right now) have a grouping feature of all the searches made, but this is more of a search log than a search management feature.
So chrome://history/grouped is the closest to what I mean, but I can't pin or manage these history groups, enrich them with comments or even files, like pdf's which could then get stored in Google Drive, as well as get indexed for better searches.
You don't think SEO-LLMs will evolve to redirect search-LLMs to 'see the world' the way the SEO-LLMs want it to? I foresee SEO-LLM-brinkmanship as the inevitable outcome. Soon THIS will be the catalyst for the real Skynet -- battling smart ad engines.
I was looking for algorithms that can find new interest of users . I felt like after all these research all i learned is the ancient technique of showing x percent of random items to users .