Bärchelor of Science@social.tchncs.detoDeGoogle Yourself@lemmy.ml•In an age of LLMs, is it time to reconsider human-edited web directories?
0·
8 months ago@ajsadauskas @degoogle I mean we could still use all modern tools. I’m hosting a searxng manually and there is currently an ever growing block list for AI generated websites that I regularly import to keep up to date. You could also make it as allow list thing to have all websites blocked and allow websites gradually.
@ajsadauskas @degoogle I started that because it bothered me that you couldn’t just report a website to duckduckgo that obviously was a stackoverflow crawler. This problem persists since reddit and stackoverflow are a thing themselves. why are there no measurements from search engine to get a hold of it.
I never understood that.