Remix.run Logo
upboundspiral 3 hours ago

I feel like a government funded search engine would resolve a lot of the issues with the monetized web.

The purpose of a search engine is to display links to web pages, not the entire content. As such, it can be argued it falls under fair use. It provides value to the people searching for content and those providing it.

However we left such a crucially important public utility in the hands of private companies, that changed their algorythms many times in order to maximize their profits and not the public good.

I think there needs to be real competition, and I am increasingly becoming certain that the government should be part of that competition. Both "private" companies and "public" governement are biased, but are biased in different ways, and I think there is real value to be created in this clash. It makes it easier for individuals to pick and choose the best option for themselves, and for third independent options to be developed.

The current cycle of knowledge generation is academia doing foundational research -> private companies expanding this research and monetizing it -> nothing. If the last step was expanded to the government providing a barebones but useable service to commodotize it, years after private companies have been able to reap immense profits, then the capabilities of the entire society are increased. If the last step is prevented, then the ruling companies turn to rentseeking and sitting on their lawrels, turn from innovating to extracting.

digiown 3 hours ago | parent | next [-]

We can start by forcing sites to treat crawlers equally. Google's main moat is less physical infrastructure or the algorithms, and more that sites allow only Google to scrape and index them.

They can charge money for access or disallow all scrapers, but it should not be allowed to selectively allow only Google.

charcircuit 3 hours ago | parent [-]

It's not like only allowing Google actually means that only Google is allowed forever. Crawlers are free to make agreements with sites to allow themselves to crawl easier or pretend they are a regular user to bypass whatever block they are trying to do.

LPisGood 3 hours ago | parent | prev | next [-]

The government having the power to curate access to information seems bad. You could try to separate it as an independent agency, but as the current US administration is showing, that’s not really a thing.

upboundspiral 17 minutes ago | parent [-]

The idea is that the government is biased towards hiding certain information and private companies are biased towards hiding a different set.

While unlikely, the ideal would be for the government to provide a foundational open search infrastructure that would allow people to build on it and expand it to fit their needs in a way that is hard to do when a private companies eschews competition and hides its techniques.

Perhaps it would be better for there to be a sanctioned crawler funded by the government, that then sells the unfiltered information to third parties like google. This would ensure IP rights are protected while ensuring open access to information.

underlipton 3 hours ago | parent | prev [-]

I'm feeling it. Addressing the other reply: zero moderation or curation, and zero shielding from the crawler, if what you've posted is on a public network. Yes, users will be able to access anything they can think of. And the government will know. I think you don't have to worry about them censoring content; they'll be perfectly happy to know who's searching for CSAM or bomb-making materials. And if people have an issue with what the government does with this information (for example, charging people who search for things the Tangerine-in-Chief doesn't want you to see), you stop it at the point of prosecution, not data access. (This does only work in a society with a functioning democracy... but free information access is also what enables that. As Americans, with our red-hot American blood, do we dare?)