libreplanet-discuss
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Are websites closing down en masse? (distributed free standards and


From: Paul D. Fernhout
Subject: Re: Are websites closing down en masse? (distributed free standards and tools)
Date: Wed, 13 Dec 2023 09:11:21 -0500
User-agent: Mozilla Thunderbird

A search on "average lifespan of a web page" (and similar) produces estimates ranging from forty days to just under three years (including an estimate supposedly derived from archive.org at some point). So, in general you are right that most web pages don't last very long, but this is not especially a new thing.

Although I can wonder if there are recent trends that might make this worse?

Part of the issue may be that Google tuned its results several years ago to prioritize recent content over older content (i.e. "Freshness Signals"). Also, social media companies tend to promote new posts. Partially as a result, most web pages of various sorts get their greatest number of views in hours or days after they are posted. Thus there are diminishing financial returns to advertising-funded content to keep it up past a few days. Website design companies also make money promoting "refreshes" to their clients every two years or so. So, there are a lot of obvious financial incentives for various actors to put up new web pages and fewer incentives to keep up old ones.

That said, I personally like the idea of permanent URLs. I try to keep webpages up that I create at the same URL. Some have been up for over twenty-five years. But at some point, after I pass away (or just enter old-age and perhaps poverty), will someone else want to keep those websites up? Priorities by individuals and communities can change over time.

One suggestion I've seen is for "permanent" URLs is to typically including a date of creation in URLs for things like blog posts. Then even if you change your content hosting platform you can more easily keep up the old URLs. But all too often I see websites redone with all the old content discarded or moved so old links are broken.

Methods using the hash of content as a link may help preserve public content if we have more distributed systems. This is because the content is not tied to a specific domain that may expire or a specific server that may be retired.

Archive.org and similar are amazing resources for keeping old content available. Wikipedia has some interaction with them to provide archive.org links for items linked from Wikipedia. I try to ensure content I put on the web on personal sites is archived there. But, while I hope archive.org ans similar will prosper for decades to come, there is no guarantee that archive.org will be around a long time either due to risks related to funding issues, technology issues, management issues, and/or legal/political issues.

I like email as a way to personally archive some forms of content. It's been sad over the years to see Mozilla short-change Thunderbird (implicitly a distributed content system) in favor of Firefox (generally used to access centralized content) with how they have spent about a billion dollar a year they have gotten from Google and similar funders.

Glad to read about "hyperdrive.el" by another poster as one more alternative for distributed content. I personally enjoy working on software in that area myself sometimes in my spare time.

A deeper issue, however, is the need for wide adoption of free standards for persistent distributed content (like HTTP became a widely adopted standard). While coding is fun and potentially useful, such standards are more important for social software than good free implementations (even if they ultimately go together and benefit each other, and a really good implementation widely adopted can define a de-facto social standard).

I made a Lightning Talk for LibrePlanet 2022 related to that issue:
https://media.libreplanet.org/u/libreplanet/m/lightning-talk-free-libre-standards-for-social-media-and-other-communications/

And I say there, I think helping define and promote such free standards for free distributed content and related tools is a valuable role the FSF could play in fostering a more libre planet.

Some related content by archive.org:
https://blog.archive.org/tag/distributed-web/

--Paul Fernhout (pdfernhout.net)
"The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity."

On 12/3/23 22:53, Akira Urushibata wrote:
Recently I feel I frequently encounter defunct links.  Links to
external material toward the bottom of Wikipedia articles often turn
out to be unavailable.

I don't know if there is any empirical data on this.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]