Should I combat link rot with redirects? Can it be done unintrusively?

20. huhtikuuta 2011 klo 19.56
Sijainti: Muut: Stack Exchange

I try to do my bit of keeping the web fluent by minimizing the amount of link rot affecting my own site. This means keeping track of local 404’s and fixing them, as well as manually updating broken external URL references I come across.

I’ve seen redirects suggested as one solution to combat link rot. When all links point to local redirect addresses (such as example.com/?r=123456), the actual target URL can be kept in a database and updated throughout the site in one place in the face of link rot.

However, on the user side I generally dislike redirect systems myself, as they make picking up the actual target URL’s somewhat cumbersome (have a look at the title links on a Google search results page for example).

Then again, maybe I’ve used sites that do employ redirects but do it transparently enough for me not to even notice.

Is a transparent (or nearly-so) redirect system at all possible? Are such systems available already, or should I roll my own?

Also, I’d be interested to hear if there are other major cons in utilizing redirects. So far, the user annoyance I mentioned above has been enough to keep me from planning ahead with this technique.

Vastaa viestiin sen kontekstissa (Stack Exchange)