lynx-dev
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Lynx-dev] urls longer than 1024 characters


From: Bela Lubkin
Subject: Re: [Lynx-dev] urls longer than 1024 characters
Date: Fri, 27 Jan 2023 17:24:07 -0800

Jude DaShiell wrote:

> Would it be possible for lynx to count the characters in an url and if the
> url is longer than 1024 characters offer to send the long url to an url
> shortening service and then catch the shortened url the service sent back
> and then open that shortened url instead?

As others have mentioned, this won't work since you'd just get
redirected back to the long URL.

In my experience, long URLs are usually *chock full of crap*.  If you
can copy-paste the URL into an editor, break it out into '&this=that'
segments, you'll probably be able to identify one or more huge ones
which are obviously useless.  Remove those, reassemble the rest, and
proceed.

Is that a gigantic hassle?  Sure.  Not advocating this as a great
procedure to use all the time; but if you're stuck at one URL, it's
worth trying.

I'm talking about things like:

    &session_key=[400 chars of gibberish]
    &previous_session_key=[400 more chars of gibberish]

You can definitely set the backup key on fire!  Use your judgment.
There are almost always heaps and mounds of useless trash.  Another big
pile of junk you'll see is:

    &parameter14=
    &parameter15=
    &sub_query_boots=
    &gibber_jabber=

... bunch of empty variable settings.  Burn them all to the ground.
Almost all receivers will automatically set unspecified keys to an empty
default; there is zero reason to include affirmative empty values in the
URL parameters.

I constantly edit URLs like this.  It works.  Whenever I send someone
(or save for my own purposes) a URL to an item on Amazon, eBay, other
commerce services, libraries, etc. -- I smash it down to the smallest
*readable* URL I can manage.  Like: amazon.com/dp/B081KQ17R3, or
camelcamelcamel.com/product/B081KQ17R3 (well, they actually provide
succinct URLs in the first place...)

I've been doing this for at least 15 years and have yet to run into a
case where it caused an actual problem (like locking out my account).
You try an edit; either it works or you get the site's rendition of a
404 error, or some database error, and no harm is done.  Try again.

In something like a bug database lookup, you might mistakenly remove a
parameter which was helpfully thinning the result (e.g. the one that
means 'currently open bugs only'); so you get back a valid result, but
broader than intended.  So undo that removal, and carry on.

>Bela<



reply via email to

[Prev in Thread] Current Thread [Next in Thread]