bug-bash
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: wget command with simple url makes bash crash


From: Greg Wooledge
Subject: Re: wget command with simple url makes bash crash
Date: Wed, 26 Aug 2009 08:19:46 -0400
User-agent: Mutt/1.4.2.3i

On Tue, Aug 25, 2009 at 02:40:24PM +0200, Oskar Hermansson wrote:
>     wget
> http://www.kohanaphp.com/download?modules%5Bauth%5D=Auth&vendors%5Bmarkdown%5D=Markdown&languages%5Ben_US%5D=en_US&format=zip
> 
>     If the command is placed in a file instead, the file is successfully
> downloaded:
>     wget `cat url.txt`

In the second version, the & characters in the URL are not seen by
the bash parser, and therefore do not cause bash to see three separate
commands.  Bash just sees the one command, although word splitting
would still be performed on the results of the substitution.  (In
simpler language, this means that if url.txt happened to contain any
whitespace, you'd still get undesired behavior.)

I can't address the reason bash is exiting, but in order to make your
script work the way you intended, you should quote things:

  wget "http://...&foo=bar&baz=quux";
  wget "$myurl"
  wget "$(cat url.txt)"   # POSIX command substitution
  wget "`cat url.txt`"    # old Bourne shell style
  wget "$(<url.txt)"      # bash extension

The double quotes in the first example will prevent the parser from
treating the URL as three separate commands due to ampersands.  In
all five cases, the quotes will prevent word splitting of the URL
or the substitution, so it would work even if you had spaces in your
URL.




reply via email to

[Prev in Thread] Current Thread [Next in Thread]