There are many ways to download and install wget without having wget itself installed. For example, one can use curl, a sort of competitor to wget, or a package manager with libfetch or some other library-level downloader integrated (such as pacman). One may be able to use SSH’s scp or sftp utility or even use netcat to transfer a wget tarball over a network. But these methods of obtaining wget are not always feasible or even possible whereas a bash shell and a few core utilities are often readily available.
I was introduced to the bash builtin /dev/tcp by warg the other day on x-tab#chat. He explained a basic use of this device by demonstrating how to download wget’s compressed tarball. The download process itself can be done with pure bash, but some post-processing of the downloaded file must be done to remove HTTP headers. I document warg’s application of /dev/tcp here because I found the idea fascinating and want this documentation for myself ;-).
Connecting and Downloading
To read about the /dev/tcp builtin for yourself, check out the following:
$ info '(bash) Redirections'
With the exec line we initiate the connection, allocating a file descriptor and storing the numeic file descriptor into the HTTP_FD variable. Then, with the echo line, we send an HTTP request through the descriptor to the server. After sending the request, we process the server’s response with the sed line which skips over the HTTP headers sent by the server and stows the results into wget-latest.tar.gz. Note that this last command will sit around for a while. It is with this command that the builk of the data transfer is performed. And, since you’re using shell redirections to download the file, you cannot see the download progress. Instead, wait for the command to complete. This also involves waiting for the server to time out your connection since it supports pipelining. After this process is completed, the wget-latest.tar.gz file is as your disposal.
$ WGET_HOSTNAME='ftp.gnu.org' $ exec {HTTP_FD}<>/dev/tcp/${WGET_HOSTNAME}/80 $ echo -ne 'GET /gnu/wget/wget-latest.tar.gz HTTP/1.1\r\nHost: '\ ${WGET_HOSTNAME}'\r\nUser-Agent: '\ 'bash/'${BASH_VERSION}'\r\n\r\n' >&${HTTP_FD} $ sed -e '1,/^.$/d' <&${HTTP_FD} >wget-latest.tar.gz
Now you have a wget source tarball on your machine. As long as you have tar and a compiler on the machine, you are well on your way to downloading stuff using a self-compiled wget. In the commands above, you may replace “gz” with “bz2” or “lzma” for smaller downloads if the machine you’re using has bzip2 or xz-utils installed. And, of course, it should not be too hard to repurpose the above code to download a particular version of wget or even a completely unrelated software package.
Please feel free to point out problems with this approach or give pointers on porting this to other environments :-).
I am just curious if this approach would work for pre-made packages? I am currently working on a script that basis you have wget installed. As far as I can tell the minimum install of the OS does not have it. This makes it so you have to download and install wget manually. If I could incorporate this coding into my script and have it work correctly the user would only have to run the script and wait for it to finish.
This method of downloading should work for most HTTP URLs. The exceptions I can think of is when the server decides to use chunked-encoding, which only really happens when streaming is done (which normally isn’t the case for file downloads). I can’t tell exactly what you’re trying to do, but if your script uses wget to fetch a particular file over HTTP you could possibly just download the file you need directly instead of downloading wget and then building it. Or are you saying that you’re downloading a prebuilt wget — that should work just as fine as downloading wget’s sources.
Note that when using this method of downloading a file via HTTP, you might not notice things which wget handles such as interruption of the connection. This might result in a half-downloaded file instead of the full thing. So you should check this case. For example, if you are downloading a .tar.gz or .tar.bz2 file, you can always run tar in -t mode on the file. This command should exit with a nonzero status if the file is corrupted.
What I am trying to do is build a script to make a Slackware min install into a usable environment. I have tried modifying this code but I get errors when running it.
./test.sh: line 3: $: command not found
./test.sh: line 4: : Name or service not known
./test.sh: line 4: HTTP_FD: Invalid argument
./test.sh: line 5: ${HTTP_FD}: ambiguous redirect
./test.sh: line 8: ${HTTP_FD}: ambiguous redirect
Evidently on a Slack min install the required programs to even do this much is not installed. The command not found is the WGET_HOSTNAME line. I am trying to be able to install wget automatically so then the script can use it to install the rest of the packages needed.
The `$’ chars I have are not meant to be copied into the script, they’re just to represent an interactive shell’s prompt. Line 4 might be caused by an inability to resolve the hostname, check that DNS is working, etc. (check that getent hosts ${WGET_HOSTNAME} works, for example — since you have a syntax error on line 3, ${WGET_HOSTNAME} expands to the empty string and this explains that the virtual device /dev/tcp//80 causes bash problems when it tries to look up `80′ or `’ using DNS).
If you need help using this, please just ask me directly in irc://irc.ohnopub.net/protofusion :-p
Interesting. However what for do I need that page, if I can read it? And what can I do I have no wget and can not read it having no web browser? ;).
You may someday end up in the unlikely scenario on a unix system where a version of bash supporting
/dev/tcp
is available butftp
,fetch
,curl
,wget
,nc
, andsftp
(or whatever else you might use) are all unavailable. This trick might be enough to get you by in such a situation. It is certainly much more convenient than downloading the file on another machine and placing it on physical media like a CD or flash drive—I mean, using physical media might require you to stand up from your seat or even be impossible if you’re accessing a console remotely! (And if your’re accessing a remote terminal, your local non-dumb terminal (e.g., PC) might have a web browser able to read this article).Hopefully you never end up in a situation where the information in this article is essential to complete the task at hand. But it is a nifty example of using bash’s
/dev/tcp
which I, again, have to thank warg for showing me.