[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: Function to download a URL and return it as a string
From: |
Drew Adams |
Subject: |
RE: Function to download a URL and return it as a string |
Date: |
Sat, 8 Dec 2012 21:38:15 -0800 |
> I recently wanted to be able to download a document from a URL and
> return it as a string. I couldn't find a function that did that
> precisely, but I was able to construct my own:
>
> (defun download (url)
> (with-current-buffer (url-retrieve-synchronously url)
> (prog2
> (when (not (search-forward-regexp "^$" nil t))
> (error "Unable to locate downloaded data"))
> (buffer-substring (1+ (point)) (point-max))
> (kill-buffer))))
>
> This seems rather busy for such a basic-seeming operation--in
> particular, having to take care to delete the buffer created by
> url-retrieve-synchronously. Is there a better way to do it?
Dunno anything about whether there is a better way, but consider using
`unwind-protect', with `kill-buffer' as your cleanup part. And to play safe,
pass the actual buffer (returned from `url-*') as arg to `kill-buffer'.
Using `unwind-protect' makes sure the buffer is killed when you're done, no
matter what.