bug-gnu-emacs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

bug#16733: messed up unicode chars in package description


From: Juanma Barranquero
Subject: bug#16733: messed up unicode chars in package description
Date: Thu, 20 Mar 2014 06:12:24 +0100

On Wed, Mar 19, 2014 at 7:05 PM, Glenn Morris <rgm@gnu.org> wrote:

> I'd favour autoloading url-insert rather than requiring.
> (I see url-insert-file-contents has an autoload cookie, so maybe that is
> supposed to be the relevant entry point.)

All in all, trying to use url-insert(-file-contents) has been less
than straightforward.

- url-insert-file-contents can be used, and in fact it simplifies the
code in package--with-work-buffer quite a bit. A nicety is that, if
the HTTP response headers to not set an explicit coding, it guesses.
The catch is that it does not detect missing URLs, i.e., if you point
it to http://www.gnu.org/does-not-exist it will happily, and
successfully, return the 404 Page Not Found error page. (That, BTW,
makes package-handle-response irrelevant, because it is no longer used
anywhere).

- url-insert works on the buffer returned by
url-retrieve-synchronously, so package-handle-response can be used to
detect errors, i.e. response codes not in [200, 300). The problem is
that if the response header does not include a coding, url-insert
doesn't know what to do. Ironically, that's exactly the case with
Glenn's original report:

C:\> lwp-request -m HEAD
http://elpa.gnu.org/packages/ascii-art-to-unicode-readme.txt
200 OK
[...]
Content-Length: 1255
Content-Type: text/plain
[...]

So, at this point, I see the following alternatives:

1) Leave it as it is now, with
detect-coding-string/decode-coding-string (or perhaps
decode-coding-(inserted-)region).

2) Use url-insert-file and accept that it will return the wrong
contents for missing URLs; that could be acceptable, depending on how
consistent is the data returned by ELPA and other repositories (this
is my preferred fix, BTW).

3) Copy all or some of the logic of url-insert-file and create a
package-* version which checks for errors with package-handle-response
(Ugly duplication of code).

4) Fix url-insert-file so it can (optionally) check for errors (after
the freeze).



The patch for 3), as an example:


=== modified file 'lisp/emacs-lisp/package.el'
--- lisp/emacs-lisp/package.el  2014-03-19 16:14:26 +0000
+++ lisp/emacs-lisp/package.el  2014-03-20 05:09:52 +0000
@@ -770,38 +770,16 @@
 and evaluates BODY while that buffer is current.  This work
 buffer is killed afterwards.  Return the last value in BODY."
   (declare (indent 2) (debug t))
-  `(let* ((http (string-match "\\`https?:" ,location))
-         (buffer
-          (if http
-              (url-retrieve-synchronously (concat ,location ,file))
-            (generate-new-buffer "*package work buffer*"))))
-     (prog1
-        (with-current-buffer buffer
-          (if http
-              (progn (package-handle-response)
-                     (re-search-forward "^$" nil 'move)
-                     (forward-char)
-                     (delete-region (point-min) (point)))
-            (unless (file-name-absolute-p ,location)
-              (error "Archive location %s is not an absolute file name"
-                     ,location))
-            (insert-file-contents (expand-file-name ,file ,location)))
-          ,@body)
-       (kill-buffer buffer))))
-
-(defun package-handle-response ()
-  "Handle the response from a `url-retrieve-synchronously' call.
-Parse the HTTP response and throw if an error occurred.
-The url package seems to require extra processing for this.
-This should be called in a `save-excursion', in the download buffer.
-It will move point to somewhere in the headers."
-  ;; We assume HTTP here.
-  (require 'url-http)
-  (let ((response (url-http-parse-response)))
-    (when (or (< response 200) (>= response 300))
-      (error "Error downloading %s:%s"
-            (url-recreate-url url-http-target-url)
-            (buffer-substring-no-properties (point) (line-end-position))))))
+  `(with-temp-buffer
+     (if (string-match-p "\\`https?:" ,location)
+        (progn
+          (url-insert-file-contents (concat ,location ,file))
+          (goto-char (point-min)))
+       (unless (file-name-absolute-p ,location)
+        (error "Archive location %s is not an absolute file name"
+               ,location))
+       (insert-file-contents (expand-file-name ,file ,location)))
+     ,@body))

 (defun package--archive-file-exists-p (location file)
   (let ((http (string-match "\\`https?:" location)))
@@ -1272,7 +1250,7 @@
                     (car archive)))))
       ;; Read the retrieved buffer to make sure it is valid (e.g. it
       ;; may fetch a URL redirect page).
-      (when (listp (read buffer))
+      (when (listp (read (current-buffer)))
        (make-directory dir t)
        (setq buffer-file-name (expand-file-name file dir))
        (let ((version-control 'never)
@@ -1531,8 +1509,7 @@
                        (setq readme-string (buffer-string))
                        t))
                 (error nil))
-              (let ((coding (detect-coding-string readme-string t)))
-                (insert (decode-coding-string readme-string coding t))))
+              (insert readme-string))
              ((file-readable-p readme)
               (insert-file-contents readme)
               (goto-char (point-max))))))))





reply via email to

[Prev in Thread] Current Thread [Next in Thread]