[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

url-retrieve fails on most HTTPS sites

From: Nicolas Graner
Subject: url-retrieve fails on most HTTPS sites
Date: Mon, 28 Sep 2020 22:41:02 +0200

Hello all,

using 26.1 on Debian, I try to open various web pages with url-retrieve.
It works with all HTTP and some HTTPS connections, but fails with most
HTTPS. For instance with Wikipedia:

  (switch-to-buffer (url-retrieve ""; '(lambda (&rest 

The buffer that shows up is empty. Tracing url-http-async-sentinel gives:

1 -> (url-http-async-sentinel #<process> "open
1 <- url-http-async-sentinel: nil
1 -> (url-http-async-sentinel #<process> "connection broken by 
remote peer
1 <- url-http-async-sentinel: nil

In contrast, The Conversation is OK:
  (switch-to-buffer (url-retrieve ""; '(lambda 
(&rest ignore))))
works fine, the buffer contains HTTP headers followed by à HTML page.

I don't use a proxy. The result is the same whether url-privacy-level
is 'none or 'paranoid.

Any suggestions as to what could make some sites close the connection,
or how I could find out more?

Thanx for your help.

reply via email to

[Prev in Thread] Current Thread [Next in Thread]