guix-patches
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[bug#28129] [PATCH 0/1] gnu: python-internetarchive: Update to 1.7.1.


From: Oleg Pykhalov
Subject: [bug#28129] [PATCH 0/1] gnu: python-internetarchive: Update to 1.7.1.
Date: Thu, 17 Aug 2017 23:35:23 +0300
User-agent: Gnus/5.13 (Gnus v5.13) Emacs/25.2 (gnu/linux)

https://debbugs.gnu.org/cgi/bugreport.cgi?bug=27699

Danny Milosavljevic <address@hidden> writes:
> After I fixed up the test invocation, still 11 tests of 105 fail,
> apparently mostly because the Requests mock doesn't work.  Could you
> take a look?

> The mocking is done in tests/conftest.py in internetarchive-1.6.0.

11 failed, whose (maybe) all require internet connections.  When Guix
build a package he has no networking inside chroot, has it?

So, we cannot pass those tests.  Could we just disable them selectively
(not all 105)?

Thanks.

--8<---------------cut here---------------start------------->8---
starting phase `check'
============================= test session starts ==============================
platform linux -- Python 3.5.3, pytest-3.0.7, py-1.4.32, pluggy-0.4.0
rootdir: 
/tmp/guix-build-python-internetarchive-1.7.1.drv-0/internetarchive-1.7.1, 
inifile: setup.cfg
plugins: hypothesis-3.1.0, capturelog-0.7
collected 105 items

tests/test_api.py ......F.............
tests/test_bad_data.py .
tests/test_config.py .........
tests/test_exceptions.py .
tests/test_item.py .............................
tests/test_session.py ...
tests/test_utils.py .........
tests/cli/test_argparser.py ..
tests/cli/test_ia.py F
tests/cli/test_ia_download.py FFFFFFFFF
tests/cli/test_ia_list.py ........
tests/cli/test_ia_metadata.py ...
tests/cli/test_ia_search.py ..
tests/cli/test_ia_upload.py ........

=================================== FAILURES ===================================
__________________________ test_get_item_with_kwargs ___________________________

self = <requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 
0x7f9238ef8dd8>

    def _new_conn(self):
        """ Establish a socket connection and set nodelay settings on it.
    
            :return: New socket connection.
            """
        extra_kw = {}
        if self.source_address:
            extra_kw['source_address'] = self.source_address
    
        if self.socket_options:
            extra_kw['socket_options'] = self.socket_options
    
        try:
            conn = connection.create_connection(
>               (self.host, self.port), self.timeout, **extra_kw)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connection.py:141:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

address = ('archive.org', 443), timeout = 1e-13, source_address = None
socket_options = [(6, 1, 1)]

    def create_connection(address, timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
                          source_address=None, socket_options=None):
        """Connect to *address* and return the socket object.
    
        Convenience function.  Connect to *address* (a 2-tuple ``(host,
        port)``) and return the socket object.  Passing the optional
        *timeout* parameter will set the timeout on the socket instance
        before attempting to connect.  If no *timeout* is supplied, the
        global default timeout setting returned by :func:`getdefaulttimeout`
        is used.  If *source_address* is set it must be a tuple of (host, port)
        for the socket to bind as a source address before making the connection.
        An host of '' or port 0 tells the OS to use the default.
        """
    
        host, port = address
        if host.startswith('['):
            host = host.strip('[]')
        err = None
    
        # Using the value from allowed_gai_family() in the context of 
getaddrinfo lets
        # us select whether to work with IPv4 DNS records, IPv6 records, or 
both.
        # The original create_connection function always returns all records.
        family = allowed_gai_family()
    
>       for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/util/connection.py:60:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

host = 'archive.org', port = 443, family = <AddressFamily.AF_UNSPEC: 0>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0

    def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
        """Resolve host and port into list of address info entries.
    
        Translate the host/port argument into a sequence of 5-tuples that 
contain
        all the necessary arguments for creating a socket connected to that 
service.
        host is a domain name, a string representation of an IPv4/v6 address or
        None. port is a string service name such as 'http', a numeric port 
number or
        None. By passing None as the value of host and port, you can pass NULL 
to
        the underlying C API.
    
        The family, type and proto arguments can be optionally specified in 
order to
        narrow the list of addresses returned. Passing zero as a value for each 
of
        these arguments selects the full range of results.
        """
        # We override this function since we want to translate the numeric 
family
        # and socket type values to enum constants.
        addrlist = []
>       for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E       socket.gaierror: [Errno -2] Name or service not known

/gnu/store/3aw9x28la9nh8fzkm665d7fywxzbl15j-python-3.5.3/lib/python3.5/socket.py:733:
 gaierror

During handling of the above exception, another exception occurred:

self = <requests.packages.urllib3.connectionpool.HTTPSConnectionPool object at 
0x7f92346f96d8>
method = 'GET', url = '/metadata/nasa', body = None
headers = {'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 
'Accept': '*/*', 'User-Agent': 'internetarchive/1.7.1 (Linux ; N; en; None) 
Python/3.5.3'}
retries = Retry(total=0, connect=0, read=3, redirect=0), redirect = False
assert_same_host = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92346f9630>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True, err = None, clean_exit = False
timeout_obj = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f9234960f60>
is_new_proxy_conn = False

    def urlopen(self, method, url, body=None, headers=None, retries=None,
                redirect=True, assert_same_host=True, timeout=_Default,
                pool_timeout=None, release_conn=None, chunked=False,
                body_pos=None, **response_kw):
        """
            Get a connection from the pool and perform an HTTP request. This is 
the
            lowest level call for making a request, so you'll need to specify 
all
            the raw details.
    
            .. note::
    
               More commonly, it's appropriate to use a convenience method 
provided
               by :class:`.RequestMethods`, such as :meth:`request`.
    
            .. note::
    
               `release_conn` will only behave as expected if
               `preload_content=False` because we want to make
               `preload_content=False` the default behaviour someday soon 
without
               breaking backwards compatibility.
    
            :param method:
                HTTP request method (such as GET, POST, PUT, etc.)
    
            :param body:
                Data to send in the request body (useful for creating
                POST requests, see HTTPConnectionPool.post_url for
                more convenience).
    
            :param headers:
                Dictionary of custom headers to send, such as User-Agent,
                If-None-Match, etc. If None, pool headers are used. If provided,
                these headers completely replace any pool-specific headers.
    
            :param retries:
                Configure the number of retries to allow before raising a
                :class:`~urllib3.exceptions.MaxRetryError` exception.
    
                Pass ``None`` to retry until you receive a response. Pass a
                :class:`~urllib3.util.retry.Retry` object for fine-grained 
control
                over different types of retries.
                Pass an integer number to retry connection errors that many 
times,
                but no other types of errors. Pass zero to never retry.
    
                If ``False``, then retries are disabled and any exception is 
raised
                immediately. Also, instead of raising a MaxRetryError on 
redirects,
                the redirect response will be returned.
    
            :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
            :param redirect:
                If True, automatically handle redirects (status codes 301, 302,
                303, 307, 308). Each redirect counts as a retry. Disabling 
retries
                will disable redirect, too.
    
            :param assert_same_host:
                If ``True``, will make sure that the host of the pool requests 
is
                consistent else will raise HostChangedError. When False, you can
                use the pool on an HTTP proxy and request foreign hosts.
    
            :param timeout:
                If specified, overrides the default timeout for this one
                request. It may be a float (in seconds) or an instance of
                :class:`urllib3.util.Timeout`.
    
            :param pool_timeout:
                If set and the pool is set to block=True, then this method will
                block for ``pool_timeout`` seconds and raise EmptyPoolError if 
no
                connection is available within the time period.
    
            :param release_conn:
                If False, then the urlopen call will not release the connection
                back into the pool once a response is received (but will 
release if
                you read the entire contents of the response such as when
                `preload_content=True`). This is useful if you're not preloading
                the response's content immediately. You will need to call
                ``r.release_conn()`` on the response ``r`` to return the 
connection
                back into the pool. If None, it takes the value of
                ``response_kw.get('preload_content', True)``.
    
            :param chunked:
                If True, urllib3 will send the body using chunked transfer
                encoding. Otherwise, urllib3 will send the body using the 
standard
                content-length form. Defaults to False.
    
            :param int body_pos:
                Position to seek to in file-like body in the event of a retry or
                redirect. Typically this won't need to be set because urllib3 
will
                auto-populate the value when needed.
    
            :param \\**response_kw:
                Additional parameters are passed to
                :meth:`urllib3.response.HTTPResponse.from_httplib`
            """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, 
default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get('preload_content', True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == 'http':
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(conn, 
'sock', None)
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(conn, method, url,
                                                  timeout=timeout_obj,
                                                  body=body, headers=headers,
>                                                 chunked=chunked)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:600:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connectionpool.HTTPSConnectionPool object at 
0x7f92346f96d8>
conn = <requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 
0x7f9238ef8dd8>
method = 'GET', url = '/metadata/nasa'
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f9234960f60>
chunked = False
httplib_request_kw = {'body': None, 'headers': {'Connection': 'keep-alive', 
'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 
'internetarchive/1.7.1 (Linux ; N; en; None) Python/3.5.3'}}
timeout_obj = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f9234660d68>

    def _make_request(self, conn, method, url, timeout=_Default, chunked=False,
                      **httplib_request_kw):
        """
            Perform a request on a given urllib connection object taken from our
            pool.
    
            :param conn:
                a connection from one of our connection pools
    
            :param timeout:
                Socket timeout in seconds for the request. This can be a
                float or integer, which will set the same timeout value for
                the socket connect and the socket read, or an instance of
                :class:`urllib3.util.Timeout`, which gives you more fine-grained
                control over your timeouts.
            """
        self.num_requests += 1
    
        timeout_obj = self._get_timeout(timeout)
        timeout_obj.start_connect()
        conn.timeout = timeout_obj.connect_timeout
    
        # Trigger any extra validation we need to do.
        try:
>           self._validate_conn(conn)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:345:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connectionpool.HTTPSConnectionPool object at 
0x7f92346f96d8>
conn = <requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 
0x7f9238ef8dd8>

    def _validate_conn(self, conn):
        """
            Called right before a request is made, after the socket is created.
            """
        super(HTTPSConnectionPool, self)._validate_conn(conn)
    
        # Force connect early to allow us to validate the connection.
        if not getattr(conn, 'sock', None):  # AppEngine might not have  `.sock`
>           conn.connect()

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:844:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 
0x7f9238ef8dd8>

    def connect(self):
        # Add certificate verification
>       conn = self._new_conn()

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connection.py:284:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 
0x7f9238ef8dd8>

    def _new_conn(self):
        """ Establish a socket connection and set nodelay settings on it.
    
            :return: New socket connection.
            """
        extra_kw = {}
        if self.source_address:
            extra_kw['source_address'] = self.source_address
    
        if self.socket_options:
            extra_kw['socket_options'] = self.socket_options
    
        try:
            conn = connection.create_connection(
                (self.host, self.port), self.timeout, **extra_kw)
    
        except SocketTimeout as e:
            raise ConnectTimeoutError(
                self, "Connection to %s timed out. (connect timeout=%s)" %
                (self.host, self.timeout))
    
        except SocketError as e:
            raise NewConnectionError(
>               self, "Failed to establish a new connection: %s" % e)
E           requests.packages.urllib3.exceptions.NewConnectionError: 
<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 
0x7f9238ef8dd8>: Failed to establish a new connection: [Errno -2] Name or 
service not known

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connection.py:150:
 NewConnectionError

During handling of the above exception, another exception occurred:

self = <requests.adapters.HTTPAdapter object at 0x7f92346f0780>
request = <PreparedRequest [GET]>, stream = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92346f9630>
verify = True, cert = None, proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, 
proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
            :param request: The :class:`PreparedRequest <PreparedRequest>` 
being sent.
            :param stream: (optional) Whether to stream the request content.
            :param timeout: (optional) How long to wait for the server to send
                data before giving up, as a float, or a :ref:`(connect timeout,
                read timeout) <timeouts>` tuple.
            :type timeout: float or tuple
            :param verify: (optional) Whether to verify SSL certificates.
            :param cert: (optional) Any user-provided SSL certificate to be 
trusted.
            :param proxies: (optional) The proxies dictionary to apply to the 
request.
            :rtype: requests.Response
            """
    
        conn = self.get_connection(request.url, proxies)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request)
    
        chunked = not (request.body is None or 'Content-Length' in 
request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {0}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
                resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
>                   timeout=timeout
                )

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/adapters.py:423:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connectionpool.HTTPSConnectionPool object at 
0x7f92346f96d8>
method = 'GET', url = '/metadata/nasa', body = None
headers = {'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 
'Accept': '*/*', 'User-Agent': 'internetarchive/1.7.1 (Linux ; N; en; None) 
Python/3.5.3'}
retries = Retry(total=2, connect=2, read=3, redirect=0), redirect = False
assert_same_host = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92346f9630>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True
err = 
NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection
 object at 0x7f92346f98d0>: Failed to establish a new connection: [Errno -2] 
Name or service not known',)
clean_exit = False
timeout_obj = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92346f9550>
is_new_proxy_conn = False

    def urlopen(self, method, url, body=None, headers=None, retries=None,
                redirect=True, assert_same_host=True, timeout=_Default,
                pool_timeout=None, release_conn=None, chunked=False,
                body_pos=None, **response_kw):
        """
            Get a connection from the pool and perform an HTTP request. This is 
the
            lowest level call for making a request, so you'll need to specify 
all
            the raw details.
    
            .. note::
    
               More commonly, it's appropriate to use a convenience method 
provided
               by :class:`.RequestMethods`, such as :meth:`request`.
    
            .. note::
    
               `release_conn` will only behave as expected if
               `preload_content=False` because we want to make
               `preload_content=False` the default behaviour someday soon 
without
               breaking backwards compatibility.
    
            :param method:
                HTTP request method (such as GET, POST, PUT, etc.)
    
            :param body:
                Data to send in the request body (useful for creating
                POST requests, see HTTPConnectionPool.post_url for
                more convenience).
    
            :param headers:
                Dictionary of custom headers to send, such as User-Agent,
                If-None-Match, etc. If None, pool headers are used. If provided,
                these headers completely replace any pool-specific headers.
    
            :param retries:
                Configure the number of retries to allow before raising a
                :class:`~urllib3.exceptions.MaxRetryError` exception.
    
                Pass ``None`` to retry until you receive a response. Pass a
                :class:`~urllib3.util.retry.Retry` object for fine-grained 
control
                over different types of retries.
                Pass an integer number to retry connection errors that many 
times,
                but no other types of errors. Pass zero to never retry.
    
                If ``False``, then retries are disabled and any exception is 
raised
                immediately. Also, instead of raising a MaxRetryError on 
redirects,
                the redirect response will be returned.
    
            :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
            :param redirect:
                If True, automatically handle redirects (status codes 301, 302,
                303, 307, 308). Each redirect counts as a retry. Disabling 
retries
                will disable redirect, too.
    
            :param assert_same_host:
                If ``True``, will make sure that the host of the pool requests 
is
                consistent else will raise HostChangedError. When False, you can
                use the pool on an HTTP proxy and request foreign hosts.
    
            :param timeout:
                If specified, overrides the default timeout for this one
                request. It may be a float (in seconds) or an instance of
                :class:`urllib3.util.Timeout`.
    
            :param pool_timeout:
                If set and the pool is set to block=True, then this method will
                block for ``pool_timeout`` seconds and raise EmptyPoolError if 
no
                connection is available within the time period.
    
            :param release_conn:
                If False, then the urlopen call will not release the connection
                back into the pool once a response is received (but will 
release if
                you read the entire contents of the response such as when
                `preload_content=True`). This is useful if you're not preloading
                the response's content immediately. You will need to call
                ``r.release_conn()`` on the response ``r`` to return the 
connection
                back into the pool. If None, it takes the value of
                ``response_kw.get('preload_content', True)``.
    
            :param chunked:
                If True, urllib3 will send the body using chunked transfer
                encoding. Otherwise, urllib3 will send the body using the 
standard
                content-length form. Defaults to False.
    
            :param int body_pos:
                Position to seek to in file-like body in the event of a retry or
                redirect. Typically this won't need to be set because urllib3 
will
                auto-populate the value when needed.
    
            :param \\**response_kw:
                Additional parameters are passed to
                :meth:`urllib3.response.HTTPResponse.from_httplib`
            """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, 
default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get('preload_content', True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == 'http':
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(conn, 
'sock', None)
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(conn, method, url,
                                                  timeout=timeout_obj,
                                                  body=body, headers=headers,
                                                  chunked=chunked)
    
            # If we're going to release the connection in ``finally:``, then
            # the response doesn't need to know about the connection. Otherwise
            # it will also try to release it and we'll have a double-release
            # mess.
            response_conn = conn if not release_conn else None
    
            # Pass method to Response for length checking
            response_kw['request_method'] = method
    
            # Import httplib's response into our own wrapper object
            response = self.ResponseCls.from_httplib(httplib_response,
                                                     pool=self,
                                                     connection=response_conn,
                                                     retries=retries,
                                                     **response_kw)
    
            # Everything went great!
            clean_exit = True
    
        except queue.Empty:
            # Timed out by queue.
            raise EmptyPoolError(self, "No pool connections are available.")
    
        except (BaseSSLError, CertificateError) as e:
            # Close the connection. If a connection is reused on which there
            # was a Certificate error, the next request will certainly raise
            # another Certificate error.
            clean_exit = False
            raise SSLError(e)
    
        except SSLError:
            # Treat SSLError separately from BaseSSLError to preserve
            # traceback.
            clean_exit = False
            raise
    
        except (TimeoutError, HTTPException, SocketError, ProtocolError) as e:
            # Discard the connection for these exceptions. It will be
            # be replaced during the next _get_conn() call.
            clean_exit = False
    
            if isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
                e = ProxyError('Cannot connect to proxy.', e)
            elif isinstance(e, (SocketError, HTTPException)):
                e = ProtocolError('Connection aborted.', e)
    
            retries = retries.increment(method, url, error=e, _pool=self,
                                        _stacktrace=sys.exc_info()[2])
            retries.sleep()
    
            # Keep track of the error for the retry warning.
            err = e
    
        finally:
            if not clean_exit:
                # We hit some kind of exception, handled or otherwise. We need
                # to throw the connection away unless explicitly told not to.
                # Close the connection, set the variable to None, and make sure
                # we put the None back in the pool to avoid leaking it.
                conn = conn and conn.close()
                release_this_conn = True
    
            if release_this_conn:
                # Put the connection back to be reused. If the connection is
                # expired then it will be None, which will get replaced with a
                # fresh connection during _get_conn.
                self._put_conn(conn)
    
        if not conn:
            # Try again
            log.warning("Retrying (%r) after connection "
                        "broken by '%r': %s", retries, err, url)
            return self.urlopen(method, url, body, headers, retries,
                                redirect, assert_same_host,
                                timeout=timeout, pool_timeout=pool_timeout,
                                release_conn=release_conn, body_pos=body_pos,
>                               **response_kw)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:678:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connectionpool.HTTPSConnectionPool object at 
0x7f92346f96d8>
method = 'GET', url = '/metadata/nasa', body = None
headers = {'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 
'Accept': '*/*', 'User-Agent': 'internetarchive/1.7.1 (Linux ; N; en; None) 
Python/3.5.3'}
retries = Retry(total=1, connect=1, read=3, redirect=0), redirect = False
assert_same_host = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92346f9630>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True
err = 
NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection
 object at 0x7f92346f9ba8>: Failed to establish a new connection: [Errno -2] 
Name or service not known',)
clean_exit = False
timeout_obj = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92346f9b38>
is_new_proxy_conn = False

    def urlopen(self, method, url, body=None, headers=None, retries=None,
                redirect=True, assert_same_host=True, timeout=_Default,
                pool_timeout=None, release_conn=None, chunked=False,
                body_pos=None, **response_kw):
        """
            Get a connection from the pool and perform an HTTP request. This is 
the
            lowest level call for making a request, so you'll need to specify 
all
            the raw details.
    
            .. note::
    
               More commonly, it's appropriate to use a convenience method 
provided
               by :class:`.RequestMethods`, such as :meth:`request`.
    
            .. note::
    
               `release_conn` will only behave as expected if
               `preload_content=False` because we want to make
               `preload_content=False` the default behaviour someday soon 
without
               breaking backwards compatibility.
    
            :param method:
                HTTP request method (such as GET, POST, PUT, etc.)
    
            :param body:
                Data to send in the request body (useful for creating
                POST requests, see HTTPConnectionPool.post_url for
                more convenience).
    
            :param headers:
                Dictionary of custom headers to send, such as User-Agent,
                If-None-Match, etc. If None, pool headers are used. If provided,
                these headers completely replace any pool-specific headers.
    
            :param retries:
                Configure the number of retries to allow before raising a
                :class:`~urllib3.exceptions.MaxRetryError` exception.
    
                Pass ``None`` to retry until you receive a response. Pass a
                :class:`~urllib3.util.retry.Retry` object for fine-grained 
control
                over different types of retries.
                Pass an integer number to retry connection errors that many 
times,
                but no other types of errors. Pass zero to never retry.
    
                If ``False``, then retries are disabled and any exception is 
raised
                immediately. Also, instead of raising a MaxRetryError on 
redirects,
                the redirect response will be returned.
    
            :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
            :param redirect:
                If True, automatically handle redirects (status codes 301, 302,
                303, 307, 308). Each redirect counts as a retry. Disabling 
retries
                will disable redirect, too.
    
            :param assert_same_host:
                If ``True``, will make sure that the host of the pool requests 
is
                consistent else will raise HostChangedError. When False, you can
                use the pool on an HTTP proxy and request foreign hosts.
    
            :param timeout:
                If specified, overrides the default timeout for this one
                request. It may be a float (in seconds) or an instance of
                :class:`urllib3.util.Timeout`.
    
            :param pool_timeout:
                If set and the pool is set to block=True, then this method will
                block for ``pool_timeout`` seconds and raise EmptyPoolError if 
no
                connection is available within the time period.
    
            :param release_conn:
                If False, then the urlopen call will not release the connection
                back into the pool once a response is received (but will 
release if
                you read the entire contents of the response such as when
                `preload_content=True`). This is useful if you're not preloading
                the response's content immediately. You will need to call
                ``r.release_conn()`` on the response ``r`` to return the 
connection
                back into the pool. If None, it takes the value of
                ``response_kw.get('preload_content', True)``.
    
            :param chunked:
                If True, urllib3 will send the body using chunked transfer
                encoding. Otherwise, urllib3 will send the body using the 
standard
                content-length form. Defaults to False.
    
            :param int body_pos:
                Position to seek to in file-like body in the event of a retry or
                redirect. Typically this won't need to be set because urllib3 
will
                auto-populate the value when needed.
    
            :param \\**response_kw:
                Additional parameters are passed to
                :meth:`urllib3.response.HTTPResponse.from_httplib`
            """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, 
default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get('preload_content', True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == 'http':
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(conn, 
'sock', None)
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(conn, method, url,
                                                  timeout=timeout_obj,
                                                  body=body, headers=headers,
                                                  chunked=chunked)
    
            # If we're going to release the connection in ``finally:``, then
            # the response doesn't need to know about the connection. Otherwise
            # it will also try to release it and we'll have a double-release
            # mess.
            response_conn = conn if not release_conn else None
    
            # Pass method to Response for length checking
            response_kw['request_method'] = method
    
            # Import httplib's response into our own wrapper object
            response = self.ResponseCls.from_httplib(httplib_response,
                                                     pool=self,
                                                     connection=response_conn,
                                                     retries=retries,
                                                     **response_kw)
    
            # Everything went great!
            clean_exit = True
    
        except queue.Empty:
            # Timed out by queue.
            raise EmptyPoolError(self, "No pool connections are available.")
    
        except (BaseSSLError, CertificateError) as e:
            # Close the connection. If a connection is reused on which there
            # was a Certificate error, the next request will certainly raise
            # another Certificate error.
            clean_exit = False
            raise SSLError(e)
    
        except SSLError:
            # Treat SSLError separately from BaseSSLError to preserve
            # traceback.
            clean_exit = False
            raise
    
        except (TimeoutError, HTTPException, SocketError, ProtocolError) as e:
            # Discard the connection for these exceptions. It will be
            # be replaced during the next _get_conn() call.
            clean_exit = False
    
            if isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
                e = ProxyError('Cannot connect to proxy.', e)
            elif isinstance(e, (SocketError, HTTPException)):
                e = ProtocolError('Connection aborted.', e)
    
            retries = retries.increment(method, url, error=e, _pool=self,
                                        _stacktrace=sys.exc_info()[2])
            retries.sleep()
    
            # Keep track of the error for the retry warning.
            err = e
    
        finally:
            if not clean_exit:
                # We hit some kind of exception, handled or otherwise. We need
                # to throw the connection away unless explicitly told not to.
                # Close the connection, set the variable to None, and make sure
                # we put the None back in the pool to avoid leaking it.
                conn = conn and conn.close()
                release_this_conn = True
    
            if release_this_conn:
                # Put the connection back to be reused. If the connection is
                # expired then it will be None, which will get replaced with a
                # fresh connection during _get_conn.
                self._put_conn(conn)
    
        if not conn:
            # Try again
            log.warning("Retrying (%r) after connection "
                        "broken by '%r': %s", retries, err, url)
            return self.urlopen(method, url, body, headers, retries,
                                redirect, assert_same_host,
                                timeout=timeout, pool_timeout=pool_timeout,
                                release_conn=release_conn, body_pos=body_pos,
>                               **response_kw)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:678:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connectionpool.HTTPSConnectionPool object at 
0x7f92346f96d8>
method = 'GET', url = '/metadata/nasa', body = None
headers = {'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 
'Accept': '*/*', 'User-Agent': 'internetarchive/1.7.1 (Linux ; N; en; None) 
Python/3.5.3'}
retries = Retry(total=0, connect=0, read=3, redirect=0), redirect = False
assert_same_host = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92346f9630>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True
err = 
NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection
 object at 0x7f9234960f28>: Failed to establish a new connection: [Errno -2] 
Name or service not known',)
clean_exit = False
timeout_obj = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92346794e0>
is_new_proxy_conn = False

    def urlopen(self, method, url, body=None, headers=None, retries=None,
                redirect=True, assert_same_host=True, timeout=_Default,
                pool_timeout=None, release_conn=None, chunked=False,
                body_pos=None, **response_kw):
        """
            Get a connection from the pool and perform an HTTP request. This is 
the
            lowest level call for making a request, so you'll need to specify 
all
            the raw details.
    
            .. note::
    
               More commonly, it's appropriate to use a convenience method 
provided
               by :class:`.RequestMethods`, such as :meth:`request`.
    
            .. note::
    
               `release_conn` will only behave as expected if
               `preload_content=False` because we want to make
               `preload_content=False` the default behaviour someday soon 
without
               breaking backwards compatibility.
    
            :param method:
                HTTP request method (such as GET, POST, PUT, etc.)
    
            :param body:
                Data to send in the request body (useful for creating
                POST requests, see HTTPConnectionPool.post_url for
                more convenience).
    
            :param headers:
                Dictionary of custom headers to send, such as User-Agent,
                If-None-Match, etc. If None, pool headers are used. If provided,
                these headers completely replace any pool-specific headers.
    
            :param retries:
                Configure the number of retries to allow before raising a
                :class:`~urllib3.exceptions.MaxRetryError` exception.
    
                Pass ``None`` to retry until you receive a response. Pass a
                :class:`~urllib3.util.retry.Retry` object for fine-grained 
control
                over different types of retries.
                Pass an integer number to retry connection errors that many 
times,
                but no other types of errors. Pass zero to never retry.
    
                If ``False``, then retries are disabled and any exception is 
raised
                immediately. Also, instead of raising a MaxRetryError on 
redirects,
                the redirect response will be returned.
    
            :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
            :param redirect:
                If True, automatically handle redirects (status codes 301, 302,
                303, 307, 308). Each redirect counts as a retry. Disabling 
retries
                will disable redirect, too.
    
            :param assert_same_host:
                If ``True``, will make sure that the host of the pool requests 
is
                consistent else will raise HostChangedError. When False, you can
                use the pool on an HTTP proxy and request foreign hosts.
    
            :param timeout:
                If specified, overrides the default timeout for this one
                request. It may be a float (in seconds) or an instance of
                :class:`urllib3.util.Timeout`.
    
            :param pool_timeout:
                If set and the pool is set to block=True, then this method will
                block for ``pool_timeout`` seconds and raise EmptyPoolError if 
no
                connection is available within the time period.
    
            :param release_conn:
                If False, then the urlopen call will not release the connection
                back into the pool once a response is received (but will 
release if
                you read the entire contents of the response such as when
                `preload_content=True`). This is useful if you're not preloading
                the response's content immediately. You will need to call
                ``r.release_conn()`` on the response ``r`` to return the 
connection
                back into the pool. If None, it takes the value of
                ``response_kw.get('preload_content', True)``.
    
            :param chunked:
                If True, urllib3 will send the body using chunked transfer
                encoding. Otherwise, urllib3 will send the body using the 
standard
                content-length form. Defaults to False.
    
            :param int body_pos:
                Position to seek to in file-like body in the event of a retry or
                redirect. Typically this won't need to be set because urllib3 
will
                auto-populate the value when needed.
    
            :param \\**response_kw:
                Additional parameters are passed to
                :meth:`urllib3.response.HTTPResponse.from_httplib`
            """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, 
default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get('preload_content', True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == 'http':
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(conn, 
'sock', None)
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(conn, method, url,
                                                  timeout=timeout_obj,
                                                  body=body, headers=headers,
                                                  chunked=chunked)
    
            # If we're going to release the connection in ``finally:``, then
            # the response doesn't need to know about the connection. Otherwise
            # it will also try to release it and we'll have a double-release
            # mess.
            response_conn = conn if not release_conn else None
    
            # Pass method to Response for length checking
            response_kw['request_method'] = method
    
            # Import httplib's response into our own wrapper object
            response = self.ResponseCls.from_httplib(httplib_response,
                                                     pool=self,
                                                     connection=response_conn,
                                                     retries=retries,
                                                     **response_kw)
    
            # Everything went great!
            clean_exit = True
    
        except queue.Empty:
            # Timed out by queue.
            raise EmptyPoolError(self, "No pool connections are available.")
    
        except (BaseSSLError, CertificateError) as e:
            # Close the connection. If a connection is reused on which there
            # was a Certificate error, the next request will certainly raise
            # another Certificate error.
            clean_exit = False
            raise SSLError(e)
    
        except SSLError:
            # Treat SSLError separately from BaseSSLError to preserve
            # traceback.
            clean_exit = False
            raise
    
        except (TimeoutError, HTTPException, SocketError, ProtocolError) as e:
            # Discard the connection for these exceptions. It will be
            # be replaced during the next _get_conn() call.
            clean_exit = False
    
            if isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
                e = ProxyError('Cannot connect to proxy.', e)
            elif isinstance(e, (SocketError, HTTPException)):
                e = ProtocolError('Connection aborted.', e)
    
            retries = retries.increment(method, url, error=e, _pool=self,
                                        _stacktrace=sys.exc_info()[2])
            retries.sleep()
    
            # Keep track of the error for the retry warning.
            err = e
    
        finally:
            if not clean_exit:
                # We hit some kind of exception, handled or otherwise. We need
                # to throw the connection away unless explicitly told not to.
                # Close the connection, set the variable to None, and make sure
                # we put the None back in the pool to avoid leaking it.
                conn = conn and conn.close()
                release_this_conn = True
    
            if release_this_conn:
                # Put the connection back to be reused. If the connection is
                # expired then it will be None, which will get replaced with a
                # fresh connection during _get_conn.
                self._put_conn(conn)
    
        if not conn:
            # Try again
            log.warning("Retrying (%r) after connection "
                        "broken by '%r': %s", retries, err, url)
            return self.urlopen(method, url, body, headers, retries,
                                redirect, assert_same_host,
                                timeout=timeout, pool_timeout=pool_timeout,
                                release_conn=release_conn, body_pos=body_pos,
>                               **response_kw)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:678:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connectionpool.HTTPSConnectionPool object at 
0x7f92346f96d8>
method = 'GET', url = '/metadata/nasa', body = None
headers = {'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 
'Accept': '*/*', 'User-Agent': 'internetarchive/1.7.1 (Linux ; N; en; None) 
Python/3.5.3'}
retries = Retry(total=0, connect=0, read=3, redirect=0), redirect = False
assert_same_host = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92346f9630>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True, err = None, clean_exit = False
timeout_obj = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f9234960f60>
is_new_proxy_conn = False

    def urlopen(self, method, url, body=None, headers=None, retries=None,
                redirect=True, assert_same_host=True, timeout=_Default,
                pool_timeout=None, release_conn=None, chunked=False,
                body_pos=None, **response_kw):
        """
            Get a connection from the pool and perform an HTTP request. This is 
the
            lowest level call for making a request, so you'll need to specify 
all
            the raw details.
    
            .. note::
    
               More commonly, it's appropriate to use a convenience method 
provided
               by :class:`.RequestMethods`, such as :meth:`request`.
    
            .. note::
    
               `release_conn` will only behave as expected if
               `preload_content=False` because we want to make
               `preload_content=False` the default behaviour someday soon 
without
               breaking backwards compatibility.
    
            :param method:
                HTTP request method (such as GET, POST, PUT, etc.)
    
            :param body:
                Data to send in the request body (useful for creating
                POST requests, see HTTPConnectionPool.post_url for
                more convenience).
    
            :param headers:
                Dictionary of custom headers to send, such as User-Agent,
                If-None-Match, etc. If None, pool headers are used. If provided,
                these headers completely replace any pool-specific headers.
    
            :param retries:
                Configure the number of retries to allow before raising a
                :class:`~urllib3.exceptions.MaxRetryError` exception.
    
                Pass ``None`` to retry until you receive a response. Pass a
                :class:`~urllib3.util.retry.Retry` object for fine-grained 
control
                over different types of retries.
                Pass an integer number to retry connection errors that many 
times,
                but no other types of errors. Pass zero to never retry.
    
                If ``False``, then retries are disabled and any exception is 
raised
                immediately. Also, instead of raising a MaxRetryError on 
redirects,
                the redirect response will be returned.
    
            :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
            :param redirect:
                If True, automatically handle redirects (status codes 301, 302,
                303, 307, 308). Each redirect counts as a retry. Disabling 
retries
                will disable redirect, too.
    
            :param assert_same_host:
                If ``True``, will make sure that the host of the pool requests 
is
                consistent else will raise HostChangedError. When False, you can
                use the pool on an HTTP proxy and request foreign hosts.
    
            :param timeout:
                If specified, overrides the default timeout for this one
                request. It may be a float (in seconds) or an instance of
                :class:`urllib3.util.Timeout`.
    
            :param pool_timeout:
                If set and the pool is set to block=True, then this method will
                block for ``pool_timeout`` seconds and raise EmptyPoolError if 
no
                connection is available within the time period.
    
            :param release_conn:
                If False, then the urlopen call will not release the connection
                back into the pool once a response is received (but will 
release if
                you read the entire contents of the response such as when
                `preload_content=True`). This is useful if you're not preloading
                the response's content immediately. You will need to call
                ``r.release_conn()`` on the response ``r`` to return the 
connection
                back into the pool. If None, it takes the value of
                ``response_kw.get('preload_content', True)``.
    
            :param chunked:
                If True, urllib3 will send the body using chunked transfer
                encoding. Otherwise, urllib3 will send the body using the 
standard
                content-length form. Defaults to False.
    
            :param int body_pos:
                Position to seek to in file-like body in the event of a retry or
                redirect. Typically this won't need to be set because urllib3 
will
                auto-populate the value when needed.
    
            :param \\**response_kw:
                Additional parameters are passed to
                :meth:`urllib3.response.HTTPResponse.from_httplib`
            """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, 
default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get('preload_content', True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == 'http':
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(conn, 
'sock', None)
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(conn, method, url,
                                                  timeout=timeout_obj,
                                                  body=body, headers=headers,
                                                  chunked=chunked)
    
            # If we're going to release the connection in ``finally:``, then
            # the response doesn't need to know about the connection. Otherwise
            # it will also try to release it and we'll have a double-release
            # mess.
            response_conn = conn if not release_conn else None
    
            # Pass method to Response for length checking
            response_kw['request_method'] = method
    
            # Import httplib's response into our own wrapper object
            response = self.ResponseCls.from_httplib(httplib_response,
                                                     pool=self,
                                                     connection=response_conn,
                                                     retries=retries,
                                                     **response_kw)
    
            # Everything went great!
            clean_exit = True
    
        except queue.Empty:
            # Timed out by queue.
            raise EmptyPoolError(self, "No pool connections are available.")
    
        except (BaseSSLError, CertificateError) as e:
            # Close the connection. If a connection is reused on which there
            # was a Certificate error, the next request will certainly raise
            # another Certificate error.
            clean_exit = False
            raise SSLError(e)
    
        except SSLError:
            # Treat SSLError separately from BaseSSLError to preserve
            # traceback.
            clean_exit = False
            raise
    
        except (TimeoutError, HTTPException, SocketError, ProtocolError) as e:
            # Discard the connection for these exceptions. It will be
            # be replaced during the next _get_conn() call.
            clean_exit = False
    
            if isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
                e = ProxyError('Cannot connect to proxy.', e)
            elif isinstance(e, (SocketError, HTTPException)):
                e = ProtocolError('Connection aborted.', e)
    
            retries = retries.increment(method, url, error=e, _pool=self,
>                                       _stacktrace=sys.exc_info()[2])

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:649:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = Retry(total=0, connect=0, read=3, redirect=0), method = 'GET'
url = '/metadata/nasa', response = None
error = 
NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection
 object at 0x7f9238ef8dd8>: Failed to establish a new connection: [Errno -2] 
Name or service not known',)
_pool = <requests.packages.urllib3.connectionpool.HTTPSConnectionPool object at 
0x7f92346f96d8>
_stacktrace = <traceback object at 0x7f92349cda88>

    def increment(self, method=None, url=None, response=None, error=None,
                  _pool=None, _stacktrace=None):
        """ Return a new Retry object with incremented retry counters.
    
            :param response: A response object, or None, if the server did not
                return a response.
            :type response: :class:`~urllib3.response.HTTPResponse`
            :param Exception error: An error encountered during the request, or
                None if the response was received successfully.
    
            :return: A new ``Retry`` object.
            """
        if self.total is False and error:
            # Disabled, indicate to re-raise the error.
            raise six.reraise(type(error), error, _stacktrace)
    
        total = self.total
        if total is not None:
            total -= 1
    
        connect = self.connect
        read = self.read
        redirect = self.redirect
        cause = 'unknown'
        status = None
        redirect_location = None
    
        if error and self._is_connection_error(error):
            # Connect retry?
            if connect is False:
                raise six.reraise(type(error), error, _stacktrace)
            elif connect is not None:
                connect -= 1
    
        elif error and self._is_read_error(error):
            # Read retry?
            if read is False or not self._is_method_retryable(method):
                raise six.reraise(type(error), error, _stacktrace)
            elif read is not None:
                read -= 1
    
        elif response and response.get_redirect_location():
            # Redirect retry?
            if redirect is not None:
                redirect -= 1
            cause = 'too many redirects'
            redirect_location = response.get_redirect_location()
            status = response.status
    
        else:
            # Incrementing because of a server error like a 500 in
            # status_forcelist and a the given method is in the whitelist
            cause = ResponseError.GENERIC_ERROR
            if response and response.status:
                cause = ResponseError.SPECIFIC_ERROR.format(
                    status_code=response.status)
                status = response.status
    
        history = self.history + (RequestHistory(method, url, error, status, 
redirect_location),)
    
        new_retry = self.new(
            total=total,
            connect=connect, read=read, redirect=redirect,
            history=history)
    
        if new_retry.is_exhausted():
>           raise MaxRetryError(_pool, url, error or ResponseError(cause))
E           requests.packages.urllib3.exceptions.MaxRetryError: 
HTTPSConnectionPool(host='archive.org', port=443): Max retries exceeded with 
url: /metadata/nasa (Caused by 
NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection
 object at 0x7f9238ef8dd8>: Failed to establish a new connection: [Errno -2] 
Name or service not known',))

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/util/retry.py:376:
 MaxRetryError

During handling of the above exception, another exception occurred:

self = <internetarchive.session.ArchiveSession object at 0x7f92346f0668>
identifier = 'nasa', request_kwargs = {'timeout': 1e-13}

    def get_metadata(self, identifier, request_kwargs=None):
        """Get an item's metadata from the `Metadata API
            <http://blog.archive.org/2013/07/04/metadata-api/>`__
    
            :type identifier: str
            :param identifier: Globally unique Archive.org identifier.
    
            :rtype: dict
            :returns: Metadat API response.
            """
        request_kwargs = {} if not request_kwargs else request_kwargs
        url = '{0}//archive.org/metadata/{1}'.format(self.protocol, identifier)
        if 'timeout' not in request_kwargs:
            request_kwargs['timeout'] = 12
        try:
>           resp = self.get(url, **request_kwargs)

../../../internetarchive-1.7.1/internetarchive/session.py:237: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <internetarchive.session.ArchiveSession object at 0x7f92346f0668>
url = 'https://archive.org/metadata/nasa'
kwargs = {'allow_redirects': True, 'timeout': 1e-13}

    def get(self, url, **kwargs):
        """Sends a GET request. Returns :class:`Response` object.
    
            :param url: URL for the new :class:`Request` object.
            :param \*\*kwargs: Optional arguments that ``request`` takes.
            :rtype: requests.Response
            """
    
        kwargs.setdefault('allow_redirects', True)
>       return self.request('GET', url, **kwargs)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/sessions.py:501:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <internetarchive.session.ArchiveSession object at 0x7f92346f0668>
method = 'GET', url = 'https://archive.org/metadata/nasa', params = None
data = None, headers = None, cookies = None, files = None, auth = None
timeout = 1e-13, allow_redirects = True, proxies = {}, hooks = None
stream = None, verify = None, cert = None, json = None

    def request(self, method, url,
        params=None,
        data=None,
        headers=None,
        cookies=None,
        files=None,
        auth=None,
        timeout=None,
        allow_redirects=True,
        proxies=None,
        hooks=None,
        stream=None,
        verify=None,
        cert=None,
        json=None):
        """Constructs a :class:`Request <Request>`, prepares it and sends it.
            Returns :class:`Response <Response>` object.
    
            :param method: method for the new :class:`Request` object.
            :param url: URL for the new :class:`Request` object.
            :param params: (optional) Dictionary or bytes to be sent in the 
query
                string for the :class:`Request`.
            :param data: (optional) Dictionary, bytes, or file-like object to 
send
                in the body of the :class:`Request`.
            :param json: (optional) json to send in the body of the
                :class:`Request`.
            :param headers: (optional) Dictionary of HTTP Headers to send with 
the
                :class:`Request`.
            :param cookies: (optional) Dict or CookieJar object to send with the
                :class:`Request`.
            :param files: (optional) Dictionary of ``'filename': 
file-like-objects``
                for multipart encoding upload.
            :param auth: (optional) Auth tuple or callable to enable
                Basic/Digest/Custom HTTP Auth.
            :param timeout: (optional) How long to wait for the server to send
                data before giving up, as a float, or a :ref:`(connect timeout,
                read timeout) <timeouts>` tuple.
            :type timeout: float or tuple
            :param allow_redirects: (optional) Set to True by default.
            :type allow_redirects: bool
            :param proxies: (optional) Dictionary mapping protocol or protocol 
and
                hostname to the URL of the proxy.
            :param stream: (optional) whether to immediately download the 
response
                content. Defaults to ``False``.
            :param verify: (optional) whether the SSL cert will be verified.
                A CA_BUNDLE path can also be provided. Defaults to ``True``.
            :param cert: (optional) if String, path to ssl client cert file 
(.pem).
                If Tuple, ('cert', 'key') pair.
            :rtype: requests.Response
            """
        # Create the Request.
        req = Request(
            method = method.upper(),
            url = url,
            headers = headers,
            files = files,
            data = data or {},
            json = json,
            params = params or {},
            auth = auth,
            cookies = cookies,
            hooks = hooks,
        )
        prep = self.prepare_request(req)
    
        proxies = proxies or {}
    
        settings = self.merge_environment_settings(
            prep.url, proxies, stream, verify, cert
        )
    
        # Send the request.
        send_kwargs = {
            'timeout': timeout,
            'allow_redirects': allow_redirects,
        }
        send_kwargs.update(settings)
>       resp = self.send(prep, **send_kwargs)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/sessions.py:488:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <internetarchive.session.ArchiveSession object at 0x7f92346f0668>
request = <PreparedRequest [GET]>
kwargs = {'allow_redirects': True, 'cert': None, 'proxies': OrderedDict(), 
'stream': False, ...}
insecure = False, w = []

    def send(self, request, **kwargs):
        # Catch urllib3 warnings for HTTPS related errors.
        insecure = False
        with warnings.catch_warnings(record=True) as w:
            warnings.filterwarnings('always')
>           r = super(ArchiveSession, self).send(request, **kwargs)

../../../internetarchive-1.7.1/internetarchive/session.py:353: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <internetarchive.session.ArchiveSession object at 0x7f92346f0668>
request = <PreparedRequest [GET]>
kwargs = {'cert': None, 'proxies': OrderedDict(), 'stream': False, 'timeout': 
1e-13, ...}
allow_redirects = True, stream = False, hooks = {'response': []}
checked_urls = set()
adapter = <requests.adapters.HTTPAdapter object at 0x7f92346f0780>
start = datetime.datetime(2017, 8, 17, 20, 18, 13, 579671)

    def send(self, request, **kwargs):
        """
            Send a given PreparedRequest.
    
            :rtype: requests.Response
            """
        # Set defaults that the hooks can utilize to ensure they always have
        # the correct parameters to reproduce the previous request.
        kwargs.setdefault('stream', self.stream)
        kwargs.setdefault('verify', self.verify)
        kwargs.setdefault('cert', self.cert)
        kwargs.setdefault('proxies', self.proxies)
    
        # It's possible that users might accidentally send a Request object.
        # Guard against that specific failure case.
        if isinstance(request, Request):
            raise ValueError('You can only send PreparedRequests.')
    
        # Set up variables needed for resolve_redirects and dispatching of hooks
        allow_redirects = kwargs.pop('allow_redirects', True)
        stream = kwargs.get('stream')
        hooks = request.hooks
    
        # Resolve URL in redirect cache, if available.
        if allow_redirects:
            checked_urls = set()
            while request.url in self.redirect_cache:
                checked_urls.add(request.url)
                new_url = self.redirect_cache.get(request.url)
                if new_url in checked_urls:
                    break
                request.url = new_url
    
        # Get the appropriate adapter to use
        adapter = self.get_adapter(url=request.url)
    
        # Start time (approximately) of the request
        start = datetime.utcnow()
    
        # Send the request
>       r = adapter.send(request, **kwargs)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/sessions.py:609:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.adapters.HTTPAdapter object at 0x7f92346f0780>
request = <PreparedRequest [GET]>, stream = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92346f9630>
verify = True, cert = None, proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, 
proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
            :param request: The :class:`PreparedRequest <PreparedRequest>` 
being sent.
            :param stream: (optional) Whether to stream the request content.
            :param timeout: (optional) How long to wait for the server to send
                data before giving up, as a float, or a :ref:`(connect timeout,
                read timeout) <timeouts>` tuple.
            :type timeout: float or tuple
            :param verify: (optional) Whether to verify SSL certificates.
            :param cert: (optional) Any user-provided SSL certificate to be 
trusted.
            :param proxies: (optional) The proxies dictionary to apply to the 
request.
            :rtype: requests.Response
            """
    
        conn = self.get_connection(request.url, proxies)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request)
    
        chunked = not (request.body is None or 'Content-Length' in 
request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {0}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
                resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
                    timeout=timeout
                )
    
            # Send the request.
            else:
                if hasattr(conn, 'proxy_pool'):
                    conn = conn.proxy_pool
    
                low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
    
                try:
                    low_conn.putrequest(request.method,
                                        url,
                                        skip_accept_encoding=True)
    
                    for header, value in request.headers.items():
                        low_conn.putheader(header, value)
    
                    low_conn.endheaders()
    
                    for i in request.body:
                        low_conn.send(hex(len(i))[2:].encode('utf-8'))
                        low_conn.send(b'\r\n')
                        low_conn.send(i)
                        low_conn.send(b'\r\n')
                    low_conn.send(b'0\r\n\r\n')
    
                    # Receive the response from the server
                    try:
                        # For Python 2.7+ versions, use buffering of HTTP
                        # responses
                        r = low_conn.getresponse(buffering=True)
                    except TypeError:
                        # For compatibility with Python 2.6 versions and back
                        r = low_conn.getresponse()
    
                    resp = HTTPResponse.from_httplib(
                        r,
                        pool=conn,
                        connection=low_conn,
                        preload_content=False,
                        decode_content=False
                    )
                except:
                    # If we hit any problems here, clean up the connection.
                    # Then, reraise so that we can handle the actual exception.
                    low_conn.close()
                    raise
    
        except (ProtocolError, socket.error) as err:
            raise ConnectionError(err, request=request)
    
        except MaxRetryError as e:
            if isinstance(e.reason, ConnectTimeoutError):
                # TODO: Remove this in 3.0.0: see #2811
                if not isinstance(e.reason, NewConnectionError):
                    raise ConnectTimeout(e, request=request)
    
            if isinstance(e.reason, ResponseError):
                raise RetryError(e, request=request)
    
            if isinstance(e.reason, _ProxyError):
                raise ProxyError(e, request=request)
    
>           raise ConnectionError(e, request=request)
E           requests.exceptions.ConnectionError: 
HTTPSConnectionPool(host='archive.org', port=443): Max retries exceeded with 
url: /metadata/nasa (Caused by 
NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection
 object at 0x7f9238ef8dd8>: Failed to establish a new connection: [Errno -2] 
Name or service not known',))

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/adapters.py:487:
 ConnectionError

During handling of the above exception, another exception occurred:

    def test_get_item_with_kwargs():
        with IaRequestsMock(assert_all_requests_are_fired=False) as rsps:
            rsps.add_metadata_mock('nasa')
            item = get_item('nasa', http_adapter_kwargs={'max_retries': 13})
            assert 
isinstance(item.session.adapters['{0}//'.format(PROTOCOL)].max_retries,
                              urllib3.Retry)
    
        try:
>           get_item('nasa', request_kwargs={'timeout': .0000000000001})

../../../internetarchive-1.7.1/tests/test_api.py:74: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

identifier = 'nasa', config = None, config_file = None
archive_session = <internetarchive.session.ArchiveSession object at 
0x7f92346f0668>
debug = None, http_adapter_kwargs = None, request_kwargs = {'timeout': 1e-13}

    def get_item(identifier,
                 config=None,
                 config_file=None,
                 archive_session=None,
                 debug=None,
                 http_adapter_kwargs=None,
                 request_kwargs=None):
        """Get an :class:`Item` object.
    
        :type identifier: str
        :param identifier: The globally unique Archive.org item identifier.
    
        :type config: dict
        :param config: (optional) A dictionary used to configure your session.
    
        :type config_file: str
        :param config_file: (optional) A path to a config file used to 
configure your session.
    
        :type archive_session: :class:`ArchiveSession`
        :param archive_session: (optional) An :class:`ArchiveSession` object 
can be provided
                                via the ``archive_session`` parameter.
    
        :type http_adapter_kwargs: dict
        :param http_adapter_kwargs: (optional) Keyword arguments that
                                    :py:class:`requests.adapters.HTTPAdapter` 
takes.
    
        :type request_kwargs: dict
        :param request_kwargs: (optional) Keyword arguments that
                               :py:class:`requests.Request` takes.
    
        Usage:
            >>> from internetarchive import get_item
            >>> item = get_item('nasa')
            >>> item.item_size
            121084
        """
        if not archive_session:
            archive_session = get_session(config, config_file, debug, 
http_adapter_kwargs)
>       return archive_session.get_item(identifier, 
> request_kwargs=request_kwargs)

../../../internetarchive-1.7.1/internetarchive/api.py:116: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <internetarchive.session.ArchiveSession object at 0x7f92346f0668>
identifier = 'nasa', item_metadata = None, request_kwargs = {'timeout': 1e-13}

    def get_item(self, identifier, item_metadata=None, request_kwargs=None):
        """A method for creating :class:`internetarchive.Item <Item>` and
            :class:`internetarchive.Collection <Collection>` objects.
    
            :type identifier: str
            :param identifier: A globally unique Archive.org identifier.
    
            :type item_metadata: dict
            :param item_metadata: (optional) A metadata dict used to initialize 
the Item or
                                  Collection object. Metadata will 
automatically be retrieved
                                  from Archive.org if nothing is provided.
    
            :type request_kwargs: dict
            :param request_kwargs: (optional) Keyword arguments to be used in
                                        :meth:`requests.sessions.Session.get` 
request.
            """
        request_kwargs = {} if not request_kwargs else request_kwargs
        if not item_metadata:
            logger.debug('no metadata provided for "{0}", '
                         'retrieving now.'.format(identifier))
>           item_metadata = self.get_metadata(identifier, request_kwargs)

../../../internetarchive-1.7.1/internetarchive/session.py:214: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <internetarchive.session.ArchiveSession object at 0x7f92346f0668>
identifier = 'nasa', request_kwargs = {'timeout': 1e-13}

    def get_metadata(self, identifier, request_kwargs=None):
        """Get an item's metadata from the `Metadata API
            <http://blog.archive.org/2013/07/04/metadata-api/>`__
    
            :type identifier: str
            :param identifier: Globally unique Archive.org identifier.
    
            :rtype: dict
            :returns: Metadat API response.
            """
        request_kwargs = {} if not request_kwargs else request_kwargs
        url = '{0}//archive.org/metadata/{1}'.format(self.protocol, identifier)
        if 'timeout' not in request_kwargs:
            request_kwargs['timeout'] = 12
        try:
            resp = self.get(url, **request_kwargs)
            resp.raise_for_status()
        except Exception as exc:
            error_msg = 'Error retrieving metadata from {0}, {1}'.format(url, 
exc)
            logger.error(error_msg)
>           raise type(exc)(error_msg)
E           requests.exceptions.ConnectionError: Error retrieving metadata from 
https://archive.org/metadata/nasa, HTTPSConnectionPool(host='archive.org', 
port=443): Max retries exceeded with url: /metadata/nasa (Caused by 
NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection
 object at 0x7f9238ef8dd8>: Failed to establish a new connection: [Errno -2] 
Name or service not known',))

../../../internetarchive-1.7.1/internetarchive/session.py:242: ConnectionError

During handling of the above exception, another exception occurred:

    def test_get_item_with_kwargs():
        with IaRequestsMock(assert_all_requests_are_fired=False) as rsps:
            rsps.add_metadata_mock('nasa')
            item = get_item('nasa', http_adapter_kwargs={'max_retries': 13})
            assert 
isinstance(item.session.adapters['{0}//'.format(PROTOCOL)].max_retries,
                              urllib3.Retry)
    
        try:
            get_item('nasa', request_kwargs={'timeout': .0000000000001})
        except Exception as exc:
>           assert 'timed out' in str(exc)
E           assert 'timed out' in "Error retrieving metadata from 
https://archive.org/metadata/nasa, HTTPSConnectionPool(host='archive.org', 
port=443): ...PSConnection object at 0x7f9238ef8dd8>: Failed to establish a new 
connection: [Errno -2] Name or service not known',))"
E            +  where "Error retrieving metadata from 
https://archive.org/metadata/nasa, HTTPSConnectionPool(host='archive.org', 
port=443): ...PSConnection object at 0x7f9238ef8dd8>: Failed to establish a new 
connection: [Errno -2] Name or service not known',))" = 
str(ConnectionError("Error retrieving metadata from 
https://archive.org/metadata/nasa, 
HTTPSConnectionPool(host='archive.o...Connection object at 0x7f9238ef8dd8>: 
Failed to establish a new connection: [Errno -2] Name or service not 
known',))",))

../../../internetarchive-1.7.1/tests/test_api.py:76: AssertionError
--------------------------------- Captured log ---------------------------------
session.py                 213 DEBUG    no metadata provided for "nasa", 
retrieving now.
session.py                 213 DEBUG    no metadata provided for "nasa", 
retrieving now.
connectionpool.py          818 DEBUG    Starting new HTTPS connection (1): 
archive.org
retry.py                   378 DEBUG    Incremented Retry for 
(url='/metadata/nasa'): Retry(total=2, connect=2, read=3, redirect=0)
connectionpool.py          673 WARNING  Retrying (Retry(total=2, connect=2, 
read=3, redirect=0)) after connection broken by 
'NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection
 object at 0x7f92346f98d0>: Failed to establish a new connection: [Errno -2] 
Name or service not known',)': /metadata/nasa
connectionpool.py          818 DEBUG    Starting new HTTPS connection (2): 
archive.org
retry.py                   378 DEBUG    Incremented Retry for 
(url='/metadata/nasa'): Retry(total=1, connect=1, read=3, redirect=0)
connectionpool.py          673 WARNING  Retrying (Retry(total=1, connect=1, 
read=3, redirect=0)) after connection broken by 
'NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection
 object at 0x7f92346f9ba8>: Failed to establish a new connection: [Errno -2] 
Name or service not known',)': /metadata/nasa
connectionpool.py          818 DEBUG    Starting new HTTPS connection (3): 
archive.org
retry.py                   378 DEBUG    Incremented Retry for 
(url='/metadata/nasa'): Retry(total=0, connect=0, read=3, redirect=0)
connectionpool.py          673 WARNING  Retrying (Retry(total=0, connect=0, 
read=3, redirect=0)) after connection broken by 
'NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection
 object at 0x7f9234960f28>: Failed to establish a new connection: [Errno -2] 
Name or service not known',)': /metadata/nasa
connectionpool.py          818 DEBUG    Starting new HTTPS connection (4): 
archive.org
session.py                 241 ERROR    Error retrieving metadata from 
https://archive.org/metadata/nasa, HTTPSConnectionPool(host='archive.org', 
port=443): Max retries exceeded with url: /metadata/nasa (Caused by 
NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection
 object at 0x7f9238ef8dd8>: Failed to establish a new connection: [Errno -2] 
Name or service not known',))
___________________________________ test_ia ____________________________________

self = <requests.packages.urllib3.connection.HTTPConnection object at 
0x7f92347c7b70>

    def _new_conn(self):
        """ Establish a socket connection and set nodelay settings on it.
    
            :return: New socket connection.
            """
        extra_kw = {}
        if self.source_address:
            extra_kw['source_address'] = self.source_address
    
        if self.socket_options:
            extra_kw['socket_options'] = self.socket_options
    
        try:
            conn = connection.create_connection(
>               (self.host, self.port), self.timeout, **extra_kw)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connection.py:141:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

address = ('archive.org', 80), timeout = 12, source_address = None
socket_options = [(6, 1, 1)]

    def create_connection(address, timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
                          source_address=None, socket_options=None):
        """Connect to *address* and return the socket object.
    
        Convenience function.  Connect to *address* (a 2-tuple ``(host,
        port)``) and return the socket object.  Passing the optional
        *timeout* parameter will set the timeout on the socket instance
        before attempting to connect.  If no *timeout* is supplied, the
        global default timeout setting returned by :func:`getdefaulttimeout`
        is used.  If *source_address* is set it must be a tuple of (host, port)
        for the socket to bind as a source address before making the connection.
        An host of '' or port 0 tells the OS to use the default.
        """
    
        host, port = address
        if host.startswith('['):
            host = host.strip('[]')
        err = None
    
        # Using the value from allowed_gai_family() in the context of 
getaddrinfo lets
        # us select whether to work with IPv4 DNS records, IPv6 records, or 
both.
        # The original create_connection function always returns all records.
        family = allowed_gai_family()
    
>       for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/util/connection.py:60:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

host = 'archive.org', port = 80, family = <AddressFamily.AF_UNSPEC: 0>
type = <SocketKind.SOCK_STREAM: 1>, proto = 0, flags = 0

    def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0):
        """Resolve host and port into list of address info entries.
    
        Translate the host/port argument into a sequence of 5-tuples that 
contain
        all the necessary arguments for creating a socket connected to that 
service.
        host is a domain name, a string representation of an IPv4/v6 address or
        None. port is a string service name such as 'http', a numeric port 
number or
        None. By passing None as the value of host and port, you can pass NULL 
to
        the underlying C API.
    
        The family, type and proto arguments can be optionally specified in 
order to
        narrow the list of addresses returned. Passing zero as a value for each 
of
        these arguments selects the full range of results.
        """
        # We override this function since we want to translate the numeric 
family
        # and socket type values to enum constants.
        addrlist = []
>       for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
E       socket.gaierror: [Errno -2] Name or service not known

/gnu/store/3aw9x28la9nh8fzkm665d7fywxzbl15j-python-3.5.3/lib/python3.5/socket.py:733:
 gaierror

During handling of the above exception, another exception occurred:

self = <requests.packages.urllib3.connectionpool.HTTPConnectionPool object at 
0x7f923370e128>
method = 'GET', url = '/metadata/nasa', body = None
headers = {'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 
'Accept': '*/*', 'User-Agent': 'internetarchive/1.7.1 (Linux ; N; en; foo) 
Python/3.5.3'}
retries = Retry(total=0, connect=0, read=3, redirect=0), redirect = False
assert_same_host = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f9238eeaa20>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True, err = None, clean_exit = False
timeout_obj = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92347c74e0>
is_new_proxy_conn = False

    def urlopen(self, method, url, body=None, headers=None, retries=None,
                redirect=True, assert_same_host=True, timeout=_Default,
                pool_timeout=None, release_conn=None, chunked=False,
                body_pos=None, **response_kw):
        """
            Get a connection from the pool and perform an HTTP request. This is 
the
            lowest level call for making a request, so you'll need to specify 
all
            the raw details.
    
            .. note::
    
               More commonly, it's appropriate to use a convenience method 
provided
               by :class:`.RequestMethods`, such as :meth:`request`.
    
            .. note::
    
               `release_conn` will only behave as expected if
               `preload_content=False` because we want to make
               `preload_content=False` the default behaviour someday soon 
without
               breaking backwards compatibility.
    
            :param method:
                HTTP request method (such as GET, POST, PUT, etc.)
    
            :param body:
                Data to send in the request body (useful for creating
                POST requests, see HTTPConnectionPool.post_url for
                more convenience).
    
            :param headers:
                Dictionary of custom headers to send, such as User-Agent,
                If-None-Match, etc. If None, pool headers are used. If provided,
                these headers completely replace any pool-specific headers.
    
            :param retries:
                Configure the number of retries to allow before raising a
                :class:`~urllib3.exceptions.MaxRetryError` exception.
    
                Pass ``None`` to retry until you receive a response. Pass a
                :class:`~urllib3.util.retry.Retry` object for fine-grained 
control
                over different types of retries.
                Pass an integer number to retry connection errors that many 
times,
                but no other types of errors. Pass zero to never retry.
    
                If ``False``, then retries are disabled and any exception is 
raised
                immediately. Also, instead of raising a MaxRetryError on 
redirects,
                the redirect response will be returned.
    
            :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
            :param redirect:
                If True, automatically handle redirects (status codes 301, 302,
                303, 307, 308). Each redirect counts as a retry. Disabling 
retries
                will disable redirect, too.
    
            :param assert_same_host:
                If ``True``, will make sure that the host of the pool requests 
is
                consistent else will raise HostChangedError. When False, you can
                use the pool on an HTTP proxy and request foreign hosts.
    
            :param timeout:
                If specified, overrides the default timeout for this one
                request. It may be a float (in seconds) or an instance of
                :class:`urllib3.util.Timeout`.
    
            :param pool_timeout:
                If set and the pool is set to block=True, then this method will
                block for ``pool_timeout`` seconds and raise EmptyPoolError if 
no
                connection is available within the time period.
    
            :param release_conn:
                If False, then the urlopen call will not release the connection
                back into the pool once a response is received (but will 
release if
                you read the entire contents of the response such as when
                `preload_content=True`). This is useful if you're not preloading
                the response's content immediately. You will need to call
                ``r.release_conn()`` on the response ``r`` to return the 
connection
                back into the pool. If None, it takes the value of
                ``response_kw.get('preload_content', True)``.
    
            :param chunked:
                If True, urllib3 will send the body using chunked transfer
                encoding. Otherwise, urllib3 will send the body using the 
standard
                content-length form. Defaults to False.
    
            :param int body_pos:
                Position to seek to in file-like body in the event of a retry or
                redirect. Typically this won't need to be set because urllib3 
will
                auto-populate the value when needed.
    
            :param \\**response_kw:
                Additional parameters are passed to
                :meth:`urllib3.response.HTTPResponse.from_httplib`
            """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, 
default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get('preload_content', True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == 'http':
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(conn, 
'sock', None)
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(conn, method, url,
                                                  timeout=timeout_obj,
                                                  body=body, headers=headers,
>                                                 chunked=chunked)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:600:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connectionpool.HTTPConnectionPool object at 
0x7f923370e128>
conn = <requests.packages.urllib3.connection.HTTPConnection object at 
0x7f92347c7b70>
method = 'GET', url = '/metadata/nasa'
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92347c74e0>
chunked = False
httplib_request_kw = {'body': None, 'headers': {'Connection': 'keep-alive', 
'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 
'internetarchive/1.7.1 (Linux ; N; en; foo) Python/3.5.3'}}
timeout_obj = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92347c75f8>

    def _make_request(self, conn, method, url, timeout=_Default, chunked=False,
                      **httplib_request_kw):
        """
            Perform a request on a given urllib connection object taken from our
            pool.
    
            :param conn:
                a connection from one of our connection pools
    
            :param timeout:
                Socket timeout in seconds for the request. This can be a
                float or integer, which will set the same timeout value for
                the socket connect and the socket read, or an instance of
                :class:`urllib3.util.Timeout`, which gives you more fine-grained
                control over your timeouts.
            """
        self.num_requests += 1
    
        timeout_obj = self._get_timeout(timeout)
        timeout_obj.start_connect()
        conn.timeout = timeout_obj.connect_timeout
    
        # Trigger any extra validation we need to do.
        try:
            self._validate_conn(conn)
        except (SocketTimeout, BaseSSLError) as e:
            # Py2 raises this as a BaseSSLError, Py3 raises it as socket 
timeout.
            self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
            raise
    
        # conn.request() calls httplib.*.request, not the method in
        # urllib3.request. It also calls makefile (recv) on the socket.
        if chunked:
            conn.request_chunked(method, url, **httplib_request_kw)
        else:
>           conn.request(method, url, **httplib_request_kw)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:356:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connection.HTTPConnection object at 
0x7f92347c7b70>
method = 'GET', url = '/metadata/nasa', body = None
headers = {'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 
'Accept': '*/*', 'User-Agent': 'internetarchive/1.7.1 (Linux ; N; en; foo) 
Python/3.5.3'}

    def request(self, method, url, body=None, headers={}):
        """Send a complete request to the server."""
>       self._send_request(method, url, body, headers)

/gnu/store/3aw9x28la9nh8fzkm665d7fywxzbl15j-python-3.5.3/lib/python3.5/http/client.py:1107:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connection.HTTPConnection object at 
0x7f92347c7b70>
method = 'GET', url = '/metadata/nasa', body = None
headers = {'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 
'Accept': '*/*', 'User-Agent': 'internetarchive/1.7.1 (Linux ; N; en; foo) 
Python/3.5.3'}

    def _send_request(self, method, url, body, headers):
        # Honor explicitly requested Host: and Accept-Encoding: headers.
        header_names = dict.fromkeys([k.lower() for k in headers])
        skips = {}
        if 'host' in header_names:
            skips['skip_host'] = 1
        if 'accept-encoding' in header_names:
            skips['skip_accept_encoding'] = 1
    
        self.putrequest(method, url, **skips)
    
        if 'content-length' not in header_names:
            self._set_content_length(body, method)
        for hdr, value in headers.items():
            self.putheader(hdr, value)
        if isinstance(body, str):
            # RFC 2616 Section 3.7.1 says that text default has a
            # default charset of iso-8859-1.
            body = _encode(body, 'body')
>       self.endheaders(body)

/gnu/store/3aw9x28la9nh8fzkm665d7fywxzbl15j-python-3.5.3/lib/python3.5/http/client.py:1152:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connection.HTTPConnection object at 
0x7f92347c7b70>
message_body = None

    def endheaders(self, message_body=None):
        """Indicate that the last header line has been sent to the server.
    
            This method sends the request to the server.  The optional 
message_body
            argument can be used to pass a message body associated with the
            request.  The message body will be sent in the same packet as the
            message headers if it is a string, otherwise it is sent as a 
separate
            packet.
            """
        if self.__state == _CS_REQ_STARTED:
            self.__state = _CS_REQ_SENT
        else:
            raise CannotSendHeader()
>       self._send_output(message_body)

/gnu/store/3aw9x28la9nh8fzkm665d7fywxzbl15j-python-3.5.3/lib/python3.5/http/client.py:1103:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connection.HTTPConnection object at 
0x7f92347c7b70>
message_body = None

    def _send_output(self, message_body=None):
        """Send the currently buffered request and clear the buffer.
    
            Appends an extra \\r\\n to the buffer.
            A message_body may be specified, to be appended to the request.
            """
        self._buffer.extend((b"", b""))
        msg = b"\r\n".join(self._buffer)
        del self._buffer[:]
    
>       self.send(msg)

/gnu/store/3aw9x28la9nh8fzkm665d7fywxzbl15j-python-3.5.3/lib/python3.5/http/client.py:934:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connection.HTTPConnection object at 
0x7f92347c7b70>
data = b'GET /metadata/nasa HTTP/1.1\r\nHost: archive.org\r\nConnection: 
keep-alive\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nUser-Agent: 
internetarchive/1.7.1 (Linux ; N; en; foo) Python/3.5.3\r\n\r\n'

    def send(self, data):
        """Send `data' to the server.
            ``data`` can be a string object, a bytes object, an array object, a
            file-like object that supports a .read() method, or an iterable 
object.
            """
    
        if self.sock is None:
            if self.auto_open:
>               self.connect()

/gnu/store/3aw9x28la9nh8fzkm665d7fywxzbl15j-python-3.5.3/lib/python3.5/http/client.py:877:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connection.HTTPConnection object at 
0x7f92347c7b70>

    def connect(self):
>       conn = self._new_conn()

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connection.py:166:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connection.HTTPConnection object at 
0x7f92347c7b70>

    def _new_conn(self):
        """ Establish a socket connection and set nodelay settings on it.
    
            :return: New socket connection.
            """
        extra_kw = {}
        if self.source_address:
            extra_kw['source_address'] = self.source_address
    
        if self.socket_options:
            extra_kw['socket_options'] = self.socket_options
    
        try:
            conn = connection.create_connection(
                (self.host, self.port), self.timeout, **extra_kw)
    
        except SocketTimeout as e:
            raise ConnectTimeoutError(
                self, "Connection to %s timed out. (connect timeout=%s)" %
                (self.host, self.timeout))
    
        except SocketError as e:
            raise NewConnectionError(
>               self, "Failed to establish a new connection: %s" % e)
E           requests.packages.urllib3.exceptions.NewConnectionError: 
<requests.packages.urllib3.connection.HTTPConnection object at 0x7f92347c7b70>: 
Failed to establish a new connection: [Errno -2] Name or service not known

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connection.py:150:
 NewConnectionError

During handling of the above exception, another exception occurred:

self = <requests.adapters.HTTPAdapter object at 0x7f923364a4e0>
request = <PreparedRequest [GET]>, stream = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f9238eeaa20>
verify = True, cert = None, proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, 
proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
            :param request: The :class:`PreparedRequest <PreparedRequest>` 
being sent.
            :param stream: (optional) Whether to stream the request content.
            :param timeout: (optional) How long to wait for the server to send
                data before giving up, as a float, or a :ref:`(connect timeout,
                read timeout) <timeouts>` tuple.
            :type timeout: float or tuple
            :param verify: (optional) Whether to verify SSL certificates.
            :param cert: (optional) Any user-provided SSL certificate to be 
trusted.
            :param proxies: (optional) The proxies dictionary to apply to the 
request.
            :rtype: requests.Response
            """
    
        conn = self.get_connection(request.url, proxies)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request)
    
        chunked = not (request.body is None or 'Content-Length' in 
request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {0}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
                resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
>                   timeout=timeout
                )

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/adapters.py:423:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connectionpool.HTTPConnectionPool object at 
0x7f923370e128>
method = 'GET', url = '/metadata/nasa', body = None
headers = {'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 
'Accept': '*/*', 'User-Agent': 'internetarchive/1.7.1 (Linux ; N; en; foo) 
Python/3.5.3'}
retries = Retry(total=2, connect=2, read=3, redirect=0), redirect = False
assert_same_host = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f9238eeaa20>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True
err = NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection 
object at 0x7f923370e358>: Failed to establish a new connection: [Errno -2] 
Name or service not known',)
clean_exit = False
timeout_obj = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f923370e6d8>
is_new_proxy_conn = False

    def urlopen(self, method, url, body=None, headers=None, retries=None,
                redirect=True, assert_same_host=True, timeout=_Default,
                pool_timeout=None, release_conn=None, chunked=False,
                body_pos=None, **response_kw):
        """
            Get a connection from the pool and perform an HTTP request. This is 
the
            lowest level call for making a request, so you'll need to specify 
all
            the raw details.
    
            .. note::
    
               More commonly, it's appropriate to use a convenience method 
provided
               by :class:`.RequestMethods`, such as :meth:`request`.
    
            .. note::
    
               `release_conn` will only behave as expected if
               `preload_content=False` because we want to make
               `preload_content=False` the default behaviour someday soon 
without
               breaking backwards compatibility.
    
            :param method:
                HTTP request method (such as GET, POST, PUT, etc.)
    
            :param body:
                Data to send in the request body (useful for creating
                POST requests, see HTTPConnectionPool.post_url for
                more convenience).
    
            :param headers:
                Dictionary of custom headers to send, such as User-Agent,
                If-None-Match, etc. If None, pool headers are used. If provided,
                these headers completely replace any pool-specific headers.
    
            :param retries:
                Configure the number of retries to allow before raising a
                :class:`~urllib3.exceptions.MaxRetryError` exception.
    
                Pass ``None`` to retry until you receive a response. Pass a
                :class:`~urllib3.util.retry.Retry` object for fine-grained 
control
                over different types of retries.
                Pass an integer number to retry connection errors that many 
times,
                but no other types of errors. Pass zero to never retry.
    
                If ``False``, then retries are disabled and any exception is 
raised
                immediately. Also, instead of raising a MaxRetryError on 
redirects,
                the redirect response will be returned.
    
            :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
            :param redirect:
                If True, automatically handle redirects (status codes 301, 302,
                303, 307, 308). Each redirect counts as a retry. Disabling 
retries
                will disable redirect, too.
    
            :param assert_same_host:
                If ``True``, will make sure that the host of the pool requests 
is
                consistent else will raise HostChangedError. When False, you can
                use the pool on an HTTP proxy and request foreign hosts.
    
            :param timeout:
                If specified, overrides the default timeout for this one
                request. It may be a float (in seconds) or an instance of
                :class:`urllib3.util.Timeout`.
    
            :param pool_timeout:
                If set and the pool is set to block=True, then this method will
                block for ``pool_timeout`` seconds and raise EmptyPoolError if 
no
                connection is available within the time period.
    
            :param release_conn:
                If False, then the urlopen call will not release the connection
                back into the pool once a response is received (but will 
release if
                you read the entire contents of the response such as when
                `preload_content=True`). This is useful if you're not preloading
                the response's content immediately. You will need to call
                ``r.release_conn()`` on the response ``r`` to return the 
connection
                back into the pool. If None, it takes the value of
                ``response_kw.get('preload_content', True)``.
    
            :param chunked:
                If True, urllib3 will send the body using chunked transfer
                encoding. Otherwise, urllib3 will send the body using the 
standard
                content-length form. Defaults to False.
    
            :param int body_pos:
                Position to seek to in file-like body in the event of a retry or
                redirect. Typically this won't need to be set because urllib3 
will
                auto-populate the value when needed.
    
            :param \\**response_kw:
                Additional parameters are passed to
                :meth:`urllib3.response.HTTPResponse.from_httplib`
            """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, 
default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get('preload_content', True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == 'http':
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(conn, 
'sock', None)
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(conn, method, url,
                                                  timeout=timeout_obj,
                                                  body=body, headers=headers,
                                                  chunked=chunked)
    
            # If we're going to release the connection in ``finally:``, then
            # the response doesn't need to know about the connection. Otherwise
            # it will also try to release it and we'll have a double-release
            # mess.
            response_conn = conn if not release_conn else None
    
            # Pass method to Response for length checking
            response_kw['request_method'] = method
    
            # Import httplib's response into our own wrapper object
            response = self.ResponseCls.from_httplib(httplib_response,
                                                     pool=self,
                                                     connection=response_conn,
                                                     retries=retries,
                                                     **response_kw)
    
            # Everything went great!
            clean_exit = True
    
        except queue.Empty:
            # Timed out by queue.
            raise EmptyPoolError(self, "No pool connections are available.")
    
        except (BaseSSLError, CertificateError) as e:
            # Close the connection. If a connection is reused on which there
            # was a Certificate error, the next request will certainly raise
            # another Certificate error.
            clean_exit = False
            raise SSLError(e)
    
        except SSLError:
            # Treat SSLError separately from BaseSSLError to preserve
            # traceback.
            clean_exit = False
            raise
    
        except (TimeoutError, HTTPException, SocketError, ProtocolError) as e:
            # Discard the connection for these exceptions. It will be
            # be replaced during the next _get_conn() call.
            clean_exit = False
    
            if isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
                e = ProxyError('Cannot connect to proxy.', e)
            elif isinstance(e, (SocketError, HTTPException)):
                e = ProtocolError('Connection aborted.', e)
    
            retries = retries.increment(method, url, error=e, _pool=self,
                                        _stacktrace=sys.exc_info()[2])
            retries.sleep()
    
            # Keep track of the error for the retry warning.
            err = e
    
        finally:
            if not clean_exit:
                # We hit some kind of exception, handled or otherwise. We need
                # to throw the connection away unless explicitly told not to.
                # Close the connection, set the variable to None, and make sure
                # we put the None back in the pool to avoid leaking it.
                conn = conn and conn.close()
                release_this_conn = True
    
            if release_this_conn:
                # Put the connection back to be reused. If the connection is
                # expired then it will be None, which will get replaced with a
                # fresh connection during _get_conn.
                self._put_conn(conn)
    
        if not conn:
            # Try again
            log.warning("Retrying (%r) after connection "
                        "broken by '%r': %s", retries, err, url)
            return self.urlopen(method, url, body, headers, retries,
                                redirect, assert_same_host,
                                timeout=timeout, pool_timeout=pool_timeout,
                                release_conn=release_conn, body_pos=body_pos,
>                               **response_kw)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:678:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connectionpool.HTTPConnectionPool object at 
0x7f923370e128>
method = 'GET', url = '/metadata/nasa', body = None
headers = {'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 
'Accept': '*/*', 'User-Agent': 'internetarchive/1.7.1 (Linux ; N; en; foo) 
Python/3.5.3'}
retries = Retry(total=1, connect=1, read=3, redirect=0), redirect = False
assert_same_host = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f9238eeaa20>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True
err = NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection 
object at 0x7f92346f9240>: Failed to establish a new connection: [Errno -2] 
Name or service not known',)
clean_exit = False
timeout_obj = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f923466fcc0>
is_new_proxy_conn = False

    def urlopen(self, method, url, body=None, headers=None, retries=None,
                redirect=True, assert_same_host=True, timeout=_Default,
                pool_timeout=None, release_conn=None, chunked=False,
                body_pos=None, **response_kw):
        """
            Get a connection from the pool and perform an HTTP request. This is 
the
            lowest level call for making a request, so you'll need to specify 
all
            the raw details.
    
            .. note::
    
               More commonly, it's appropriate to use a convenience method 
provided
               by :class:`.RequestMethods`, such as :meth:`request`.
    
            .. note::
    
               `release_conn` will only behave as expected if
               `preload_content=False` because we want to make
               `preload_content=False` the default behaviour someday soon 
without
               breaking backwards compatibility.
    
            :param method:
                HTTP request method (such as GET, POST, PUT, etc.)
    
            :param body:
                Data to send in the request body (useful for creating
                POST requests, see HTTPConnectionPool.post_url for
                more convenience).
    
            :param headers:
                Dictionary of custom headers to send, such as User-Agent,
                If-None-Match, etc. If None, pool headers are used. If provided,
                these headers completely replace any pool-specific headers.
    
            :param retries:
                Configure the number of retries to allow before raising a
                :class:`~urllib3.exceptions.MaxRetryError` exception.
    
                Pass ``None`` to retry until you receive a response. Pass a
                :class:`~urllib3.util.retry.Retry` object for fine-grained 
control
                over different types of retries.
                Pass an integer number to retry connection errors that many 
times,
                but no other types of errors. Pass zero to never retry.
    
                If ``False``, then retries are disabled and any exception is 
raised
                immediately. Also, instead of raising a MaxRetryError on 
redirects,
                the redirect response will be returned.
    
            :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
            :param redirect:
                If True, automatically handle redirects (status codes 301, 302,
                303, 307, 308). Each redirect counts as a retry. Disabling 
retries
                will disable redirect, too.
    
            :param assert_same_host:
                If ``True``, will make sure that the host of the pool requests 
is
                consistent else will raise HostChangedError. When False, you can
                use the pool on an HTTP proxy and request foreign hosts.
    
            :param timeout:
                If specified, overrides the default timeout for this one
                request. It may be a float (in seconds) or an instance of
                :class:`urllib3.util.Timeout`.
    
            :param pool_timeout:
                If set and the pool is set to block=True, then this method will
                block for ``pool_timeout`` seconds and raise EmptyPoolError if 
no
                connection is available within the time period.
    
            :param release_conn:
                If False, then the urlopen call will not release the connection
                back into the pool once a response is received (but will 
release if
                you read the entire contents of the response such as when
                `preload_content=True`). This is useful if you're not preloading
                the response's content immediately. You will need to call
                ``r.release_conn()`` on the response ``r`` to return the 
connection
                back into the pool. If None, it takes the value of
                ``response_kw.get('preload_content', True)``.
    
            :param chunked:
                If True, urllib3 will send the body using chunked transfer
                encoding. Otherwise, urllib3 will send the body using the 
standard
                content-length form. Defaults to False.
    
            :param int body_pos:
                Position to seek to in file-like body in the event of a retry or
                redirect. Typically this won't need to be set because urllib3 
will
                auto-populate the value when needed.
    
            :param \\**response_kw:
                Additional parameters are passed to
                :meth:`urllib3.response.HTTPResponse.from_httplib`
            """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, 
default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get('preload_content', True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == 'http':
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(conn, 
'sock', None)
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(conn, method, url,
                                                  timeout=timeout_obj,
                                                  body=body, headers=headers,
                                                  chunked=chunked)
    
            # If we're going to release the connection in ``finally:``, then
            # the response doesn't need to know about the connection. Otherwise
            # it will also try to release it and we'll have a double-release
            # mess.
            response_conn = conn if not release_conn else None
    
            # Pass method to Response for length checking
            response_kw['request_method'] = method
    
            # Import httplib's response into our own wrapper object
            response = self.ResponseCls.from_httplib(httplib_response,
                                                     pool=self,
                                                     connection=response_conn,
                                                     retries=retries,
                                                     **response_kw)
    
            # Everything went great!
            clean_exit = True
    
        except queue.Empty:
            # Timed out by queue.
            raise EmptyPoolError(self, "No pool connections are available.")
    
        except (BaseSSLError, CertificateError) as e:
            # Close the connection. If a connection is reused on which there
            # was a Certificate error, the next request will certainly raise
            # another Certificate error.
            clean_exit = False
            raise SSLError(e)
    
        except SSLError:
            # Treat SSLError separately from BaseSSLError to preserve
            # traceback.
            clean_exit = False
            raise
    
        except (TimeoutError, HTTPException, SocketError, ProtocolError) as e:
            # Discard the connection for these exceptions. It will be
            # be replaced during the next _get_conn() call.
            clean_exit = False
    
            if isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
                e = ProxyError('Cannot connect to proxy.', e)
            elif isinstance(e, (SocketError, HTTPException)):
                e = ProtocolError('Connection aborted.', e)
    
            retries = retries.increment(method, url, error=e, _pool=self,
                                        _stacktrace=sys.exc_info()[2])
            retries.sleep()
    
            # Keep track of the error for the retry warning.
            err = e
    
        finally:
            if not clean_exit:
                # We hit some kind of exception, handled or otherwise. We need
                # to throw the connection away unless explicitly told not to.
                # Close the connection, set the variable to None, and make sure
                # we put the None back in the pool to avoid leaking it.
                conn = conn and conn.close()
                release_this_conn = True
    
            if release_this_conn:
                # Put the connection back to be reused. If the connection is
                # expired then it will be None, which will get replaced with a
                # fresh connection during _get_conn.
                self._put_conn(conn)
    
        if not conn:
            # Try again
            log.warning("Retrying (%r) after connection "
                        "broken by '%r': %s", retries, err, url)
            return self.urlopen(method, url, body, headers, retries,
                                redirect, assert_same_host,
                                timeout=timeout, pool_timeout=pool_timeout,
                                release_conn=release_conn, body_pos=body_pos,
>                               **response_kw)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:678:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connectionpool.HTTPConnectionPool object at 
0x7f923370e128>
method = 'GET', url = '/metadata/nasa', body = None
headers = {'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 
'Accept': '*/*', 'User-Agent': 'internetarchive/1.7.1 (Linux ; N; en; foo) 
Python/3.5.3'}
retries = Retry(total=0, connect=0, read=3, redirect=0), redirect = False
assert_same_host = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f9238eeaa20>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True
err = NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection 
object at 0x7f92346f9208>: Failed to establish a new connection: [Errno -2] 
Name or service not known',)
clean_exit = False
timeout_obj = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92346f9f28>
is_new_proxy_conn = False

    def urlopen(self, method, url, body=None, headers=None, retries=None,
                redirect=True, assert_same_host=True, timeout=_Default,
                pool_timeout=None, release_conn=None, chunked=False,
                body_pos=None, **response_kw):
        """
            Get a connection from the pool and perform an HTTP request. This is 
the
            lowest level call for making a request, so you'll need to specify 
all
            the raw details.
    
            .. note::
    
               More commonly, it's appropriate to use a convenience method 
provided
               by :class:`.RequestMethods`, such as :meth:`request`.
    
            .. note::
    
               `release_conn` will only behave as expected if
               `preload_content=False` because we want to make
               `preload_content=False` the default behaviour someday soon 
without
               breaking backwards compatibility.
    
            :param method:
                HTTP request method (such as GET, POST, PUT, etc.)
    
            :param body:
                Data to send in the request body (useful for creating
                POST requests, see HTTPConnectionPool.post_url for
                more convenience).
    
            :param headers:
                Dictionary of custom headers to send, such as User-Agent,
                If-None-Match, etc. If None, pool headers are used. If provided,
                these headers completely replace any pool-specific headers.
    
            :param retries:
                Configure the number of retries to allow before raising a
                :class:`~urllib3.exceptions.MaxRetryError` exception.
    
                Pass ``None`` to retry until you receive a response. Pass a
                :class:`~urllib3.util.retry.Retry` object for fine-grained 
control
                over different types of retries.
                Pass an integer number to retry connection errors that many 
times,
                but no other types of errors. Pass zero to never retry.
    
                If ``False``, then retries are disabled and any exception is 
raised
                immediately. Also, instead of raising a MaxRetryError on 
redirects,
                the redirect response will be returned.
    
            :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
            :param redirect:
                If True, automatically handle redirects (status codes 301, 302,
                303, 307, 308). Each redirect counts as a retry. Disabling 
retries
                will disable redirect, too.
    
            :param assert_same_host:
                If ``True``, will make sure that the host of the pool requests 
is
                consistent else will raise HostChangedError. When False, you can
                use the pool on an HTTP proxy and request foreign hosts.
    
            :param timeout:
                If specified, overrides the default timeout for this one
                request. It may be a float (in seconds) or an instance of
                :class:`urllib3.util.Timeout`.
    
            :param pool_timeout:
                If set and the pool is set to block=True, then this method will
                block for ``pool_timeout`` seconds and raise EmptyPoolError if 
no
                connection is available within the time period.
    
            :param release_conn:
                If False, then the urlopen call will not release the connection
                back into the pool once a response is received (but will 
release if
                you read the entire contents of the response such as when
                `preload_content=True`). This is useful if you're not preloading
                the response's content immediately. You will need to call
                ``r.release_conn()`` on the response ``r`` to return the 
connection
                back into the pool. If None, it takes the value of
                ``response_kw.get('preload_content', True)``.
    
            :param chunked:
                If True, urllib3 will send the body using chunked transfer
                encoding. Otherwise, urllib3 will send the body using the 
standard
                content-length form. Defaults to False.
    
            :param int body_pos:
                Position to seek to in file-like body in the event of a retry or
                redirect. Typically this won't need to be set because urllib3 
will
                auto-populate the value when needed.
    
            :param \\**response_kw:
                Additional parameters are passed to
                :meth:`urllib3.response.HTTPResponse.from_httplib`
            """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, 
default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get('preload_content', True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == 'http':
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(conn, 
'sock', None)
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(conn, method, url,
                                                  timeout=timeout_obj,
                                                  body=body, headers=headers,
                                                  chunked=chunked)
    
            # If we're going to release the connection in ``finally:``, then
            # the response doesn't need to know about the connection. Otherwise
            # it will also try to release it and we'll have a double-release
            # mess.
            response_conn = conn if not release_conn else None
    
            # Pass method to Response for length checking
            response_kw['request_method'] = method
    
            # Import httplib's response into our own wrapper object
            response = self.ResponseCls.from_httplib(httplib_response,
                                                     pool=self,
                                                     connection=response_conn,
                                                     retries=retries,
                                                     **response_kw)
    
            # Everything went great!
            clean_exit = True
    
        except queue.Empty:
            # Timed out by queue.
            raise EmptyPoolError(self, "No pool connections are available.")
    
        except (BaseSSLError, CertificateError) as e:
            # Close the connection. If a connection is reused on which there
            # was a Certificate error, the next request will certainly raise
            # another Certificate error.
            clean_exit = False
            raise SSLError(e)
    
        except SSLError:
            # Treat SSLError separately from BaseSSLError to preserve
            # traceback.
            clean_exit = False
            raise
    
        except (TimeoutError, HTTPException, SocketError, ProtocolError) as e:
            # Discard the connection for these exceptions. It will be
            # be replaced during the next _get_conn() call.
            clean_exit = False
    
            if isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
                e = ProxyError('Cannot connect to proxy.', e)
            elif isinstance(e, (SocketError, HTTPException)):
                e = ProtocolError('Connection aborted.', e)
    
            retries = retries.increment(method, url, error=e, _pool=self,
                                        _stacktrace=sys.exc_info()[2])
            retries.sleep()
    
            # Keep track of the error for the retry warning.
            err = e
    
        finally:
            if not clean_exit:
                # We hit some kind of exception, handled or otherwise. We need
                # to throw the connection away unless explicitly told not to.
                # Close the connection, set the variable to None, and make sure
                # we put the None back in the pool to avoid leaking it.
                conn = conn and conn.close()
                release_this_conn = True
    
            if release_this_conn:
                # Put the connection back to be reused. If the connection is
                # expired then it will be None, which will get replaced with a
                # fresh connection during _get_conn.
                self._put_conn(conn)
    
        if not conn:
            # Try again
            log.warning("Retrying (%r) after connection "
                        "broken by '%r': %s", retries, err, url)
            return self.urlopen(method, url, body, headers, retries,
                                redirect, assert_same_host,
                                timeout=timeout, pool_timeout=pool_timeout,
                                release_conn=release_conn, body_pos=body_pos,
>                               **response_kw)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:678:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.packages.urllib3.connectionpool.HTTPConnectionPool object at 
0x7f923370e128>
method = 'GET', url = '/metadata/nasa', body = None
headers = {'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 
'Accept': '*/*', 'User-Agent': 'internetarchive/1.7.1 (Linux ; N; en; foo) 
Python/3.5.3'}
retries = Retry(total=0, connect=0, read=3, redirect=0), redirect = False
assert_same_host = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f9238eeaa20>
pool_timeout = None, release_conn = False, chunked = False, body_pos = None
response_kw = {'decode_content': False, 'preload_content': False}, conn = None
release_this_conn = True, err = None, clean_exit = False
timeout_obj = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f92347c74e0>
is_new_proxy_conn = False

    def urlopen(self, method, url, body=None, headers=None, retries=None,
                redirect=True, assert_same_host=True, timeout=_Default,
                pool_timeout=None, release_conn=None, chunked=False,
                body_pos=None, **response_kw):
        """
            Get a connection from the pool and perform an HTTP request. This is 
the
            lowest level call for making a request, so you'll need to specify 
all
            the raw details.
    
            .. note::
    
               More commonly, it's appropriate to use a convenience method 
provided
               by :class:`.RequestMethods`, such as :meth:`request`.
    
            .. note::
    
               `release_conn` will only behave as expected if
               `preload_content=False` because we want to make
               `preload_content=False` the default behaviour someday soon 
without
               breaking backwards compatibility.
    
            :param method:
                HTTP request method (such as GET, POST, PUT, etc.)
    
            :param body:
                Data to send in the request body (useful for creating
                POST requests, see HTTPConnectionPool.post_url for
                more convenience).
    
            :param headers:
                Dictionary of custom headers to send, such as User-Agent,
                If-None-Match, etc. If None, pool headers are used. If provided,
                these headers completely replace any pool-specific headers.
    
            :param retries:
                Configure the number of retries to allow before raising a
                :class:`~urllib3.exceptions.MaxRetryError` exception.
    
                Pass ``None`` to retry until you receive a response. Pass a
                :class:`~urllib3.util.retry.Retry` object for fine-grained 
control
                over different types of retries.
                Pass an integer number to retry connection errors that many 
times,
                but no other types of errors. Pass zero to never retry.
    
                If ``False``, then retries are disabled and any exception is 
raised
                immediately. Also, instead of raising a MaxRetryError on 
redirects,
                the redirect response will be returned.
    
            :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
    
            :param redirect:
                If True, automatically handle redirects (status codes 301, 302,
                303, 307, 308). Each redirect counts as a retry. Disabling 
retries
                will disable redirect, too.
    
            :param assert_same_host:
                If ``True``, will make sure that the host of the pool requests 
is
                consistent else will raise HostChangedError. When False, you can
                use the pool on an HTTP proxy and request foreign hosts.
    
            :param timeout:
                If specified, overrides the default timeout for this one
                request. It may be a float (in seconds) or an instance of
                :class:`urllib3.util.Timeout`.
    
            :param pool_timeout:
                If set and the pool is set to block=True, then this method will
                block for ``pool_timeout`` seconds and raise EmptyPoolError if 
no
                connection is available within the time period.
    
            :param release_conn:
                If False, then the urlopen call will not release the connection
                back into the pool once a response is received (but will 
release if
                you read the entire contents of the response such as when
                `preload_content=True`). This is useful if you're not preloading
                the response's content immediately. You will need to call
                ``r.release_conn()`` on the response ``r`` to return the 
connection
                back into the pool. If None, it takes the value of
                ``response_kw.get('preload_content', True)``.
    
            :param chunked:
                If True, urllib3 will send the body using chunked transfer
                encoding. Otherwise, urllib3 will send the body using the 
standard
                content-length form. Defaults to False.
    
            :param int body_pos:
                Position to seek to in file-like body in the event of a retry or
                redirect. Typically this won't need to be set because urllib3 
will
                auto-populate the value when needed.
    
            :param \\**response_kw:
                Additional parameters are passed to
                :meth:`urllib3.response.HTTPResponse.from_httplib`
            """
        if headers is None:
            headers = self.headers
    
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, 
default=self.retries)
    
        if release_conn is None:
            release_conn = response_kw.get('preload_content', True)
    
        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)
    
        conn = None
    
        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn
    
        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == 'http':
            headers = headers.copy()
            headers.update(self.proxy_headers)
    
        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None
    
        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False
    
        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)
    
        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)
    
            conn.timeout = timeout_obj.connect_timeout
    
            is_new_proxy_conn = self.proxy is not None and not getattr(conn, 
'sock', None)
            if is_new_proxy_conn:
                self._prepare_proxy(conn)
    
            # Make the request on the httplib connection object.
            httplib_response = self._make_request(conn, method, url,
                                                  timeout=timeout_obj,
                                                  body=body, headers=headers,
                                                  chunked=chunked)
    
            # If we're going to release the connection in ``finally:``, then
            # the response doesn't need to know about the connection. Otherwise
            # it will also try to release it and we'll have a double-release
            # mess.
            response_conn = conn if not release_conn else None
    
            # Pass method to Response for length checking
            response_kw['request_method'] = method
    
            # Import httplib's response into our own wrapper object
            response = self.ResponseCls.from_httplib(httplib_response,
                                                     pool=self,
                                                     connection=response_conn,
                                                     retries=retries,
                                                     **response_kw)
    
            # Everything went great!
            clean_exit = True
    
        except queue.Empty:
            # Timed out by queue.
            raise EmptyPoolError(self, "No pool connections are available.")
    
        except (BaseSSLError, CertificateError) as e:
            # Close the connection. If a connection is reused on which there
            # was a Certificate error, the next request will certainly raise
            # another Certificate error.
            clean_exit = False
            raise SSLError(e)
    
        except SSLError:
            # Treat SSLError separately from BaseSSLError to preserve
            # traceback.
            clean_exit = False
            raise
    
        except (TimeoutError, HTTPException, SocketError, ProtocolError) as e:
            # Discard the connection for these exceptions. It will be
            # be replaced during the next _get_conn() call.
            clean_exit = False
    
            if isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
                e = ProxyError('Cannot connect to proxy.', e)
            elif isinstance(e, (SocketError, HTTPException)):
                e = ProtocolError('Connection aborted.', e)
    
            retries = retries.increment(method, url, error=e, _pool=self,
>                                       _stacktrace=sys.exc_info()[2])

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:649:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = Retry(total=0, connect=0, read=3, redirect=0), method = 'GET'
url = '/metadata/nasa', response = None
error = 
NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object 
at 0x7f92347c7b70>: Failed to establish a new connection: [Errno -2] Name or 
service not known',)
_pool = <requests.packages.urllib3.connectionpool.HTTPConnectionPool object at 
0x7f923370e128>
_stacktrace = <traceback object at 0x7f92349eb8c8>

    def increment(self, method=None, url=None, response=None, error=None,
                  _pool=None, _stacktrace=None):
        """ Return a new Retry object with incremented retry counters.
    
            :param response: A response object, or None, if the server did not
                return a response.
            :type response: :class:`~urllib3.response.HTTPResponse`
            :param Exception error: An error encountered during the request, or
                None if the response was received successfully.
    
            :return: A new ``Retry`` object.
            """
        if self.total is False and error:
            # Disabled, indicate to re-raise the error.
            raise six.reraise(type(error), error, _stacktrace)
    
        total = self.total
        if total is not None:
            total -= 1
    
        connect = self.connect
        read = self.read
        redirect = self.redirect
        cause = 'unknown'
        status = None
        redirect_location = None
    
        if error and self._is_connection_error(error):
            # Connect retry?
            if connect is False:
                raise six.reraise(type(error), error, _stacktrace)
            elif connect is not None:
                connect -= 1
    
        elif error and self._is_read_error(error):
            # Read retry?
            if read is False or not self._is_method_retryable(method):
                raise six.reraise(type(error), error, _stacktrace)
            elif read is not None:
                read -= 1
    
        elif response and response.get_redirect_location():
            # Redirect retry?
            if redirect is not None:
                redirect -= 1
            cause = 'too many redirects'
            redirect_location = response.get_redirect_location()
            status = response.status
    
        else:
            # Incrementing because of a server error like a 500 in
            # status_forcelist and a the given method is in the whitelist
            cause = ResponseError.GENERIC_ERROR
            if response and response.status:
                cause = ResponseError.SPECIFIC_ERROR.format(
                    status_code=response.status)
                status = response.status
    
        history = self.history + (RequestHistory(method, url, error, status, 
redirect_location),)
    
        new_retry = self.new(
            total=total,
            connect=connect, read=read, redirect=redirect,
            history=history)
    
        if new_retry.is_exhausted():
>           raise MaxRetryError(_pool, url, error or ResponseError(cause))
E           requests.packages.urllib3.exceptions.MaxRetryError: 
HTTPConnectionPool(host='archive.org', port=80): Max retries exceeded with url: 
/metadata/nasa (Caused by 
NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object 
at 0x7f92347c7b70>: Failed to establish a new connection: [Errno -2] Name or 
service not known',))

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/packages/urllib3/util/retry.py:376:
 MaxRetryError

During handling of the above exception, another exception occurred:

self = <internetarchive.session.ArchiveSession object at 0x7f9233640630>
identifier = 'nasa', request_kwargs = {'timeout': 12}

    def get_metadata(self, identifier, request_kwargs=None):
        """Get an item's metadata from the `Metadata API
            <http://blog.archive.org/2013/07/04/metadata-api/>`__
    
            :type identifier: str
            :param identifier: Globally unique Archive.org identifier.
    
            :rtype: dict
            :returns: Metadat API response.
            """
        request_kwargs = {} if not request_kwargs else request_kwargs
        url = '{0}//archive.org/metadata/{1}'.format(self.protocol, identifier)
        if 'timeout' not in request_kwargs:
            request_kwargs['timeout'] = 12
        try:
>           resp = self.get(url, **request_kwargs)

../../../internetarchive-1.7.1/internetarchive/session.py:237: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <internetarchive.session.ArchiveSession object at 0x7f9233640630>
url = 'http://archive.org/metadata/nasa'
kwargs = {'allow_redirects': True, 'timeout': 12}

    def get(self, url, **kwargs):
        """Sends a GET request. Returns :class:`Response` object.
    
            :param url: URL for the new :class:`Request` object.
            :param \*\*kwargs: Optional arguments that ``request`` takes.
            :rtype: requests.Response
            """
    
        kwargs.setdefault('allow_redirects', True)
>       return self.request('GET', url, **kwargs)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/sessions.py:501:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <internetarchive.session.ArchiveSession object at 0x7f9233640630>
method = 'GET', url = 'http://archive.org/metadata/nasa', params = None
data = None, headers = None, cookies = None, files = None, auth = None
timeout = 12, allow_redirects = True, proxies = {}, hooks = None, stream = None
verify = None, cert = None, json = None

    def request(self, method, url,
        params=None,
        data=None,
        headers=None,
        cookies=None,
        files=None,
        auth=None,
        timeout=None,
        allow_redirects=True,
        proxies=None,
        hooks=None,
        stream=None,
        verify=None,
        cert=None,
        json=None):
        """Constructs a :class:`Request <Request>`, prepares it and sends it.
            Returns :class:`Response <Response>` object.
    
            :param method: method for the new :class:`Request` object.
            :param url: URL for the new :class:`Request` object.
            :param params: (optional) Dictionary or bytes to be sent in the 
query
                string for the :class:`Request`.
            :param data: (optional) Dictionary, bytes, or file-like object to 
send
                in the body of the :class:`Request`.
            :param json: (optional) json to send in the body of the
                :class:`Request`.
            :param headers: (optional) Dictionary of HTTP Headers to send with 
the
                :class:`Request`.
            :param cookies: (optional) Dict or CookieJar object to send with the
                :class:`Request`.
            :param files: (optional) Dictionary of ``'filename': 
file-like-objects``
                for multipart encoding upload.
            :param auth: (optional) Auth tuple or callable to enable
                Basic/Digest/Custom HTTP Auth.
            :param timeout: (optional) How long to wait for the server to send
                data before giving up, as a float, or a :ref:`(connect timeout,
                read timeout) <timeouts>` tuple.
            :type timeout: float or tuple
            :param allow_redirects: (optional) Set to True by default.
            :type allow_redirects: bool
            :param proxies: (optional) Dictionary mapping protocol or protocol 
and
                hostname to the URL of the proxy.
            :param stream: (optional) whether to immediately download the 
response
                content. Defaults to ``False``.
            :param verify: (optional) whether the SSL cert will be verified.
                A CA_BUNDLE path can also be provided. Defaults to ``True``.
            :param cert: (optional) if String, path to ssl client cert file 
(.pem).
                If Tuple, ('cert', 'key') pair.
            :rtype: requests.Response
            """
        # Create the Request.
        req = Request(
            method = method.upper(),
            url = url,
            headers = headers,
            files = files,
            data = data or {},
            json = json,
            params = params or {},
            auth = auth,
            cookies = cookies,
            hooks = hooks,
        )
        prep = self.prepare_request(req)
    
        proxies = proxies or {}
    
        settings = self.merge_environment_settings(
            prep.url, proxies, stream, verify, cert
        )
    
        # Send the request.
        send_kwargs = {
            'timeout': timeout,
            'allow_redirects': allow_redirects,
        }
        send_kwargs.update(settings)
>       resp = self.send(prep, **send_kwargs)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/sessions.py:488:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <internetarchive.session.ArchiveSession object at 0x7f9233640630>
request = <PreparedRequest [GET]>
kwargs = {'allow_redirects': True, 'cert': None, 'proxies': OrderedDict(), 
'stream': False, ...}
insecure = False, w = []

    def send(self, request, **kwargs):
        # Catch urllib3 warnings for HTTPS related errors.
        insecure = False
        with warnings.catch_warnings(record=True) as w:
            warnings.filterwarnings('always')
>           r = super(ArchiveSession, self).send(request, **kwargs)

../../../internetarchive-1.7.1/internetarchive/session.py:353: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <internetarchive.session.ArchiveSession object at 0x7f9233640630>
request = <PreparedRequest [GET]>
kwargs = {'cert': None, 'proxies': OrderedDict(), 'stream': False, 'timeout': 
12, ...}
allow_redirects = True, stream = False, hooks = {'response': []}
checked_urls = set()
adapter = <requests.adapters.HTTPAdapter object at 0x7f923364a4e0>
start = datetime.datetime(2017, 8, 17, 20, 18, 20, 154309)

    def send(self, request, **kwargs):
        """
            Send a given PreparedRequest.
    
            :rtype: requests.Response
            """
        # Set defaults that the hooks can utilize to ensure they always have
        # the correct parameters to reproduce the previous request.
        kwargs.setdefault('stream', self.stream)
        kwargs.setdefault('verify', self.verify)
        kwargs.setdefault('cert', self.cert)
        kwargs.setdefault('proxies', self.proxies)
    
        # It's possible that users might accidentally send a Request object.
        # Guard against that specific failure case.
        if isinstance(request, Request):
            raise ValueError('You can only send PreparedRequests.')
    
        # Set up variables needed for resolve_redirects and dispatching of hooks
        allow_redirects = kwargs.pop('allow_redirects', True)
        stream = kwargs.get('stream')
        hooks = request.hooks
    
        # Resolve URL in redirect cache, if available.
        if allow_redirects:
            checked_urls = set()
            while request.url in self.redirect_cache:
                checked_urls.add(request.url)
                new_url = self.redirect_cache.get(request.url)
                if new_url in checked_urls:
                    break
                request.url = new_url
    
        # Get the appropriate adapter to use
        adapter = self.get_adapter(url=request.url)
    
        # Start time (approximately) of the request
        start = datetime.utcnow()
    
        # Send the request
>       r = adapter.send(request, **kwargs)

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/sessions.py:609:
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <requests.adapters.HTTPAdapter object at 0x7f923364a4e0>
request = <PreparedRequest [GET]>, stream = False
timeout = <requests.packages.urllib3.util.timeout.Timeout object at 
0x7f9238eeaa20>
verify = True, cert = None, proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, 
proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
            :param request: The :class:`PreparedRequest <PreparedRequest>` 
being sent.
            :param stream: (optional) Whether to stream the request content.
            :param timeout: (optional) How long to wait for the server to send
                data before giving up, as a float, or a :ref:`(connect timeout,
                read timeout) <timeouts>` tuple.
            :type timeout: float or tuple
            :param verify: (optional) Whether to verify SSL certificates.
            :param cert: (optional) Any user-provided SSL certificate to be 
trusted.
            :param proxies: (optional) The proxies dictionary to apply to the 
request.
            :rtype: requests.Response
            """
    
        conn = self.get_connection(request.url, proxies)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request)
    
        chunked = not (request.body is None or 'Content-Length' in 
request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {0}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
                resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
                    timeout=timeout
                )
    
            # Send the request.
            else:
                if hasattr(conn, 'proxy_pool'):
                    conn = conn.proxy_pool
    
                low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
    
                try:
                    low_conn.putrequest(request.method,
                                        url,
                                        skip_accept_encoding=True)
    
                    for header, value in request.headers.items():
                        low_conn.putheader(header, value)
    
                    low_conn.endheaders()
    
                    for i in request.body:
                        low_conn.send(hex(len(i))[2:].encode('utf-8'))
                        low_conn.send(b'\r\n')
                        low_conn.send(i)
                        low_conn.send(b'\r\n')
                    low_conn.send(b'0\r\n\r\n')
    
                    # Receive the response from the server
                    try:
                        # For Python 2.7+ versions, use buffering of HTTP
                        # responses
                        r = low_conn.getresponse(buffering=True)
                    except TypeError:
                        # For compatibility with Python 2.6 versions and back
                        r = low_conn.getresponse()
    
                    resp = HTTPResponse.from_httplib(
                        r,
                        pool=conn,
                        connection=low_conn,
                        preload_content=False,
                        decode_content=False
                    )
                except:
                    # If we hit any problems here, clean up the connection.
                    # Then, reraise so that we can handle the actual exception.
                    low_conn.close()
                    raise
    
        except (ProtocolError, socket.error) as err:
            raise ConnectionError(err, request=request)
    
        except MaxRetryError as e:
            if isinstance(e.reason, ConnectTimeoutError):
                # TODO: Remove this in 3.0.0: see #2811
                if not isinstance(e.reason, NewConnectionError):
                    raise ConnectTimeout(e, request=request)
    
            if isinstance(e.reason, ResponseError):
                raise RetryError(e, request=request)
    
            if isinstance(e.reason, _ProxyError):
                raise ProxyError(e, request=request)
    
>           raise ConnectionError(e, request=request)
E           requests.exceptions.ConnectionError: 
HTTPConnectionPool(host='archive.org', port=80): Max retries exceeded with url: 
/metadata/nasa (Caused by 
NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object 
at 0x7f92347c7b70>: Failed to establish a new connection: [Errno -2] Name or 
service not known',))

/gnu/store/avxn9b7hva7p7lnbafyzvngsbsf8nwd0-python-requests-2.13.0/lib/python3.5/site-packages/requests/adapters.py:487:
 ConnectionError

During handling of the above exception, another exception occurred:

capsys = <_pytest.capture.CaptureFixture object at 0x7f9233fc2358>

    def test_ia(capsys):
        ia_call(['ia', '--help'])
        out, err = capsys.readouterr()
        assert 'A command line interface to Archive.org.' in out
    
>       ia_call(['ia', '--insecure', 'ls', 'nasa'])

../../../internetarchive-1.7.1/tests/cli/test_ia.py:9: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../../../internetarchive-1.7.1/tests/conftest.py:51: in ia_call
    ia.main()
../../../internetarchive-1.7.1/internetarchive/cli/ia.py:159: in main
    sys.exit(ia_module.main(argv, session))
../../../internetarchive-1.7.1/internetarchive/cli/ia_list.py:46: in main
    item = session.get_item(args['<identifier>'])
../../../internetarchive-1.7.1/internetarchive/session.py:214: in get_item
    item_metadata = self.get_metadata(identifier, request_kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <internetarchive.session.ArchiveSession object at 0x7f9233640630>
identifier = 'nasa', request_kwargs = {'timeout': 12}

    def get_metadata(self, identifier, request_kwargs=None):
        """Get an item's metadata from the `Metadata API
            <http://blog.archive.org/2013/07/04/metadata-api/>`__
    
            :type identifier: str
            :param identifier: Globally unique Archive.org identifier.
    
            :rtype: dict
            :returns: Metadat API response.
            """
        request_kwargs = {} if not request_kwargs else request_kwargs
        url = '{0}//archive.org/metadata/{1}'.format(self.protocol, identifier)
        if 'timeout' not in request_kwargs:
            request_kwargs['timeout'] = 12
        try:
            resp = self.get(url, **request_kwargs)
            resp.raise_for_status()
        except Exception as exc:
            error_msg = 'Error retrieving metadata from {0}, {1}'.format(url, 
exc)
            logger.error(error_msg)
>           raise type(exc)(error_msg)
E           requests.exceptions.ConnectionError: Error retrieving metadata from 
http://archive.org/metadata/nasa, HTTPConnectionPool(host='archive.org', 
port=80): Max retries exceeded with url: /metadata/nasa (Caused by 
NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object 
at 0x7f92347c7b70>: Failed to establish a new connection: [Errno -2] Name or 
service not known',))

../../../internetarchive-1.7.1/internetarchive/session.py:242: ConnectionError
--------------------------------- Captured log ---------------------------------
session.py                 213 DEBUG    no metadata provided for "nasa", 
retrieving now.
connectionpool.py          207 DEBUG    Starting new HTTP connection (1): 
archive.org
retry.py                   378 DEBUG    Incremented Retry for 
(url='/metadata/nasa'): Retry(total=2, connect=2, read=3, redirect=0)
connectionpool.py          673 WARNING  Retrying (Retry(total=2, connect=2, 
read=3, redirect=0)) after connection broken by 
'NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection 
object at 0x7f923370e358>: Failed to establish a new connection: [Errno -2] 
Name or service not known',)': /metadata/nasa
connectionpool.py          207 DEBUG    Starting new HTTP connection (2): 
archive.org
retry.py                   378 DEBUG    Incremented Retry for 
(url='/metadata/nasa'): Retry(total=1, connect=1, read=3, redirect=0)
connectionpool.py          673 WARNING  Retrying (Retry(total=1, connect=1, 
read=3, redirect=0)) after connection broken by 
'NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection 
object at 0x7f92346f9240>: Failed to establish a new connection: [Errno -2] 
Name or service not known',)': /metadata/nasa
connectionpool.py          207 DEBUG    Starting new HTTP connection (3): 
archive.org
retry.py                   378 DEBUG    Incremented Retry for 
(url='/metadata/nasa'): Retry(total=0, connect=0, read=3, redirect=0)
connectionpool.py          673 WARNING  Retrying (Retry(total=0, connect=0, 
read=3, redirect=0)) after connection broken by 
'NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection 
object at 0x7f92346f9208>: Failed to establish a new connection: [Errno -2] 
Name or service not known',)': /metadata/nasa
connectionpool.py          207 DEBUG    Starting new HTTP connection (4): 
archive.org
session.py                 241 ERROR    Error retrieving metadata from 
http://archive.org/metadata/nasa, HTTPConnectionPool(host='archive.org', 
port=80): Max retries exceeded with url: /metadata/nasa (Caused by 
NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object 
at 0x7f92347c7b70>: Failed to establish a new connection: [Errno -2] Name or 
service not known',))
_________________________________ test_no_args _________________________________

tmpdir_ch = 
local('/tmp/guix-build-python-internetarchive-1.7.1.drv-0/pytest-of-nixbld/pytest-0/test_no_args0')

    def test_no_args(tmpdir_ch):
        call_cmd('ia --insecure download nasa')
>       assert files_downloaded(path='nasa') == NASA_EXPECTED_FILES
E       AssertionError: assert set() == {'NASAarchiveLogo.jpg', 'g...xml', 
'nasa_meta.xml', ...}
E         Extra items in the right set:
E         'nasa_meta.xml'
E         'nasa_archive.torrent'
E         'NASAarchiveLogo.jpg'
E         'nasa_reviews.xml'
E         'nasa_files.xml'
E         'globe_west_540.jpg'
E         'globe_west_540_thumb.jpg'
E         Use -v to get the full diff

../../../internetarchive-1.7.1/tests/cli/test_ia_download.py:8: AssertionError
__________________________________ test_https __________________________________

tmpdir_ch = 
local('/tmp/guix-build-python-internetarchive-1.7.1.drv-0/pytest-of-nixbld/pytest-0/test_https0')

    def test_https(tmpdir_ch):
        if sys.version_info < (2, 7, 9):
            stdout, stderr = call_cmd('ia download nasa', expected_exit_code=1)
            assert 'You are attempting to make an HTTPS' in stderr
        else:
            call_cmd('ia download nasa')
>           assert files_downloaded(path='nasa') == NASA_EXPECTED_FILES
E           AssertionError: assert set() == {'NASAarchiveLogo.jpg', 'g...xml', 
'nasa_meta.xml', ...}
E             Extra items in the right set:
E             'nasa_meta.xml'
E             'nasa_archive.torrent'
E             'NASAarchiveLogo.jpg'
E             'nasa_reviews.xml'
E             'nasa_files.xml'
E             'globe_west_540.jpg'
E             'globe_west_540_thumb.jpg'
E             Use -v to get the full diff

../../../internetarchive-1.7.1/tests/cli/test_ia_download.py:17: AssertionError
_________________________________ test_dry_run _________________________________

    def test_dry_run():
        nasa_url = 'http://archive.org/download/nasa/'
        expected_urls = set([nasa_url + f for f in NASA_EXPECTED_FILES])
    
        stdout, stderr = call_cmd('ia --insecure download --dry-run nasa')
        output_lines = stdout.split('\n')
        dry_run_urls = set([x.strip() for x in output_lines if x and 'nasa:' 
not in x])
    
>       assert expected_urls == dry_run_urls
E       AssertionError: assert {'http://arch...eta.xml', ...} == set()
E         Extra items in the left set:
E         'http://archive.org/download/nasa/globe_west_540_thumb.jpg'
E         'http://archive.org/download/nasa/nasa_files.xml'
E         'http://archive.org/download/nasa/nasa_reviews.xml'
E         'http://archive.org/download/nasa/NASAarchiveLogo.jpg'
E         'http://archive.org/download/nasa/nasa_archive.torrent'
E         'http://archive.org/download/nasa/globe_west_540.jpg'
E         'http://archive.org/download/nasa/nasa_meta.xml'
E         Use -v to get the full diff

../../../internetarchive-1.7.1/tests/cli/test_ia_download.py:28: AssertionError
__________________________________ test_glob ___________________________________

tmpdir_ch = 
local('/tmp/guix-build-python-internetarchive-1.7.1.drv-0/pytest-of-nixbld/pytest-0/test_glob0')

    def test_glob(tmpdir_ch):
        expected_files = set([
            'globe_west_540.jpg',
            'NASAarchiveLogo.jpg',
            'globe_west_540_thumb.jpg'
        ])
    
        call_cmd('ia --insecure download --glob="*jpg" nasa')
>       assert files_downloaded(path='nasa') == expected_files
E       AssertionError: assert set() == {'NASAarchiveLogo.jpg', 
'g...'globe_west_540_thumb.jpg'}
E         Extra items in the right set:
E         'globe_west_540.jpg'
E         'globe_west_540_thumb.jpg'
E         'NASAarchiveLogo.jpg'
E         Use -v to get the full diff

../../../internetarchive-1.7.1/tests/cli/test_ia_download.py:39: AssertionError
_________________________________ test_format __________________________________

tmpdir_ch = 
local('/tmp/guix-build-python-internetarchive-1.7.1.drv-0/pytest-of-nixbld/pytest-0/test_format0')

    def test_format(tmpdir_ch):
        call_cmd('ia --insecure download --format="Archive BitTorrent" nasa')
>       assert files_downloaded(path='nasa') == set(['nasa_archive.torrent'])
E       AssertionError: assert set() == {'nasa_archive.torrent'}
E         Extra items in the right set:
E         'nasa_archive.torrent'
E         Use -v to get the full diff

../../../internetarchive-1.7.1/tests/cli/test_ia_download.py:44: AssertionError
_________________________________ test_clobber _________________________________

tmpdir_ch = 
local('/tmp/guix-build-python-internetarchive-1.7.1.drv-0/pytest-of-nixbld/pytest-0/test_clobber0')

    def test_clobber(tmpdir_ch):
        cmd = 'ia --insecure download nasa nasa_meta.xml'
        call_cmd(cmd)
>       assert files_downloaded('nasa') == set(['nasa_meta.xml'])
E       AssertionError: assert set() == {'nasa_meta.xml'}
E         Extra items in the right set:
E         'nasa_meta.xml'
E         Use -v to get the full diff

../../../internetarchive-1.7.1/tests/cli/test_ia_download.py:50: AssertionError
________________________________ test_checksum _________________________________

tmpdir_ch = 
local('/tmp/guix-build-python-internetarchive-1.7.1.drv-0/pytest-of-nixbld/pytest-0/test_checksum0')

    def test_checksum(tmpdir_ch):
        call_cmd('ia --insecure download nasa nasa_meta.xml')
>       assert files_downloaded('nasa') == set(['nasa_meta.xml'])
E       AssertionError: assert set() == {'nasa_meta.xml'}
E         Extra items in the right set:
E         'nasa_meta.xml'
E         Use -v to get the full diff

../../../internetarchive-1.7.1/tests/cli/test_ia_download.py:59: AssertionError
_____________________________ test_no_directories ______________________________

tmpdir_ch = 
local('/tmp/guix-build-python-internetarchive-1.7.1.drv-0/pytest-of-nixbld/pytest-0/test_no_directories0')

    def test_no_directories(tmpdir_ch):
        call_cmd('ia --insecure download --no-directories nasa nasa_meta.xml')
>       assert files_downloaded('.') == set(['nasa_meta.xml'])
E       AssertionError: assert set() == {'nasa_meta.xml'}
E         Extra items in the right set:
E         'nasa_meta.xml'
E         Use -v to get the full diff

../../../internetarchive-1.7.1/tests/cli/test_ia_download.py:69: AssertionError
_________________________________ test_destdir _________________________________

tmpdir_ch = 
local('/tmp/guix-build-python-internetarchive-1.7.1.drv-0/pytest-of-nixbld/pytest-0/test_destdir0')

    def test_destdir(tmpdir_ch):
        cmd = 'ia --insecure download --destdir=thisdirdoesnotexist/ nasa 
nasa_meta.xml'
        stdout, stderr = call_cmd(cmd, expected_exit_code=1)
    
        assert '--destdir must be a valid path to a directory.' in stderr
    
        tmpdir_ch.mkdir('thisdirdoesnotexist/')
        call_cmd(cmd)
>       assert files_downloaded('thisdirdoesnotexist/nasa') == 
> set(['nasa_meta.xml'])
E       AssertionError: assert set() == {'nasa_meta.xml'}
E         Extra items in the right set:
E         'nasa_meta.xml'
E         Use -v to get the full diff

../../../internetarchive-1.7.1/tests/cli/test_ia_download.py:80: AssertionError
============================ pytest-warning summary ============================
WI1 
/gnu/store/9mmg3cws531bybx4yv976f1s8dj3qir9-python-pytest-capturelog-0.7/lib/python3.5/site-packages/pytest_capturelog.py:171
 'pytest_runtest_makereport' hook uses deprecated __multicall__ argument
WC1 None pytest_funcarg__caplog: declaring fixtures using "pytest_funcarg__" 
prefix is deprecated and scheduled to be removed in pytest 4.0.  Please remove 
the prefix and use the @pytest.fixture decorator instead.
WC1 None pytest_funcarg__capturelog: declaring fixtures using 
"pytest_funcarg__" prefix is deprecated and scheduled to be removed in pytest 
4.0.  Please remove the prefix and use the @pytest.fixture decorator instead.
=========== 11 failed, 94 passed, 3 pytest-warnings in 76.14 seconds ===========
--8<---------------cut here---------------end--------------->8---

Oleg Pykhalov (1):
  gnu: python-internetarchive: Update to 1.7.1.

 gnu/packages/web.scm | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

-- 
2.14.1






reply via email to

[Prev in Thread] Current Thread [Next in Thread]