gnunet-svn
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[GNUnet-SVN] [gnurl] 86/150: http: fix the max header length detection l


From: gnunet
Subject: [GNUnet-SVN] [gnurl] 86/150: http: fix the max header length detection logic
Date: Fri, 30 Mar 2018 16:49:00 +0200

This is an automated email from the git hooks/post-receive script.

ng0 pushed a commit to branch master
in repository gnurl.

commit 03370fa5a0ac5c1deae4315f01f19e9f1bc53662
Author: Daniel Stenberg <address@hidden>
AuthorDate: Fri Feb 16 09:49:33 2018 +0100

    http: fix the max header length detection logic
    
    Previously, it would only check for max length if the existing alloc
    buffer was to small to fit it, which often would make the header still
    get used.
    
    Reported-by: Guido Berhoerster
    Bug: https://curl.haxx.se/mail/lib-2018-02/0056.html
    
    Closes #2315
---
 lib/http.c | 21 ++++++++++-----------
 1 file changed, 10 insertions(+), 11 deletions(-)

diff --git a/lib/http.c b/lib/http.c
index f44b18ae9..c1c7b3908 100644
--- a/lib/http.c
+++ b/lib/http.c
@@ -2880,20 +2880,19 @@ static CURLcode header_append(struct Curl_easy *data,
                               struct SingleRequest *k,
                               size_t length)
 {
-  if(k->hbuflen + length >= data->state.headersize) {
+  size_t newsize = k->hbuflen + length;
+  if(newsize > CURL_MAX_HTTP_HEADER) {
+    /* The reason to have a max limit for this is to avoid the risk of a bad
+       server feeding libcurl with a never-ending header that will cause
+       reallocs infinitely */
+    failf(data, "Rejected %zd bytes header (max is %d)!", newsize,
+          CURL_MAX_HTTP_HEADER);
+    return CURLE_OUT_OF_MEMORY;
+  }
+  if(newsize >= data->state.headersize) {
     /* We enlarge the header buffer as it is too small */
     char *newbuff;
     size_t hbufp_index;
-    size_t newsize;
-
-    if(k->hbuflen + length > CURL_MAX_HTTP_HEADER) {
-      /* The reason to have a max limit for this is to avoid the risk of a bad
-         server feeding libcurl with a never-ending header that will cause
-         reallocs infinitely */
-      failf(data, "Avoided giant realloc for header (max is %d)!",
-            CURL_MAX_HTTP_HEADER);
-      return CURLE_OUT_OF_MEMORY;
-    }
 
     newsize = CURLMAX((k->hbuflen + length) * 3 / 2, data->state.headersize*2);
     hbufp_index = k->hbufp - data->state.headerbuff;

-- 
To stop receiving notification emails like this one, please contact
address@hidden



reply via email to

[Prev in Thread] Current Thread [Next in Thread]