Age | Commit message (Collapse) | Author |
|
Enabling test2017 to test2022.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Currently 1356 to 1362 succeed but a write failure is logged in traceNNNN.
Currently 1363 fails, so disabled for now.
|
|
Currently 1348 to 1354 succeed but a write failure is logged in traceNNNN.
Currently 1355 fails, so disabled for now.
|
|
|
|
|
|
When doing a chunked-encoded POST with -d (CURLOPT_POSTFIELDS) and the
size of the POST was zero length, it made libcurl first send a zero
chunk and then the terminating one. This could confuse a receiver and it
should rather just send the terminating chunk as it does with this fix.
Test case 1333 is added to verify.
Bug: http://curl.haxx.se/mail/archive-2012-04/0060.html
Reported by: Arnaud Compan
|
|
|
|
Verify that cookies are sent back even after a 407 response has been
received
|
|
With FOLLOWLOCATION enabled. When a 3xx page is downloaded and the
download size was known (like with a Content-Length header), but the
subsequent URL (transfered after the 3xx page) was chunked encoded, then
the previous "known download size" would linger and cause the progress
meter to get incorrect information, ie the former value would remain
being sent in. This could easily result in downloads that were WAY
larger than "expected" and would cause >100% outputs with the curl
command line tool.
Test case 599 was created and it was used to repeat the bug and then
verify the fix.
Bug: http://curl.haxx.se/bug/view.cgi?id=3510057
Reported by: Michael Wallner
|
|
|
|
These tests check the output of the --libcurl option of curl,
including the improved option handling added in a related patch.
|
|
The proxy parser function strips off trailing slashes off the proxy name
which could lead to a mistaken zero length proxy name which would be
treated as no proxy at all by subsequent functions!
This is now detected and an error is returned. Verified by the new test
1329.
Reported by: Chandrakant Bagul
Bug: http://curl.haxx.se/mail/lib-2012-02/0000.html
|
|
We want to continue to the next URL to try even on failures returned
from libcurl. This makes -f with ranges still get subsequent URLs even
if occasional ones return error. This was a regression as it used to
work and broke in the 7.23.0 release.
Added test case 1328 to verify the fix.
Bug: http://curl.haxx.se/bug/view.cgi?id=3481223
Reported by: Juan Barreto
|
|
Related to the security vulnerability: CVE-2012-0036
Bug: http://curl.haxx.se/docs/adv_20120124.html
|
|
Add simple telnet tests which (ab)use the http server.
The second test checks for an input file handling bug.
|
|
This newly speced HTTP status code already works as intended in the new
spec:
http://greenbytes.de/tech/webdav/draft-reschke-http-status-308-02.html
Test 1325 is added to verify that the method is kept after the redirect
|
|
|
|
There's a new 'http-proxy' server for tests that runs on a separate port
and lets clients do HTTP CONNECT to other ports on the same host to
allow us to test HTTP "tunneling" properly.
Test cases now have a <proxy> section in <verify> to check that the
proxy protocol part matches correctly.
Test case 80, 83, 95, 275, 503 and 1078 have been converted. Test 1316
was added.
|
|
When a HTTP connection is re-used for a subsequent request without
proxy, it would always re-use the Host: header of the first request. As
host names are case insensitive it would make curl send another host
name case that what the particular request used.
Now it will instead always use the most recent host name to always use
the desired casing.
Added test case 1318 to verify.
Bug: http://curl.haxx.se/mail/lib-2011-12/0314.html
Reported by: Alex Vinnik
|
|
Test 1317 verifies --resolve (leaked memory)
Bug: http://curl.haxx.se/bug/view.cgi?id=3463121
Reported by: "tw84452852"
|
|
Test case 1315 was added to verify this functionality. When passing in
multiple files to a single -F, the parser would get all confused if one
of the specified files had a custom type= assigned.
Reported by: Colin Hogben
|
|
|
|
test 595: for passive FTP
test 596: for active FTP
|
|
Test 815 is disabled for now since libcurl currently doesn't unescape
such lines the way it should. See mail:
http://curl.haxx.se/mail/lib-2011-11/0324.html
|
|
"Active FTP hangs if server does not open data connection"
The server first sends a 150 and then when libcurl waits for the data
transfer, the server sends a 425.
|
|
By setting PROTOPT_NOURLQUERY in the protocol handler struct, the
protocol will get the "query part" of the URL cut off before the data is
handled by the protocol-specific code. This makes libcurl adhere to
RFC3986 section 2.2.
Test 1220 is added to verify a file:// URL with query-part.
|
|
A regression between 7.22.0 and 7.23.0 -- downloading a file with the
flags -O and -J results in the content being written to stdout if and
only if there was no Content-Disposition header in the http response. If
there is a C-D header with a filename attribute, the output is correctly
written.
Reported by: Dave Reisner
Bug: http://curl.haxx.se/mail/archive-2011-11/0030.html
|
|
prefixing a command with '*' means it is allowed to fail without
aborting the chain actions
|
|
591 -> FTP multi PORT and 425 on upload
592 -> FTP multi PORT and 421 on upload
593 -> FTP multi PORT upload, no data conn and no transient neg. reply
594 -> FTP multi PORT upload, no data conn and no positive prelim. reply
1206 -> FTP PORT and 425 on download
1207 -> FTP PORT and 421 on download
1208 -> FTP PORT download, no data conn and no transient negative reply
1209 -> FTP PORT download, no data conn and no positive preliminary reply
|
|
This test is created to verify Rene Bernhardt's patch which makes sure
libcurl properly _not_ deals with Negotiate if not asked to even if the
proxy says it can serve it.
|
|
|
|
I created test 587 in commit 840eff44f2b but forgot to add the file to
the tarball. Added now.
|
|
This is using the verbatim 525 test code but it disables EPRT in the
server and this should work just as well anyway.
|
|
As commit 5850cc4808ab clarifies, libcurl can deliver header lines that
are longer than CURL_MAX_WRITE_SIZE, only body data is limited to that
size. The curl tool has check (when built debug-enabled) that made the
wrong checks and this new test 1205 verifies that larger headers work.
|
|
The fix is pretty much the one Nick Zitzmann provided, just edited to do
the right indent levels and with test case 1204 added to verify the fix.
Bug: http://curl.haxx.se/mail/lib-2011-10/0190.html
Reported by: Nick Zitzmann
|
|
Verifies the fix from commit 322f3d5af7093
|
|
|
|
|
|
|
|
|
|
Content-disposition headers can provide file names with semicolons which
previously would be cut off at that point.
Added test case 1311 and 1312 to verify -J.
Bug: http://curl.haxx.se/bug/view.cgi?id=3375603
Reported by: Peter Hjalmarsson
|
|
The 20xx range is for multiple sequential tests.
|
|
When libcurl has said to the server that there's a POST or PUT coming
(with a content-length and all) it has to either deliver that amount of
data or it needs to close the connection before trying a second request.
Adds test case 1129, 1130 and 1131
The bug report is about when used with 100-continue, but the change is
more generic.
Bug: http://curl.haxx.se/mail/lib-2011-06/0191.html
Reported by: Steven Parkes
|
|
|