Age | Commit message (Collapse) | Author |
|
(http://curl.haxx.se/bug/view.cgi?id=1338648) which really is more of a
feature request, but anyway. It pointed out that --max-redirs did not allow
it to be set to 0, which then would return an error code on the first
Location: found. Based on Nis' patch, now libcurl supports CURLOPT_MAXREDIRS
set to 0, or -1 for infinity. Added test case 274 to verify.
|
|
(http://curl.haxx.se/bug/view.cgi?id=1337723) that curl could not upload
binary data from stdin on Windows if the data contained control-Z (hex 1a)
since that is treated as end-of-file when read in text mode. Gisle Vanem
pointed out the fix, and I made both -T and --data-binary take advantage of
it.
|
|
in the man page, curl would send an invalid HTTP Range: header. The correct
way would be to use "-r [number]-" or even "-r -[number]". Starting now,
curl will warn if this is discovered, and automatically append a dash to the
range before passing it to libcurl.
|
|
|
|
#1334338 (http://curl.haxx.se/bug/view.cgi?id=1334338). When reading an SSL
stream from a server and the server requests a "rehandshake", the current
code simply returns this as an error. I have no good way to test this, but
I've added a crude attempt of dealing with this situation slightly better -
it makes a blocking handshake if this happens. Done like this because fixing
this the "proper" way (that would handshake asynchronously) will require
quite some work and I really need a good way to test this to do such a
change.
|
|
it, it could then accidentally actually crash. Presumably, this concerns FTP
connections. http://curl.haxx.se/bug/view.cgi?id=1330310
|
|
linked to the executable and not to the libcurld.lib
http://curl.haxx.se/bug/view.cgi?id=1326676
|
|
CURLE_COULDNT_RESOLVE_PROXY and CURLE_COULDNT_RESOLVE_HOST on resolving
errors (as documented).
|
|
(wrongly) sends *two* WWW-Authenticate headers for Digest. While this should
never happen in a sane world, libcurl previously got into an infinite loop
when this occurred. Dave added test 273 to verify this.
|
|
you set rtlibcfg=static for the make, then it would build with /MT. The
default behaviour is /MD (the original)."
http://curl.haxx.se/bug/view.cgi?id=1326665
|
|
reported, the define is used by the configure script and is assumed to use
the 0xYYXXZZ format. This made "curl-config --vernum" fail in the 7.15.0
release version.
|
|
|
|
|
|
the MEST and CEST time zones.
|
|
|
|
(http://curl.haxx.se/bug/view.cgi?id=1299181) that identified a silly problem
with Content-Range: headers with the 'bytes' keyword written in a different
case than all lowercase! It would cause a segfault!
|
|
the modified FTPS negotiation change of August 19 2005. Thus, we revert the
change back to pre-7.14.1 status.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
added. TODO: add them to docs. add TFTP server to test suite. add TFTP to
list of protocols whereever those are mentioned.
|
|
|
|
|
|
for Windows, that could lead to an Access Violation when the multi interface
was used due to an issue with how the resolver thread was and was not
terminated.
|
|
|
|
from the command line tool with --ignore-content-length. This will make it
easier to download files from Apache 1.x (and similar) servers that are
still having problems serving files larger than 2 or 4 GB. When this option
is enabled, curl will simply have to wait for the server to close the
connection to signal end of transfer. I wrote test case 269 that runs a
simple test that this works.
|
|
|
|
previously failed due to GnuTLS not allowing x509 v1 CA certs by default.
|
|
that made curl run fine in his end. The key was to make sure we do the
SSL/TLS negotiation immediately after the TCP connect is done and not after
a few other commands have been sent like we did previously. I don't consider
this change necessary to obey the standards, I think this server is pickier
than what the specs allow it to be, but I can't see how this modified
libcurl code can add any problems to those who are interpreting the
standards more liberally.
|
|
CURLOPT_COOKIEFILE), add a cookie (with CURLOPT_COOKIELIST), tell it to
write the result to a given cookie jar and then never actually call
curl_easy_perform() - the given file(s) to read was never read but the
output file was written and thus it caused a "funny" result.
- While doing some tests for the bug above, I noticed that Firefox generates
large numbers (for the expire time) in the cookies.txt file and libcurl
didn't treat them properly. Now it does.
|
|
|
|
zone name of a daylight savings time was used. For example, PDT vs PDS. This
flaw was introduced with the new date parser (11 sep 2004 - 7.12.2).
Fortunately, no web server or cookie string etc should be using such time
zone names thus limiting the effect of this bug.
|
|
HTTP proxy if an FTP URL was given. libcurl now properly switches to pure HTTP
internally when an HTTP proxy is used, even for FTP URLs. The problem would
also occur with other multi-pass auth methods.
|
|
--features was used
|
|
set to 1, CURLOPT_NOBODY will now automatically be set to 0.
|
|
simple interface to extracting and setting cookies in libcurl's internal
"cookie jar". See the new cookie_interface.c example code.
|
|
|
|
trailer is then sent to the normal header callback/stream.
|
|
seems the Windows (MSVC) libc time functions may return data one hour off if
TZ is not set and automatic DST adjustment is enabled. This made
curl_getdate() return wrong value, and it also concerned internal cookie
expirations etc.
|
|
|
|
fix the CONNECT authentication code with multi-pass auth methods (such as
NTLM) as it didn't previously properly ignore response-bodies - in fact it
stopped reading after all response headers had been received. This could
lead to libcurl sending the next request and reading the body from the first
request as response to the second request. (I also renamed the function,
which wasn't strictly necessary but...)
The best fix would to once and for all make the CONNECT code use the
ordinary request sending/receiving code, treating it as any ordinary request
instead of the special-purpose function we have now. It should make it
better for multi-interface too. And possibly lead to less code...
Added test case 265 for this. It doesn't work as a _really_ good test case
since the test proxy is too stupid, but the test case helps when running the
debugger to verify.
|
|
|
|
|
|
|
|
1) findtool does look per tool in PATH and think ./perl is the perl
executable, while is just a local directory (I have . in the PATH)
2) I got several warning for head -1 deprecated in favour of head -n 1
3) ares directory is missing some file (missing is missing :-) ) because
automake and friends is not run.
(Let's hope number 2 doesn't break somewhere "out there", if so we can always
search/replace that back.)
|
|
|
|
address was not possible to use. It is now, but requires it written
RFC2732-style, within brackets - which incidently is how you enter numerical
IPv6 addresses in URLs. Test case 263 added to verify.
|