Age | Commit message (Collapse) | Author |
|
(http://curl.haxx.se/bug/view.cgi?id=1338648) which really is more of a
feature request, but anyway. It pointed out that --max-redirs did not allow
it to be set to 0, which then would return an error code on the first
Location: found. Based on Nis' patch, now libcurl supports CURLOPT_MAXREDIRS
set to 0, or -1 for infinity. Added test case 274 to verify.
|
|
Added CURL_DISABLE_TFTP; tftp.c doesn't compile as-is.
|
|
bug #1326306
|
|
#1334338 (http://curl.haxx.se/bug/view.cgi?id=1334338). When reading an SSL
stream from a server and the server requests a "rehandshake", the current
code simply returns this as an error. I have no good way to test this, but
I've added a crude attempt of dealing with this situation slightly better -
it makes a blocking handshake if this happens. Done like this because fixing
this the "proper" way (that would handshake asynchronously) will require
quite some work and I really need a good way to test this to do such a
change.
|
|
it, it could then accidentally actually crash. Presumably, this concerns FTP
connections. http://curl.haxx.se/bug/view.cgi?id=1330310
|
|
linked to the executable and not to the libcurld.lib
http://curl.haxx.se/bug/view.cgi?id=1326676
|
|
CURLE_COULDNT_RESOLVE_PROXY and CURLE_COULDNT_RESOLVE_HOST on resolving
errors (as documented).
|
|
(wrongly) sends *two* WWW-Authenticate headers for Digest. While this should
never happen in a sane world, libcurl previously got into an infinite loop
when this occurred. Dave added test 273 to verify this.
|
|
DEBUG_ADDRINFO to enable.
|
|
you set rtlibcfg=static for the make, then it would build with /MT. The
default behaviour is /MD (the original)."
http://curl.haxx.se/bug/view.cgi?id=1326665
|
|
when building the static library. http://curl.haxx.se/bug/view.cgi?id=1326665
|
|
copy them there.
|
|
was 0 as it seems at least some AIX versions don't like a "0" string there
|
|
|
|
modded since the given time, so we should compare <= and not just <.
|
|
the MEST and CEST time zones.
|
|
will break strict-aliasing rules".
|
|
runtime libs.
|
|
now contain the word "proxy" is the hostname is in fact a proxy. This will
help users detect situations when they mistakenly use a proxy.
|
|
(http://curl.haxx.se/bug/view.cgi?id=1299181) that identified a silly problem
with Content-Range: headers with the 'bytes' keyword written in a different
case than all lowercase! It would cause a segfault!
|
|
the modified FTPS negotiation change of August 19 2005. Thus, we revert the
change back to pre-7.14.1 status.
|
|
|
|
|
|
define SEC_ENTRY and thus fails unless this is done!
|
|
|
|
|
|
protocol sockets even if the resolved address may say otherwise
|
|
|
|
|
|
|
|
|
|
Should we do this for all targets?
|
|
|
|
have this check done on far too many places by now...
|
|
|
|
|
|
|
|
added. TODO: add them to docs. add TFTP server to test suite. add TFTP to
list of protocols whereever those are mentioned.
|
|
|
|
Kevin Lussier pointed this out!
|
|
for Windows, that could lead to an Access Violation when the multi interface
was used due to an issue with how the resolver thread was and was not
terminated.
|
|
|
|
docs/TODO
|
|
from the command line tool with --ignore-content-length. This will make it
easier to download files from Apache 1.x (and similar) servers that are
still having problems serving files larger than 2 or 4 GB. When this option
is enabled, curl will simply have to wait for the server to close the
connection to signal end of transfer. I wrote test case 269 that runs a
simple test that this works.
|
|
previously failed due to GnuTLS not allowing x509 v1 CA certs by default.
|
|
|
|
that made curl run fine in his end. The key was to make sure we do the
SSL/TLS negotiation immediately after the TCP connect is done and not after
a few other commands have been sent like we did previously. I don't consider
this change necessary to obey the standards, I think this server is pickier
than what the specs allow it to be, but I can't see how this modified
libcurl code can add any problems to those who are interpreting the
standards more liberally.
|
|
|
|
|
|
CURLOPT_COOKIEFILE), add a cookie (with CURLOPT_COOKIELIST), tell it to
write the result to a given cookie jar and then never actually call
curl_easy_perform() - the given file(s) to read was never read but the
output file was written and thus it caused a "funny" result.
- While doing some tests for the bug above, I noticed that Firefox generates
large numbers (for the expire time) in the cookies.txt file and libcurl
didn't treat them properly. Now it does.
|