Age | Commit message (Collapse) | Author |
|
fail to connect if there is no Common Name field found in the remote cert.
We should deprecate the support for this set to 1 anyway soon, since the
feature is pointless and most likely never really used by anyone.
|
|
The tiny patch below fixes a bug (that I introduced :) which happens
when negotiating authentication with a proxy (probably with web
servers as well) that uses chunked transfer encoding for the 407 error
pages. In this case the ''ignorebody'' flag was ignored (no pun
intended).
|
|
to a proxy during CONNECT auth negotiation.
|
|
using one of the so-called 'right' time zones that take into account
leap seconds, which causes the tests to fail (as reported by
Daniel Black in bug report #1745964).
|
|
|
|
message for an scp:// upload failure. If libssh2 has his matching
patch, then the error message return by the server will be used instead
of a more generic error.
|
|
libcurl. This also makes the options change name to --krb (from --krb4) and
CURLOPT_KRBLEVEL (from CURLOPT_KRB4LEVEL) but the old names are still
|
|
|
|
|
|
hash function for different hashes, and also expanded the default size for
the socket hash table used in multi handles to greatly enhance speed when
very many connections are added and the socket API is used.
|
|
chunked encoding (that also lacks "Connection: close"). It now simply
assumes that the connection WILL be closed to signal the end, as that is how
RFC2616 section 4.4 point #5 says we should behave.
|
|
|
|
http://curl.haxx.se/mail/lib-2007-06/0238.html, libcurl didn't properly do
no-body requests on FTP files on re-used connections properly, or at least
it didn't provide the info back in the header callback properly in the
subsequent requests.
|
|
tool reports and it was indeed a legitimate one and it is one fixed. It was
a use of a share without doing the proper locking first.
|
|
(http://curl.haxx.se/bug/view.cgi?id=1740263). Adam discovered that when
getting a large amount of URLs with curl, they were fetched slower and
slower... which turned out to be because the --libcurl data collecting which
wrongly always was enabled, but no longer is...
|
|
(http://curl.haxx.se/bug/view.cgi?id=1739100) that mentioned that libcurl
could not actually list the contents of the root directory of a given FTP
server if the login directory isn't root. I fixed the problem and added three
test cases (one is disabled for now since I identified KNOWN_BUGS #44, we
cannot use --ftp-method nocwd and list ftp directories).
|
|
HTTP CONNECT over a proxy
|
|
|
|
CURLOPT_FTP_CREATE_MISSING_DIRS works for SFTP connections as well as FTP
ones.
|
|
(http://curl.haxx.se/bug/view.cgi?id=1733119) and we collaborated on the fix.
The problem is that for 64bit HPUX builds, several socket-related functions
would still assume int (32 bit) arguments and not socklen_t (64 bit) ones.
|
|
|
|
the SOCKS server.
|
|
|
|
- -s/--silent can now be used to toggle off the silence again if used a second
time.
Daniel S (5 June 2007)
- Added Daniel Black's work that adds the first few SOCKS test cases. I also
fixed two minor SOCKS problems to make the test cases run fine.
|
|
to find that it crashed miserably, and this was due to some select()isms left
in the code. This was due to API restrictions in c-ares 1.3.x, but with the
upcoming c-ares 1.4.0 this is no longer the case so now libcurl runs much
better with c-ares and the multi interface with > 1024 file descriptors in
use.
|
|
|
|
the maximum size of the connection cache maximum size of the multi handle.
|
|
overwrite in Curl_select(). While fixing it, I also improved its performance
somewhat by changing calloc to malloc and breaking out of a loop earlier
(when possible).
|
|
(http://curl.haxx.se/bug/view.cgi?id=1705802), which was filed by Daniel
Black identifying several FTP-SSL test cases fail when we build libcurl with
NSS for TLS/SSL. Listed as #42 in KNOWN_BUGS.
|
|
(http://curl.haxx.se/bug/view.cgi?id=1724016) noticing that downloading
glob-ranges for TFTP was broken in CVS.
|
|
pointed out that the warnf() function in the curl tool didn't properly deal
with the cases when excessively long words were used in the string to chop
up.
|
|
|
|
peer's name in the SSL certificate when built for OpenSSL. The leak happens
for libcurls with CURL_DOES_CONVERSIONS enabled that fail to convert the CN
name from UTF8.
|
|
bug report #1715394 (http://curl.haxx.se/bug/view.cgi?id=1715394), and the
transfer-related info "variables" were indeed overwritten with zeroes wrongly
and have now been adjusted. The upload size still isn't accurate.
|
|
|
|
code for timeouts less than fice seconds, and also provided a fix for it.
|
|
|
|
case 614.
Allow SFTP quote commands chmod, chown, chgrp to set a value of 0.
|
|
|
|
sftp didn't truncate it first, which would corrupt the file if the new
file was shorter than the old.
|
|
|
|
because I just made SCP uploads return this value if the file size of
the upload file isn't given with CURLOPT_INFILESIZE*. Docs updated to
reflect this news, and a define for the old name was added to the public
header file.
|
|
cache grow a bit too much, beyond the normal 4 * easy_handles.
|
|
|
|
when CURLOPT_HTTP200ALIASES is used to avoid the problem mentioned below is
not very nice if the client wants to be able to use _either_ a HTTP 1.1
server or one within the aliases list... so starting now, libcurl will
simply consider 200-alias matches the to be HTTP 1.0 compliant.
|
|
libcurls, which turned out to be the 25-nov-2006 change which treats HTTP
responses without Content-Length or chunked encoding as without bodies. We
now added the conditional that the above mentioned response is only without
body if the response is HTTP 1.1.
|
|
CURLMOPT_TIMERFUNCTION callback option.
|
|
when CURLM_CALL_MULTI_PERFORM is returned from curl_multi_socket*/perform,
to make applications using only curl_multi_socket() to properly function
when adding easy handles "on the fly". Bug report and test app provided by
Michael Wallner.
|
|
for every year is giving us too many files! I also split out the changes
from 2006 from CHANGES to CHANGES.0 now.
|
|
the default port numbers, allowing more than one test suite to run
simultaneously on the same host.
|