aboutsummaryrefslogtreecommitdiff
path: root/docs/KNOWN_BUGS
blob: 3c4fa53c7c5d455b06d3c4081f2c7eaaa9032f33 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
These are problems known to exist at the time of this release. Feel free to
join in and help us correct one or more of these! Also be sure to check the
changelog of the current development status, as one or more of these problems
may have been fixed since this was written!

* To get HTTP Negotiate authentication to work fine, you need to provide a
  (fake) user name (this concerns both curl and the lib) because the code
  wrongly only considers authentication if there's a user name provided.
  Bug report #1004841.

* If you use a very large amount of file descriptors (more than FD_SETSIZE)
  and then use libcurl, it might crash on its use of select() which then
  stores data out of bounds. Bug report #948950.

* --limit-rate using -d or -F does not work. This is because the limit logic
  is provided by the curl app in its read/write callbacks, and when doing
  -d/-F the callbacks aren't used! Bug report #921395.

* Doing resumed upload over HTTP does not work with '-C -', because curl
  doesn't do a HEAD first to get the initial size. This needs to be done
  manually for HTTP PUT resume to work, and then '-C [index]'.

* CURLOPT_USERPWD and CURLOPT_PROXYUSERPWD have no way of providing user names
  that contain a colon. This can't be fixed easily in a backwards compatible
  way without adding new options (and then, they should most probably allow
  setting user name and password separately).

* libcurl ignores empty path parts in FTP URLs, whereas RFC1738 states that
  such parts should be sent to the server as 'CWD ' (without an argument).
  The only exception to this rule, is that we knowingly break this if the
  empty part is first in the path, as then we use the double slashes to
  indicate that the user wants to reach the root dir (this exception SHALL
  remain even when this bug is fixed).

* libcurl doesn't treat the content-length of compressed data properly, as
  it seems HTTP servers send the *uncompressed* length in that header and
  libcurl thinks of it as the *compressed* lenght. Some explanations are here:
  http://curl.haxx.se/mail/lib-2003-06/0146.html

* Downloading 0 (zero) bytes files over FTP will not create a zero byte file
  locally, which is because libcurl doesn't call the write callback with zero
  bytes. Explained here: http://curl.haxx.se/mail/archive-2003-04/0143.html

* IPv6 support on AIX 4.3.3 doesn't work due to a missing sockaddr_storage
  struct. It has been reported to work on AIX 5.1 though.

* GOPHER transfers seem broken

* If a HTTP server responds to a HEAD request and includes a body (thus
  violating the RFC2616), curl won't wait to read the response but just stop
  reading and return back. If a second request (let's assume a GET) is then
  immediately made to the same server again, the connection will be re-used
  fine of course, and the second request will be sent off but when the
  response is to get read, the previous response-body is what curl will read
  and havoc is what happens.
  More details on this is found in this libcurl mailing list thread:
  http://curl.haxx.se/mail/lib-2002-08/0000.html