diff options
Diffstat (limited to 'src')
| -rw-r--r-- | src/hugehelp.c | 1699 | 
1 files changed, 0 insertions, 1699 deletions
| diff --git a/src/hugehelp.c b/src/hugehelp.c deleted file mode 100644 index d43a2e2a8..000000000 --- a/src/hugehelp.c +++ /dev/null @@ -1,1699 +0,0 @@ -/* NEVER EVER edit this manually, fix the mkhelp script instead! */ -#include <stdio.h> -void hugehelp(void) -{ -puts ( -"                                  _   _ ____  _     \n" -"  Project                     ___| | | |  _ \\| |    \n" -"                             / __| | | | |_) | |    \n" -"                            | (__| |_| |  _ <| |___ \n" -"                             \\___|\\___/|_| \\_\\_____|\n" -"NAME\n" -"     curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT, FILE,\n" -"     HTTP or HTTPS syntax.\n" -"\n" -"SYNOPSIS\n" -"     curl [options] url\n" -"\n" -"DESCRIPTION\n" -"     curl is a client to get documents/files from servers,  using\n" -"     any  of  the supported protocols. The command is designed to\n" -"     work without user interaction or any kind of  interactivity.\n" -"\n" -"     curl  offers  a busload of useful tricks like proxy support,\n" -"     user authentication, ftp upload,  HTTP  post,  SSL  (https:)\n" -"     connections, cookies, file transfer resume and more.\n" -"\n" -"URL\n" -"     The URL syntax is protocol dependent. You'll find a detailed\n" -"     description in RFC 2396.\n" -"\n" -"     You can specify multiple URLs or parts of  URLs  by  writing\n" -"     part sets within braces as in:\n" -"\n" -"      http://site.{one,two,three}.com\n" -"\n" -"     or  you can get sequences of alphanumeric series by using []\n" -"     as in:\n" -"\n" -"      ftp://ftp.numericals.com/file[1-100].txt\n" -"      ftp://ftp.numericals.com/file[001-100].txt    (with leading\n" -"     zeros)\n" -"      ftp://ftp.letters.com/file[a-z].txt\n" -"\n" -"     It  is possible to specify up to 9 sets or series for a URL,\n" -"     but no nesting is supported at the moment:\n" -"\n" -"      http://www.any.org/archive[1996-1999]/vol\n" -"     ume[1-4]part{a,b,c,index}.html\n" -"\n" -"OPTIONS\n" -"     -a/--append\n" -"          (FTP) When used in a ftp upload, this will tell curl to\n" -"          append to the target file instead of overwriting it. If\n" -"          the file doesn't exist, it will be created.\n" -"\n" -"          If  this option is used twice, the second one will dis\n" -"          able append mode again.\n" -"\n" -"     -A/--user-agent <agent string>\n" -"          (HTTP) Specify the User-Agent string  to  send  to  the\n" -"          HTTP  server.  Some badly done CGIs fail if its not set\n" -"          to \"Mozilla/4.0\".  To  encode  blanks  in  the  string,\n" -"          surround  the string with single quote marks.  This can\n" -"          also be set with the -H/--header flag of course.\n" -"\n" -"          If this option is used more than  once,  the  last  one\n" -"          will be the one to be used.\n" -"\n" -"     -b/--cookie <name=data>\n" -"          (HTTP) Pass the data to the HTTP server as a cookie. It\n" -"          is supposedly the data  previously  received  from  the\n" -"          server  in a \"Set-Cookie:\" line.  The data should be in\n" -"          the format \"NAME1=VALUE1; NAME2=VALUE2\".\n" -"\n" -"          If no '=' letter is used in the line, it is treated  as\n" -"          a  filename  to  use  to  read previously stored cookie\n" -"          lines from, which should be used  in  this  session  if\n" -"          they  match.  Using  this  method  also  activates  the\n" -"          \"cookie parser\" which will make  curl  record  incoming\n" -"          cookies too, which may be handy if you're using this in\n" -"          combination with the  -L/--location  option.  The  file\n" -"          format of the file to read cookies from should be plain\n" -"          HTTP headers or the netscape cookie file format.\n" -"\n" -"          NOTE that the file specified with -b/--cookie  is  only\n" -"          used  as  input. No cookies will be stored in the file.\n" -"          To store cookies, save the HTTP headers to a file using\n" -"          -D/--dump-header!\n" -"\n" -"          If  this  option  is  used more than once, the last one\n" -"          will be the one to be used.\n" -"\n" -"     -B/--use-ascii\n" -"          Use ASCII transfer when getting an  FTP  file  or  LDAP\n" -"          info.  For  FTP,  this can also be enforced by using an\n" -"          URL that ends with \";type=A\". This option  causes  data\n" -"          sent to stdout to be in text mode for win32 systems.\n" -"\n" -"          If  this option is used twice, the second one will dis\n" -"          able ASCII usage.\n" -"\n" -"     -c/--continue\n" -"          Deprecated. Use '-C -' instead.  Continue/Resume a pre\n" -"          vious  file  transfer.  This instructs curl to continue\n" -"          appending data on the  file  where  it  was  previously\n" -"          left,  possibly  because  of a broken connection to the\n" -"          server. There must be a named physical file  to  append\n" -"          to  for  this to work.  Note: Upload resume is depening\n" -"          on a command named SIZE not always present in  all  ftp\n" -"          servers! Upload resume is for FTP only.  HTTP resume is\n" -"          only possible with HTTP/1.1 or later servers.\n" -"\n" -"     -C/--continue-at <offset>\n" -"          Continue/Resume a previous file transfer at  the  given\n" -"          offset.  The  given offset is the exact number of bytes\n" -"          that will be skipped counted from the beginning of  the\n" -"          source file before it is transfered to the destination.\n" -"          If used with uploads, the ftp server command SIZE  will\n" -"          not  be  used  by  curl. Upload resume is for FTP only.\n" -"          HTTP resume is only possible  with  HTTP/1.1  or  later\n" -"          servers.\n" -"\n" -"          If  this  option  is  used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -d/--data <data>\n" -"          (HTTP) Sends the specified data in a  POST  request  to\n" -"          the  HTTP server. Note that the data is sent exactly as\n" -"          specified with no extra processing (with  all  newlines\n" -"          cut  off).   The  data is expected to be \"url-encoded\".\n" -"          This will cause curl to pass the  data  to  the  server\n" -"          using  the  content-type  application/x-www-form-urlen\n" -"          coded. Compare to -F. If more than one -d/--data option\n" -"          is used on the same command line, the data pieces spec\n" -"          ified will be merged together with a separating  &-let\n" -"          ter.  Thus, using '-d name=daniel -d skill=lousy' would\n" -"          generate a post chunk that looks like\n" -"\n" -"          If you start the data with the letter @, the rest\n" -"          should be a file name to read the data from, or - if\n" -"          you want curl to read the data from stdin.  The con\n" -"          tents of the file must already be url-encoded. Multiple\n" -"          files can also be specified.\n" -"\n" -"          To post data purely binary, you should instead use the\n" -"          --data-binary option.\n" -"\n" -"          -d/--data is the same as --data-ascii.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     --data-ascii <data>\n" -"          (HTTP) This is an alias for the -d/--data option.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     --data-binary <data>\n" -"          (HTTP) This posts data in a similar manner as --data-\n" -"          ascii does, although when using this option the entire\n" -"          context of the posted data is kept as-is. If you want\n" -"          to post a binary file without the strip-newlines fea\n" -"          ture of the --data-ascii option, this is for you.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -D/--dump-header <file>\n" -"          (HTTP/FTP) Write the HTTP headers to this file. Write\n" -"          the FTP file info to this file if -I/--head is used.\n" -"\n" -"          This option is handy to use when you want to store the\n" -"          cookies that a HTTP site sends to you. The cookies\n" -"          could then be read in a second curl invoke by using the\n" -"          -b/--cookie option!\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -e/--referer <URL>\n" -"          (HTTP) Sends the \"Referer Page\" information to the HTTP\n" -"          server. This can also be set with the -H/--header flag\n" -"          of course.  When used with -L/--location you can append\n" -"          \";auto\" to the referer URL to make curl automatically\n" -"          set the previous URL when it follows a Location:\n" -"          header. The \";auto\" string can be used alone, even if\n" -"          you don't set an initial referer.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -E/--cert <certificate[:password]>\n" -"          (HTTPS) Tells curl to use the specified certificate\n" -"          file when getting a file with HTTPS. The certificate\n" -"          must be in PEM format.  If the optional password isn't\n" -"          specified, it will be queried for on the terminal. Note\n" -"          that this certificate is the private key and the pri\n" -"          vate certificate concatenated!\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     --cacert <CA certificate>\n" -"          (HTTPS) Tells curl to use the specified certificate\n" -"          file to verify the peer. The certificate must be in PEM\n" -"          format.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -f/--fail\n" -"          (HTTP) Fail silently (no output at all) on server\n" -"          errors. This is mostly done like this to better enable\n" -"          scripts etc to better deal with failed attempts. In\n" -"          normal cases when a HTTP server fails to deliver a doc\n" -"          ument, it returns a HTML document stating so (which\n" -"          often also describes why and more). This flag will\n" -"          prevent curl from outputting that and fail silently\n" -"          instead.\n" -"\n" -"          If this option is used twice, the second will again\n" -"          disable silent failure.\n" -"\n" -"     -F/--form <name=content>\n" -"          (HTTP) This lets curl emulate a filled in form in which\n" -"          a user has pressed the submit button. This causes curl\n" -"          to POST data using the content-type multipart/form-data\n" -"          according to RFC1867. This enables uploading of binary\n" -"          files etc. To force the 'content' part to be be a file,\n" -"          prefix the file name with an @ sign. To just get the\n" -"          content part from a file, prefix the file name with the\n" -"          letter <. The difference between @ and < is then that @\n" -"          makes a file get attached in the post as a file upload,\n" -"          while the < makes a text field and just get the con\n" -"          tents for that text field from a file.\n" -"\n" -"          Example, to send your password file to the server,\n" -"          where input:\n" -"\n" -"          curl -F password=@/etc/passwd www.mypasswords.com\n" -"\n" -"          To read the file's content from stdin insted of a file,\n" -"          use - where the file name should've been. This goes for\n" -); - puts( -"          both @ and < constructs.\n" -"\n" -"          This option can be used multiple times.\n" -"\n" -"     -h/--help\n" -"          Usage help.\n" -"\n" -"     -H/--header <header>\n" -"          (HTTP) Extra header to use when getting a web page. You\n" -"          may specify any number of extra headers. Note that if\n" -"          you should add a custom header that has the same name\n" -"          as one of the internal ones curl would use, your exter\n" -"          nally set header will be used instead of the internal\n" -"          one. This allows you to make even trickier stuff than\n" -"          curl would normally do. You should not replace inter\n" -"          nally set headers without knowing perfectly well what\n" -"          you're doing. Replacing an internal header with one\n" -"          without content on the right side of the colon will\n" -"          prevent that header from appearing.\n" -"\n" -"          This option can be used multiple times.\n" -"\n" -"     -i/--include\n" -"          (HTTP) Include the HTTP-header in the output. The HTTP-\n" -"          header includes things like server-name, date of the\n" -"          document, HTTP-version and more...\n" -"          If this option is used twice, the second will again\n" -"          disable header include.\n" -"\n" -"     --interface <name>\n" -"          Perform an operation using a specified interface. You\n" -"          can enter interface name, IP address or host name. An\n" -"          example could look like:\n" -"\n" -"          curl --interface eth0:1 http://www.netscape.com/\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -I/--head\n" -"          (HTTP/FTP) Fetch the HTTP-header only! HTTP-servers\n" -"          feature the command HEAD which this uses to get nothing\n" -"          but the header of a document. When used on a FTP file,\n" -"          curl displays the file size only.\n" -"\n" -"          If this option is used twice, the second will again\n" -"          disable header only.\n" -"\n" -"     --krb4 <level>\n" -"          (FTP) Enable kerberos4 authentication and use. The\n" -"          level must be entered and should be one of 'clear',\n" -"          'safe', 'confidential' or 'private'. Should you use a\n" -"          level that is not one of these, 'private' will instead\n" -"          be used.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -K/--config <config file>\n" -"          Specify which config file to read curl arguments from.\n" -"          The config file is a text file in which command line\n" -"          arguments can be written which then will be used as if\n" -"          they were written on the actual command line. Options\n" -"          and their parameters must be specified on the same con\n" -"          fig file line. If the parameter is to contain white\n" -"          spaces, the parameter must be inclosed within quotes.\n" -"          If the first column of a config line is a '#' charac\n" -"          ter, the rest of the line will be treated as a comment.\n" -"\n" -"          Specify the filename as '-' to make curl read the file\n" -"          from stdin.\n" -"\n" -"          This option can be used multiple times.\n" -"\n" -"     -l/--list-only\n" -"          (FTP) When listing an FTP directory, this switch forces\n" -"          a name-only view.  Especially useful if you want to\n" -"          machine-parse the contents of an FTP directory since\n" -"          the normal directory view doesn't use a standard look\n" -"          or format.\n" -"\n" -"          If this option is used twice, the second will again\n" -"          disable list only.\n" -"\n" -"     -L/--location\n" -"          (HTTP/HTTPS) If the server reports that the requested\n" -"          page has a different location (indicated with the\n" -"          header line Location:) this flag will let curl attempt\n" -"          to reattempt the get on the new place. If used together\n" -"          with -i or -I, headers from all requested pages will be\n" -"          shown. If this flag is used when making a HTTP POST,\n" -"          curl will automatically switch to GET after the initial\n" -"          POST has been done.\n" -"\n" -"          If this option is used twice, the second will again\n" -"          disable location following.\n" -"\n" -"     -m/--max-time <seconds>\n" -"          Maximum time in seconds that you allow the whole opera\n" -"          tion to take.  This is useful for preventing your batch\n" -"          jobs from hanging for hours due to slow networks or\n" -"          links going down.  This doesn't work fully in win32\n" -"          systems.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -M/--manual\n" -"          Manual. Display the huge help text.\n" -"\n" -"     -n/--netrc\n" -"          Makes curl scan the .netrc file in the user's home\n" -"          directory for login name and password. This is typi\n" -"          cally used for ftp on unix. If used with http, curl\n" -"          will enable user authentication. See netrc(4) for\n" -"          details on the file format. Curl will not complain if\n" -"          that file hasn't the right permissions (it should not\n" -"          be world nor group readable). The environment variable\n" -"          \"HOME\" is used to find the home directory.\n" -"\n" -"          A quick and very simple example of how to setup a\n" -"          .netrc to allow curl to ftp to the machine\n" -"          host.domain.com with user name\n" -"\n" -"          machine host.domain.com login myself password secret\n" -"\n" -"          If this option is used twice, the second will again\n" -"          disable netrc usage.\n" -"\n" -"     -N/--no-buffer\n" -"          Disables the buffering of the output stream. In normal\n" -"          work situations, curl will use a standard buffered out\n" -"          put stream that will have the effect that it will out\n" -"          put the data in chunks, not necessarily exactly when\n" -"          the data arrives.  Using this option will disable that\n" -"          buffering.\n" -"\n" -"          If this option is used twice, the second will again\n" -"          switch on buffering.\n" -"\n" -"     -o/--output <file>\n" -"          Write output to <file> instead of stdout. If you are\n" -"          using {} or [] to fetch multiple documents, you can use\n" -"          '#' followed by a number in the <file> specifier. That\n" -"          variable will be replaced with the current string for\n" -"          the URL being fetched. Like in:\n" -"\n" -"            curl http://{one,two}.site.com -o \"file_#1.txt\"\n" -"\n" -"          or use several variables like:\n" -"\n" -"            curl http://{site,host}.host[1-5].com -o \"#1_#2\"\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -O/--remote-name\n" -"          Write output to a local file named like the remote file\n" -"          we get. (Only the file part of the remote file is used,\n" -"          the path is cut off.)\n" -"\n" -"     -p/--proxytunnel\n" -"          When an HTTP proxy is used, this option will cause non-\n" -"          HTTP protocols to attempt to tunnel through the proxy\n" -"          instead of merely using it to do HTTP-like operations.\n" -"          The tunnel approach is made with the HTTP proxy CONNECT\n" -"          request and requires that the proxy allows direct con\n" -"          nect to the remote port number curl wants to tunnel\n" -"          through to.\n" -"\n" -"          If this option is used twice, the second will again\n" -"          disable proxy tunnel.\n" -"\n" -"     -P/--ftpport <address>\n" -"          (FTP) Reverses the initiator/listener roles when con\n" -"          necting with ftp. This switch makes Curl use the PORT\n" -"          command instead of PASV. In practice, PORT tells the\n" -"          server to connect to the client's specified address and\n" -"          port, while PASV asks the server for an ip address and\n" -"          port to connect to. <address> should be one of:\n" -"          interface   i.e \"eth0\" to specify which interface's IP\n" -"                      address you want to use  (Unix only)\n" -"\n" -"          IP address  i.e \"192.168.10.1\" to specify exact IP num\n" -"                      ber\n" -"\n" -"          host name   i.e \"my.host.domain\" to specify machine\n" -"\n" -"          -           (any single-letter string) to make it pick\n" -"                      the machine's default\n" -"\n" -"     If this option is used serveral times, the last one will be\n" -"     used.\n" -"\n" -"     -q   If used as the first parameter on the command line, the\n" -"          $HOME/.curlrc file will not be read and used as a con\n" -"          fig file.\n" -"\n" -"     -Q/--quote <comand>\n" -"          (FTP) Send an arbitrary command to the remote FTP\n" -"          server, by using the QUOTE command of the server. Not\n" -"          all servers support this command, and the set of QUOTE\n" -"          commands are server specific! Quote commands are sent\n" -"          BEFORE the transfer is taking place. To make commands\n" -"          take place after a successful transfer, prefix them\n" -"          with a dash '-'. You may specify any amount of commands\n" -"          to be run before and after the transfer. If the server\n" -"          returns failure for one of the commands, the entire\n" -"          operation will be aborted.\n" -"\n" -"          This option can be used multiple times.\n" -"\n" -"     -r/--range <range>\n" -"          (HTTP/FTP) Retrieve a byte range (i.e a partial docu\n" -"          ment) from a HTTP/1.1 or FTP server. Ranges can be\n" -"          specified in a number of ways.\n" -"\n" -"          0-499     specifies the first 500 bytes\n" -"\n" -"          500-999   specifies the second 500 bytes\n" -"\n" -"          -500      specifies the last 500 bytes\n" -"\n" -"          9500      specifies the bytes from offset 9500 and for\n" -"                    ward\n" -"\n" -"          0-0,-1    specifies the first and last byte only(*)(H)\n" -"\n" -"          500-700,600-799\n" -"                    specifies 300 bytes from offset 500(H)\n" -"\n" -"          100-199,500-599\n" -"                    specifies two separate 100 bytes ranges(*)(H)\n" -"\n" -"     (*) = NOTE that this will cause the server to reply with a\n" -"     multipart response!\n" -"\n" -"     You should also be aware that many HTTP/1.1 servers do not\n" -"     have this feature enabled, so that when you attempt to get a\n" -"     range, you'll instead get the whole document.\n" -"\n" -"     FTP range downloads only support the simple syntax 'start-\n" -"     stop' (optionally with one of the numbers omitted). It\n" -"     depends on the non-RFC command SIZE.\n" -"\n" -"     If this option is used serveral times, the last one will be\n" -"     used.\n" -"\n" -"     -s/--silent\n" -"          Silent mode. Don't show progress meter or error mes\n" -"          sages.  Makes Curl mute.\n" -"\n" -"          If this option is used twice, the second will again\n" -"          disable mute.\n" -"\n" -"     -S/--show-error\n" -"          When used with -s it makes curl show error message if\n" -"          it fails.\n" -"\n" -"          If this option is used twice, the second will again\n" -); - puts( -"          disable show error.\n" -"\n" -"     -t/--upload\n" -"          Deprecated. Use '-T -' instead.  Transfer the stdin\n" -"          data to the specified file. Curl will read everything\n" -"          from stdin until EOF and store with the supplied name.\n" -"          If this is used on a http(s) server, the PUT command\n" -"          will be used.\n" -"\n" -"     -T/--upload-file <file>\n" -"          Like -t, but this transfers the specified local file.\n" -"          If there is no file part in the specified URL, Curl\n" -"          will append the local file name. NOTE that you must use\n" -"          a trailing / on the last directory to really prove to\n" -"          Curl that there is no file name or curl will think that\n" -"          your last directory name is the remote file name to\n" -"          use. That will most likely cause the upload operation\n" -"          to fail. If this is used on a http(s) server, the PUT\n" -"          command will be used.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -u/--user <user:password>\n" -"          Specify user and password to use when fetching. See\n" -"          README.curl for detailed examples of how to use this.\n" -"          If no password is specified, curl will ask for it\n" -"          interactively.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -U/--proxy-user <user:password>\n" -"          Specify user and password to use for Proxy authentica\n" -"          tion. If no password is specified, curl will ask for it\n" -"          interactively.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     --url <URL>\n" -"          Set the URL to fetch. This option is mostly handy when\n" -"          you wanna specify URL in a config file.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -v/--verbose\n" -"          Makes the fetching more verbose/talkative. Mostly\n" -"          usable for debugging. Lines starting with '>' means\n" -"          data sent by curl, '<' means data received by curl that\n" -"          is hidden in normal cases and lines starting with '*'\n" -"          means additional info provided by curl.\n" -"\n" -"          If this option is used twice, the second will again\n" -"          disable verbose.\n" -"\n" -"     -V/--version\n" -"          Displays the full version of curl, libcurl and other\n" -"          3rd party libraries linked with the executable.\n" -"\n" -"     -w/--write-out <format>\n" -"          Defines what to display after a completed and success\n" -"          ful operation. The format is a string that may contain\n" -"          plain text mixed with any number of variables. The\n" -"          string can be specified as \"string\", to get read from a\n" -"          particular file you specify it \"@filename\" and to tell\n" -"          curl to read the format from stdin you write \"@-\".\n" -"\n" -"          The variables present in the output format will be sub\n" -"          stituted by the value or text that curl thinks fit, as\n" -"          described below. All variables are specified like\n" -"          %{variable_name} and to output a normal % you just\n" -"          write them like %%. You can output a newline by using\n" -"          \\n, a carrige return with \\r and a tab space with \\t.\n" -"          NOTE: The %-letter is a special letter in the\n" -"          win32-environment, where all occurrences of % must be\n" -"          doubled when using this option.\n" -"\n" -"          Available variables are at this point:\n" -"\n" -"          url_effective  The URL that was fetched last. This is\n" -"                         mostly meaningful if you've told curl to\n" -"                         follow location: headers.\n" -"\n" -"          http_code      The numerical code that was found in the\n" -"                         last retrieved HTTP(S) page.\n" -"\n" -"          time_total     The total time, in seconds, that the\n" -"                         full operation lasted. The time will be\n" -"                         displayed with millisecond resolution.\n" -"\n" -"          time_namelookup\n" -"                         The time, in seconds, it took from the\n" -"                         start until the name resolving was com\n" -"                         pleted.\n" -"\n" -"          time_connect   The time, in seconds, it took from the\n" -"                         start until the connect to the remote\n" -"                         host (or proxy) was completed.\n" -"\n" -"          time_pretransfer\n" -"                         The time, in seconds, it took from the\n" -"                         start until the file transfer is just\n" -"                         about to begin. This includes all pre-\n" -"                         transfer commands and negotiations that\n" -"                         are specific to the particular proto\n" -"                         col(s) involved.\n" -"\n" -"          size_download  The total amount of bytes that were\n" -"                         downloaded.\n" -"\n" -"          size_upload    The total amount of bytes that were\n" -"                         uploaded.\n" -"\n" -"          size_header    The total amount of bytes of the down\n" -"                         loaded headers.\n" -"\n" -"          size_request   The total amount of bytes that were sent\n" -"                         in the HTTP request.\n" -"\n" -"          speed_download The average download speed that curl\n" -"                         measured for the complete download.\n" -"\n" -"          speed_upload   The average upload speed that curl mea\n" -"                         sured for the complete upload.\n" -"     If this option is used serveral times, the last one will be\n" -"     used.\n" -"\n" -"     -x/--proxy <proxyhost[:port]>\n" -"          Use specified proxy. If the port number is not speci\n" -"          fied, it is assumed at port 1080.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -X/--request <command>\n" -"          (HTTP) Specifies a custom request to use when communi\n" -"          cating with the HTTP server.  The specified request\n" -"          will be used instead of the standard GET. Read the HTTP\n" -"          1.1 specification for details and explanations.\n" -"\n" -"          (FTP) Specifies a custom FTP command to use instead of\n" -"          LIST when doing file lists with ftp.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -y/--speed-time <time>\n" -"          If a download is slower than speed-limit bytes per sec\n" -"          ond during a speed-time period, the download gets\n" -"          aborted. If speed-time is used, the default speed-limit\n" -"          will be 1 unless set with -y.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -Y/--speed-limit <speed>\n" -"          If a download is slower than this given speed, in bytes\n" -"          per second, for speed-time seconds it gets aborted.\n" -"          speed-time is set with -Y and is 30 if not set.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -z/--time-cond <date expression>\n" -"          (HTTP) Request to get a file that has been modified\n" -"          later than the given time and date, or one that has\n" -"          been modified before that time. The date expression can\n" -"          be all sorts of date strings or if it doesn't match any\n" -"          internal ones, it tries to get the time from a given\n" -"          file name instead! See the GNU date(1) or curl_get\n" -"          date(3) man pages for date expression details.\n" -"\n" -"          Start the date expression with a dash (-) to make it\n" -"          request for a document that is older than the given\n" -"          date/time, default is a document that is newer than the\n" -"          specified date/time.\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"     -3/--sslv3\n" -"          (HTTPS) Forces curl to use SSL version 3 when negotiat\n" -"          ing with a remote SSL server.\n" -"\n" -"     -2/--sslv2\n" -"          (HTTPS) Forces curl to use SSL version 2 when negotiat\n" -"          ing with a remote SSL server.\n" -"\n" -"     -#/--progress-bar\n" -"          Make curl display progress information as a progress\n" -"          bar instead of the default statistics.\n" -"\n" -"          If this option is used twice, the second will again\n" -"          disable the progress bar.\n" -"\n" -"     --crlf\n" -"          (FTP) Convert LF to CRLF in upload. Useful for MVS\n" -"          (OS/390).\n" -"\n" -"          If this option is used twice, the second will again\n" -"          disable crlf converting.\n" -"\n" -"     --stderr <file>\n" -"          Redirect all writes to stderr to the specified file\n" -"          instead. If the file name is a plain '-', it is instead\n" -"          written to stdout. This option has no point when you're\n" -"          using a shell with decent redirecting capabilities.\n" -"\n" -"          If this option is used serveral times, the last one\n" -"          will be used.\n" -"\n" -"FILES\n" -"     ~/.curlrc\n" -"          Default config file.\n" -"\n" -"ENVIRONMENT\n" -"     HTTP_PROXY [protocol://]<host>[:port]\n" -"          Sets proxy server to use for HTTP.\n" -"\n" -"     HTTPS_PROXY [protocol://]<host>[:port]\n" -"          Sets proxy server to use for HTTPS.\n" -"\n" -"     FTP_PROXY [protocol://]<host>[:port]\n" -"          Sets proxy server to use for FTP.\n" -"\n" -"     GOPHER_PROXY [protocol://]<host>[:port]\n" -"          Sets proxy server to use for GOPHER.\n" -"     ALL_PROXY [protocol://]<host>[:port]\n" -"          Sets proxy server to use if no protocol-specific proxy\n" -"          is set.\n" -"\n" -"     NO_PROXY <comma-separated list of hosts>\n" -"          list of host names that shouldn't go through any proxy.\n" -"          If set to a asterisk '*' only, it matches all hosts.\n" -"\n" -"     COLUMNS <integer>\n" -"          The width of the terminal.  This variable only affects\n" -"          curl when the --progress-bar option is used.\n" -"\n" -"EXIT CODES\n" -"     There exists a bunch of different error codes and their cor\n" -"     responding error messages that may appear during bad condi\n" -"     tions. At the time of this writing, the exit codes are:\n" -"\n" -"     1    Unsupported protocol. This build of curl has no support\n" -"          for this protocol.\n" -"\n" -"     2    Failed to initialize.\n" -"\n" -"     3    URL malformat. The syntax was not correct.\n" -"\n" -"     4    URL user malformatted. The user-part of the URL syntax\n" -"          was not correct.\n" -"\n" -"     5    Couldn't resolve proxy. The given proxy host could not\n" -"          be resolved.\n" -"\n" -"     6    Couldn't resolve host. The given remote host was not\n" -"          resolved.\n" -"\n" -"     7    Failed to connect to host.\n" -"\n" -"     8    FTP weird server reply. The server sent data curl\n" -"          couldn't parse.\n" -"\n" -"     9    FTP access denied. The server denied login.\n" -"\n" -"     10   FTP user/password incorrect. Either one or both were\n" -); - puts( -"          not accepted by the server.\n" -"\n" -"     11   FTP weird PASS reply. Curl couldn't parse the reply\n" -"          sent to the PASS request.\n" -"\n" -"     12   FTP weird USER reply. Curl couldn't parse the reply\n" -"          sent to the USER request.\n" -"\n" -"     13   FTP weird PASV reply, Curl couldn't parse the reply\n" -"          sent to the PASV request.\n" -"     14   FTP weird 227 format. Curl couldn't parse the 227-line\n" -"          the server sent.\n" -"\n" -"     15   FTP can't get host. Couldn't resolve the host IP we got\n" -"          in the 227-line.\n" -"\n" -"     16   FTP can't reconnect. Couldn't connect to the host we\n" -"          got in the 227-line.\n" -"\n" -"     17   FTP couldn't set binary. Couldn't change transfer\n" -"          method to binary.\n" -"\n" -"     18   Partial file. Only a part of the file was transfered.\n" -"\n" -"     19   FTP couldn't RETR file. The RETR command failed.\n" -"\n" -"     20   FTP write error. The transfer was reported bad by the\n" -"          server.\n" -"\n" -"     21   FTP quote error. A quote command returned error from\n" -"          the server.\n" -"\n" -"     22   HTTP not found. The requested page was not found. This\n" -"          return code only appears if --fail is used.\n" -"\n" -"     23   Write error. Curl couldn't write data to a local\n" -"          filesystem or similar.\n" -"\n" -"     24   Malformat user. User name badly specified.\n" -"\n" -"     25   FTP couldn't STOR file. The server denied the STOR\n" -"          operation.\n" -"\n" -"     26   Read error. Various reading problems.\n" -"\n" -"     27   Out of memory. A memory allocation request failed.\n" -"\n" -"     28   Operation timeout. The specified time-out period was\n" -"          reached according to the conditions.\n" -"\n" -"     29   FTP couldn't set ASCII. The server returned an unknown\n" -"          reply.\n" -"\n" -"     30   FTP PORT failed. The PORT command failed.\n" -"\n" -"     31   FTP couldn't use REST. The REST command failed.\n" -"\n" -"     32   FTP couldn't use SIZE. The SIZE command failed. The\n" -"          command is an extension to the original FTP spec RFC\n" -"          959.\n" -"\n" -"     33   HTTP range error. The range \"command\" didn't work.\n" -"     34   HTTP post error. Internal post-request generation\n" -"          error.\n" -"\n" -"     35   SSL connect error. The SSL handshaking failed.\n" -"\n" -"     36   FTP bad download resume. Couldn't continue an earlier\n" -"          aborted download.\n" -"\n" -"     37   FILE couldn't read file. Failed to open the file. Per\n" -"          missions?\n" -"\n" -"     38   LDAP cannot bind. LDAP bind operation failed.\n" -"\n" -"     39   LDAP search failed.\n" -"\n" -"     40   Library not found. The LDAP library was not found.\n" -"\n" -"     41   Function not found. A required LDAP function was not\n" -"          found.\n" -"\n" -"     42   Aborted by callback. An application told curl to abort\n" -"          the operation.\n" -"\n" -"     43   Internal error. A function was called with a bad param\n" -"          eter.\n" -"\n" -"     44   Internal error. A function was called in a bad order.\n" -"\n" -"     45   Interface error. A specified outgoing interface could\n" -"          not be used.\n" -"\n" -"     46   Bad password entered. An error was signalled when the\n" -"          password was entered.\n" -"\n" -"     47   Too many redirects. When following redirects, curl hit\n" -"          the maximum amount.\n" -"\n" -"     XX   There will appear more error codes here in future\n" -"          releases. The existing ones are meant to never change.\n" -"\n" -"BUGS\n" -"     If you do find bugs, mail them to curl-bug@haxx.se.\n" -"\n" -"AUTHORS / CONTRIBUTORS\n" -"      - Daniel Stenberg <Daniel.Stenberg@haxx.se>\n" -"      - Rafael Sagula <sagula@inf.ufrgs.br>\n" -"      - Sampo Kellomaki <sampo@iki.fi>\n" -"      - Linas Vepstas <linas@linas.org>\n" -"      - Bjorn Reese <breese@mail1.stofanet.dk>\n" -"      - Johan Anderson <johan@homemail.com>\n" -"      - Kjell Ericson <Kjell.Ericson@haxx.se>\n" -"      - Troy Engel <tengel@sonic.net>\n" -"      - Ryan Nelson <ryan@inch.com>\n" -"      - Björn Stenberg <Bjorn.Stenberg@haxx.se>\n" -"      - Angus Mackay <amackay@gus.ml.org>\n" -"      - Eric Young <eay@cryptsoft.com>\n" -"      - Simon Dick <simond@totally.irrelevant.org>\n" -"      - Oren Tirosh <oren@monty.hishome.net>\n" -"      - Steven G. Johnson <stevenj@alum.mit.edu>\n" -"      - Gilbert Ramirez Jr. <gram@verdict.uthscsa.edu>\n" -"      - Andrés García <ornalux@redestb.es>\n" -"      - Douglas E. Wegscheid <wegscd@whirlpool.com>\n" -"      - Mark Butler <butlerm@xmission.com>\n" -"      - Eric Thelin <eric@generation-i.com>\n" -"      - Marc Boucher <marc@mbsi.ca>\n" -"      - Greg Onufer <Greg.Onufer@Eng.Sun.COM>\n" -"      - Doug Kaufman <dkaufman@rahul.net>\n" -"      - David Eriksson <david@2good.com>\n" -"      - Ralph Beckmann <rabe@uni-paderborn.de>\n" -"      - T. Yamada <tai@imasy.or.jp>\n" -"      - Lars J. Aas <larsa@sim.no>\n" -"      - Jörn Hartroth <Joern.Hartroth@computer.org>\n" -"      - Matthew Clarke <clamat@van.maves.ca>\n" -"      - Linus Nielsen <Linus.Nielsen@haxx.se>\n" -"      - Felix von Leitner <felix@convergence.de>\n" -"      - Dan Zitter <dzitter@zitter.net>\n" -"      - Jongki Suwandi <Jongki.Suwandi@eng.sun.com>\n" -"      - Chris Maltby <chris@aurema.com>\n" -"      - Ron Zapp <rzapper@yahoo.com>\n" -"      - Paul Marquis <pmarquis@iname.com>\n" -"      - Ellis Pritchard <ellis@citria.com>\n" -"      - Damien Adant <dams@usa.net>\n" -"      - Chris <cbayliss@csc.come>\n" -"      - Marco G. Salvagno <mgs@whiz.cjb.net>\n" -"      - Paul Marquis <pmarquis@iname.com>\n" -"      - David LeBlanc <dleblanc@qnx.com>\n" -"      - Rich Gray at Plus Technologies\n" -"      - Luong Dinh Dung <u8luong@lhsystems.hu>\n" -"      - Torsten Foertsch <torsten.foertsch@gmx.net>\n" -"      - Kristian Köhntopp <kris@koehntopp.de>\n" -"      - Fred Noz <FNoz@siac.com>\n" -"      - Caolan McNamara <caolan@csn.ul.ie>\n" -"      - Albert Chin-A-Young <china@thewrittenword.com>\n" -"      - Stephen Kick <skick@epicrealm.com>\n" -"      - Martin Hedenfalk <mhe@stacken.kth.se>\n" -"      - Richard Prescott\n" -"      - Jason S. Priebe <priebe@wral-tv.com>\n" -"      - T. Bharath <TBharath@responsenetworks.com>\n" -"      - Alexander Kourakos <awk@users.sourceforge.net>\n" -"      - James Griffiths <griffiths_james@yahoo.com>\n" -"\n" -"WWW\n" -"     http://curl.haxx.se\n" -"FTP\n" -"     ftp://ftp.sunet.se/pub/www/utilities/curl/\n" -"\n" -"SEE ALSO\n" -"     ftp(1), wget(1), snarf(1)\n" -"\n" -"LATEST VERSION\n" -"\n" -"  You always find news about what's going on as well as the latest versions\n" -"  from the curl web pages, located at:\n" -"\n" -"        http://curl.haxx.se\n" -"\n" -"SIMPLE USAGE\n" -"\n" -"  Get the main page from netscape's web-server:\n" -"\n" -"        curl http://www.netscape.com/\n" -"\n" -"  Get the root README file from funet's ftp-server:\n" -"\n" -"        curl ftp://ftp.funet.fi/README\n" -"\n" -"  Get a gopher document from funet's gopher server:\n" -"\n" -"        curl gopher://gopher.funet.fi\n" -"\n" -"  Get a web page from a server using port 8000:\n" -"\n" -"        curl http://www.weirdserver.com:8000/\n" -"\n" -"  Get a list of the root directory of an FTP site:\n" -"\n" -"        curl ftp://ftp.fts.frontec.se/\n" -"\n" -"  Get the definition of curl from a dictionary:\n" -"\n" -"        curl dict://dict.org/m:curl\n" -"\n" -"DOWNLOAD TO A FILE\n" -"\n" -"  Get a web page and store in a local file:\n" -"\n" -"        curl -o thatpage.html http://www.netscape.com/\n" -"\n" -"  Get a web page and store in a local file, make the local file get the name\n" -"  of the remote document (if no file name part is specified in the URL, this\n" -"  will fail):\n" -"\n" -"        curl -O http://www.netscape.com/index.html\n" -"\n" -"USING PASSWORDS\n" -"\n" -" FTP\n" -"\n" -"   To ftp files using name+passwd, include them in the URL like:\n" -"\n" -"        curl ftp://name:passwd@machine.domain:port/full/path/to/file\n" -"\n" -"   or specify them with the -u flag like\n" -"\n" -"        curl -u name:passwd ftp://machine.domain:port/full/path/to/file\n" -"\n" -" HTTP\n" -"\n" -"   The HTTP URL doesn't support user and password in the URL string. Curl\n" -"   does support that anyway to provide a ftp-style interface and thus you can\n" -"   pick a file like:\n" -"\n" -"        curl http://name:passwd@machine.domain/full/path/to/file\n" -"\n" -"   or specify user and password separately like in\n" -"\n" -"        curl -u name:passwd http://machine.domain/full/path/to/file\n" -"\n" -"   NOTE! Since HTTP URLs don't support user and password, you can't use that\n" -"   style when using Curl via a proxy. You _must_ use the -u style fetch\n" -"   during such circumstances.\n" -"\n" -" HTTPS\n" -"\n" -"   Probably most commonly used with private certificates, as explained below.\n" -"\n" -" GOPHER\n" -"\n" -"   Curl features no password support for gopher.\n" -"\n" -"PROXY\n" -"\n" -" Get an ftp file using a proxy named my-proxy that uses port 888:\n" -"\n" -"        curl -x my-proxy:888 ftp://ftp.leachsite.com/README\n" -"\n" -" Get a file from a HTTP server that requires user and password, using the\n" -" same proxy as above:\n" -"\n" -"        curl -u user:passwd -x my-proxy:888 http://www.get.this/\n" -"\n" -" Some proxies require special authentication. Specify by using -U as above:\n" -"\n" -"        curl -U user:passwd -x my-proxy:888 http://www.get.this/\n" -"\n" -" See also the environment variables Curl support that offer further proxy\n" -" control.\n" -"\n" -"RANGES\n" -"\n" -"  With HTTP 1.1 byte-ranges were introduced. Using this, a client can request\n" -"  to get only one or more subparts of a specified document. Curl supports\n" -"  this with the -r flag.\n" -"\n" -"  Get the first 100 bytes of a document:\n" -"\n" -"        curl -r 0-99 http://www.get.this/\n" -"\n" -"  Get the last 500 bytes of a document:\n" -"\n" -"        curl -r -500 http://www.get.this/\n" -"\n" -"  Curl also supports simple ranges for FTP files as well. Then you can only\n" -"  specify start and stop position.\n" -"\n" -"  Get the first 100 bytes of a document using FTP:\n" -"\n" -"        curl -r 0-99 ftp://www.get.this/README  \n" -"\n" -"UPLOADING\n" -"\n" -" FTP\n" -"\n" -"  Upload all data on stdin to a specified ftp site:\n" -"\n" -"        curl -t ftp://ftp.upload.com/myfile\n" -"\n" -"  Upload data from a specified file, login with user and password:\n" -"\n" -"        curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile\n" -"\n" -"  Upload a local file to the remote site, and use the local file name remote\n" -"  too:\n" -" \n" -"        curl -T uploadfile -u user:passwd ftp://ftp.upload.com/\n" -"\n" -"  Upload a local file to get appended to the remote file using ftp:\n" -"\n" -"        curl -T localfile -a ftp://ftp.upload.com/remotefile\n" -"\n" -"  Curl also supports ftp upload through a proxy, but only if the proxy is\n" -"  configured to allow that kind of tunneling. If it does, you can run curl in\n" -"  a fashion similar to:\n" -"\n" -"        curl --proxytunnel -x proxy:port -T localfile ftp.upload.com\n" -"\n" -" HTTP\n" -"\n" -"  Upload all data on stdin to a specified http site:\n" -"\n" -"        curl -t http://www.upload.com/myfile\n" -"\n" -"  Note that the http server must've been configured to accept PUT before this\n" -"  can be done successfully.\n" -"\n" -"  For other ways to do http data upload, see the POST section below.\n" -"\n" -"VERBOSE / DEBUG\n" -"\n" -"  If curl fails where it isn't supposed to, if the servers don't let you\n" -"  in, if you can't understand the responses: use the -v flag to get VERBOSE\n" -"  fetching. Curl will output lots of info and all data it sends and\n" -"  receives in order to let the user see all client-server interaction.\n" -"\n" -"        curl -v ftp://ftp.upload.com/\n" -); - puts( -"\n" -"DETAILED INFORMATION\n" -"\n" -"  Different protocols provide different ways of getting detailed information\n" -"  about specific files/documents. To get curl to show detailed information\n" -"  about a single file, you should use -I/--head option. It displays all\n" -"  available info on a single file for HTTP and FTP. The HTTP information is a\n" -"  lot more extensive.\n" -"\n" -"  For HTTP, you can get the header information (the same as -I would show)\n" -"  shown before the data by using -i/--include. Curl understands the\n" -"  -D/--dump-header option when getting files from both FTP and HTTP, and it\n" -"  will then store the headers in the specified file.\n" -"\n" -"  Store the HTTP headers in a separate file:\n" -"\n" -"        curl --dump-header headers.txt curl.haxx.se\n" -"\n" -"  Note that headers stored in a separate file can be very useful at a later\n" -"  time if you want curl to use cookies sent by the server. More about that in\n" -"  the cookies section.\n" -"\n" -"POST (HTTP)\n" -"\n" -"  It's easy to post data using curl. This is done using the -d <data>\n" -"  option.  The post data must be urlencoded.\n" -"\n" -"  Post a simple \"name\" and \"phone\" guestbook.\n" -"\n" -"        curl -d \"name=Rafael%20Sagula&phone=3320780\" \\\n" -"                http://www.where.com/guest.cgi\n" -"\n" -"  How to post a form with curl, lesson #1:\n" -"\n" -"  Dig out all the <input> tags in the form that you want to fill in. (There's\n" -"  a perl program called formfind.pl on the curl site that helps with this).\n" -"\n" -"  If there's a \"normal\" post, you use -d to post. -d takes a full \"post\n" -"  string\", which is in the format\n" -"\n" -"        <variable1>=<data1>&<variable2>=<data2>&...\n" -"\n" -"  The 'variable' names are the names set with \"name=\" in the <input> tags, and\n" -"  the data is the contents you want to fill in for the inputs. The data *must*\n" -"  be properly URL encoded. That means you replace space with + and that you\n" -"  write weird letters with %XX where XX is the hexadecimal representation of\n" -"  the letter's ASCII code.\n" -"\n" -"  Example:\n" -"\n" -"  (page located at http://www.formpost.com/getthis/\n" -"\n" -"        <form action=\"post.cgi\" method=\"post\">\n" -"        <input name=user size=10>\n" -"        <input name=pass type=password size=10>\n" -"        <input name=id type=hidden value=\"blablabla\">\n" -"        <input name=ding value=\"submit\">\n" -"        </form>\n" -"\n" -"  We want to enter user 'foobar' with password '12345'.\n" -"\n" -"  To post to this, you enter a curl command line like:\n" -"\n" -"        curl -d \"user=foobar&pass=12345&id=blablabla&dig=submit\"  (continues)\n" -"          http://www.formpost.com/getthis/post.cgi\n" -"\n" -"\n" -"  While -d uses the application/x-www-form-urlencoded mime-type, generally\n" -"  understood by CGI's and similar, curl also supports the more capable\n" -"  multipart/form-data type. This latter type supports things like file upload.\n" -"\n" -"  -F accepts parameters like -F \"name=contents\". If you want the contents to\n" -"  be read from a file, use <@filename> as contents. When specifying a file,\n" -"  you can also specify which content type the file is, by appending\n" -"  ';type=<mime type>' to the file name. You can also post contents of several\n" -"  files in one field. So that the field name 'coolfiles' can be sent three\n" -"  files with different content types in a manner similar to:\n" -"\n" -"        curl -F \"coolfiles=@fil1.gif;type=image/gif,fil2.txt,fil3.html\" \\\n" -"        http://www.post.com/postit.cgi\n" -"\n" -"  If content-type is not specified, curl will try to guess from the extension\n" -"  (it only knows a few), or use the previously specified type (from an earlier\n" -"  file if several files are specified in a list) or finally using the default\n" -"  type 'text/plain'.\n" -"\n" -"  Emulate a fill-in form with -F. Let's say you fill in three fields in a\n" -"  form. One field is a file name which to post, one field is your name and one\n" -"  field is a file description. We want to post the file we have written named\n" -"  \"cooltext.txt\". To let curl do the posting of this data instead of your\n" -"  favourite browser, you have to check out the HTML of the form page to get to\n" -"  know the names of the input fields. In our example, the input field names are\n" -"  'file', 'yourname' and 'filedescription'.\n" -"\n" -"        curl -F \"file=@cooltext.txt\" -F \"yourname=Daniel\" \\\n" -"             -F \"filedescription=Cool text file with cool text inside\" \\\n" -"             http://www.post.com/postit.cgi\n" -"\n" -"  So, to send two files in one post you can do it in two ways:\n" -"\n" -"  1. Send multiple files in a single \"field\" with a single field name:\n" -" \n" -"        curl -F \"pictures=@dog.gif,cat.gif\" \n" -" \n" -"  2. Send two fields with two field names: \n" -"\n" -"        curl -F \"docpicture=@dog.gif\" -F \"catpicture=@cat.gif\" \n" -"\n" -"REFERER\n" -"\n" -"  A HTTP request has the option to include information about which address\n" -"  that referred to actual page, and curl allows the user to specify that\n" -"  referrer to get specified on the command line. It is especially useful to\n" -"  fool or trick stupid servers or CGI scripts that rely on that information\n" -"  being available or contain certain data.\n" -"\n" -"        curl -e www.coolsite.com http://www.showme.com/\n" -"\n" -"  NOTE: The referer field is defined in the HTTP spec to be a full URL.\n" -"\n" -"USER AGENT\n" -"\n" -"  A HTTP request has the option to include information about the browser\n" -"  that generated the request. Curl allows it to be specified on the command\n" -"  line. It is especially useful to fool or trick stupid servers or CGI\n" -"  scripts that only accept certain browsers.\n" -"\n" -"  Example:\n" -"\n" -"  curl -A 'Mozilla/3.0 (Win95; I)' http://www.nationsbank.com/\n" -"\n" -"  Other common strings:\n" -"    'Mozilla/3.0 (Win95; I)'     Netscape Version 3 for Windows 95\n" -"    'Mozilla/3.04 (Win95; U)'    Netscape Version 3 for Windows 95\n" -"    'Mozilla/2.02 (OS/2; U)'     Netscape Version 2 for OS/2\n" -"    'Mozilla/4.04 [en] (X11; U; AIX 4.2; Nav)'           NS for AIX\n" -"    'Mozilla/4.05 [en] (X11; U; Linux 2.0.32 i586)'      NS for Linux\n" -"\n" -"  Note that Internet Explorer tries hard to be compatible in every way:\n" -"    'Mozilla/4.0 (compatible; MSIE 4.01; Windows 95)'    MSIE for W95\n" -"\n" -"  Mozilla is not the only possible User-Agent name:\n" -"    'Konqueror/1.0'             KDE File Manager desktop client\n" -"    'Lynx/2.7.1 libwww-FM/2.14' Lynx command line browser\n" -"\n" -"COOKIES\n" -"\n" -"  Cookies are generally used by web servers to keep state information at the\n" -"  client's side. The server sets cookies by sending a response line in the\n" -"  headers that looks like 'Set-Cookie: <data>' where the data part then\n" -"  typically contains a set of NAME=VALUE pairs (separated by semicolons ';'\n" -"  like \"NAME1=VALUE1; NAME2=VALUE2;\"). The server can also specify for what\n" -"  path the \"cookie\" should be used for (by specifying \"path=value\"), when the\n" -"  cookie should expire (\"expire=DATE\"), for what domain to use it\n" -"  (\"domain=NAME\") and if it should be used on secure connections only\n" -"  (\"secure\").\n" -"\n" -"  If you've received a page from a server that contains a header like:\n" -"        Set-Cookie: sessionid=boo123; path=\"/foo\";\n" -"\n" -"  it means the server wants that first pair passed on when we get anything in\n" -"  a path beginning with \"/foo\".\n" -"\n" -"  Example, get a page that wants my name passed in a cookie:\n" -"\n" -"        curl -b \"name=Daniel\" www.sillypage.com\n" -"\n" -"  Curl also has the ability to use previously received cookies in following\n" -"  sessions. If you get cookies from a server and store them in a file in a\n" -"  manner similar to:\n" -"\n" -"        curl --dump-header headers www.example.com\n" -"\n" -"  ... you can then in a second connect to that (or another) site, use the\n" -"  cookies from the 'headers' file like:\n" -"\n" -"        curl -b headers www.example.com\n" -"\n" -"  Note that by specifying -b you enable the \"cookie awareness\" and with -L\n" -"  you can make curl follow a location: (which often is used in combination\n" -"  with cookies). So that if a site sends cookies and a location, you can\n" -"  use a non-existing file to trig the cookie awareness like:\n" -"\n" -"        curl -L -b empty-file www.example.com\n" -"\n" -"  The file to read cookies from must be formatted using plain HTTP headers OR\n" -"  as netscape's cookie file. Curl will determine what kind it is based on the\n" -"  file contents.\n" -"\n" -"PROGRESS METER\n" -"\n" -"  The progress meter exists to show a user that something actually is\n" -"  happening. The different fields in the output have the following meaning:\n" -"\n" -"  % Total    % Received % Xferd  Average Speed          Time             Curr.\n" -"                                 Dload  Upload Total    Current  Left    Speed\n" -"  0  151M    0 38608    0     0   9406      0  4:41:43  0:00:04  4:41:39  9287\n" -"\n" -"  From left-to-right:\n" -"   %             - percentage completed of the whole transfer\n" -"   Total         - total size of the whole expected transfer\n" -"   %             - percentage completed of the download\n" -"   Received      - currently downloaded amount of bytes\n" -"   %             - percentage completed of the upload\n" -"   Xferd         - currently uploaded amount of bytes\n" -"   Average Speed\n" -"   Dload         - the average transfer speed of the download\n" -"   Average Speed\n" -"   Upload        - the average transfer speed of the upload\n" -"   Time Total    - expected time to complete the operation\n" -"   Time Current  - time passed since the invoke\n" -"   Time Left     - expected time left to completetion\n" -"   Curr.Speed    - the average transfer speed the last 5 seconds (the first\n" -"                   5 seconds of a transfer is based on less time of course.)\n" -"\n" -"  The -# option will display a totally different progress bar that doesn't\n" -"  need much explanation!\n" -"\n" -"SPEED LIMIT\n" -"\n" -"  Curl offers the user to set conditions regarding transfer speed that must\n" -"  be met to let the transfer keep going. By using the switch -y and -Y you\n" -"  can make curl abort transfers if the transfer speed doesn't exceed your\n" -"  given lowest limit for a specified time.\n" -"\n" -"  To let curl abandon downloading this page if its slower than 3000 bytes per\n" -"  second for 1 minute, run:\n" -"\n" -"        curl -y 3000 -Y 60 www.far-away-site.com\n" -"\n" -"  This can very well be used in combination with the overall time limit, so\n" -"  that the above operatioin must be completed in whole within 30 minutes:\n" -"\n" -"        curl -m 1800 -y 3000 -Y 60 www.far-away-site.com\n" -"\n" -"CONFIG FILE\n" -"\n" -"  Curl automatically tries to read the .curlrc file (or _curlrc file on win32\n" -"  systems) from the user's home dir on startup.\n" -"\n" -"  The config file could be made up with normal command line switches, but you\n" -"  can also specify the long options without the dashes to make it more\n" -"  readable. You can separate the options and the parameter with spaces, or\n" -"  with = or :. Comments can be used within the file. If the first letter on a\n" -"  line is a '#'-letter the rest of the line is treated as a comment.\n" -); - puts( -"\n" -"  If you want the parameter to contain spaces, you must inclose the entire\n" -"  parameter within double quotes (\"). Within those quotes, you specify a\n" -"  quote as \\\".\n" -"\n" -"  NOTE: You must specify options and their arguments on the same line.\n" -"\n" -"  Example, set default time out and proxy in a config file:\n" -"\n" -"        # We want a 30 minute timeout:\n" -"        -m 1800\n" -"        # ... and we use a proxy for all accesses:\n" -"        proxy = proxy.our.domain.com:8080\n" -"\n" -"  White spaces ARE significant at the end of lines, but all white spaces\n" -"  leading up to the first characters of each line are ignored.\n" -"\n" -"  Prevent curl from reading the default file by using -q as the first command\n" -"  line parameter, like:\n" -"\n" -"        curl -q www.thatsite.com\n" -"\n" -"  Force curl to get and display a local help page in case it is invoked\n" -"  without URL by making a config file similar to:\n" -"\n" -"        # default url to get\n" -"        url = \"http://help.with.curl.com/curlhelp.html\"\n" -"\n" -"  You can specify another config file to be read by using the -K/--config\n" -"  flag. If you set config file name to \"-\" it'll read the config from stdin,\n" -"  which can be handy if you want to hide options from being visible in process\n" -"  tables etc:\n" -"\n" -"        echo \"user = user:passwd\" | curl -K - http://that.secret.site.com\n" -"\n" -"EXTRA HEADERS\n" -"\n" -"  When using curl in your own very special programs, you may end up needing\n" -"  to pass on your own custom headers when getting a web page. You can do\n" -"  this by using the -H flag.\n" -"\n" -"  Example, send the header \"X-you-and-me: yes\" to the server when getting a\n" -"  page:\n" -"\n" -"        curl -H \"X-you-and-me: yes\" www.love.com\n" -"\n" -"  This can also be useful in case you want curl to send a different text in\n" -"  a header than it normally does. The -H header you specify then replaces the\n" -"  header curl would normally send.\n" -"\n" -"FTP and PATH NAMES\n" -"\n" -"  Do note that when getting files with the ftp:// URL, the given path is\n" -"  relative the directory you enter. To get the file 'README' from your home\n" -"  directory at your ftp site, do:\n" -"\n" -"        curl ftp://user:passwd@my.site.com/README\n" -"\n" -"  But if you want the README file from the root directory of that very same\n" -"  site, you need to specify the absolute file name:\n" -"\n" -"        curl ftp://user:passwd@my.site.com//README\n" -"\n" -"  (I.e with an extra slash in front of the file name.)\n" -"\n" -"FTP and firewalls\n" -"\n" -"  The FTP protocol requires one of the involved parties to open a second\n" -"  connction as soon as data is about to get transfered. There are two ways to\n" -"  do this.\n" -"\n" -"  The default way for curl is to issue the PASV command which causes the\n" -"  server to open another port and await another connection performed by the\n" -"  client. This is good if the client is behind a firewall that don't allow\n" -"  incoming connections.\n" -"\n" -"        curl ftp.download.com\n" -"\n" -"  If the server for example, is behind a firewall that don't allow connections\n" -"  on other ports than 21 (or if it just doesn't support the PASV command), the\n" -"  other way to do it is to use the PORT command and instruct the server to\n" -"  connect to the client on the given (as parameters to the PORT command) IP\n" -"  number and port.\n" -"\n" -"  The -P flag to curl supports a few different options. Your machine may have\n" -"  several IP-addresses and/or network interfaces and curl allows you to select\n" -"  which of them to use. Default address can also be used:\n" -"\n" -"        curl -P - ftp.download.com\n" -"\n" -"  Download with PORT but use the IP address of our 'le0' interface (this does\n" -"  not work on windows):\n" -"\n" -"        curl -P le0 ftp.download.com\n" -"\n" -"  Download with PORT but use 192.168.0.10 as our IP address to use:\n" -"\n" -"        curl -P 192.168.0.10 ftp.download.com\n" -"\n" -"NETWORK INTERFACE\n" -"\n" -"  Get a web page from a server using a specified port for the interface:\n" -"\n" -"	curl --interface eth0:1 http://www.netscape.com/\n" -"\n" -"  or\n" -"\n" -"	curl --interface 192.168.1.10 http://www.netscape.com/\n" -"\n" -"HTTPS\n" -"\n" -"  Secure HTTP requires SSL libraries to be installed and used when curl is\n" -"  built. If that is done, curl is capable of retrieving and posting documents\n" -"  using the HTTPS procotol.\n" -"\n" -"  Example:\n" -"\n" -"        curl https://www.secure-site.com\n" -"\n" -"  Curl is also capable of using your personal certificates to get/post files\n" -"  from sites that require valid certificates. The only drawback is that the\n" -"  certificate needs to be in PEM-format. PEM is a standard and open format to\n" -"  store certificates with, but it is not used by the most commonly used\n" -"  browsers (Netscape and MSEI both use the so called PKCS#12 format). If you\n" -"  want curl to use the certificates you use with your (favourite) browser, you\n" -"  may need to download/compile a converter that can convert your browser's\n" -"  formatted certificates to PEM formatted ones. This kind of converter is\n" -"  included in recent versions of OpenSSL, and for older versions Dr Stephen\n" -"  N. Henson has written a patch for SSLeay that adds this functionality. You\n" -"  can get his patch (that requires an SSLeay installation) from his site at:\n" -"  http://www.drh-consultancy.demon.co.uk/\n" -"\n" -"  Example on how to automatically retrieve a document using a certificate with\n" -"  a personal password:\n" -"\n" -"        curl -E /path/to/cert.pem:password https://secure.site.com/\n" -"\n" -"  If you neglect to specify the password on the command line, you will be\n" -"  prompted for the correct password before any data can be received.\n" -"\n" -"  Many older SSL-servers have problems with SSLv3 or TLS, that newer versions\n" -"  of OpenSSL etc is using, therefore it is sometimes useful to specify what\n" -"  SSL-version curl should use. Use -3 or -2 to specify that exact SSL version\n" -"  to use:\n" -"\n" -"        curl -2 https://secure.site.com/\n" -"\n" -"  Otherwise, curl will first attempt to use v3 and then v2.\n" -"\n" -"  To use OpenSSL to convert your favourite browser's certificate into a PEM\n" -"  formatted one that curl can use, do something like this (assuming netscape,\n" -"  but IE is likely to work similarly):\n" -"\n" -"    You start with hitting the 'security' menu button in netscape. \n" -"\n" -"    Select 'certificates->yours' and then pick a certificate in the list \n" -"\n" -"    Press the 'export' button \n" -"\n" -"    enter your PIN code for the certs \n" -"\n" -"    select a proper place to save it \n" -"\n" -"    Run the 'openssl' application to convert the certificate. If you cd to the\n" -"    openssl installation, you can do it like:\n" -"\n" -"     # ./apps/openssl pkcs12 -certfile [file you saved] -out [PEMfile]\n" -"\n" -"\n" -"RESUMING FILE TRANSFERS\n" -"\n" -" To continue a file transfer where it was previously aborted, curl supports\n" -" resume on http(s) downloads as well as ftp uploads and downloads.\n" -"\n" -" Continue downloading a document:\n" -"\n" -"        curl -c -o file ftp://ftp.server.com/path/file\n" -"\n" -" Continue uploading a document(*1):\n" -"\n" -"        curl -c -T file ftp://ftp.server.com/path/file\n" -"\n" -" Continue downloading a document from a web server(*2):\n" -"\n" -"        curl -c -o file http://www.server.com/\n" -"\n" -" (*1) = This requires that the ftp server supports the non-standard command\n" -"        SIZE. If it doesn't, curl will say so.\n" -"\n" -" (*2) = This requires that the wb server supports at least HTTP/1.1. If it\n" -"        doesn't, curl will say so.\n" -"\n" -"TIME CONDITIONS\n" -"\n" -" HTTP allows a client to specify a time condition for the document it\n" -" requests. It is If-Modified-Since or If-Unmodified-Since. Curl allow you to\n" -" specify them with the -z/--time-cond flag.\n" -"\n" -" For example, you can easily make a download that only gets performed if the\n" -" remote file is newer than a local copy. It would be made like:\n" -"\n" -"        curl -z local.html http://remote.server.com/remote.html\n" -"\n" -" Or you can download a file only if the local file is newer than the remote\n" -" one. Do this by prepending the date string with a '-', as in:\n" -"\n" -"        curl -z -local.html http://remote.server.com/remote.html\n" -"\n" -" You can specify a \"free text\" date as condition. Tell curl to only download\n" -" the file if it was updated since yesterday:\n" -"\n" -"        curl -z yesterday http://remote.server.com/remote.html\n" -"\n" -" Curl will then accept a wide range of date formats. You always make the date\n" -" check the other way around by prepending it with a dash '-'.\n" -"\n" -"DICT\n" -"\n" -"  For fun try\n" -"\n" -"        curl dict://dict.org/m:curl\n" -"        curl dict://dict.org/d:heisenbug:jargon\n" -"        curl dict://dict.org/d:daniel:web1913\n" -"\n" -"  Aliases for 'm' are 'match' and 'find', and aliases for 'd' are 'define'\n" -"  and 'lookup'. For example,\n" -"\n" -"        curl dict://dict.org/find:curl\n" -"\n" -"  Commands that break the URL description of the RFC (but not the DICT\n" -"  protocol) are\n" -"\n" -"        curl dict://dict.org/show:db\n" -"        curl dict://dict.org/show:strat\n" -"\n" -"  Authentication is still missing (but this is not required by the RFC)\n" -"\n" -"LDAP\n" -"\n" -"  If you have installed the OpenLDAP library, curl can take advantage of it\n" -"  and offer ldap:// support.\n" -"\n" -"  LDAP is a complex thing and writing an LDAP query is not an easy task. I do\n" -"  advice you to dig up the syntax description for that elsewhere, RFC 1959 if\n" -"  no other place is better.\n" -"\n" -"  To show you an example, this is now I can get all people from my local LDAP\n" -"  server that has a certain sub-domain in their email address:\n" -"\n" -"        curl -B \"ldap://ldap.frontec.se/o=frontec??sub?mail=*sth.frontec.se\"\n" -"\n" -"  If I want the same info in HTML format, I can get it by not using the -B\n" -"  (enforce ASCII) flag.\n" -"\n" -"ENVIRONMENT VARIABLES\n" -"\n" -"  Curl reads and understands the following environment variables:\n" -"\n" -"        HTTP_PROXY, HTTPS_PROXY, FTP_PROXY, GOPHER_PROXY\n" -"\n" -"  They should be set for protocol-specific proxies. General proxy should be\n" -"  set with\n" -"        \n" -"        ALL_PROXY\n" -"\n" -"  A comma-separated list of host names that shouldn't go through any proxy is\n" -"  set in (only an asterisk, '*' matches all hosts)\n" -"\n" -"        NO_PROXY\n" -"\n" -"  If a tail substring of the domain-path for a host matches one of these\n" -"  strings, transactions with that node will not be proxied.\n" -"\n" -"\n" -"  The usage of the -x/--proxy flag overrides the environment variables.\n" -"\n" -"NETRC\n" -"\n" -"  Unix introduced the .netrc concept a long time ago. It is a way for a user\n" -"  to specify name and password for commonly visited ftp sites in a file so\n" -"  that you don't have to type them in each time you visit those sites. You\n" -"  realize this is a big security risk if someone else gets hold of your\n" -"  passwords, so therefor most unix programs won't read this file unless it is\n" -"  only readable by yourself (curl doesn't care though).\n" -"\n" -"  Curl supports .netrc files if told so (using the -n/--netrc option). This is\n" -"  not restricted to only ftp, but curl can use it for all protocols where\n" -"  authentication is used.\n" -"\n" -"  A very simple .netrc file could look something like:\n" -); - puts( -"\n" -"        machine curl.haxx.se login iamdaniel password mysecret\n" -"\n" -"CUSTOM OUTPUT\n" -"\n" -"  To better allow script programmers to get to know about the progress of\n" -"  curl, the -w/--write-out option was introduced. Using this, you can specify\n" -"  what information from the previous transfer you want to extract.\n" -"\n" -"  To display the amount of bytes downloaded together with some text and an\n" -"  ending newline:\n" -"\n" -"        curl -w 'We downloaded %{size_download} bytes\\n' www.download.com\n" -"\n" -"KERBEROS4 FTP TRANSFER\n" -"\n" -"  Curl supports kerberos4 for FTP transfers. You need the kerberos package\n" -"  installed and used at curl build time for it to be used.\n" -"\n" -"  First, get the krb-ticket the normal way, like with the kauth tool. Then use\n" -"  curl in way similar to:\n" -"\n" -"        curl --krb4 private ftp://krb4site.com -u username:fakepwd\n" -"\n" -"  There's no use for a password on the -u switch, but a blank one will make\n" -"  curl ask for one and you already entered the real password to kauth.\n" -"\n" -"MAILING LIST\n" -"\n" -"  We have an open mailing list to discuss curl, its development and things\n" -"  relevant to this.\n" -"\n" -"  To subscribe, mail curl-request@contactor.se with \"subscribe <fill in your\n" -"  email address>\" in the body.\n" -"\n" -"  To post to the list, mail curl@contactor.se.\n" -"\n" -"  To unsubcribe, mail curl-request@contactor.se with \"unsubscribe <your\n" -"  subscribed email address>\" in the body.\n" -"\n" - ) ; -} | 
