/* NEVER EVER edit this manually, fix the mkhelp script instead! */ #include void hugehelp(void) { puts ( " _ _ ____ _ \n" " Project ___| | | | _ \\| | \n" " / __| | | | |_) | | \n" " | (__| |_| | _ <| |___ \n" " \\___|\\___/|_| \\_\\_____|\n" "NAME\n" " curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT,\n" " FILE, HTTP or HTTPS syntax.\n" "\n" "SYNOPSIS\n" " curl [options] url\n" "\n" "DESCRIPTION\n" " curl is a client to get documents/files from servers,\n" " using any of the supported protocols. The command is\n" " designed to work without user interaction or any kind of\n" " interactivity.\n" "\n" " curl offers a busload of useful tricks like proxy support,\n" " user authentication, ftp upload, HTTP post, SSL (https:)\n" " connections, cookies, file transfer resume and more.\n" "\n" "URL\n" " The URL syntax is protocol dependent. You'll find a\n" " detailed description in RFC 2396.\n" "\n" " You can specify multiple URLs or parts of URLs by writing\n" " part sets within braces as in:\n" "\n" " http://site.{one,two,three}.com\n" "\n" " or you can get sequences of alphanumeric series by using\n" " [] as in:\n" "\n" " ftp://ftp.numericals.com/file[1-100].txt\n" " ftp://ftp.numericals.com/file[001-100].txt (with lead-\n" " ing zeros)\n" " ftp://ftp.letters.com/file[a-z].txt\n" "\n" " It is possible to specify up to 9 sets or series for a\n" " URL, but no nesting is supported at the moment:\n" "\n" " http://www.any.org/archive[1996-1999]/vol-\n" " ume[1-4]part{a,b,c,index}.html\n" "\n" "OPTIONS\n" " -a/--append\n" " (FTP) When used in a ftp upload, this will tell\n" " curl to append to the target file instead of over-\n" " writing it. If the file doesn't exist, it will be\n" " created.\n" "\n" " -A/--user-agent \n" " (HTTP) Specify the User-Agent string to send to the\n" " HTTP server. Some badly done CGIs fail if its not\n" " set to \"Mozilla/4.0\". To encode blanks in the\n" " string, surround the string with single quote\n" " marks. This can also be set with the -H/--header\n" " flag of course.\n" " -b/--cookie \n" " (HTTP) Pass the data to the HTTP server as a\n" " cookie. It is supposedly the data previously\n" " received from the server in a \"Set-Cookie:\" line.\n" " The data should be in the format \"NAME1=VALUE1;\n" " NAME2=VALUE2\".\n" "\n" " If no '=' letter is used in the line, it is treated\n" " as a filename to use to read previously stored\n" " cookie lines from, which should be used in this\n" " session if they match. Using this method also acti-\n" " vates the \"cookie parser\" which will make curl\n" " record incoming cookies too, which may be handy if\n" " you're using this in combination with the\n" " -L/--location option. The file format of the file\n" " to read cookies from should be plain HTTP headers\n" " or the netscape cookie file format.\n" "\n" " NOTE that the file specified with -b/--cookie is\n" " only used as input. No cookies will be stored in\n" " the file. To store cookies, save the HTTP headers\n" " to a file using -D/--dump-header!\n" "\n" " -B/--ftp-ascii\n" " (FTP/LDAP) Use ASCII transfer when getting an FTP\n" " file or LDAP info. For FTP, this can also be\n" " enforced by using an URL that ends with \";type=A\".\n" "\n" " -c/--continue\n" " Continue/Resume a previous file transfer. This\n" " instructs curl to continue appending data on the\n" " file where it was previously left, possibly because\n" " of a broken connection to the server. There must be\n" " a named physical file to append to for this to\n" " work. Note: Upload resume is depening on a command\n" " named SIZE not always present in all ftp servers!\n" " Upload resume is for FTP only. HTTP resume is only\n" " possible with HTTP/1.1 or later servers.\n" "\n" " -C/--continue-at \n" " Continue/Resume a previous file transfer at the\n" " given offset. The given offset is the exact number\n" " of bytes that will be skipped counted from the\n" " beginning of the source file before it is trans-\n" " fered to the destination. If used with uploads,\n" " the ftp server command SIZE will not be used by\n" " curl. Upload resume is for FTP only. HTTP resume\n" " is only possible with HTTP/1.1 or later servers.\n" "\n" " -d/--data \n" " (HTTP) Sends the specified data in a POST request\n" " to the HTTP server. Note that the data is sent\n" " exactly as specified with no extra processing. The\n" " data is expected to be \"url-encoded\". This will\n" " cause curl to pass the data to the server using the\n" " content-type application/x-www-form-urlencoded.\n" " Compare to -F.\n" "\n" " If you start the data with the letter @, the rest\n" " should be a file name to read the data from, or -\n" " if you want curl to read the data from stdin. The\n" " contents of the file must already be url-encoded.\n" "\n" " -D/--dump-header \n" " (HTTP/FTP) Write the HTTP headers to this file.\n" " Write the FTP file info to this file if -I/--head\n" " is used.\n" "\n" " This option is handy to use when you want to store\n" " the cookies that a HTTP site sends to you. The\n" " cookies could then be read in a second curl invoke\n" " by using the -b/--cookie option!\n" "\n" " -e/--referer \n" " (HTTP) Sends the \"Referer Page\" information to the\n" " HTTP server. Some badly done CGIs fail if it's not\n" " set. This can also be set with the -H/--header flag\n" " of course.\n" "\n" " -E/--cert \n" " (HTTPS) Tells curl to use the specified certificate\n" " file when getting a file with HTTPS. The certifi-\n" " cate must be in PEM format. If the optional pass-\n" " word isn't specified, it will be queried for on the\n" " terminal. Note that this certificate is the private\n" " key and the private certificate concatenated!\n" "\n" " -f/--fail\n" " (HTTP) Fail silently (no output at all) on server\n" " errors. This is mostly done like this to better\n" " enable scripts etc to better deal with failed\n" " attempts. In normal cases when a HTTP server fails\n" " to deliver a document, it returns a HTML document\n" " stating so (which often also describes why and\n" " more). This flag will prevent curl from outputting\n" " that and fail silently instead.\n" "\n" " -F/--form \n" " (HTTP) This lets curl emulate a filled in form in\n" " which a user has pressed the submit button. This\n" " causes curl to POST data using the content-type\n" " multipart/form-data according to RFC1867. This\n" " enables uploading of binary files etc. To force the\n" " 'content' part to be read from a file, prefix the\n" " file name with an @ sign. Example, to send your\n" " password file to the server, where 'password' is\n" " the name of the form-field to which /etc/passwd\n" " will be the input:\n" " curl -F password=@/etc/passwd www.mypasswords.com\n" "\n" " To read the file's content from stdin insted of a\n" " file, use - where the file name should've been.\n" "\n" " -h/--help\n" " Usage help.\n" "\n" " -H/--header
\n" " (HTTP) Extra header to use when getting a web page.\n" " You may specify any number of extra headers. Note\n" " that if you should add a custom header that has the\n" " same name as one of the internal ones curl would\n" " use, your externally set header will be used\n" " instead of the internal one. This allows you to\n" " make even trickier stuff than curl would normally\n" " do. You should not replace internally set headers\n" " without knowing perfectly well what you're doing.\n" "\n" " -i/--include\n" " (HTTP) Include the HTTP-header in the output. The\n" " HTTP-header includes things like server-name, date\n" " of the document, HTTP-version and more...\n" "\n" " -I/--head\n" " (HTTP/FTP) Fetch the HTTP-header only! HTTP-servers\n" " feature the command HEAD which this uses to get\n" " nothing but the header of a document. When used on\n" " a FTP file, curl displays the file size only.\n" "\n" " -K/--config \n" " Specify which config file to read curl arguments\n" " from. The config file is a text file in which com-\n" " mand line arguments can be written which then will\n" " be used as if they were written on the actual com-\n" " mand line. If the first column of a config line is\n" " a '#' character, the rest of the line will be\n" " treated as a comment.\n" "\n" " Specify the filename as '-' to make curl read the\n" " file from stdin.\n" "\n" " -l/--list-only\n" " (FTP) When listing an FTP directory, this switch\n" " forces a name-only view. Especially useful if you\n" " want to machine-parse the contents of an FTP direc-\n" " tory since the normal directory view doesn't use a\n" " standard look or format.\n" "\n" " -L/--location\n" " (HTTP/HTTPS) If the server reports that the\n" " requested page has a different location (indicated\n" " with the header line Location:) this flag will let\n" " curl attempt to reattempt the get on the new place.\n" " If used together with -i or -I, headers from all\n" " requested pages will be shown.\n" "\n" " -m/--max-time \n" " Maximum time in seconds that you allow the whole\n" " operation to take. This is useful for preventing\n" " your batch jobs from hanging for hours due to slow\n" " networks or links going down. This doesn't work\n" " properly in win32 systems.\n" "\n" " -M/--manual\n" " Manual. Display the huge help text.\n" "\n" " -n/--netrc\n" " Makes curl scan the .netrc file in the user's home\n" " directory for login name and password. This is typ-\n" " ically used for ftp on unix. If used with http,\n" " curl will enable user authentication. See netrc(5)\n" " for details on the file format. Curl will not com-\n" " plain if that file hasn't the right permissions (it\n" " should not be world nor group readable). The envi-\n" " ronment variable \"HOME\" is used to find the home\n" " directory.\n" "\n" " A quick and very simple example of how to setup a\n" " .netrc to allow curl to ftp to the machine\n" " host.domain.com with user name\n" "\n" " machine host.domain.com user myself password secret\n" "\n" " -N/--no-buffer\n" " Disables the buffering of the output stream. In\n" " normal work situations, curl will use a standard\n" " buffered output stream that will have the effect\n" " that it will output the data in chunks, not neces-\n" " sarily exactly when the data arrives. Using this\n" " option will disable that buffering.\n" "\n" " -o/--output \n" " Write output to instead of stdout. If you\n" " are using {} or [] to fetch multiple documents, you\n" " can use '#' followed by a number in the \n" " specifier. That variable will be replaced with the\n" " current string for the URL being fetched. Like in:\n" "\n" " curl http://{one,two}.site.com -o \"file_#1.txt\"\n" "\n" " or use several variables like:\n" "\n" " curl http://{site,host}.host[1-5].com -o \"#1_#2\"\n" "\n" " -O/--remote-name\n" " Write output to a local file named like the remote\n" " file we get. (Only the file part of the remote file\n" " is used, the path is cut off.)\n" "\n" " -P/--ftpport
\n" " (FTP) Reverses the initiator/listener roles when\n" " connecting with ftp. This switch makes Curl use the\n" " PORT command instead of PASV. In practice, PORT\n" " tells the server to connect to the client's speci-\n" " fied address and port, while PASV asks the server\n" " for an ip address and port to connect to.
\n" " should be one of:\n" "\n" " interface i.e \"eth0\" to specify which interface's\n" " IP address you want to use (Unix only)\n" "\n" " IP address i.e \"192.168.10.1\" to specify exact IP\n" " number\n" "\n" " host name i.e \"my.host.domain\" to specify machine\n" "\n" " - (any single-letter string) to make it\n" " pick the machine's default\n" "\n" " -q If used as the first parameter on the command line,\n" " the $HOME/.curlrc file will not be read and used as\n" " a config file.\n" "\n" " -Q/--quote \n" " (FTP) Send an arbitrary command to the remote FTP\n" " server, by using the QUOTE command of the server.\n" " Not all servers support this command, and the set\n" " of QUOTE commands are server specific! Quote com-\n" " mands are sent BEFORE the transfer is taking place.\n" " To make commands take place after a successful\n" " transfer, prefix them with a dash '-'. You may\n" " specify any amount of commands to be run before and\n" " after the transfer. If the server returns failure\n" " for one of the commands, the entire operation will\n" " be aborted.\n" "\n" " -r/--range \n" " (HTTP/FTP) Retrieve a byte range (i.e a partial\n" " document) from a HTTP/1.1 or FTP server. Ranges can\n" " be specified in a number of ways.\n" "\n" " 0-499 specifies the first 500 bytes\n" "\n" " 500-999 specifies the second 500 bytes\n" "\n" " -500 specifies the last 500 bytes\n" "\n" " 9500 specifies the bytes from offset 9500 and\n" " forward\n" "\n" " 0-0,-1 specifies the first and last byte\n" " only(*)(H)\n" "\n" " 500-700,600-799\n" " specifies 300 bytes from offset 500(H)\n" "\n" " 100-199,500-599\n" " specifies two separate 100 bytes\n" " ranges(*)(H)\n" "\n" " (*) = NOTE that this will cause the server to reply with a\n" " multipart response!\n" "\n" " You should also be aware that many HTTP/1.1 servers do not\n" " have this feature enabled, so that when you attempt to get\n" " a range, you'll instead get the whole document.\n" "\n" " FTP range downloads only support the simple syntax 'start-\n" " stop' (optionally with one of the numbers omitted). It\n" " depends on the non-RFC command SIZE.\n" "\n" " -s/--silent\n" " Silent mode. Don't show progress meter or error\n" " messages. Makes Curl mute.\n" "\n" " -S/--show-error\n" " When used with -s it makes curl show error message\n" " if it fails.\n" "\n" " -t/--upload\n" " Transfer the stdin data to the specified file. Curl\n" " will read everything from stdin until EOF and store\n" " with the supplied name. If this is used on a\n" " http(s) server, the PUT command will be used.\n" "\n" " -T/--upload-file \n" " Like -t, but this transfers the specified local\n" " file. If there is no file part in the specified\n" " URL, Curl will append the local file name. NOTE\n" " that you must use a trailing / on the last direc-\n" " tory to really prove to Curl that there is no file\n" " name or curl will think that your last directory\n" " name is the remote file name to use. That will most\n" " likely cause the upload operation to fail. If this\n" " is used on a http(s) server, the PUT command will\n" " be used.\n" "\n" " -u/--user \n" " Specify user and password to use when fetching. See\n" " README.curl for detailed examples of how to use\n" " this. If no password is specified, curl will ask\n" " for it interactively.\n" "\n" " -U/--proxy-user \n" " Specify user and password to use for Proxy\n" " authentication. If no password is specified, curl\n" " will ask for it interactively.\n" "\n" " -v/--verbose\n" " Makes the fetching more verbose/talkative. Mostly\n" " usable for debugging. Lines starting with '>' means\n" " data sent by curl, '<' means data received by curl\n" " that is hidden in normal cases and lines starting\n" " with '*' means additional info provided by curl.\n" "\n" " -V/--version\n" " Displays the full version of curl, libcurl and\n" " other 3rd party libraries linked with the exe-\n" " cutable.\n" "\n" " -w/--write-out \n" " Defines what to display after a completed and suc-\n" " cessful operation. The format is a string that may\n" " contain plain text mixed with any number of vari-\n" " ables. The string can be specified as \"string\", to\n" " get read from a particular file you specify it\n" " \"@filename\" and to tell curl to read the format\n" " from stdin you write \"@-\".\n" "\n" " The variables present in the output format will be\n" " substituted by the value or text that curl thinks\n" " fit, as described below. All variables are speci-\n" " fied like %{variable_name} and to output a normal %\n" " you just write them like %%. You can output a new-\n" " line by using \\n, a carrige return with \\r and a\n" " tab space with \\t.\n" "\n" " NOTE: The %-letter is a special letter in the\n" " win32-environment, where all occurrences of % must\n" " be doubled when using this option.\n" "\n" " Available variables are at this point:\n" "\n" " url_effective The URL that was fetched last. This\n" " is mostly meaningful if you've told\n" " curl to follow location: headers.\n" "\n" " http_code The numerical code that was found in\n" " the last retrieved HTTP(S) page.\n" "\n" " time_total The total time, in seconds, that the\n" " full operation lasted. The time will\n" " be displayed with millisecond reso-\n" " lution.\n" "\n" " time_namelookup\n" " The time, in seconds, it took from\n" " the start until the name resolving\n" " was completed.\n" " time_connect The time, in seconds, it took from\n" " the start until the connect to the\n" " remote host (or proxy) was com-\n" " pleted.\n" "\n" " time_pretransfer\n" " The time, in seconds, it took from\n" " the start until the file transfer is\n" " just about to begin. This includes\n" " all pre-transfer commands and nego-\n" " tiations that are specific to the\n" " particular protocol(s) involved.\n" "\n" " size_download The total amount of bytes that were\n" " downloaded.\n" "\n" " size_upload The total amount of bytes that were\n" " uploaded.\n" "\n" " speed_download The average download speed that curl\n" " measured for the complete download.\n" "\n" " speed_upload The average upload speed that curl\n" " measured for the complete download.\n" "\n" " -x/--proxy \n" " Use specified proxy. If the port number is not\n" " specified, it is assumed at port 1080.\n" "\n" " -X/--request \n" " (HTTP) Specifies a custom request to use when com-\n" " municating with the HTTP server. The specified\n" " request will be used instead of the standard GET.\n" " Read the HTTP 1.1 specification for details and\n" " explanations.\n" "\n" " (FTP) Specifies a custom FTP command to use instead\n" " of LIST when doing file lists with ftp.\n" "\n" " -y/--speed-time