Download whole site curl




















Active 1 year, 2 months ago. Viewed k times. Improve this question. Thi G. Add a comment. Active Oldest Votes. Use wget instead. Install it with Homebrew: brew install wget or MacPorts: sudo port install wget For downloading files from a directory listing, use -r recursive , -np don't follow links to parent directories , and -k to make links in downloaded HTML or CSS point to local files credit xaccrocheur.

Improve this answer. Lri Lri The trick is to use -k to convert the links images, etc. What should I do? HoseynHeydari : you can use rudix. The option -k does not always work. Show 1 more comment. Thank you. Stack Overflow for Teams — Collaborate and share knowledge with a private group. Create a free Team What is Teams? Collectives on Stack Overflow.

Learn more. Asked 9 years, 5 months ago. Active 3 months ago. Viewed k times. I am using cURL to try to download all files in a certain directory. Martin Prikryl k 47 47 gold badges silver badges bronze badges. Alex Gordon Alex Gordon Do you want to download the whole directory and wildcard is not really relevant here or just some files in it matching the wildcard? Are you bound to curl? To be honest, it's one of the most hard-to-use tools I've ever seen.

What is your platform? Because in the post you mention bash, and in the answer comments you mentioned. Also, how is sftp relevant? And does your example actually use FTP?

Show 1 more comment. Active Oldest Votes. The default maximum depth is five layers. When ssh is enabled you should be able to use scp as well and doing it with scp should be much easier: scp user ftp. You'll need to exchange SSH keys first to make that passwordless. The question is how to do this using curl, not wget. In addition to that, the following parameters could be added for more stability: -nc, --no-clobber: skip downloads that would download to existing files and also --continue — Thales Valias.

Add a comment. Name it as you want, and put into it: curl -u login:pass ftp. Finally, write that file and run ftp like this: ftp -i -s:yourscript where -i disables interactivity asking before downloading files , and -s specifies path to the script you created.

Our sample file will be the readme for blcli, BitLaunch's command-line interface , which is hosted on GitHub. In the case of our readme, the complete command would like this:. So what if we want to use cURL to save a file to our server? For that, we must use the -O option:. You'll notice that cURL will display a download progress table rather than the file contents this time:. If you'd like the file to have a different file name perhaps readme is too bossy for you , specify it after -O :.

That's all well and good, but downloading lots of files in this way would quickly become a hassle. You can download more than one file in a single command by using the following syntax:.



0コメント

  • 1000 / 1000