Swiss File Knife
a command line
free external tools,
- download the free Swiss File Knife Base from Sourceforge. - open the Windows CMD command line, Mac OS X Terminal or Linux shell. - OS X : type mv sfk-mac-i686.exe sfk and chmod +x sfk then ./sfk - Linux: type mv sfk-linux.exe sfk and chmod +x sfk then ./sfk OS X and Linux syntax may differ, check the help within the tool.
sfk wget [options] url [outfile|outdir] [options] download content from a given http:// URL. an output filename or directory can be specified. existing output files are overwritten without asking back. options -proxy hostname:port of a proxy server. from within a company network, it is often required to connect through proxies. alternatively, set the environment variable SFK_PROXY : set SFK_PROXY=myproxyhost:8000 to find out what proxy your browser is using, see - Firefox: tools/options/advanced/network/settings - IE: tools/internet options/connections/lan settings -path2name include web path in generated output name, to create unique names on multiple downloads. this option is default on chained processing. -fullpath recreate the whole web path within output dir. -nodom do not include domain name in output name. -nopath do not include any path and domain information within the output names. will not work if URL does not contain any relative filename. -quiet or -noprog shows no download progress indicator. -quiet=2 show no "done" info line. -addext always add a filename extension like .txt, .html or .dat even if the URL has no such extension. -timeout=n wait up to n msec for data -verbose tell current proxy settings, if any automatic name expansions http:// is added automatically. short ip's like .100 are extended like 192.168.1.100 depending on your subnet. quoted multi line parameters are supported in scripts using full trim. type "sfk script" for details. limitations although sfk wget can download a list of URLs, it is not a real webpage downloader/archiver, as this would require the conversion of html pages to adapt contained links. chaining support output filename chaining is supported. see also sfk web send a simple web request with instant result output to terminal curl powerful web request and download tool web reference http://stahlworks.com/sfk-wget examples sfk wget -proxy myproxy:8000 http://foobar.com/x.zip foo.zip download x.zip, writing the content into a file foo.zip, connecting through a proxy server myproxy on port 8000. sfk filt urls.txt +wget mydir if urls.txt contains a list of http:// URLs, load it and download all contents into mydir. the output names will include path information found in the source URL. sfk filt urls.txt +wget -fullpath mydir +list -big the same as above, but create the whole dir structure, and then list biggest files from the downloaded. sfk wget -quiet=2 server/info.xml tmp.txt +ffilter -nofile download info.xml from server, write it as file tmp.txt and instantly print the tmp.txt content to terminal without any status messages or filename infos.
sfk is a free open-source tool, running instantly without installation efforts. no DLL's,
no registry changes - just get sfk.exe from the zip package and use it (binaries for
windows, linux and mac are included).
read more about all sfk functions here.