How to do things
AI Noob vs. Pro

List biggest files
List newest files
Show subdir sizes
Search in files
Replace word in files
List dir differences
Send files in LAN

Free Open Source:

Swiss File Knife

a command line
multi function tool.

remove tabs
list dir sizes
find text
filter lines
find in path
collect text
instant ftp or
http server
file transfer
send text
patch text
patch binary
run own cmd
convert crlf
dup file find
md5 lists
fromto clip
hexdump
split files
list latest
compare dirs
save typing
trace http
echo colors
head & tail
dep. listing
find classes
speed shell
zip search
zip dir list

Depeche View
Source Research
First Steps

windows GUI
automation

command line
file encryption

free external tools,
zero install effort,
usb stick compliant:

zip and unzip
diff and merge
reformat xml
reformat source

cpp sources

log tracing
mem tracing
hexdump
using printf

articles

embedded
stat. c array
stat. java array
var. c array
var. java array
view all text
as you type
surf over text
find by click
quick copy
multi view
find nearby
fullscreen
bookmarks
find by path
expressions
location jump
skip accents
clip match
filter lines
edit text
highlight
load filter
hotkey list
receive text
send in C++
send in Java
smooth scroll
touch scroll
fly wxWidgets
fly over Qt
search Java

Download files from HTTP URLs instantly in the Windows CMD.EXE command line with the free Swiss File Knife.
  • Download the free Swiss File Knife Base from Sourceforge.
  • Open the Windows CMD command line, Mac OS X Terminal or Linux shell.
  • OS X : type mv sfk-mac-64.exe sfk and chmod +x sfk then ./sfk
  • Linux: type mv sfk-linux-64.exe sfk and chmod +x sfk then ./sfk. OS X and Linux syntax may differ, check the help within the tool.
sfk wget [options] url [outfile|outdir] [options]

download content from a given http:// URL.
an output filename or directory can be specified.
existing output files are overwritten without asking back.

options
   -user=u     and -pw=p set http basic authentication.
               you may also use global options -webuser, -webpw.
               note that passwords are not encrypted on transfer,
               except when using SFK Plus with HTTPS connections.
   -proxy      hostname:port of a proxy server. from within a company
               network, it is often required to connect through proxies.
               alternatively, set the environment variable SFK_PROXY :
                 set SFK_PROXY=myproxyhost:8000
               to find out what proxy your browser is using, see
               - Firefox: tools/options/advanced/network/settings
               - IE: tools/internet options/connections/lan settings
   -path2name  include web path in generated output name,
               to create unique names on multiple downloads.
               this option is default on chained processing.
   -fullpath   recreate the whole web path within output dir.
   -nodom      do not include domain name in output name.
   -nopath     do not include any path and domain information
               within the output names. will not work if URL
               does not contain any relative filename.
   -quiet      or -noprog shows no download progress indicator.
   -quiet=2    show no "done" info line.
   -addext     always add a filename extension like .txt, .html
               or .dat even if the URL has no such extension.
   -timeout=n  wait up to n msec for data
   -verbose    tell current proxy settings, if any
   -noclose    do not send "Connection: close" header.

automatic name expansions
   http:// is added automatically. short ip's like .100 are
   extended like 192.168.1.100 depending on your subnet.

quoted multi line parameters are supported in scripts
   using full trim. type "sfk script" for details.

limitations
   although sfk wget can download a list of URLs, it is not
   a real webpage downloader/archiver, as this would require
   the conversion of html pages to adapt contained links.

HTTPS support
   SSL/TLS downloads are supported with SFK Plus.
   read more under:
      stahlworks.com/sfkplus

chaining support
   output filename chaining is supported.

see also
   sfk web      send a simple web request with instant
                result output to terminal
   curl         powerful web request and download tool

examples
   sfk wget -proxy myproxy:8000 http://foobar.com/x.zip foo.zip
      download x.zip, writing the content into a file foo.zip,
      connecting through a proxy server myproxy on port 8000.

   sfk filt urls.txt +wget mydir
      if urls.txt contains a list of http:// URLs, load it
      and download all contents into mydir. the output names
      will include path information found in the source URL.

   sfk filt urls.txt +wget -fullpath mydir +list -big
      the same as above, but create the whole dir structure,
      and then list biggest files from the downloaded.

   sfk wget -quiet=2 server/info.xml tmp.txt +ffilter -nofile
      download info.xml from server, write it as file tmp.txt
      and instantly print the tmp.txt content to terminal
      without any status messages or filename infos.
 
sfk wget [options] url [outfile|outdir] 
   [options]

download content from a given http:// URL.
an output filename or directory can be 
specified. existing output files are
overwritten without asking back.

options
   -user=u     and -pw=p set http basic 
                    authentication.
               you may also use global 
               options -webuser, -webpw.
               note that passwords are not
               encrypted on transfer,
               except when using SFK Plus
               with HTTPS connections.
   -proxy      hostname:port of a proxy 
               server. from within a
               company network, it is often
               required to connect through
               proxies. alternatively, set
               the environment variable
               SFK_PROXY :
                 set 
       SFK_PROXY=myproxyhost:8000
               to find out what proxy your 
               browser is using, see -
               Firefox: tools/options/
               advanced/network/settings -
               IE: tools/internet options/
               connections/lan settings
   -path2name  include web path in 
               generated output name, to
               create unique names on
               multiple downloads. this
               option is default on chained
               processing.
   -fullpath   recreate the whole web path 
               within output dir.
   -nodom      do not include domain name 
               in output name.
   -nopath     do not include any path and 
               domain information within
               the output names. will not
               work if URL does not contain
               any relative filename.
   -quiet      or -noprog shows no download 
                   progress indicator.
   -quiet=2    show no "done" info line.
   -addext     always add a filename 
               extension like .txt, .html
               or .dat even if the URL has
               no such extension.
   -timeout=n  wait up to n msec for data
   -verbose    tell current proxy settings, 
               if any
   -noclose    do not send "Connection: 
               close" header.

automatic name expansions
   http:// is added automatically. short 
   ip's like .100 are extended like 192.168.
   1.100 depending on your subnet.

quoted multi line parameters are supported 
in scripts
   using full trim. type "sfk script" for 
   details.

limitations
   although sfk wget can download a list of 
   URLs, it is not a real webpage
   downloader/archiver, as this would
   require the conversion of html pages to
   adapt contained links.

HTTPS support
   SSL/TLS downloads are supported with SFK 
   Plus. read more under:
      stahlworks.com/sfkplus

chaining support
   output filename chaining is supported.

see also
   sfk web      send a simple web request 
                with instant result output
                to terminal
   curl         powerful web request and 
                download tool

examples sfk wget -proxy myproxy:8000 http://foobar.com/x.zip foo. zip download x.zip, writing the content into a file foo.zip, connecting through a proxy server myproxy on port 8000. sfk filt urls.txt +wget mydir if urls.txt contains a list of http:// URLs, load it and download all contents into mydir. the output names will include path information found in the source URL. sfk filt urls.txt +wget -fullpath mydir +list -big the same as above, but create the whole dir structure, and then list biggest files from the downloaded. sfk wget -quiet=2 server/info.xml tmp.txt +ffilter -nofile download info.xml from server, write it as file tmp.txt and instantly print the tmp.txt content to terminal without any status messages or filename infos.

you are viewing this page in mobile portrait mode with a limited layout. turn your device right, use a desktop browser or buy the sfk e-book for improved reading.

 
sfk is a free open-source tool, running instantly without installation efforts. no DLL's, no registry changes - just get sfk.exe from the zip package and use it (binaries for windows, linux and mac are included).

 

the Daily Landscape image
the Daily Mobile Background