No installation. USB stick portable applications.
Swiss File Knife

a command line
multi function tool.

remove tabs
list dir sizes
find text
filter lines
find in path
collect text
instant ftp or
http server
file transfer
send text
patch text
patch binary
run own cmd
convert crlf
dup file find
md5 lists
fromto clip
hexdump
split files
list latest
compare dirs
save typing
trace http
echo colors
head & tail
find classes
dep. listing
speed shell
zip search
zip dir list

Depeche View
Source Research
First Steps

firefox add-ons

using vm linux

windows GUI
automation

the d3caster
java game engine

command line
file encryption

free external tools,
zero install effort,
usb stick compliant:

zip and unzip
diff and merge
reformat xml
reformat source

java sources

thread creation

cpp sources

log tracing
mem tracing
hexdump
using printf

articles

embedded
stat. c array
stat. java array
var. c array
var. java array
view all text
as you type
surf over text
find by click
quick copy
multi view
find nearby
fullscreen
bookmarks
find by path
expressions
location jump
skip accents
clip match
filter lines
edit text
highlight
load filter
hotkey list
receive text
send in C++
send in Java
smooth scroll
touch scroll
fly wxWidgets
fly over Qt
search Java

Download files from HTTP URLs instantly

in the Windows CMD.EXE command line with the free Swiss File Knife.
- download the free Swiss File Knife Base from Sourceforge.
- open the Windows CMD command line, Mac OS X Terminal or Linux shell.
- OS X : type mv sfk-mac-i686.exe sfk and chmod +x sfk then ./sfk
- Linux: type mv sfk-linux.exe sfk    and chmod +x sfk then ./sfk
  OS X and Linux syntax may differ, check the help within the tool.
sfk wget [options] url [outfile|outdir] [options]

download content from a given http:// URL.
an output filename or directory can be specified.
existing output files are overwritten without asking back.

options
   -proxy      hostname:port of a proxy server. from within a company
               network, it is often required to connect through proxies.
               alternatively, set the environment variable SFK_PROXY :
                 set SFK_PROXY=myproxyhost:8000
               to find out what proxy your browser is using, see
               - Firefox: tools/options/advanced/network/settings
               - IE: tools/internet options/connections/lan settings
   -path2name  include web path in generated output name,
               to create unique names on multiple downloads.
               this option is default on chained processing.
   -fullpath   recreate the whole web path within output dir.
   -nodom      do not include domain name in output name.
   -nopath     do not include any path and domain information
               within the output names. will not work if URL
               does not contain any relative filename.
   -quiet      or -noprog shows no download progress indicator.
   -quiet=2    show no "done" info line.
   -addext     always add a filename extension like .txt, .html
               or .dat even if the URL has no such extension.
   -timeout=n  wait up to n msec for data
   -verbose    tell current proxy settings, if any

automatic name expansions
   http:// is added automatically. short ip's like .100 are
   extended like 192.168.1.100 depending on your subnet.

quoted multi line parameters are supported in scripts
   using full trim. type "sfk script" for details.

limitations
   although sfk wget can download a list of URLs, it is not
   a real webpage downloader/archiver, as this would require
   the conversion of html pages to adapt contained links.

chaining support
   output filename chaining is supported.

see also
   sfk web      send a simple web request with instant
                result output to terminal
   curl         powerful web request and download tool

web reference
   http://stahlworks.com/sfk-wget

examples
   sfk wget -proxy myproxy:8000 http://foobar.com/x.zip foo.zip
      download x.zip, writing the content into a file foo.zip,
      connecting through a proxy server myproxy on port 8000.

   sfk filt urls.txt +wget mydir
      if urls.txt contains a list of http:// URLs, load it
      and download all contents into mydir. the output names
      will include path information found in the source URL.

   sfk filt urls.txt +wget -fullpath mydir +list -big
      the same as above, but create the whole dir structure,
      and then list biggest files from the downloaded.

   sfk wget -quiet=2 server/info.xml tmp.txt +ffilter -nofile
      download info.xml from server, write it as file tmp.txt
      and instantly print the tmp.txt content to terminal
      without any status messages or filename infos.
 



 
sfk is a free open-source tool, running instantly without installation efforts. no DLL's,
no registry changes - just get sfk.exe from the zip package and use it (binaries for
windows, linux and mac are included).

read more about all sfk functions here.

Download the free Depeche View Lite Text Search Tool