How to do things
AI Noob vs. Pro

List biggest files
List newest files
Show subdir sizes
Search in files
Replace word in files
List dir differences
Send files in LAN

Free Open Source:

Swiss File Knife

a command line
multi function tool.

remove tabs
list dir sizes
find text
filter lines
find in path
collect text
instant ftp or
http server
file transfer
send text
patch text
patch binary
run own cmd
convert crlf
dup file find
md5 lists
fromto clip
hexdump
split files
list latest
compare dirs
save typing
trace http
echo colors
head & tail
dep. listing
find classes
speed shell
zip search
zip dir list

Depeche View
Source Research
First Steps

windows GUI
automation

command line
file encryption

free external tools,
zero install effort,
usb stick compliant:

zip and unzip
diff and merge
reformat xml
reformat source

cpp sources

log tracing
mem tracing
hexdump
using printf

articles

embedded
stat. c array
stat. java array
var. c array
var. java array
view all text
as you type
surf over text
find by click
quick copy
multi view
find nearby
fullscreen
bookmarks
find by path
expressions
location jump
skip accents
clip match
filter lines
edit text
highlight
load filter
hotkey list
receive text
send in C++
send in Java
smooth scroll
touch scroll
fly wxWidgets
fly over Qt
search Java

Send HTTP requests from the command line and automate processing of reply data in batch files with the free Swiss File Knife for Windows, Mac OS X and Linux.
  • Download the free Swiss File Knife Base from Sourceforge.
  • Open the Windows CMD command line, Mac OS X Terminal or Linux shell.
  • OS X : type mv sfk-mac-64.exe sfk and chmod +x sfk then ./sfk
  • Linux: type mv sfk-linux-64.exe sfk and chmod +x sfk then ./sfk. OS X and Linux syntax may differ, check the help within the tool.
sfk web [options] url [options]
sfk filter ... +tweb [options]

call an http:// URL and print output to terminal,
or pass output to further commands for processing.

sfk ... +web requires an url parameter.
sfk ... +tweb gets the url(s) from a previous command.

options
   -user=u      and -pw=p set http basic authentication.
                you may also use global options -webuser, -webpw.
                note that passwords are not encrypted on transfer,
                except when using SFK Plus with HTTPS connections.
   -nodump      do not print reply data.
   -proxy       hostname:port of a proxy server. from within a company
                network it is often required to connect through proxies.
                alternatively, set the environment variable SFK_PROXY :
                  set SFK_PROXY=myproxyhost:8000
                to find out what proxy your browser is using, see
                - Firefox: tools/options/advanced/network/settings
                - IE: tools/internet options/connections/lan settings
   -timeout=n     wait up to n msec for connection or data.
                  default is a blocking access, i.e. connect stops
                  after the operating system default timeout,
                  and data read may block endless.
   -webtimeout=n  same, but can be given as global option
                  for a multi command chain.
   -delay=n     wait n msec after each request.
   -weblimit=n  set download size limit to n mb
   -status[=s]  add a status line after reply data, optionally
                prefixed by string s, which supports slash patterns
                like \n or \t. on command chaining fields are
                separated by tabs, otherwise by blanks.
   -noerr       print no error message
   -quiet       do not print status line in case of -nodump
   -headers     print sent and received http headers
   -header x    or -head adds custom header x to http requests, like
                -header "Accept-Language: de,en-US;q=0.7,en;q=0.3"
                multiple header lines can be given. default headers
                with the same name are replaced.
   -request x   or -req specifies the whole HTTP request, like
                -req "POST / HTTP/1.1
                      Host: localhost
                      Connection: close
                      
                      var1=123&var2=456
                      "
                this can only be used within a script file.
                to create an example script for editing, type:
                   sfk batch webreq.bat
   -reqfromvar a  take request from variable a. must contain exact
                data, like empty CRLF line after GET header.
   -headerstovar a   write reply headers as one text block to a
   -headertovar n a  write header n content to variable a
   -showreq     print full URL, may also use -status
   -verbose     tell current proxy settings, if any
   -noclose     do not send "Connection: close" header.

automatic name expansions
   http:// is added automatically. short ip's like .100 are
   extended like 192.168.1.100 depending on your subnet.

quoted multi line parameters are supported in scripts
   using full trim. type "sfk script" for details.

limitations
   - by default sfk web reads up to 100 mbytes of data.
     use -weblimit=n to change this to n mbytes.
   - if binary data is found, binary codes are stripped
     on output to terminal.

aliases
   cweb  call the web quickly without any output,
         same as web -nodump -quiet.
   tweb  same as web but tells explicitely
         that it expects chain text input.

HTTPS support
   SSL/TLS connections are supported with SFK Plus.
   read more under:
      stahlworks.com/sfkplus

return codes for chaining
   0 = ok    >0 = any error

see also
   sfk wfilt    download web text and filter it directly
   sfk wget     download file from http URL
   sfk view     GUI tool to search and filter text from
                an http URL interactively
   curl         powerful web request and download tool

web reference
   http://stahlworks.com/sfk-web

more in the SFK Book
   the SFK Book contains a 60 page tutorial, including
   an HTTP automation example with detailed explanations.
   type "sfk book" for details.

examples
   sfk web .100/getStatus.xml
      calls, for example, http://192.168.1.100/getStatus.xml
      and prints the xml reply to terminal

   sfk web 192.168.1.200/zones.xml +filter -+status
      calls http://192.168.1.200/zones.xml and extracts
      all lines containing "status".

   sfk web .100 +xex "_<head>**</head>_"
      gets main page from .100 and extracts html head tag.

   sfk filter ips.txt -form "$col1/xml/status.xml"
    +tweb -nodump
      calls many different urls based on a table of ips.
      option -nodump does not print the full result data
      but only a single status line.

   --- scripting example: ---
   +setvar error=""
   +setvar uptime=""
   +web -maxwait=2000 -noerr -status=:status:
      ".250/info.xml"
      +xex "_:status:*\tERR
            _[setvar error][part2][endvar]_"
           "_<uptime>*</uptime>
            _[setvar uptime][part2][endvar]_"
   +if -var "#(error) <> "
      stop -var 5 "no access (#(error))"
   +getvar
   --- scripting example end ---
      try to read an xml value "uptime" from info.xml
      on local IP .250 and show it by +getvar.
      if there is no connection or an HTTP error
      then stop instead with a text "no acess".
 
sfk web [options] url [options]
sfk filter ... +tweb [options]

call an http:// URL and print output to 
terminal, or pass output to further
commands for processing.

sfk ... +web requires an url parameter.
sfk ... +tweb gets the url(s) from a 
         previous command.

options
   -user=u      and -pw=p set http basic 
                     authentication.
                you may also use global 
                options -webuser, -webpw.
                note that passwords are not
                encrypted on transfer,
                except when using SFK Plus
                with HTTPS connections.
   -nodump      do not print reply data.
   -proxy       hostname:port of a proxy 
                server. from within a
                company network it is often
                required to connect through
                proxies. alternatively, set
                the environment variable
                SFK_PROXY :
                  set 
        SFK_PROXY=myproxyhost:8000
                to find out what proxy your 
                browser is using, see -
                Firefox: tools/options/
                advanced/network/settings -
                IE: tools/internet options/
                connections/lan settings
   -timeout=n     wait up to n msec for 
                  connection or data.
                  default is a blocking
                  access, i.e. connect
                  stops after the operating
                  system default timeout,
                  and data read may block
                  endless.
   -webtimeout=n  same, but can be given as 
                  global option for a multi
                  command chain.
   -delay=n     wait n msec after each 
                request.
   -weblimit=n  set download size limit 
                to n mb
   -status[=s]  add a status line after 
                reply data, optionally
                prefixed by string s, which
                supports slash patterns
                like \n or \t. on command
                chaining fields are
                separated by tabs,
                otherwise by blanks.
   -noerr       print no error message
   -quiet       do not print status line in 
                case of -nodump
   -headers     print sent and received 
                http headers
   -header x    or -head adds custom header 
                    x to http requests,
                    like
                -header "Accept-Language: 
                 de,en-US;q=0.7,en;q=0.3"
                multiple header lines can 
                be given. default headers
                with the same name are
                replaced.
   -request x   or -req specifies the whole 
                    HTTP request, like
                -req "POST / HTTP/1.1
                      Host: localhost
                      Connection: close
                      
                      var1=123&var2=456
                      "
                this can only be used 
                within a script file. to
                create an example script
                for editing, type:
                   sfk batch webreq.bat
   -reqfromvar a  take request from 
                  variable a. must contain
                  exact
                data, like empty CRLF line 
                after GET header.
   -headerstovar a   write reply headers as 
                     one text block to a
   -headertovar n a  write header n content 
                     to variable a
   -showreq     print full URL, may also 
                use -status
   -verbose     tell current proxy settings,
                if any
   -noclose     do not send "Connection: 
                close" header.

automatic name expansions
   http:// is added automatically. short 
   ip's like .100 are extended like 192.168.
   1.100 depending on your subnet.

quoted multi line parameters are supported 
in scripts
   using full trim. type "sfk script" for 
   details.

limitations
   - by default sfk web reads up to 100 
     mbytes of data. use -weblimit=n to
     change this to n mbytes.
   - if binary data is found, binary codes 
     are stripped on output to terminal.

aliases
   cweb  call the web quickly without any 
         output, same as web -nodump -quiet.
         
   tweb  same as web but tells explicitely
         that it expects chain text input.

HTTPS support
   SSL/TLS connections are supported with 
   SFK Plus. read more under:
      stahlworks.com/sfkplus

return codes for chaining
   0 = ok    >0 = any error

see also
   sfk wfilt    download web text and 
                filter it directly
   sfk wget     download file from 
                http URL
   sfk view     GUI tool to search and 
                filter text from an http
                URL interactively
   curl         powerful web request and 
                download tool

web reference
   http://stahlworks.com/sfk-web

more in the SFK Book
   the SFK Book contains a 60 page 
 tutorial, including
   an HTTP automation example with detailed 
   explanations. type "sfk book" for
   details.

examples sfk web .100/getStatus.xml calls, for example, http://192.168.1.100/getStatus.xml and prints the xml reply to terminal sfk web 192.168.1.200/zones.xml +filter -+status calls http://192.168.1.200/zones.xml and extracts all lines containing "status". sfk web .100 +xex "_<head>**</head>_" gets main page from .100 and extracts html head tag. sfk filter ips.txt -form "$col1/xml/status.xml" +tweb -nodump calls many different urls based on a table of ips. option -nodump does not print the full result data but only a single status line. --- scripting example: --- +setvar error="" +setvar uptime="" +web -maxwait=2000 -noerr -status=:status: ".250/info.xml" +xex "_:status:*\tERR _[setvar error][part2][endvar]_" "_<uptime>*</uptime> _[setvar uptime][part2][endvar]_" +if -var "#(error) <> " stop -var 5 "no access (#(error))" +getvar --- scripting example end --- try to read an xml value "uptime" from info.xml on local IP .250 and show it by +getvar. if there is no connection or an HTTP error then stop instead with a text "no acess".

you are viewing this page in mobile portrait mode with a limited layout. turn your device right, use a desktop browser or buy the sfk e-book for improved reading.

 
sfk is a free open-source tool, running instantly without installation efforts. no DLL's, no registry changes - just get sfk.exe from the zip package and use it (binaries for windows, linux and mac are included).

 

the Daily Landscape image
the Daily Mobile Background