Send HTTP requests from the command line and automate processing of reply data in batch files with the free Swiss File Knife for Windows, Mac OS X and Linux.

sfk web [options] url [options]

call an http:// URL and print output to terminal,
or pass output to further commands for processing.

   -nodump      do not print reply data.
   -proxy       hostname:port of a proxy server. from within a company
                network it is often required to connect through proxies.
                alternatively, set the environment variable SFK_PROXY :
                  set SFK_PROXY=myproxyhost:8000
                to find out what proxy your browser is using, see
                - Firefox: tools/options/advanced/network/settings
                - IE: tools/internet options/connections/lan settings
   -timeout=n     wait up to n msec for connection or data.
                  default is a blocking access, i.e. connect stops
                  after the operating system default timeout,
                  and data read may block endless.
   -webtimeout=n  same, but can be given as global option
                  for a multi command chain.
   -weblimit=n  set download size limit to n mb
   -status[=s]  add a status line after reply data, optionally
                prefixed by string s, which supports slash patterns
                like \n or \t. on command chaining fields are
                separated by tabs, otherwise by blanks.
   -noerr       print no error message
   -quiet       do not print status line in case of -nodump
   -headers     print sent and received http headers
   -showreq     print full URL, may also use -status
   -verbose     tell current proxy settings, if any

automatic name expansions
   http:// is added automatically. short ip's like .100 are
   extended like depending on your subnet.

quoted multi line parameters are supported in scripts
   using full trim. type "sfk script" for details.

   - by default sfk web reads up to 10 mbytes of data.
     use -weblimit=n to change this to n mbytes.
   - if binary data is found, binary codes are stripped
     on output to terminal.

chaining support
   output data chaining is supported.

see also
   sfk wfilt    download web text and filter it directly
   sfk wget     download file from http URL
   sfk view     GUI tool to search and filter text from
                      an http URL interactively
   curl         powerful web request and download tool

web reference

more in the SFK Book
   the SFK Book contains a 60 page tutorial, including
   an HTTP automation example with detailed explanations.
   type "sfk book" for details.

   sfk web .100/getStatus.xml
      calls, for example,
      and prints the xml reply to terminal

   sfk web +filter -+status
      calls and extracts
      all lines containing "status".

   sfk web .100 +xex "_<head>**</head>_"
      gets main page from .100 and extracts html head tag.

   sfk filter ips.txt -form "$col1/xml/status.xml"
    +web -nodump
      calls many different urls based on a table of ips.
      option -nodump does not print the full result data
      but only a single status line.

   --- scripting example: ---
   +setvar error=""
   +setvar uptime=""
   +web -maxwait=2000 -noerr -status=:status:
      +xex "_:status:*\tERR
            _[setvar error][part2][endvar]_"
            _[setvar uptime][part2][endvar]_"
   +if -var "#(error) <> "
      stop -var 5 "no access (#(error))"
   --- scripting example end ---
      try to read an xml value "uptime" from info.xml
      on local IP .250 and show it by +getvar.
      if there is no connection or an HTTP error
      then stop instead with a text "no acess".