Send HTTP requests from the command line and automate processing of reply data in batch files with the free Swiss File Knife for Windows, Mac OS X and Linux.
sfk web [options] url [options]sfk filter ... +tweb [options]
call an http:// URL and print output to terminal,
or pass output to further commands for processing.
sfk ... +web requires an url parameter.
sfk ... +tweb gets the url(s) from a previous command.
-nodump do not print reply data.
-proxy hostname:port of a proxy server. from within a company
network it is often required to connect through proxies.
alternatively, set the environment variable SFK_PROXY :
to find out what proxy your browser is using, see
- Firefox: tools/options/advanced/network/settings
- IE: tools/internet options/connections/lan settings
-timeout=n wait up to n msec for connection or data.
default is a blocking access, i.e. connect stops
after the operating system default timeout,
and data read may block endless.
-webtimeout=n same, but can be given as global option
for a multi command chain.
-delay=n wait n msec after each request.
-weblimit=n set download size limit to n mb
-status[=s] add a status line after reply data, optionally
prefixed by string s, which supports slash patterns
like \n or \t. on command chaining fields are
separated by tabs, otherwise by blanks.
-noerr print no error message
-quiet do not print status line in case of -nodump
-headers print sent and received http headers
-showreq print full URL, may also use -status
-verbose tell current proxy settings, if any
automatic name expansions
http:// is added automatically. short ip's like .100 are
extended like 192.168.1.100 depending on your subnet.
quoted multi line parameters are supported in scripts
using full trim. type "sfk script" for details.
- by default sfk web reads up to 10 mbytes of data.
use -weblimit=n to change this to n mbytes.
- if binary data is found, binary codes are stripped
on output to terminal.
aliasescweb call the web quickly without any output,
same as web -nodump -quiet.
tweb same as web but tells explicitely
that it expects chain text input.
since sfk 1.8.7 +web does not use chain text input.
use +tweb to read url's from chain text, or set
global option -chainweb to use chain input by default.
output data chaining is supported.
see alsosfk wfilt download web text and filter it directly
sfk wget download file from http URL
sfk view GUI tool to search and filter text from
an http URL interactively
curl powerful web request and download tool
web referencehttp://stahlworks.com/sfk-webmore in the SFK Book
the SFK Book contains a 60 page tutorial, including
an HTTP automation example with detailed explanations.
type "sfk book" for details.
examplessfk web .100/getStatus.xml
calls, for example, http://192.168.1.100/getStatus.xml
and prints the xml reply to terminal
sfk web 192.168.1.200/zones.xml +filter -+status
calls http://192.168.1.200/zones.xml and extracts
all lines containing "status".
sfk web .100 +xex "_<head>**</head>_"
gets main page from .100 and extracts html head tag.
sfk filter ips.txt -form "$col1/xml/status.xml"+tweb -nodump
calls many different urls based on a table of ips.
option -nodump does not print the full result data
but only a single status line.
--- scripting example: ---
+web -maxwait=2000 -noerr -status=:status:
+if -var "#(error) <> "
stop -var 5 "no access (#(error))"
--- scripting example end ---
try to read an xml value "uptime" from info.xml
on local IP .250 and show it by +getvar.
if there is no connection or an HTTP error
then stop instead with a text "no acess".