How to do things
AI Noob vs. Pro

List biggest files
List newest files
Show subdir sizes
Search in files
Replace word in files
List dir differences
Send files in LAN

Free Open Source:

Swiss File Knife

a command line
multi function tool.

remove tabs
list dir sizes
find text
filter lines
find in path
collect text
instant ftp or
http server
file transfer
send text
patch text
patch binary
run own cmd
convert crlf
dup file find
md5 lists
fromto clip
hexdump
split files
list latest
compare dirs
save typing
trace http
echo colors
head & tail
dep. listing
find classes
speed shell
zip search
zip dir list

Depeche View
Source Research
First Steps

windows GUI
automation

command line
file encryption

free external tools,
zero install effort,
usb stick compliant:

zip and unzip
diff and merge
reformat xml
reformat source

cpp sources

log tracing
mem tracing
hexdump
using printf

articles

embedded
stat. c array
stat. java array
var. c array
var. java array
view all text
as you type
surf over text
find by click
quick copy
multi view
find nearby
fullscreen
bookmarks
find by path
expressions
location jump
skip accents
clip match
filter lines
edit text
highlight
load filter
hotkey list
receive text
send in C++
send in Java
smooth scroll
touch scroll
fly wxWidgets
fly over Qt
search Java

find identical files in one or more directory trees with the free sfk dupfind command for the Windows, Mac OS X, Linux and Raspberry Pi command line.
  • Download the free Swiss File Knife Base from Sourceforge.
  • Open the Windows CMD command line, Mac OS X Terminal or Linux shell.
  • OS X : type mv sfk-mac-64.exe sfk and chmod +x sfk then ./sfk
  • Linux: type mv sfk-linux-64.exe sfk and chmod +x sfk then ./sfk. OS X and Linux syntax may differ, check the help within the tool.
sfk dupfind -dir anydir [-file .ext1 .ext2]

find and list duplicate files, just by file content,
independent from filename. searches for files with same
size and then compares contents by md5 checksums.

options
   -diffdirs    list only duplicates residing in different
                root directories. this option requires that
                you specify at least two dirs after -dir.
   -listorg     list all original filenames,
                leave out any duplicate filenames.
   -minsize=n   compare only files with size >= n.
                examples for n are:
                   5m = 5000000 bytes (5 mbytes)
                 100k =  100000 bytes (5 kbytes)
                   1M = 1048576 bytes (2<<20 bytes)
                9000b =    9000 bytes

command chaining
   - by default, this command passes the names
     of found duplicate files to the next command.

   - option -listorg does the opposite: it passes
     only original filenames, but no duplicates,
     to the next chain command.

NOTE:
   if identical files are found, the decision what is listed
   as "original" or "duplicate" is currently based on the
   order in the file system: the file found first is listed as
   "original". check carefully if this is what you think,
   before cleaning up any duplicates.

web reference
   http://stahlworks.com/sfk-dupfind

examples
   sfk dupfind .
      find all duplicates within the current directory tree.

   sfk dupfind -dir docs1 docs2 docs3
      find all dups across and within the given directories.

   sfk dupfind -diffdir -dir docs1 docs2 docs3
      find dups between docs1/docs2, docs2/docs3, docs1/docs3,
      but does NOT list dups within the same root directory.

   sfk dupfind docs .doc +del
      find all duplicate .doc files, within the docs
      directory tree, and delete them.

   sfk dupfind -listorg docs .doc +run "copy $file docs2"
      copy all .doc files from docs to docs2,
      but leave out any duplicate files.

   sfk dupfind -dir pic1 -dir pic2 -dir pic3
      find duplicates across three different directory trees.
      specifying multiple -dirs is also a way of influencing
      the result order; if a file is found both in pic1 and pic3,
      the file from pic1 will be listed as original, the other one
      as the duplicate.

   sfk sel -dir pic1 pic2 pic3 -file .jpg +dup -minsize=1m
      similar to the above, this example uses command chaining:
      list all .jpg files from the pic directories, then pass
      this to the dupfind command, also filtering by size.
 
sfk dupfind -dir anydir [-file .ext1 
              .ext2]

find and list duplicate files, just by file 
content, independent from filename.
searches for files with same size and then
compares contents by md5 checksums.

options
   -diffdirs    list only duplicates 
                residing in different root
                directories. this option
                requires that you specify
                at least two dirs after
                -dir.
   -listorg     list all original filenames,
                leave out any duplicate 
                filenames.
   -minsize=n   compare only files with 
                size >= n. examples for
                n are:
                   5m = 5000000 bytes (5 
                   mbytes)
                 100k =  100000 bytes (5 
                         kbytes)
                   1M = 1048576 bytes 
                   (2<<20 bytes)
                9000b =    9000 bytes

command chaining
   - by default, this command passes 
     the names of found duplicate files to
     the next command.

   - option -listorg does the opposite: it 
             passes
     only original filenames, but no 
     duplicates, to the next chain command.

NOTE:
   if identical files are found, the 
   decision what is listed as "original" or
   "duplicate" is currently based on the
   order in the file system: the file found
   first is listed as "original". check
   carefully if this is what you think,
   before cleaning up any duplicates.

web reference
   http://stahlworks.com/sfk-dupfind

examples sfk dupfind . find all duplicates within the current directory tree. sfk dupfind -dir docs1 docs2 docs3 find all dups across and within the given directories. sfk dupfind -diffdir -dir docs1 docs2 docs3 find dups between docs1/docs2, docs2/docs3, docs1/docs3, but does NOT list dups within the same root directory. sfk dupfind docs .doc +del find all duplicate .doc files, within the docs directory tree, and delete them. sfk dupfind -listorg docs .doc +run "copy $file docs2" copy all .doc files from docs to docs2, but leave out any duplicate files. sfk dupfind -dir pic1 -dir pic2 -dir pic3 find duplicates across three different directory trees. specifying multiple -dirs is also a way of influencing the result order; if a file is found both in pic1 and pic3, the file from pic1 will be listed as original, the other one as the duplicate. sfk sel -dir pic1 pic2 pic3 -file .jpg +dup -minsize=1m similar to the above, this example uses command chaining: list all .jpg files from the pic directories, then pass this to the dupfind command, also filtering by size.

you are viewing this page in mobile portrait mode with a limited layout. turn your device right, use a desktop browser or buy the sfk e-book for improved reading.

 
sfk is a free open-source tool, running instantly without installation efforts. no DLL's, no registry changes - just get sfk.exe from the zip package and use it (binaries for windows, linux and mac are included).

 

the Endless Image 🍣 Sushi