How to do things AI Noob vs. Pro
List biggest files Free Open Source: Swiss File Knifea command line
Depeche View
command line
free external tools,
cpp sources
articles |
sfk dupfind -dir anydir [-file .ext1 .ext2] find and list duplicate files, just by file content, independent from filename. searches for files with same size and then compares contents by md5 checksums. options -diffdirs list only duplicates residing in different root directories. this option requires that you specify at least two dirs after -dir. -listorg list all original filenames, leave out any duplicate filenames. -minsize=n compare only files with size >= n. examples for n are: 5m = 5000000 bytes (5 mbytes) 100k = 100000 bytes (5 kbytes) 1M = 1048576 bytes (2<<20 bytes) 9000b = 9000 bytes command chaining - by default, this command passes the names of found duplicate files to the next command. - option -listorg does the opposite: it passes only original filenames, but no duplicates, to the next chain command. NOTE: if identical files are found, the decision what is listed as "original" or "duplicate" is currently based on the order in the file system: the file found first is listed as "original". check carefully if this is what you think, before cleaning up any duplicates. examples sfk dupfind . find all duplicates within the current directory tree. sfk dupfind -dir docs1 docs2 docs3 find all dups across and within the given directories. sfk dupfind -diffdir -dir docs1 docs2 docs3 find dups between docs1/docs2, docs2/docs3, docs1/docs3, but does NOT list dups within the same root directory. sfk dupfind docs .doc +del find all duplicate .doc files, within the docs directory tree, and delete them. sfk dupfind -listorg docs .doc +run "copy $file docs2" copy all .doc files from docs to docs2, but leave out any duplicate files. sfk dupfind -dir pic1 -dir pic2 -dir pic3 find duplicates across three different directory trees. specifying multiple -dirs is also a way of influencing the result order; if a file is found both in pic1 and pic3, the file from pic1 will be listed as original, the other one as the duplicate. sfk sel -dir pic1 pic2 pic3 -file .jpg +dup -minsize=1m similar to the above, this example uses command chaining: list all .jpg files from the pic directories, then pass this to the dupfind command, also filtering by size. sfk dupfind -dir anydir [-file .ext1 .ext2] find and list duplicate files, just by file content, independent from filename. searches for files with same size and then compares contents by md5 checksums. options -diffdirs list only duplicates residing in different root directories. this option requires that you specify at least two dirs after -dir. -listorg list all original filenames, leave out any duplicate filenames. -minsize=n compare only files with size >= n. examples for n are: 5m = 5000000 bytes (5 mbytes) 100k = 100000 bytes (5 kbytes) 1M = 1048576 bytes (2<<20 bytes) 9000b = 9000 bytes command chaining - by default, this command passes the names of found duplicate files to the next command. - option -listorg does the opposite: it passes only original filenames, but no duplicates, to the next chain command. NOTE: if identical files are found, the decision what is listed as "original" or "duplicate" is currently based on the order in the file system: the file found first is listed as "original". check carefully if this is what you think, before cleaning up any duplicates. examples sfk dupfind . find all duplicates within the current directory tree. sfk dupfind -dir docs1 docs2 docs3 find all dups across and within the given directories. sfk dupfind -diffdir -dir docs1 docs2 docs3 find dups between docs1/docs2, docs2/docs3, docs1/docs3, but does NOT list dups within the same root directory. sfk dupfind docs .doc +del find all duplicate .doc files, within the docs directory tree, and delete them. sfk dupfind -listorg docs .doc +run "copy $file docs2" copy all .doc files from docs to docs2, but leave out any duplicate files. sfk dupfind -dir pic1 -dir pic2 -dir pic3 find duplicates across three different directory trees. specifying multiple -dirs is also a way of influencing the result order; if a file is found both in pic1 and pic3, the file from pic1 will be listed as original, the other one as the duplicate. sfk sel -dir pic1 pic2 pic3 -file .jpg +dup -minsize=1m similar to the above, this example uses command chaining: list all .jpg files from the pic directories, then pass this to the dupfind command, also filtering by size. you are viewing this page in mobile portrait mode with a limited layout. turn your device right, use a desktop browser or buy the sfk e-book for improved reading. sfk is a free open-source tool, running instantly without installation efforts. no DLL's, no registry changes - just get sfk.exe from the zip package and use it (binaries for windows, linux and mac are included).
the Daily Landscape image
|