Syntax of the `shred` command in Linux
shred [OPTION] FILE
[OPTIONS] represents the various parameters and flags that can be used to modify the behavior of the Shred command
FILE refers to the file or files you wish to shred.
Options available in `shred` Command
Options | Description |
---|---|
-n, –iterations=N |
This option allows you to specify the number of times the file will be overwritten during the shredding process. By default, Shred performs 3 iterations. |
-u, –remove |
This option instructs Shred to remove the file after the shredding process is complete. |
-v, –verbose |
When using this option, Shred provides detailed information about the shredding process. |
-z, –zero |
This option adds a final overwrite of all zeros to the file after the shredding process is complete. This helps to hide the fact that the file has been shredded. |
-f, –force |
This option forces Shred to shred files that have read-only permissions or are otherwise protected. |
-r, –random-source=FILE |
With this option, you can specify a file as the source of random data for overwriting the file being shredded. |
shred Command in Linux with Examples
When you delete a file from Linux or from any Operating System, then the file is not deleted permanently from the hard disk. When a file is deleted, it first gets moved to the trash and as soon as you clear off the trash the files get deleted for the file system. But the file is still there on your hard drive, and it could be recovered. When you delete a file permanently or delete it from the trash, the pointer pointing to the file leaves the address of it and the data of the file is sent to a sector in hard disk and is considered as unallocated space and it can be recovered easily. The file gets permanently deleted when the OS writes over the sector of the file which was considered as unallocated. So, in order to delete a file completely from a hard disk “shred” is used in Linux. This command overwrites the contents of a file multiple times, using patterns chosen to maximize the destruction of the residual data, making it harder for even very expensive hardware probing to recover it.