As a Linux system administrator, knowing how to check a file‘s size is an essential skill for managing storage, monitoring systems, and automating tasks. Bash provides several simple yet powerful commands to get a file‘s size, which can be easily incorporated into scripts.
In this comprehensive guide, we will explore the various methods to get file sizes in Bash, with code examples for bytes, kilobytes, megabytes, and human-readable formats. Whether you need to loop through files, display sizes to users, or integrate into scripts, this guide has you covered.
Prerequisites
To follow along with the examples, you will need:
- Access to a Linux terminal with Bash shell
- Basic knowledge of Bash scripting
- Ability to create/edit Bash scripts
- Test files to demonstrate the size commands
While the specifics may vary across Linux distributions, these techniques work on all major variants like Ubuntu, Debian, RHEL, CentOS, Fedora, etc.
Getting File Size in Bytes
The most straightforward way to get a file‘s size in Bash is using the stat command. stat displays detailed file and filesystem metadata.
To get the size in bytes, use:
stat -c %s filename
The %s format specifier returns the file size in bytes.
For example:
$ stat -c %s test.txt
5632
This returns the size of test.txt in bytes.
We can easily incorporate this into a script to get the byte size for any file the user specifies:
#!/bin/bash
echo "Enter file name:"
read filename
size=$(stat -c %s "$filename")
echo "$filename size: $size bytes"
When executed:
$ ./filesize.sh
Enter file name: test.txt
test.txt size: 5632 bytes
Using stat is fast, simple, and works on all Linux environments. But the size in bytes is often not very human-readable. Next, we‘ll see how to display sizes in KB, MB, or GB formats.
Getting Human-Readable Sizes
While bytes are useful in scripts, human-readable formats like KB, MB, GB are easier to quickly grasp file sizes visually.
The ls command
The good old ls command can display human-readable file sizes out of the box:
$ ls -lh test.txt
-rw-r--r-- 1 john staff 5.5K Oct 21 12:05 test.txt
Here -h prints sizes in human readable format automatically (e.g. 5.5K instead of 5632 bytes).
To extract just the size, pipe to awk:
ls -lh test.txt | awk ‘{print $5}‘
5.5K
$5 specifies the 5th column, which contains the size.
We can use this in a script:
#!/bin/bash
echo "Enter file:"
read filename
size=$(ls -lh "$filename" | awk ‘{print $5}‘)
echo "$filename size: $size"
Output:
Enter file: test.txt
test.txt size: 5.5K
The size is now more visually understandable.
The du command
Another command that displays human-readable sizes is du:
du -h test.txt
5.5K test.txt
Piping to cut or awk can extract just the size:
du -h test.txt | cut -f1
5.5K
Using this in a script:
#!/bin/bash
read filename
size=$(du -h "$filename" | cut -f1)
echo "$filename is $size"
Choosing between ls and du
ls -lh and du -h can both get human-readable sizes. What‘s the difference?
lsshows the actual file size on diskduincludes any hard links in the count
So if you care about the exact byte size, ls is more accurate.
Getting Sizes in KB, MB, GB etc.
The above human readable formats still display a mix of units – KB, MB, GB etc. based on the actual size.
In some cases, you may want to force a specific unit, for example always display size in MB or GB only.
The easiest way is to:
- Get size in bytes with
stat - Convert to the required unit
Here is a script to convert to MB:
#!/bin/bash
file="$1"
#Get size in bytes
size=$(stat -c %s "$file")
#Convert to MB
mb=$(awk ‘BEGIN {print ‘$size‘ / 1048576}‘)
echo "$file size: $mb MB"
To break this down:
- We get the original byte size with
stat - Use
awkto convert bytes to MB (divide by 1024*1024) - Display size in MB only
You can customize this script to convert to KB, GB etc. as needed.
Getting Folder Sizes
The above techniques work great for individual files. But what about getting the total size for a folder containing multiple files and sub-folders?
The easiest method is using du:
du -sh folder1
243M folder1
This recursively calculates the total size of all contents in bytes and displays a human readable format.
We can incorporate into a script:
#!/bin/bash
dir="$1"
size=$(du -sh "$dir" | cut -f1)
echo "$dir folder size: $size"
By recursively calculating sizes, du makes it easy to quickly check folder sizes from the terminal or use programmatically in scripts.
Comparing Folders
Finding the size for one folder is handy, but sometimes you need to compare multiple directories sizes.
For example, to compare the sizes of folder1 and folder2:
du -sh folder1 folder2
243M folder1
364M folder2
We can extend this to a script that compares multiple folders:
#!/bin/bash
echo "Comparing folder sizes..."
for dir in "$@"; do
size=$(du -sh "$dir" | cut -f1)
echo "$dir: $size"
done
Save the script as folder-size.sh, make executable with chmod +x folder-size.sh, then run it by passing space separated folders:
$ ./folder-size.sh folder1 folder2 folder3
Comparing folder sizes...
folder1: 243M
folder2: 364M
folder3: 1.1G
This makes it very easy to compare sizes across folders.
Finding the Largest Files/Folders
Another common task is finding the largest files or folders taking up space on a filesystem.
This helps identify candidates for clean up when disk space is low.
Here is a simple script to display the top 10 largest files in the current directory:
#!/bin/bash
files=$(ls -Ssh | head -11 | tail -n +2)
echo "Top 10 biggest files:"
echo "$files" | cat -n
Breaking this down:
ls -Sshsorts all files by size with human-readable formathead -11gets the top 11 resultstail -n +2strips the header row- Print the top 10 lines numbered
For folders, pass the root directory to du:
du -sh /home/* | sort -h | head -n 10
This will recursively check all folders under /home, sort them by size, and print the 10 biggest.
The ability to sort and filter by size makes it easy to identify space hogs!
Continuously Monitoring Size
Sometimes you may want to monitor how a file or folder size progresses in real-time.
The watch command can repeatedly execute a command every 2 seconds and display updated results.
For example, to monitor a file size:
watch -n 2 ‘ls -lh test.txt | awk ‘{print $5}‘‘
This will display the size of test.txt updating every 2 seconds, useful for catching rapid size changes.
For a folder:
watch -n 2 ‘du -sh downloads‘
Here you can see the /downloads folder size updating live!
Getting File Count per Directory
In addition to total size, another useful metric when working with batches of files is the total file count per folder.
The easiest way to get this is using a combination of ls with wc:
ls folder | wc -l
Breaking this down:
lslists all files and folders- Piped to
wc -lto count total lines (-l counts lines)
We can use this to summarize file count per folder:
for dir in /home /*; do
count=$(ls "$dir" | wc -l)
echo "$dir: $count files"
done
This loops through all folders under /home, gets file count per folder using ls | wc, and displays the result.
Finding Files by Size Range
When working with large sets of files, you may want to search for files within a certain size range.
The find command has a handy way to filter files by size criteria.
For example, to find all files over 2 MB in size:
find . -type f -size +2M
Some more size range examples:
# Over 1 GB
find / -type f -size +1G
# Between 100 MB and 500 MB
find / -type f -size +100M -size -500M
# Up to 5 KB
find / -type f -size -5k
This makes it really easy to locate files based on size specifications.
Converting Between Units
Sometimes you need to convert a file size between different units like MB to GB.
A quick way is to use Bash arithmetic expansion:
mb=2048
gb=$((mb/1024))
echo "$mb MB = $gb GB"
# Output: 2048 MB = 2 GB
Simply divide by 1024 to automatically convert between units!
For decimal places, use bc:
mb=1024
gb=$(echo "scale=3;$mb/1024" | bc)
echo "$mb MB = $gb GB"
# Output: 1024 MB = 1.000 GB
So whether whole numbers or decimals, Bash makes conversions easy.
Achieving More With Size in Scripts
While getting the file size is useful in itself, the real power comes from incorporating it into scripts to automate tasks.
Here are just a few examples of what you can achieve:
- Cleanup scripts – Delete files over a certain size
- Archivers – Compress folders once they reach X GB
- Log Rotation – Rotate when log files reach a size threshold
- Backups – Incremental backups based on file change size
- Notifications – Email admins if a folder hits 90% capacity
- Fileservers – Block uploading files above a certain size
And many more!
No matter what the use case, you have all the basic building blocks to get, format and leverage file sizes.
Key Takeaways
Getting and using file sizes in Bash is invaluable both for admins and developers. Here are some key tips:
- Use
statfor getting the byte size to use programmatically - Leverage
ls,du,findfor human-readable sizes - Format and convert units to suit your use case
- Integrate size checks into scripts to automate management and monitoring
- Sorting and filtering by size helps identify large space hogs
- Continuously watch size changes to catch anomalies
- Compare directory sizes for easy visualization
Follow the examples in this guide and you will be able to handle all common file size operations in Bash.
In addition to CLI usage, the Linux kernel and languages like Python, Perl, Go have extensive APIs to get size information to build more complex applications.
I hope you found this guide useful! Let me know if you have any other file size use cases or questions.


