2

I've got a big disk worth few TBs ( 5 TBs ) and i need fill up this disk space to about 95%, this is a linux centos box.

I've tried dd as in

dd if=/dev/zero of=o_File1.img bs=209715200 count=1000

even tried running in a loop like

#!/bin/bash

count_=0
parallel_=10

BS=2097152000
COUNT=100
while [ $count_ -lt $parallel_ ]
do
        file="Dummy_DD_BS_"$BS"_$(date +%Y%M%d%H%m%s)_$count_.img"
        echo $file
        `dd if=/dev/zero of=$file bs=$BS count=$COUNT > /dev/null 2>&1 &`
        count_=$((count_+1))
done

but 5 TB seems like a hell lot.

is there a faster method, I'm Okay with any scripts python/perl/bash or anything, I'm looking for a fast optimized solution

Edit:

Okay so following the quickly-create-a-large-file-on-a-linux-system did not really solve much anything

case 1 : dd

 time dd if=/dev/zero of=filename10 bs=1G count=10
    10+0 records in
    10+0 records out
    10737418240 bytes (11 GB) copied, 34.0468 s, 315 MB/s

    real    0m34.108s
    user    0m0.000s
    sys     0m13.498s

so an estimated around 3 - 4 hours to fill this 5 TB

Case 2 : fallocate

 fallocate failed: Operation not supported 
   its a custom file system not the popular ext* 

Case 3 : truncate

   this is the weirdest ones
   it doesn't really fill the disk this seems to be just reserving the space for the file so definitely not suiting my requirements

so is there a better solution or I have to live with dd or create smaller partition to test this

asio_guy
  • 3,667
  • 2
  • 19
  • 35

2 Answers2

6
  fallocate -l 25G file

Try this

abhishek phukan
  • 751
  • 1
  • 5
  • 16
1

Try a zip bomb You can probably download it from various places online

omu_negru
  • 4,642
  • 4
  • 27
  • 38