Hacker News new | past | comments | ask | show | jobs | submit login

While this is out of the range of most consumers, I wonder if any bored sysadmins with a new storage system to test have tried unzipping that file...



It'd be easier to do something like cat /dev/urandom > big


/dev/zero is probably faster


It is way faster (at least with dd):

  $ time dd if=/dev/zero of=10MB.dat  bs=1M  count=10

  real    0m0.213s

  $ time dd if=/dev/urandom of=10MB.dat  bs=1M  count=10

  real    0m8.873s


Or if you want to measure the speed of the source itself:

  $ dd if=/dev/zero of=/dev/null bs=1M count=100
  104857600 bytes (105 MB) copied, 0.0237114 s, 4.4 GB/s

  $ dd if=/dev/urandom of=/dev/null bs=1M count=100
  104857600 bytes (105 MB) copied, 21.501 s, 4.9 MB/s
Also dammit Ubuntu with your Gibis.


Or

  $ truncate  -s 17TB hugefile.dat
...which is just as pointless.


I believe truncate creates sparse files in Linux so that would not work.


I've tried it, and I'm no sysadmin. Fortunately, I had a disk quota on, set by my sysadmin, and so the bomb could only take up 2 gb of the space I had in my quota.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: