hammer dedup in HEAD now has a memory limiting option
Matthew Dillon
dillon at apollo.backplane.com
Wed Aug 3 22:27:29 PDT 2011
The hammer dedup and dedup-simulate directives in HEAD now accept
a memory use limit option, e.g. '-m 200m'. The default memory use
limit is 1G. If the dedup code exceeds the memory limit it will
automatically restrict the CRC range it collects information on and
will loop as many times as necessary to dedup the whole disk.
This should make dedup viable inside qemu or other virtual
environments.
A few minor I/O optimizations were also made to try to pre-cache the
b-tree metadata blocks and to allow the dedup code to get past areas
already dedupped more quickly. Initial dedups will still take a long
time.
^C and ^T are also now supported during hammer dedup runs so you can
see the progress. It has to pre-scan the b-tree but once it actually
gets into dedupping stuff ^T will give you a good indication of its
progress. ^C was being ignored before and now works as well.
-Matt
Matthew Dillon
<dillon at backplane.com>
More information about the Users
mailing list