portagro.blogg.se

Un tar xz
Un tar xz









un tar xz

It is worth mentioning that at a high level of compression,Ī large number of CPU threads (e.g., 8 threads at -9e for me)Īnd the amount of RAM not usable properly,

#UN TAR XZ FREE#

I deliberately skipped free space compression,īecause during this process I noticed a rapid reduction in compression timeĪnd increased momentary use of the SSD up to ~266 MB/s (in impulse). Which is a satisfying effect with a compression time of ~15 minutes. Assuming 3.6 to 1.7 gives a little over 50% compression,

un tar xz

The empty card space is ~28.4 GB, which was also compressed with ~3.6 GB of data – mainly binary files. The card dump weighs ~29 GB before compression and ~1.7 GB after compression. It is worth considering the fact that the SD card used has a capacity of 32 GB, In this way, I bypassed the bottleneck of data transfer speed that is SD cardĪnd I used the maximum: CPU threads, memory RAM and SSD (Read/Write ~540 MB/s). Or optionally, fully using dd syntax: sudo dd bs=4M if=/dev/mmcblk0 of=~/Desktop/sd-dump-rpi3b+-strech.img I dumped the system image from the SD card to the SSD with the dd command: sudo dd bs=4M if=/dev/mmcblk0 > ~/Desktop/sd-dump-rpi3b+-strech.img

un tar xz

I deliberately did not use compression on the fly (no using pipe) because of the limitation of the reading speed from the SD memory on my laptop (max ~28 MB/s).

  • -v - verbose mode - show progress compressing data.
  • I chose the compression level experimentally and it worked best (for me) with the DictSize = 32 MiB option.īelow is the command used: xz -k -8e -M 7000MB -T 8 -v sd-dump-rpi3b+-strech.img In my case, I used xz on a laptop (hence I used the maximum hardware capabilities) with maximally selected parameters – CPU threads, mem RAM limit and disk performance. It is worth taking into account the compromise between the degree of compression and its duration/system load. For this reason, it is not recommended to maximize the use of server resources (CPU / RAM / Disk) to not slow down the work of other services running on it. Maximum compression results in a diametrical extension of its duration, generating a heavy load of hardware resources. The maximum compression depends on the capabilities of the equipment on which you want to apply it. Nothing of much interest there though (-e sets the nice length to 273 and also adjusts the depth).

    un tar xz

    In order to see how the default presets map to these values, you can check the source file src/liblzma/lzma/lzma_encoder_presets.c. (these are the default settings, you can try values between 0 and 4, and lc+lp must not exceed 4) If you're feeling very adventurous, you can try playing with more LZMA options, like -lzma2=dict=1536Mi,nice=273,lc=3,lp=0,pb=2 If you're playing with "multimedia" files (uncompressed audio or bitmaps), you can try with -delta=dist=2 (experiment with value, good values to try are 1.4). If you're compressing binaries, add -x86 as the first xz option. This will only help if the data is actually that big, and in any case it won't help THAT much, but still. Adjust accordingly for lesser amounts of memory. This will need 1.5 GiB for decompression, and about 11x that for compression. If you have 16 GiB of RAM (and nothing else running), you can try: tar -cf - foo/ | xz -lzma2=dict=1536Mi,nice=273 -c - >











    Un tar xz