As some of you may do, I'm using a combination of dd, gzip and curl to make images of my machines for later deployment to multiple machines. The command(s) I use is as follows:

dd if=/dev/sda | gzip -cf | curl -u USER:PASS ftp://HOST/PATH/image.gz -T -

Everything goes great for the first 2 GBs or so and I get speeds around 10-20 Mbps between the machine I'm imaging and the FTP server. After that, however, speeds drop to about 1 Mbps and don't recover.

Is this an issue with gzip only able to feed curl at that pace or is it **** performance from my FTP? Any tips for optimizing each program? Perhaps adding a "bs=1k" flag to dd or decreasing the level of encryption on gzip? The goal here is a balance between speed and file size (as it always is). I have a lot of disk space so more speed is what I'm after.