[Info-vax] SCP2 and large files
Jan-Erik Söderholm
jan-erik.soderholm at telia.com
Thu Aug 13 03:22:56 EDT 2020
Den 2020-08-13 kl. 02:31, skrev Jeremy Begg:
> On 13/8/20 02:13, Phillip Helbig (undress to reply) wrote:
>> In article <rh03ak$121p$1 at gioia.aioe.org>, Jeremy Begg
>> <jeremy at vsm.com.au> writes:
>>
>>> I think where it's going wrong is step 2. The stat() on the 500GB
>>> saveset takes a very long time (seems to be about 90 minutes), by which
>>> time the Linux end has given up waiting for the transfer to begin.
>>
>> Why not do ZIP/VMS/LEVEL=9 on the saveset?
>>
>
> Because it takes too long.
>
> This backup job initially used BACKUP/DATA=COMPRESS but the time required
> to compress the savesets meant the daily backup job was taking over a day
> to run!
>
> I know BACKUP/DATA=COMPRESS uses Zlib not Info-ZIP but I would be very
> surprised if one turned out to be twice as fast as the other. I'm also not
> sure that it would yield a useful improvement in the file size.
>
> Jeremy
>
>
BACKUP/DATA=COMPRESS is only meaningful if you have spare CPU cycles.
It will use more CPU to produce a smaller saveset. If you already are
CPU constrainted, it will take a longer total time, yes.
The saved size can be estimated from the kind of files backed up.
Or you can test on a few example files.
More information about the Info-vax
mailing list