I have a small maintenance window and low bandwidth to remote locations, so when copying something huge like a Windows Feature Update, it's impossible to finish within the time window.
So I made a package that runs a command line to XCOPY the files, and gave it a custom timeout of 5 hours. In recent memory, it would churn for those 5 hours and fail. "Exceeded timeout for completion." The next night, it would pick up where it left off and fail again after 5 hours. Eventually, 3-5 days later, all the files will have copied over. Perfect. That was in July '21.
But now with the same package, it blows past the timeout setting. I discovered that when I came in one morning to many complaints from the remote sites about not being able to do business. - (The package was still running 9 hours in and saturating their bandwidth.)
Did something change in a recent version? The help file says, "This timeout applies only to the duration of a deployment to a target computer. The timeout does not include the initial process of copying installation files to the target." Fair enough. But I'm not using the embedded "File Copy" step. I'm using a Command line step. Does Deploy know that I'm copying something with that step, and hence aborting the timeout I have in place?
If so, is there any way around that? This is the only means I have to get GB worth of files to my remote sites... chunks at a time over a number of days.
Please sign in to leave a comment.