Concurrent deployment limit for specific package?
Is there any way to set a concurrent deployment limit for a specific package? Or maybe some other way to stagger a deploy?
I have a package that tells all our servers to go check for updates and to download them. This can overload our ASA and WSUS server if 100+ machines all try to download at the exact same time. Is there any way to avoid this?
I thought of either:
1) Set the concurrent limit to 10 servers at a time and let them fire off more as they complete. However, I only see a global setting and nothing anything that can be customized per package or per schedule
2) Set some sort of random offset for each job. This wouldn't be quite as good as we could still end up with a whole bunch going near the same time, but I guess might at least help a bit.
Any other ways to do this?
Comments
I believe concurrent deployment settings are only global. One recommendation I've heard for staggering deployment is to make multiple collections of the computers being targeted in PDQ Inventory. Then make a different schedule targeting teach collection changing the scheduled times accordingly. i.e. schedule 1 on Tuesday, schedule 2 on Wednesday, etc.
Another option is to set up another instance of Deploy that only handles this task. Licensing is per admin, so you can have as many instances of Deploy and Inventory as you want.
Thanks for the ideas. Based on something I saw online, I ended up creating a new Package called Random Delay that just has a single Powershell line of: Start-Sleep -seconds (1..600|get-random). I can then create a schedule or a deploy once that uses that package first and then actual package second. Can just change the 600 to whatever number you want if you want a longer timespan.
Seems to work pretty well so far.