Auto update collections?
Is there a way to set all of my collections to auto update on a schedule?
We currently have just under 25,000 endpoints in our inventory and having to wait for the collection membership (some of them have over 20,000 members) to update every time I click on one of them is not going to work for us.
Please contact firstname.lastname@example.org or create a ticket here.
They have a customer build that addresses some performance issues.
I'm currently on the customer build (v15 release 1 build 74), and while it has helped, it still takes a long time (45 minutes plus) to update some of the collections since they are so large.
It's not the VM we are running it on (8 core, 16gigs of memory). The CPU / Memory / Disk IO limits of our VM are not even being touched so I'm guessing this slowness is a result of only using SQLite for the database.
So auto updating collections is not an option? If so, this unfortunately will be a deal breaker for us as I don't have time to click on a collection and have to wait 10-45 minutes for it to update it's membership.
I agree with Robert on this issue. While we don't have the quantity of devices in Inventory as he does, it's quite frustrating to go to a collection after a deployment has ran only to see that it hasn't been updated. Being told to click on each collection to trigger a refresh isn't an ideal situation. I hope this is something that is addressed in a future release. I'm running Inventory v15 Release 1 build 72.
The collections are updated in a background process that constantly runs. Large environments take a little longer. However, when a collection is referenced it is pushed to the front of the refresh queue. “Referenced” means selecting the collection in the console, running a deployment to the collection via schedule, choosing deployment targets by collection, Auto Reports that reference the collection, etc. Actually all collection membership is processed this way. To verify this you can run a deployment to a collection. Doing this wil cause the collection membership to be updated. Look at the targets of the deployment. Then open that same collection in PDQ Inventory and the members you see should match the deploy targets.
Shane, I think we are talking about two different things...
While the collections themselves might be updated for reports and deployments in the database, the total numbers and lists shown inside of the console itself is not updated until we click on each collection. I would like not only the collection database to be updated but the numbers and list of the endpoints in each collection shown in the console to be updated as well without me having to click (and wait and wait) on each collection to update in the console.
With 25,000+ endpoints a lot of the time it takes hours for a collection numbers and list of endpoints to be updated inside of the console.
Instead of showing every single endpoint in a collection list in the console why not just show 100 on a page and then paginate the rest. Sure would help those of us with very large collections.
Robert, this is the response that I got from PDQ support when I asked about Inventory collections showing the correct number. I'm not sure if the undocumented feature will help in your environment or not.
Collection processing (in the GUI) is a background task, and can vary in time it takes depending on how many machines are in the console, and the queue depth to move the machines.
You can prioritize, or speed up the processing of a particular collection by selecting it.
The DB update is instant, so any deploy conditions would still be correct, the GUI just takes a back seat to let other tasks like scanning and deploying go faster.
There is an undocumented feature (not documented because it does not exist on all builds, and no guarantee it stays in future builds) to force garbage collection and speed things up.
From the main console window of Deploy or Inventory you can give this key combo: CTRL+ALT+SHIFT+F12 you wont see any dialog boxes or any indication that it working, but it forces a run of the garbage collection. That may help speed things up.
I'll give that a try, thanks!
Well, the "secret" hot key combo made no difference.
I've spent much more time on this issue (cost wise) than I've spent on the software.
There comes a point in time where I need to cut my losses and move on, so we will now continue to use Desktop Central for our inventory needs and drop our review of PDQ.
Thank's everyone for trying to help. Maybe once the PDQ inventory console is able to update (and reflect the correct info) in the background for large networks like ours we will take another look.
Thank you for agreeing to work with Jason on this issue. We value feedback from admins like yourself who manage large numbers of machines. We have been able to make tweaks to the applications based off of feedback like this and we will continue to do so.
There will likely need to be some hardware, and configuration changes on your end with 25,000+ machines, most likely the machine will need more RAM, and SSD based storage to keep up.
This is not a hardware issue. We have the exact same VM hardware configuration running Desktop Central. It can show a collection with 20,000+ endpoints in less than 10 seconds while PDQ takes hours (and probably more because I end of giving up on waiting after a few hours).
The difference between Desktop Central's collection via and PDQ's? They paginate the list of shown endpoints to only show 100 on each page and use a more robust database (msSQL). It's as simple as that.
When I click on a collection in Desktop Central only the first 100 endpoints are shown in the collection list, when I click on a collection in PDQ all 20,000+ endpoints are shown in the collection list.
While showing every endpoint in a list for a collection might work for a few hundred (and maybe even a few thousand) endpoints, it's just is not reasonable to expect a list with 20,000+ endpoints to be populated in a list in a reasonable time with your current database choice.
My suggestion is to offer the ability to only show a subset of endpoints in the collection list in stead of the "show every endpoint in the collection" default you have now.
Desktop Central shows 100 endpoints on a page in (let's go for the max) 10 seconds.
That's .1 second for each endpoint.
So if PDQ could pull the same info (abet using an inferior database).
25,000 endpoints at .1 second each would take 41 minutes to load.
Now factor in the fact PDQ uses SQLite and Desktop Central has the ability to use a much more robust (and faster) msSQL which also makes PDQ much slower when pulling data.
With the combination of showing every endpoint on every page and using a n SQLite database there is no way in it's current form that the PDQ console can hope to ever come close to being useable for a 25,000+ endpoint collection no matter how many SSD's or Memory you throw at it.
I feel as if I can interject here. Where is this collection coming from? Is this the Active Directory tree or a true Collection such as All Computers defined in Inventory?
If I scale back the filter on my OU so I'm looking at my entire university instead of my specific campus I can traverse all 36000 computer objects in approximately 3 minutes.
Perhaps you need to shift your focus slightly in what you are trying to do and leverage Active Directory a little more? That negates your issues with the performance of SQLite.
And for what it's worth, it would be nice if in a future release they do provide the option to use the built in database mechanism or to connect to your own SQL infrastructure if you have it existing. I see why they do what they do, the small shops they have as customers likely don't have budget and/or expertise to run a fully fledgwd DB environment.
I've been using this product for a good while now and know the ins and outs pretty well. Perhaps we can talk offline (if that's ok with you PDQ) so I can get a full understanding and then come up with a solution that may work for you.
The collection is coming from a true collection such as a group of computers with a certain version of an application.
Just an example, the number might now be exact.
I can have 14,000 computers with the latest version of Chrome, over the weekend I then deploy the latest version of Chrome to an additional 6,000 endpoints.
On Monday when I come into the office the latest version of Chrome collection number should say 20,000 endpoints without me even having to click on the collection. But, it has in fact not changed at all (even though I told the weekend 6,000 endpoint deployment to do a new inventory scan after the upgrade).
I then click on the collection and then have to physically sit and watch the collection number and list slowly grow because it's going through all 25,000+ endpoints to see what version of Chrome they are using by pulling that information from the database.
I've been told by PDQ support that the collection numbers are supposed to update in the background (abet at a low priority) without having to view them, but for me this has not been the case. Unless I have a collection group open in the console the number of endpoints in the collection is never updated. If I close the console and then log bag in and open it at a later date the numbers are always the same as the last time I closed it.
My biggest issue is not really with the database itself. My reports are always up to date, it's how the console get's it's information from the database that seems to be the bottleneck.
I'd be more than happy to talk with you on the phone to see if we can come up with a solution. I am in the UTC/GMT -7 (Arizona, USA) Time zone. My number is 480-812-7690 and I'm usually available between 8:00am-3:00pm
Understood. I should have some availability tomorrow afternoon/evening to give you a call. I have a couple ideas rattling around that I'd like to run by you, as well as some troubleshooting stuff we could try if support has not gotten to it yet. I'd hate for this to be the reason you leave the software for something else, as it is truly convenient how tightly the two integrate.
Did you manage to figure the issue out? I appear to have the same issue where I have to click on a collection to have it update (rather annoying), I only have 600 devices though. It does not seem to do update in the background as you mentioned it should. We are close to making a decision on purchasing the enterprise version for both products but this is a little bit of a stopper.
So this is a few months old but for anyone else looking I think he means updating membership to the dynamic groups. The actual groups are up to date in the database but will show 100 next to the group and when you select the group it updates membership and then will display 60 or 150 or whatever the actual correct number is and the group itself will drop or add computers to be at the current correct number. I have been looking for this as well and so far have not come up with a good solution
Is anyone aware of a solution in the works for this issue? I work as an IT Admin for an Enterprise level corporation. We are using PDQ Inventory to monitor/inventory Desktops/Laptops at multiple sites in over 8 countries. Having to open a Dynamic Collection and wait for the membership to update on EACH collection is not conducive to core purpose of the PDQ platform. Especially when you not only have 10/20K machines spread out over many dynamic collections and multiple PDQ Servers. Not to mention the issue this causes when trying to automate reporting. Any updates on this issue would be great!
Please sign in to leave a comment.