Max number of computers?

Comments

12 comments

  • Shane Corellian

    Yes, we do have plans to allow other relational databases to be used. 

    In the meantime, the limitation isn't with the database (SQLite) as much as it is with the Console machine specs. How much memory do you have on the console machine? 

    0
    Comment actions Permalink
  • Barrett Puschus

    I have 8gb of RAM, win 7 64b

    0
    Comment actions Permalink
  • Adam Ruth

    The current version of the software has had most of the performance tuning geared to around 500 computers.  We're currently working on performance improvements for larger numbers.  Most of the performance issues seem to revolve around getting changes in the database from the scanning service and loading them into the interface, there are a lot of thing we can do here to make it faster.

    Where, in particular, do you notice most of the sliggishness?  Is it just a general slowdown or do you notice it more in certain places or during certain operations? 

    0
    Comment actions Permalink
  • Barrett Puschus

    Mostly when switching between collections or double-clicking on a computer to bring up the inventory.  Certainly not a show-stopper, but because it was during data-access operations I assumed it was the sqlLite backend causing the latency.

    0
    Comment actions Permalink
  • Adam Ruth

    You're right that it's the data access code that's the bottleneck, as you would expect in such a heavily database oriented application.  SQLite may be responsible for some of it, it's performance profile is quite a bit different that SQL Server, in that some operations are faster and others are slower.  We're still very early on in the development cycle, so you can look forward to some performance gains in the future.

    0
    Comment actions Permalink
  • Rich

    When is a SQL server backend coming?  I have about 1500 computers for a client and I would love to use this SCCM just constantly has all sorts of issues with deploying packages the Include Directory mode is great.  doing the same with SCCM via code and trying scripts just isn't a solid solution.  PDQ has never let me down I just need better performance for 1500 computers/clients!

    0
    Comment actions Permalink
  • Eduardo Trevino Jr.

    I work at a school district with about 7000+ pcs/laptops.

    I find when scanning computers in smaller collections it works better.

    Say by school and classroom.

    0
    Comment actions Permalink
  • Fausto Tavarez

    We have about 2000 machines and the system has becoming pretty much useless because it's so slow.  Initially I thought it was because the server needed more resources so we gave it 8 CPUs and 6 GB of RAM. I even disabled the AV and other services just in case they are slowing the access to the DB, but not luck.  Still posting this error:

     

    Timeout connecting to database
    AdminArsenal.Data.SQLite.SqliteBusyException
    Database: C:\ProgramData\Admin Arsenal\PDQ Inventory\Database.db
    Error Code: SQLITE_BUSY
    Windows Error: 33
    SQL: begin immediate
       at AdminArsenal.Data.SQLite.SqliteEngine.TestReturnCode(IntPtr connection, Int32 code, String sql, SqliteEngine db)
       at AdminArsenal.Data.SQLite.SqliteEngine.Statement.ExecuteNonQuery(Object[] parameters)
       at AdminArsenal.Data.SQLite.SqliteEngine.Execute(String sql, Object[] parameters)
       at AdminArsenal.Data.SQLite.SqliteTransaction..ctor(SqliteConnection db)
       at AdminArsenal.PDQInventory.Computer.SetHeartbeat(ComputerId computerId, Boolean isOnline)
       at AdminArsenal.PDQInventory.HeartbeatThread.ThreadProcess()

     

     

    Any advice would be appreciated since I've come to depend on this tool.  I would use it exclusively instead of SCCM, but I can't if it has that limit

    0
    Comment actions Permalink
  • Shane Corellian

    Hi Fausto,

    It looks like the heartbeat process is causing the error. What is your heartbeat interval set to? (File > Preferences > Heartbeat). If it is still set to 300 seconds I would definitely change it to a higher number (perhaps 900 - 1200 seconds)

     

    0
    Comment actions Permalink
  • Fausto Tavarez

    Wow, that definitely increased performance tremendously.  I set it 1200 and now everything is performing really well.  Thanks Shane.

    0
    Comment actions Permalink
  • Shane Corellian

    Glad to hear that it helped, Fausto. The issue was that 300 seconds was not enough time to send a heartbeat to 2,000 computers. Basically before all the computers could return a heartbeat request they were put right back into the queue so the process was just working non-stop.

    We also recommend keeping an eye on your scheduled scans. If the schedules get too aggressive then you could start seeing a similar problem. 

    0
    Comment actions Permalink
  • Martin Hugo

    We have 6500 computers in our inventory.  My heartbeat is set to 1800.  I have my scan set to 7 days but certain deployments are set to scan at the end (applications only) so scanning is limited to smaller numbers of computers at a time.  Are there any other best practices I should be utilizing with a setup of this size?  I did consider having multiple PDQ servers but assume that this would require the purchase of more licenses.

    0
    Comment actions Permalink

Please sign in to leave a comment.