lady waiting at computerI had written before about when QuickBooks is “Not Responding”, particularly when doing data-intensive tasks on large databases. The “wait time” can be reduced by supercondensing large files, but even users of medium-sized files sometimes experience the “not responding” phenomenon.

I was watching the performance graphs in Windows Task Manager during a QuickBooks company verify (with the “not responding” message in my QuickBooks windows tab). Guess what? QuickBooks wasn’t using much CPU power or memory:

task manager screen during quickbooks verify
Verify not taking advantage of available computing resources

Why is that? Why didn’t it use more of my Intel i7-2600’s processing power? Why didn’t it want to use more of my 6GB of RAM? Dunno. It did have some peaks at about 22% CPU usage, but no more than that.

I was verifying a large company under Enterprise Solutions 12. Enterprise is said to take better advantage of extra RAM than Premier and Pro editions, and I think it probably does. But even so, it seems to me to be leaving lots of computing power on the table.

I’ve read about people doing hacks on their Windows registry so that QuickBooks uses more resources and things happen faster. Anybody tried that?

Also, I’ve read that if you keep your QuickBooks information on a solid-state drive (SSD), that speeds things up quite a bit — your hard drive is kind of like extended memory, so there is no mechanical movement associated with reading and writing data. Anybody have a word of experience on that?

Otherwise, as far as QuickBooks itself is concerned, here’s my wish: that future versions use more memory and processor resources than currently available versions do. That would save us — millions of QuickBooks users and pros — a lot of time. Just an idea.

Please follow and like us:

19 thoughts on “Why QuickBooks Is Sometimes Slow

  1. It’s been my experience from watching Task Manager with IO read, write, and other bytes columns turned on that it’s all hard drive access time and a little bit of CPU. Unfortunately, it’s not remotely multi-threaded so our quad core was using 25% CPU, aka one core, the entire time.

    Our database is 1.4GB and I just made a portable file for our outside accountant/auditor and it took 15 minutes. The whole time it was a ton of read time. Strangely, it was only about 15MB/s and our disk array can easily do 200MB/s read in sequential reads. I think it skips around the database since not everything is in line and that start and stop kills the throughput. Since SSDs have no moving parts and absolutely zero seek time, it would speed up Quickbooks immensely. I’ve only ever built 1 computer that ran Quickbooks on an SSD but it was unbelievably fast.

    The problem is, in an environment like this, an SSD would die pretty quickly because they can only be written to a certain amount of times. Having 10 users in it all the time, constantly writing data would give it a useable life on a year or two on lower end drives. Corporate grade drives from Intel have a lot higher total writes and I think OCZ Vertex 4 drives are still the kings with over 9000 estimated writes (Kingston Hyper X 3K’s for example are around 3ooo) and it has really intelligent firmware. The prices just went up but they’re still very reasonable and typically cheaper than the CPU in a high end desktop.

    Still, they seriously need to make Quickbooks use multiple threads so it can run across multiple cores for operations like this. I guess with the disk IO bottleneck, it’s not a top priority and database verification is a bit difficult to multithread from a programming standpoint so don’t expect that any time soon.

  2. I recently built a new computer for my Quck Books System and I used Raid SSD to keep the data on. It is extremly faster than any system I have ever used in the past.. I will now use SSD for all my data storage.

  3. Peter and Curt,

    That is quite interesting. So SSDs have a limited number of rewrites — say, 10,000 — before the drive dies. With a dynamic file like a QBW, I wonder how that plays out? Would certain parts of a QBW file get rewritten more or less constantly in a multi-user, high transaction situation?

    I’ve read about how some SSDs come with a utility that estimates the remaining life of the drive. It would be interesting to track that on a drive where your QBW file lives, and try to figure out the correlation between usage of the QBW file and the remaining lifetime of the drive.

    If you guys have any data to report in that regard from your installations, please share with us. Thanks for your comments.

  4. SSDs are NOT limited to 10,000 write cycles. A simple Google search will verify this. Here is a quote from one article:

    How long have you got before the disk is trashed?

    For this illustrative calculation I’m going to pick the following parameters:-
    Configuration:- a single flash SSD. (Using more disks in an array could increase the operating life.)
    Write endurance rating:- 2 million cycles. (The typical range today for flash SSDs is from 1 to 5 million. The technology trend has been for this to get better.

    That being said, SSDs can and do go bad, just like mechanical hard disks. You should be backing up your files!

  5. One notable detail in your screen capture is that QB doesn’t seem to be able to utilize multiple processors. Your Core-i7 processor appears to Windows as 8 separate processors. One processor is running near capacity while the other are idle. The Quickbooks application must be updated to utilize multiple processors.

  6. Mark, thanks for the comment. It’s difficult for me to get my head around whether 10K or 1M write cycles would really be a concern for a high-volume, high-user QBW file or not. That’s why I’m interested in some stats coming from an SSD performance tracking utility in a QB environment.

    Eric, yes, that was my interpretation of the CPU usage charts as well. Thanks for the comment.

  7. The good thing about SSD’s is that when they die, they become read only. That way, your data is not lost, you can still copy from old to new in the unfortunate case it does die. On a regular basis, if your file is less than a Gig in size, and healthy, you won’t see a whole lot of gain in performance. If you have troublesome file, unhealthy, or large, you will find the performance SSD comes in handy. I recently worked with Shannon to repair our file and then condensed it on my own. In my server, with a raid set with 15k drives, the condense process would take 12 hours or more and be extremely unstable while it ran. In my test machine I had 2 SSD’s and the same process would take 3 hours and be completely stable.
    Now that my file is a reasonable size, 500MB down from 1.8 GB, I hardly notice the difference. If I run a large report like trial balances or A/R Aging detail, there might be a second or two difference, but nothing major. Another overlooked area is the memory. By default, Quickbooks will never utilize more than 512MB of ram even if you had 32 times that installed. This can be a problem if your healthy file is larger than this. Essentially because of licensing restrictions with sybase (the DB QB runs on) you are limited to one processor, one core and 128-512MB of memory usage. You can tweak these settings and get more umph out of it. By allowing it more processors and as much memory as you want to throw at it, you can potentially better utilize your modern hardware. Virtual machines are excellent for testing this out, it helped me working with my file. However, since this is technically a crime I would not condone it beyond an R&D standpoint. Also I would not advise this for production, there is no official support for this if things went south. It would be worse than telling them you’re running QB on a virtual server or on a raid set.

  8. I have modified the amount of RAM made available to open Qb files and would like to share my solution with you since this is an active post. I have more than doubled the amount of orders that can be processed by bumping up the max ram allocated to QB files to 1.5GB with sa low end setting of 512MB. Here is the copy/paste I sent out to some coworkers:

    There is a registry setting that allocates the amount of RAM that is allocated to an open QB file. The default size is 128MB for the minimum amount and 256MB is the maximum amount. By switching these values to 512MB and 1024MB respectfully. I did notice that the settings revert back to default after a system reboot so a batch file to make the change on logon will be needed for a permanent fix.

    This command will display the settings:
    reg query HKLM\SYSTEM\CurrentControlSet\Services\QuickBooksDB”version number”\Parameters

    The return will look similar to this:
    -n QB_SERVER_NAME -qs -gd ALL -gk all -gp 4096 -gu all -ch 256M -c 128M …

    The location in the registry is here: (the ## will be different depending on your version of QB)

    After this is done, restart the “QuickBooksDB##” service(## will be different numbers depending on your version of QB) to complete the process.

    Here is a link to a QB support article that covers this procedure:

  9. What surprised me is that I have probably logged over 100 hrs of support calls to intuit for performance issues and they never suggested it to me. Then I find it on my own and they acted like they have known about it the whole time and even said it would be fine to increase the memory allocated to those amounts.

  10. You all are taking some pretty ‘deep stuff’ here; Solid-state Drives and Registry tricks for QuickBooks….by the way Shannon, that Intuit KBA is relatively new, it came out with a download utility to ‘re-set’ QB to use the recommended cache settings because some ‘code changes’ in newer versions were limiting RAM cache. So without getting into all the truly ‘techy stuff’, I wanted to spend a little bit of time on something Peter briefly mentioned, because I think this can really help concerned users who tend to freak when they see the ‘Not Responding’ especially during Verify or Rebuild.

    The ‘not responding’ is just a windows thing, it is purely arbitrary; but in your scenario of verify (or rebuild for that matter), because of the ‘diagnostic characteristics’ of Verify, far less CPU activity is required than in the case of a Rebuild which may actually be doing ‘what little the QB rebuild really will do’, nevertheless the key measures to watch are not CPU useage but the I/O activity associated with these functions. This is because essentially every bit of the QB data must be read from disk into cache and then written back to desk as the utililty scans are performed.

    To observer these measures, open the task manager and then select the Processes tab, make sure you ‘checkmark’ the option to show processes from all users, especially if you are watching the QB Network Server. Now modify the view to add everything related to I/O (Reads/Writes/Other). Depending on the available RAM and RAM allowance for the version of QB you are using (even if you have done the registry trick), working memory may be ‘peaked’ at the Peak Working Set Memory for QuickBooks.

    Even though there maybe minimal CPU activity, the I/O for the QBDBMgr (or QBDBMgrN, if this is the network server), the I/O activity should be churning away as the DBServer works its way through every byte of the datafile. Since QB mid-size to large-files may exceed the RAM Cache allowance, the DBServer will actively be reading data from the disk and loading it into Cache, then writing data back to disk as additional data must be accessed during the ‘verify’.

    So the next time you are scratching your head at that ‘Not Responding’ and thinking to yourself, “Gosh I hope it really hasn’t frozen up”….just remember that message is just ‘Windows foolery’ more often than not, the I/O will give you the real status.


    • Many thanks Murph! My trouble is … I am using the new QBks 2017 on my new machine with 16gigs of DDR4 2666 memory, i7-6700 and a Samsung PCIe SSD on the mobo and the nasty thing is still stinking the place up I’m at like hour 6 now.
      Started at only 44 gigs, but QBks failed to import an IIF file multiple times without very obvious corruption on the Balance Sheet. (So have done multiple restores to continue testing before going down this Verify, Rebuild, Condense path. Yep it reported there were issues fixed during the Rebuild)
      My Win 10 task manager shows a whopping .1% MB/s for the Disk column and the total of all my programs is <1% so it shows zero. When I look at the Disk tab in the Resource Monitor it tells me that the QBDMGR and QBW32 are doing a total (B/sec) is currently 114 and 449,083 respectively.
      I can tell you are a ton more techie-smart than I so can you help me figure out/guess why with such low I/O speeds when clearly more is available, this thing is crawling so slowly for the Rebuild and Condense functions? I have not done the tweaks yet, just came across them while the dog is on step 2 of 4 for the Condense…. Any ideas though? I think I have to try the tweaks, at least confirm my Registry is still set to slow poke mode…. Any insight would be appreciated.

      • I’m getting too old, and first, I can hardly even remember writing the contribution shown here. But the fact that you are in Shannon’s blog should tell say that you know an expert when you see one (or his writing) – I’m talking about Shannon.

        So I am not really even sure why you are doing a condense using the QuickBooks Utility, it is crap. If you are telling me that your QBW file is 44 GIGS then you should have had a professional condense (by Shannon Tucker’s team) long-long-long ago. I can say that I have NEVER seen a QBW file of 44 GIGS (4.4 GIGS that’s another matter).

        Despite being 6 hours into the process, you may want to re-think what you are doing.

        Another issue is importing iiF files, especially if you know they are corrupting the Balance Sheet.

        2016/2017 saw the release of ‘new post-verify/post-rebuild reporting’ but it still does not really beat the value of reading the QBWin.log file, in fact many times the two conflict. And while QB Rebuild will report fixing issues, many times the fix is a figment of Artificial Imagination (moreso than Artificial Intelligence).

        It sounds like you have a file that NOW really needs Shannon – for both repairs and his ‘supercondense’ neither of which rely on QB utilities. With his expedited service he could probably have your file fixed and shrunk and back to you before you get through even the 3rd phase of the QB data-disaster known as ‘condense’, because even when/if you do finish, the odds are better than 50% you will not be happy with the result.


  11. Pingback: 3 Backdoor Ways to Speed Up QuickBooks | QuickBooks and Your Business

Leave a reply

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> 


This site uses Akismet to reduce spam. Learn how your comment data is processed.