Windows Explored

Everyday Windows Desktop Support, Advanced Troubleshooting & Other OS Tidbits

What Does File Contention Look Like?

Posted by William Diaz on December 20, 2011


Every now and then someone calls in and complains about terrible performance. Often times, the tech handling the call will try to recover performance by killing CPU intensive processes or closing unused applications1, especially those with large memory footprints2. More often than not, though, the issue can be described as file contention, a condition where performance is “penalized” because the disk cannot keep up with file IO demand.

In the example here, an unexpected virus scan kicks off in the background (these are usually scheduled to run after hours), and examining the two most common aspects of system activity, processor and memory, the workstation is well within the envelop of what is considered acceptable and the technician is left scratching his head as he tries to gauge why the system is so slow even though memory usage is minimal and CPU usage averages about 15%. I advise him to start Performance Monitor (perfmon) and connect some remote performance counters, mainly disk counters like read and write time, but most importantly average and current disk queue length:
image
This is a snapshot of about 2 minutes of monitoring. We can ignore the Memory counter since the majority of system RAM is available. The processor averages about 15% while a virus scan runs minimally in the background. Read times are near max for most of the time, but the real performance bottleneck, file contention, is seen as the Average Disk Queue Length and Current Disk Queue Length counters. Think of these queues as File IO tasks waiting on the disk. If the disk is busy, they get backed up and cause a traffic jam. If you are curious about what kind of disk file IO is happening, Resource Monitor is a good place to look on Vista and later, otherwise Process Explorer and Process Monitor can assist you.

Now a few years ago, this hard drive was zooming along—and our environment has not significantly changed since then. No doubt, a virus scan kicking off is going to cause some performance issues, but not to the degree seen here unless the disk has been around for a long time. So, in addition to some steady file IO, this is also a sign of a old, worn-out hard drive, and the system model confirms it is at least 6 years old. Hard disks do, in fact, degrade in performance over time. Beside the CD\DVD drive (or floppy if you still have one), the hard drive is the last major component that contains mechanical moving parts, and the disk motor, arm actuators, and other parts are going to get slower and less sensitive with time. And despite popular myths on Internet, re-installing Windows will not regain any lost performance.

Personally, I have encountered this myself at home and practice what I am preaching here. My home systems are 6-8 years old (accept for the laptop I am using to write this article on), nothing more than dual core, a mix of Windows XP and Windows 7. Original system performance was regained in all cases by installing new hard drives. Of course, original system performance may not always be cutting it these days.

The good news is that going forward, file contention and disk IO bottlenecks are being mitigated by the arrival hybrid SSD and SSD drives.


1 Other things that are often tried are disk defrags, which is mostly a waste of time in our environment. We already have this scheduled to run monthly. Even if one of these defrag tasks are skipped, excessive fragmentation is not likely going to happen from one month to the next on the average workstation to warrant running this manually. Microsoft (I think) recommended defragging at 20% file fragmentation in the XP days and earlier, but even in those cases I have not seen any noticeable performance gains from running a defrag. Theoretically, you might gain some performance, but don’t say a 1-4% increase in performance afterwards is something you were able to actually measure as a real experience.

Then there is the clearing of the Internet cache. I see this step often documented in performance related incidents. The first problem with this—again specifically in our environment—is that this setting is already enforced via group policy for Internet Explorer. The setting is known as Empty Temporary Internet Files folder when browser is closed. User’s are already opening and closing IE several times a day and doing this unknowingly themselves. At most, all you are doing is freeing some minor storage space and ensuring the most recent web page content is not being accessed from browser cache, but performance wise you are not achieving anything. Anyways, IE already has internal controls in place that limit the cache to a percentage of disk space.

And, finally, rebooting. Rebooting actually creates additional file contention at shutdown and startup (and well into the logon process). If performance is being caused by disk, then all you have done is serve a drunk another drink just as he was sobering up. Of course, if the system was in the process of doing something fairly taxing, then you might have gained some immediate performance gains since this task may no longer be running, but ultimately you need to get back (or will be taken back) to the original point of contention and still need to troubleshoot the original performance issue.

2 Application or services consuming large amounts of memory shouldn’t necessarily be a point of concern if you have enough RAM. When you start running low and on available RAM, then file contention can be a problem as pages of memory need to be written to disk and then need to be read again from disk when needed.

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

 
%d bloggers like this: