Windows Confidential: No Limits … Sort Of
What’s the maximum number of files an Explorer directory can handle? There’s no hard limit—it’s just a matter of patience.
More than one customer has asked, “What’s the maximum number of files in a directory that Explorer will support?” The truth is there’s no specific limit on the number of items in a folder. The only true limit is basically address space and your patience.
Windows XP has to retrieve and sort all the items in a folder before it will display anything in that folder. For a large folder, this can result in a significant delay. You’ll get nothing for a long time, and then—boom—everything shows up. Windows Vista and Windows 7 load the contents of the folder incrementally, which is both good and bad.
It’s good because you get to see the contents as soon as they become available. It’s bad because it means the contents are unstable while you’re reading them.
You’ll see the first hundred items that are loaded. Then the next hundred items show up, and some of them are interleaved with the first hundred due to your sort criteria. Then the next hundred come in, and things shuffle around some more. If you’re trying to click on an item, it’s frustrating when the view constantly refreshes and moves the item you’re trying to target.
One of the consequences of large folders is that you may run out of address space. Then you may start getting weird errors. A security check that fails due to lack of address space will fail-safe and deny access. When you see the error message, “Access denied,” you’ll probably say, “I don’t understand. I should have access.” Then maybe you wait a while and try again, and it succeeds.
Sometimes we get questions from people who say something like: “We have 1.8 million files and we’re finding that Explorer’s CPU usage goes to 50 percent when we browse into that folder. The interface remains responsive. We can scroll around to see the results, but it sometimes gets a little wonky.”
My reaction to something like this is: “You are way beyond what Explorer can comfortably handle.” If you ask Explorer to keep track of 1.8 million items, then it’s nearly impossible to avoid high CPU and memory usage. Counting to 1.8 million takes time, especially if you have to allocate memory to keep track of all those 1.8 million items you’ve just counted.
Another customer admitted that a “dir /s” command on the entire drive took seven days to complete. The “dir” command doesn’t even have to save the results. It can just print them to the screen and throw away the information. Who knows, maybe someday the “dir /s” command will realize its results are being discarded, skip to the last directory and show the last 50 files.
Actually, the NTFS itself can handle huge numbers of files. It’s the programs that try to manipulate lists of those files that often run into trouble. If you have millions of files, Explorer is probably not the best tool for the job. You should use an application that specializes in managing huge quantities of files, something with a fancy name like a Document Management System.
When given this explanation, the customer asked, “Should we expect any improvement after setting NtfsDisableLastAccessUpdate and NtfsDisable8Dot3NameCreation?” The simple fact that the customer even asked this question means they didn’t understand the explanation.
Whether NTFS updates the last-access time and whether NTFS has enabled short file names doesn’t change the fact that you still have 1.8 million files. If you expect Explorer to enumerate and allocate memory to track this many files, you’re going to be waiting.
Raymond Chen's Web site, The Old New Thing, and identically titled book (Addison-Wesley, 2007) deal with Windows history, Win32 programming and stolen mice—the computer kind.