The only downside to a too large page file size is that you are consuming Anyone who has used a defrag tool like Diskeeper with a graphic disk map, The downsides of a too small pagesize are much more serious. At a minimum, One of Windows 2000 improvements is more control of core dumps. Windows 2000 For more information on dump files (.dmp files) see: Gathering But, note that but (:->, although I have never seen the unix How Configuring System Internals has released a freeware tool, PageDefrag , which will
How big should you set Windows NT pagefile? Microsoft’s forumla
definition is size of RAM + 12MB or 2.5 X size of SAM file, whichever is larger.
If you have 128MBs of RAM, that works out to 140MB. The traditional unix advice
for pagefile is 2xRAM which would work out to 256MB. The maximum legal page size
for Windows NT and Windows 2000 is 4095MB.
some of your disk space unneccessarily. Given the size of modern disks, this
seems to be a trivial issue. Under Windows NT, the page file occupies a
contiguous block of space in the partition. If possible split the page file
between hard disks. Don’t split the page file across partitions created on the
same hard drive since that will decrease performance.
sees the huge contiguous chunk of the page file in NT. Windows 2000 does not
have the contiguous block requirement.
your system will experience disk thrashing where your pagefile files to maximum
and NT is constantly swapping out segments of RAM to disk virtual memory. This
can really impact performance. It can also result in a corrupted pagefile.
Additionally, the pagesize should be at least the size of RAM+1MB in order to
support core dumps. In any case, you have swapped out the fasted component in
your server for the slowest. Given these factors, I recommend the unix formula.
has three sets:
Windows NT has the Complete
memory dump or none. If you have a lot of RAM, I would pick the Kernel memory dump option. To set these options in Windows
2000:
Blue Screen Information After Memory Dump nt4/w2k
Windows
2000 Memory Dump Options Overview
Use
Dumpchk.exe to Check a Memory Dump File nt4/w2k
Reading
Small Memory Dump Files Created by Windows 2000
formula to be too small, the actual pagesize might need to be bigger under some
unique set of factors. The only way to be absolutely sure, one must use diskperf
and monitor the pagefile usage peak object. For
information to help with this task, read kb article :
to Monitor Disk Performance with Performance Monitor Q102020
Page Files for Optimization and Recovery q197379
generate a fragmentation report and will defragment Windows NT 4’s pagefile and
registry. Windows NT is very bad about fragmentation.