Working with files that are larger than the available memory can be difficult on a windows system. Notepad, and similar editors, will not open files that will not fit in memory.
Excel is a good choice, as it will only load the first million lines. So if those first million lines are smaller than your system memory you are in luck. Save the output in excel binary format as soon as the file opens, as this will dramatically shrink the file size (1000 times for most of my logs) which makes my entire PC speed up.
People with experience with unix will note that everything that I’ve talked about so far is an editor. For those of you who don’t have experience with unix, unix has a whole suite of programs (more, less, grep, sed and others) that read files without modifying them. In windows we are accustomed to working with files, particularly from Microsoft Office, that can only be read by the programs that edit them. This is the opposite of what is desirable for a log file; we’d prefer to use a program that can’t alter the log.
Luckily, windows does come with programs that read text files and run from the command prompt. More can be used to slowly read through a file of any size. More will quickly display a small portion of even an absurdly large file. Typically the first few lines of a log are a good indication of what is to come.
Find is another program with which you should become familiar. find locates all lines that match, or do not match, a pattern. Errors in my log are flagged with an “E”. With the accompanying fixed format I often run the following:
find " E: " my.log
If that returns a lot of lines, control-c will stop it. I can get an idea of the size of the problem by counting the number of errors with:
find /c " E: " my.log
Often the log is clogged with the same entry over and over. To see if there is anything else you can run a reverse find:
find /v " E: " my.log
A complete list of the command line functions for windows is available here.