-
Notifications
You must be signed in to change notification settings - Fork 21
Tutorial
This tutorial guides you through msos's main functionality and useful commands. The support files for this tutorial are located here. These are dump files that you will analyze and inspect using msos, as well as the msos binaries. If you'd rather build your own, head on to the sources.
Note that this tutorial is not a complete reference -- refer to the Home page for the entire suite of msos commands.
In this section, you will open a crash dump of a .NET application and inspect the reason for the crash. Open a command prompt and navigate to the directory where you put the 32-bit version of msos.exe (under msos_x86 in the zip file). Begin by configuring the _NT_SYMBOL_PATH environment variable to point to the Microsoft symbol server (symbols are explained later):
set _NT_SYMBOL_PATH=srv*C:\Symbols*http://msdl.microsoft.com/download/symbols
Run the following command to open msos with the first dump file (specify the full path to crash.dmp as necessary):
msos -z crash.dmp
msos should be able to determine which thread has experienced a crash and switch to that thread. The output should resemble the following:
C:\Temp\msos_x86>msos -z crash.dmp
Opened dump file 'C:\temp\dumps\FileExplorer.exe.736.dmp', architecture X86, 1 CLR versions detected.
#0 Flavor: Desktop, Version: v4.0.30319.34014
Symbol path: srv*c:\Symbols*http://msdl.microsoft.com/download/symbols
Using Data Access DLL at: C:\Windows\Microsoft.NET\Framework\v4.0.30319\mscordacwks.dll
3>
The 3> prompt means you are currently in the context of the managed thread #3. If there is a current exception on that thread, you can see it by running the !pe command. Go ahead and do it -- the output should resemble the following:
3> !pe
Exception object: 00000000063e9118
Exception type: System.NullReferenceException
Message: Object reference not set to an instance of an object.
Inner exception: <none>
HResult: 80004003
Stack trace:
SP IP Function
Cannot find symbol file http://msdl.microsoft.com/download/symbols/FileExplorer.pdb/fb1840b4b3084de4a3ce40e6287a7df11/FileExplorer.pdb
Cannot find symbol file http://msdl.microsoft.com/download/symbols/FileExplorer.pdb/fb1840b4b3084de4a3ce40e6287a7df11/FileExplorer.pd_
Cannot find symbol file http://msdl.microsoft.com/download/symbols/FileExplorer.pdb/fb1840b4b3084de4a3ce40e6287a7df11/file.ptr
000000000517F614 0000000000740A6A FileExplorer.MainForm+<>c__DisplayClass1.<treeView1_AfterSelect>b__0(System.Object)
000000000517F650 0000000073258F01 System.Threading.QueueUserWorkItemCallback.WaitCallback_Context(System.Object)
000000000517F658 000000007323DF16 System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
000000000517F6C4 000000007323DE65 System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
000000000517F6D8 000000007329914F System.Threading.QueueUserWorkItemCallback.System.Threading.IThreadPoolWorkItem.ExecuteWorkItem()
000000000517F6EC 0000000073298949 System.Threading.ThreadPoolWorkQueue.Dispatch()
000000000517F73C 0000000073298834 System.Threading._ThreadPoolWaitCallback.PerformWaitCallback()
Note the warning messages displayed about missing symbols -- the FileExplorer.pdb file. Debugging symbols are required to display source-level information (source file name and line number). Exit msos by typing q. Back on the command prompt, append the path to FileExplorer.pdb (it's alongside crash.dmp) to the symbol path environment variable:
set _NT_SYMBOL_PATH=%_NT_SYMBOL_PATH%;C:\Path\To\FileExplorer
Now run msos again, and ask for thread #3's call stack. This time, you should see more detailed information that includes line numbers. Specifically, the first stack frame should look like this (note the source information at the end of the line):
0000000004F1F734 0000000000520A6A FileExplorer.MainForm+<>c__DisplayClass1.<treeView1_AfterSelect>b__0(System.Object) [c:\Temp\crash1\FileExplorer\MainForm.cs:45]
In some cases, this information might be sufficient to diagnose the crash. In other cases, you might need to determine what other threads were doing at the time of the crash. Key in the !Threads command to see a list of the application's threads.
It looks like threads #1, #2, and #4 are also running managed code in this application. Switch to each of the threads by using the ~ number command (e.g. ~ 1). Now, display that thread's call stack by using the !CLRStack command. Threads #2 and #4 aren't particularly interesting, and thread #1 has the following stack:
1> !CLRStack
SP IP Function
000000000032F32C 0000000000000000 InlinedCallFrame
000000000032F328 00000000689458F8 System.Windows.Forms.Application+ComponentManager.System.Windows.Forms.UnsafeNativeMethods.IMsoComponentManager.FPushMessageLoop(IntPtr, Int32, Int32)
000000000032F3B4 0000000068945389 System.Windows.Forms.Application+ThreadContext.RunMessageLoopInner(Int32, System.Windows.Forms.ApplicationContext)
000000000032F404 0000000068945202 System.Windows.Forms.Application+ThreadContext.RunMessageLoop(Int32, System.Windows.Forms.ApplicationContext)
000000000032F430 00000000689215E1 System.Windows.Forms.Application.Run(System.Windows.Forms.Form)
000000000032F444 0000000000520093 FileExplorer.Program.Main()
000000000032F5C8 0000000000000000 GCFrame
It seems that this thread is waiting for user input; it's not related to our crash.
Something else to experiment with:
Go back to thread #3. Recall that the crash occurred in a function called <treeView1_AfterSelect>b__0. See if there are interesting stack variables in that frame by running
!CLRStack -a. Identify the FileExplorer.MainForm+<>c__DisplayClass1 instance at address 00000000062daa84 and inspect its contents using the!do 00000000062daa84command.
In this section, you will experiment with an application that leaks memory. You will use msos to determine which objects are not being reclaimed, and figure out what is holding them in memory.
Navigate to the directory where you put the 64-bit version of msos (under msos_x64 in the zip file), and run the following command to open a dump file of the leaking application (specify the full path as necessary):
msos -z leak.dmp
Begin by inspecting the total memory usage of the CLR and GC by running the !eeheap command. Note the sizes of the various generations. It seems that there are approximately 26MB of memory currently in use on the heap.
1> !eeheap
GC regions:
Address Size Type Heap# Commit/Reserve
0000004900001000 25.879mb Ephemeral 0 Committed
00000049019e2000 230.117mb Ephemeral 0 Reserved
0000004910001000 68.000kb LargeObject 0 Committed
0000004910012000 127.930mb LargeObject 0 Reserved
Gen Size
0 3.858mb
1 12.007mb
2 9.997mb
LOH 33.789kb
Total 25.895mb
Other regions:
... snipped for brevity ...
Next, what are the big objects? To find out, run !dumpheap --stat. Here are some of the last lines in the output:
00007ffed1370bd8 31 1772 System.String
00007ffed1371250 7 34768 System.String[]
00007ffe72dd40f8 2365 56760 MemoryLeak.Employee
00007ffe72dd41d0 2365 56760 MemoryLeak.Schedule
00000049732b4300 312 3284210 Free
00007ffed13766d0 2365 23706760 System.Byte[]
Total 7489 objects
This listing tells you that 23MB or so are occupied by byte arrays. There are 2365 of them in fact, and there's an identical number of Schedule and Employee objects. This is suspicious. List the byte arrays using the !dumpheap --type System.Byte\[\]$ command. The --type switch takes a regular expression, so the trailing $ is to indicate that the type name should end with System.Byte[], and the backslashes escape the square brackets, which have a special meaning in a regular expression. Again, here are a few of the last lines:
00007ffed13766d0 00000049019b7068 10024
00007ffed13766d0 00000049019b97c0 10024
00007ffed13766d0 00000049019bbf18 10024
00007ffed13766d0 00000049019be670 10024
00007ffed13766d0 00000049019c0dc8 10024
00007ffed13766d0 00000049019c3520 10024
00007ffed13766d0 00000049019c5c78 10024
00007ffed13766d0 00000049019c83d0 10024
00007ffed13766d0 00000049019cab28 10024
00007ffed13766d0 00000049019cd280 10024
00007ffed13766d0 00000049019cf9d8 10024
00007ffed13766d0 00000049019d2130 10024
00007ffed13766d0 00000049019d4888 10024
00007ffed13766d0 00000049019d6fe0 10024
00007ffed13766d0 00000049019d9738 10024
Statistics:
MT Count TotalSize Class Name
00007ffed13766d0 2365 23706760 System.Byte[]
Total 2365 objects
The first column is the method table, which is the same for all objects of the same type. The second column is the object address, which varies. The third column is the size, which seems to the pretty much the same for all objects.
What's keeping these byte arrays alive? To find out, you are going to create a heap index. This is an auxiliary data structure that msos can then use to very quickly show you why your objects aren't being reclaimed. To build the index, run !bhi --nofile. This dump file is pretty small, so the index should build in less than a second.
Once the index is ready, pass some of the byte arrays to the !paths command, which tells you why these objects aren't being reclaimed:
1> !paths 00000049019c3520
0000000000000000 -> 00000049019c34f0 finalization handle
-> 00000049019c34f0 MemoryLeak.Employee
-> 00000049019c3508 MemoryLeak.Schedule
-> 00000049019c3520 System.Byte[]
Total paths displayed: 1
1> !paths 00000049019cf9d8
0000000000000000 -> 00000049019cf9a8 finalization handle
-> 00000049019cf9a8 MemoryLeak.Employee
-> 00000049019cf9c0 MemoryLeak.Schedule
-> 00000049019cf9d8 System.Byte[]
Total paths displayed: 1
This is interesting. It looks like the byte arrays aren't going away because they are retained by Schedule objects, which are retained by Employee objects, which are retained because they require finalization. To verify, run !frq --stat, which shows you objects ready for finalization. There should be hundreds of Employee objects waiting for finalization, and in the meantime retaining a lot of memory through byte arrays.
If you'd like, you can review the source code for this application and see why Employee objects are queuing up for finalization. A similar scenario is discussed in Tess Fernandez' blog on MSDN.
Something else to experiment with:
Now that you know there are Employee objects around, how do you know all the byte arrays are actually retained by these objects without going over each one with the
!pathscommand? Well, this is where a heap query can be useful. Run the following commands to determine which fields the Employee and Schedule classes have:1> !hq --tabular Class("MemoryLeak.Employee").__Fields Name Type _schedule MemoryLeak.Schedule Rows: 1 1> !hq --tabular Class("MemoryLeak.Schedule").__Fields Name Type _data System.Byte[] Rows: 1Now, run the following query to find the total size of all byte arrays referenced by Employee objects:
1> !hq --tabular (from e in ObjectsOfType("MemoryLeak.Employee") _ let baSize = (int)e._schedule._data.__Size select baSize).Sum() 23706760 Rows: 1This helps to verify the theory that all the byte arrays are indeed related to Employee instances.
In this section, you will analyze a dump of a working ASP.NET application. You will use msos to determine which HTTP requests are currently in flight, and inspect custom application objects.
Navigate to the directory where you put the 32-bit version of msos, and run the following command to open the dump (specify the path to the dump as necessary):
msos -z web.dmp
Now, write a query that finds and displays all HTTP requests. The System.Web.HttpRequest class represents an HTTP request, and we can easily find them using !hq (or !dumpheap):
1> !hq --tabular ObjectsOfType("System.Web.HttpRequest")
[0000000005b3ec88 System.Web.HttpRequest]
[0000000006a32500 System.Web.HttpRequest]
[0000000006adb590 System.Web.HttpRequest]
Rows: 3
Let's see which fields HttpRequests have:
1> !hq --tabular Class("System.Web.HttpRequest").__Fields
Name Type
_wr System.Web.HttpWorkerRequest
_context System.Web.HttpContext
_httpMethod System.String
_requestType System.String
_path System.Web.VirtualPath
_rewrittenUrl System.String
_filePath System.Web.VirtualPath
_currentExecutionFilePath System.Web.VirtualPath
_pathInfo System.Web.VirtualPath
_queryStringText System.String
_queryStringBytes System.Byte[]
... snipped for brevity ...
Looks like the _path field could be interesting. Dump out all the paths in these HTTP requests:
1> !hq --tabular from req in ObjectsOfType("System.Web.HttpRequest") select req._path._virtualPath
/
/__browserLink/requestData/4ecfd108db2b4b98a5577024ea22f466
/Home/About
Rows: 3
Cool. Now, are any of these requests still running?
1> !hq --tabular from req in ObjectsOfType("System.Web.HttpRequest") _
select new { req._path._virtualPath, req._context._response._completed }
_virtualPath _completed
/ False
/__browserLink/requestData/4ecfd108db2b4b98a557... False
/Home/About False
Rows: 3
They are all still running! On which thread?
1> !hq --tabular from req in ObjectsOfType("System.Web.HttpRequest") _
where !(bool)req._context._response._completed _
select new { _
req._path._virtualPath, _
req._context._thread.m_ManagedThreadId _
}
_virtualPath m_ManagedThreadId
/ <null>
/__browserLink/requestData/4ecfd108db2b4b98a557... <null>
/Home/About 8
Rows: 3
Something else to experiment with:
At some point, writing these queries by hand gets pretty tiresome. That's where aliases come in. You can create an alias that executes any msos command, and even load these aliases from a file. Exit msos and run it again with the following command (specify the full path to aspnet.aliases as necessary):
msos -z web.dmp -i aspnet.aliasesmsos then reads the contents of aspnet.aliases and executes the commands from the file. The only command is
.newalias, which creates a new alias calledreqthreads. To execute the alias, use%:1> % reqthreads Alias 'reqthreads' expanded to '!hq --tabular from req in ObjectsOfType("System.Web.HttpRequest") where !(bool)req._context._response._completed select new { req._path._virtualPath, req._context._thread.m_ManagedThreadId }' _virtualPath m_ManagedThreadId / <null> /__browserLink/requestData/4ecfd108db2b4b98a557... <null> /Home/About 8 Rows: 3Aliases are very useful for repetitive commands, or for sharing useful commands with other people
Neat! Thread #8 seems to be processing the third request, and the other two are probably not started yet, so there is no thread taking care of them for now. Let's switch to thread #8 and see what it's doing:
1> ~ 8
8> !CLRStack -a
SP IP Function
000000001433E938 0000000000000000 HelperMethodFrame
000000001433E9C4 000000007323F540 System.Threading.Thread.Sleep(Int32)
000000001433e95c = 0000000006ae2a00 (hang.Controllers.HomeController)
000000001433E9CC 00000000142BC5DB hang.Controllers.Mystery.HangIfShouldHang() [c:\Temp\hang\hang\Controllers\HomeController.cs:16]
000000001433e9cc = 0000000006aeff74 (hang.Controllers.Mystery)
000000001433E9E0 00000000142BC577 hang.Controllers.Mystery.DoIt() [c:\Temp\hang\hang\App_Start\RouteConfig.cs:16]
000000001433e9e0 = 0000000006aeff74 (hang.Controllers.Mystery)
000000001433E9EC 00000000142BC3F7 hang.Controllers.HomeController.About() [c:\Temp\hang\hang\Controllers\HomeController.cs:31]
000000001433e9ec = 0000000006ae47d0 (System.Runtime.CompilerServices.CallSite<System.Func<System.Runtime.CompilerServices.CallSite,System.Object,System.String,System.Object>>)
... snipped for brevity ...
It looks like Mystery.HangIfShouldHang called Thread.Sleep, and we're sleeping now. Let's take a look at the Mystery object referenced by the HangIfShouldHang frame:
1> !do 0000000006aeff74
Name: hang.Controllers.Mystery
MT: 0000000013bbf1e4
Size: 12(0xc) bytes
Assembly: C:\Windows\Microsoft.NET\Framework\v4.0.30319\Temporary ASP.NET Files\vs\b65849cf\f6c9efc9\assembly\dl3\a3e7706f\06890397_2bbed001\hang.dll
Value: 331084260
Fields:
Offset Type VT Attr Value Name
0 System.Boolean 1 instance True <ShouldHang>k__BackingField
Well, it's no big surprise that the request isn't done yet! Looks like the ShouldHang property is set to true.
In this section, you will attach msos to a live process on your system and inspect the heap and some application objects.
Run Visual Studio and create a new C# console application project. Double-click app.config in Solution Explorer and keep the XML editor window open.
Navigate to the directory where you put the 32-bit version of msos, and run the following command:
msos --pn devenv
If you have multiple instances of Visual Studio running, msos will tell you their process ids and you'll use the --pid switch instead.
Now, create a heap index by running the !bhi --nofile command. This can take a few seconds. Once you're done, run a query that displays all XmlSchemaSet objects on the heap. Here's how:
1> !hq --tabular from xs in ObjectsOfType("System.Xml.Schema.XmlSchemaSet") select xs
[00000000152b77b4 System.Xml.Schema.XmlSchemaSet]
Rows: 1
What's the size of this XmlSchemaSet? Use !objsize to find out:
1> !objsize 00000000152b77b4
00000000152b77b4 graph size is 146136 objects, 7572412 bytes
Whoa! That's a big graph! Over 140 thousand objects, and over 7MB of memory rooted at this XmlSchemaSet. What's keeping it alive? Find out with !paths.
Some (or all) of the paths go through an XmlDocument object. Run a heap query or the !do command to display the location field of the XmlDocument object. For example:
1> !hq --tabular Object(0x00000000150c5498).location
C:\Temp\leak1\Sources\app.config
Rows: 1
You've reached the end of the tutorial! Well done! If you're looking for more information, refer to the Home page for the entire suite of msos commands. Thanks for reading.
Copyright (C) Sasha Goldshtein, 2015. All rights reserved.