Visual Studio Performance Testing Quick Reference Guide
Visual Studio Performance Testing Quick Reference Guide
The document contains two Tables of Contents (high level overview, and list of every topic covered) as
well as an index. The current plan is to update the document on a regular basis as new information is
found.
The information contained in this document represents the current view of Microsoft Corporation
on the issues discussed as of the date of publication. Because Microsoft must respond to changing
market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and
Microsoft cannot guarantee the accuracy of any information presented after the date of
publication.
This document is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS,
IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS DOCUMENT.
Microsoft grants you a license to this document under the terms of the Creative Commons
Attribution 3.0 License. All other rights are reserved.
Microsoft, Active Directory, Excel, Internet Explorer, SQL Server, Visual Studio, and Windows are
trademarks of the Microsoft group of companies.
NOTE
All items that are not marked with a version note should be considered to apply to both VS 2008 and VS 2010
TROUBLESHOOTING 142
How to enable logging for test recording 142
Diagnosing and fixing Web Test recorder bar issues 142
How to enable Verbose Logging on an agent for troubleshooting 143
Troubleshooting invalid view state and failed event validation 143
Troubleshooting the VS Load Testing IP Switching Feature 145
EXTENSIBILITY 177
New Inner-text and Select-tag rules published on Codeplex 177
How to Add Custom Tabs to the Playback UI 178
--NEW-- How to extend recorder functionality with plugins 185
INDEX 199
There is a full section near the beginning just on new features in Visual Studio 2010. This list is not even
close to complete WRT all of the new Performance Testing features, let alone the tons of other testing
features in general. You will also find information about changes to 2010 and issues with 2010
throughout the rest of the document. All of these should have a balloon stating that it is new or
different.
Also please note that the Microsoft Visual Studio team has renamed the suite. The following changes
apply:
Thanks to all of the people who have contributed articles and information. I look forward to hearing
feedback as well as suggestions moving forward.
Sincerely,
Geoff Gray, Senior Test Consultant – Microsoft Testing Services Labs
If you do this and re-record your Web test, the Referrer header should be included in the request like
this:
3) Headers handled automatically by the engine. Two examples: 1) headers sent and received as
part of authentication. These headers are handled in the Web test engine and can’t be
controlled by the test. 2) cookies, which can be controlled through the API.
General Info (including order of execution) of load and web test plugins and rules
WebTestPlugins get tied to a webtest at the main level of the test. The order of precedence is:
class WebTestPluginMethods : WebTestPlugin
{
public override void PreWebTest(object sender, PreWebTestEventArgs e) { }
public override void PreTransaction(object sender, PreTransactionEventArgs e) {}
public override void PrePage(object sender, PrePageEventArgs e) {}
WebTestRequestPlugins get set at an individual request level and only operate on the request(s) they
are explicitly tied to, and all redirects/dependant requests of that request.
ValidationRules can be assigned at the request level and at the webtest level. If the rule is assigned at
the webtest level, it will fire after every request in the webtest. Otherwise it will fire after the request it
is assigned to.
ExtractionRules can be assigned at the request level. It will fire after the request it is assigned to.
1) These fire based on the load test (meaning each one will fire only once during a full test run)
2) These fire once per test iteration, per vUser.
3) Heartbeat fires once every second, on every agent.
4) ThresholdExceeded fires each time a given counter threshold is exceeded.
NOTE: Each method in section 1 will fire once PER physical agent machine, however since the agent
machines are independent of each other, you do not need to worry about locking items to avoid
contention.
NOTE: If you create or populate a context parameter inside the LoadTest_TestStarting method, it will
not carry across to the next iteration.
Changed in 2010
In VS 2010, you can have more than one LoadTest plugin, although there is no guarantee about the
order in which they will execute.
You can now control whether a validation rule fires BEFORE or AFTER dependent requests.
at the end of recording a Web test, we now automatically add a Response Time Goal Validation rule
at the Web test level, but this doesn’t help much unless you click on the Toolbar button that lets you
edit the response time goal as well as Think Time and Reporting Name for the Page for all recorded
requests in a single grid
http://blogs.msdn.com/slumley/pages/web-tests-work-at-the-http-layer.aspx
File Downloads, Download Size and Storage of files during Web Tests
The web test engine does not write responses to disk, so you don’t need to specify a location for the file.
It does read the entire response back to the client, but only stores the first 1.5M of the response in
memory
How the “Test Iterations” Setting impacts the total number of tests executed
In the properties for the Run Settings of a load test, there is a property called “Test Iterations” that tells
VS how many tests iterations to run during a load test. This is a global setting, so if you choose to run 5
iterations and you have 10 vusers, you will get FIVE total passes, not fifty. NOTE: you must enable this
setting by changing the property “Use Test Iterations” from FALSE (default) to TRUE.
This particular test timeout is enforced by the agent test execution code, but load test and Web test
execution are tightly coupled for performance reasons and when a load test executes a Web test, the
agent test execution code that enforces the test timeout setting is bypassed.
How user pacing and “Think Time Between Test Iterations” work
The setting “Think Time Between Test Iterations” is available in the properties for a load test scenario.
This value is applied when a user completes one test, then the think time delay is applied before the
user starts the next iteration. The setting applies to each iteration of each test in the scenario mix.
If you create a load test that has a test mix model “Based on user pace”, then the pacing calculated by
the test engine will override any settings you declare for “Think Time Between Test Iterations”.
Cool down:
Changed in 2010
In 2008
The Load test Terminate method does not fire unless you use a cool down period.
In 2010
The Load test Terminate method always fires.
Sequential – This is the default and tells the web test to start with the first row then fetch rows in order
from the data source. When it reaches the end of the data source, loop back to the beginning and start
again. Continue until the load test completes. In a load test, the current row is kept for each data source
in each web test, not for each user. When any user starts an iteration with a given Web test, they are
given the next row of data and then the cursor is advanced.
Random – This indicates to choose rows at random. Continue until the load test completes.
Unique – This indicates to start with the first row and fetch rows in order. Once every row is used, stop
the web test. If this is the only web test in the load test, then the load test will stop.
Sequential – This works that same as if you are on one machine. Each agent receives a full copy of the
data and each starts with row 1 in the data source. Then each agent will run through each row in the
data source and continue looping until the load test completes.
Random – This also works the same as if you run the test on one machine. Each agent will receive a full
copy of the data source and randomly select rows.
Unique – This one works a little differently. Each row in the data source will be used once. So if you
have 3 agents, the data will be spread across the 3 agents and no row will be used more than once. As
with one machine, once every row is used, the web test will stop executing.
There is a property named on each request in a Web test named “Cache Control” in the Web
test editor (and named “Cache” on the WebTestRequest object in the API used by coded Web
tests).
When the Cache Control property on a request in the Web test is false, the request is always
issued.
When the Cache Control property is true, the VS load test runtime code attempts to emulate the
Internet Explorer caching behavior (with the “Automatically” setting).This includes reading and
following the HTTP cache control directives.
The Cache Control property is automatically set to true for all dependent requests (typically for
images, style sheets, etc embedded on the page).
In a load test, the browser caching behavior is simulated separately for each user running in the
load test.
When a virtual user in a load test completes a Web test and a new Web test session is started to
keep the user load at the same level, sometimes the load test starts simulates a “new user” with
a clean cache, and sometimes the load test simulates a return user that has items cached from a
previous session. This is determined by the “Percentage of New Users” property on the
Scenario in the load test. The default for “Percentage of New Users” is 0.
Important Note: When running a Web test by itself (outside of the load test), the Cache Control
property is automatically set to false for all dependent requests so they are always fetched; this is so
that they can be displayed in the browser pane of the Web test results viewer without broken images.
A better term to describe a new user is “One Time User”. This is because a new user goes away at
the end of its iteration. It does not “replace” a different user in the pool. Therefore, the term “New
User” should be considered to be a “One Time” user.
The “Percentage of New Users” affects the following whether the tests contained within the load test
are Web tests or unit tests:
The value of the LoadTestUserId in the LoadTestUserContext object. This only matters for unit
tests and coded Web tests that use this property in their code. On the other hand if you set
the number of test iterations equal to the user load, then you should get a different
LoadTestUserId regardless of the setting of “Percentage of New Users”.
If you are using the load test feature that allows you to define an “Initial Test” and/or a
“Terminate Test” for a virtual user, then it affects when the InitializeTest and TerminateTest are
run: for “new users” (a more accurate name might be “one time users”, the InitializeTest is run
for the virtual user, the “Body Test” is run just once, and then the “Terminate Test” is run. For
users who are NOT “new users”, the InitializeTest is run once, the Body Test is run many times
(until the load test completes), and then the TerminateTest runs (which might be during the
cool-down period).
The “Percentage of New Users” affects the following Web test features that are not applicable for unit
tests:
The simulation of browser caching. The option affects how the VUser virtual browser cache is
maintained between iterations of Tests. “New users” have an empty cache (not the responses
are not actually cached, only the urls are tracked), “return users” have a cache. So if this value is
100% all Vusers starting a Test will be starting with an empty browser cache. If this value is 0%
all VUsers will maintain the state of the browser cache between iterations of Web Tests. This
setting affects the amount of content that is downloaded. If an object sits in a Vuser cache and if
the object has not been modified since the last time the Vuser downloaded it, the object will not
be downloaded. Therefore, new users will download more content versus returning users with
items it their browser cache.
The handling of cookie for a Web test virtual user: new users always start running a Web test
with all cookies cleared. When a user who is not a “new user” runs an Web test after the first
one run, the cookies set during previous Web tests for that virtual user are present.
Zero percent new users shows a graph where each of the 10 vusers is constantly reused.
Fifty percent new users shows a graph where each of the 10 vusers is constantly reused by half of the
iterations, but the other half are split out among new vusers which never get reused.
One hundred percent new users shows a graph where none of the vusers is ever reused.
GIF 83
BMP 32719
200 OK - 3871 New users are simulated by “clearing” the cache at the start of
304 Not Modified - 29462 each new iteration, whereas the cache is carried from iteration
VS Requests: 33,333 to iteration for return users.
VS Requests Cached: 84,507
This results in many more requests being cached with return
TOR 10 - Caching - NewUsers
HTM 276 users.
HTML 271 NOTE: The total # of requests made by VS is a sum of the two VS
IIS Logs
GIF 276
From
GIF 85
BMP 3330 information, see the section “Add an Expires or a
200 OK - 3874 Cache-Control Header” from
304 Not Modified - 75 http://developer.yahoo.com/performance/rules.html).
VS Requests: 3,949
VS Requests Cached: 84,842 Notice that VS honors the content expiration (this is
TOR 11 - Caching - NewUsers - Content Expiration actually handled by the underlying System.NET
HTM 268 component). However, VS still reports the cached file
HTML 262 request, even though no call went out the wire. This is
IIS Logs
From
GIF 268
BMP 44622
expected behavior since the request was a part of the
200 OK - 45286 site. In order to see how many requests went on the
304 Not Modified - 134 wire, you need to use IIS logs or network traces.
VS Requests: 44,742
VS Requests Cached: 42,090
SELECT
sc-status, COUNT(*) AS Total
FROM *.log
WHERE
to_timestamp(date, time) between
timestamp('2010-02-12 02:13:22', 'yyyy-MM-dd hh:mm:ss')
and
timestamp('2010-02-12 02:18:22', 'yyyy-MM-dd hh:mm:ss')
GROUP BY
sc-status
There is an extra thread for each unit test execution thread that is used to monitor the execution of the
unit test, implement timing out of the test, etc. However, the stack size for this thread is smaller than
the default size so it should take up less memory.
Each line here is one of the Any “errors” entry (#1) that has an associated “error details” will
“errors” entries (#1). have a link in one or both of the last columns. Click on these to
get the details about that specific error instance.
In the example above, Hidden1 and Hidden2 represent hidden field buckets. We call the number at the
end as the bucket number, e.g. $HIDDEN0 is bucket 0.
The easiest example to explain is a frames page with two frames. Each frame will have an independent
bucket, and requests can be interleaved across the frames. Other examples that require multiple
buckets are popup windows and certain AJAX calls (since web tests support correlation of viewstate in
ASP.NET AJAX responses).
The algorithm to determine that a given request matches a particular bucket uses the heuristic that the
hidden fields parsed out of the response will match form post fields on a subsequent request.
Then on a subsequent post we see Field1 and Field2 posted, then this request and response match and a
hidden field bucket will be created for them. The first available bucket number is assigned to the hidden
field bucket.
Once a bucket is “consumed” by a subsequent request via binding, that bucket is made available again.
So if the test has a single frame, it will always reuse bucket 0:
Page 1
o Extract bucket 0
Page 2
o Bind bucket 0 params
Page 3
o Extract bucket 0
Page 4
o Bind bucket 0 params
If a test has 2 frames that interleave requests, it will use two buckets:
Or if a test uses a popup window, or Viewstate, you would see a similar pattern as the frames page
where multiple buckets are used to keep the window state.
Some hidden fields values are modified in java script, such as EVENT_ARGUMENT. In that case, it won’t
work to simply extract the value from the hidden field in the response and play it back. If the recorder
detects this is the case, it put the actual value that was posted back as the form post parameter value
rather than binding it to the hidden field.
A single page will have have just one hidden field extraction rule applied. If there are multiple forms on a
given page, there is still just one down-stream post of form fields, resulting in one application of the
hidden field extraction rule.
1. ClassInitialize and ClassCleanup: Since ClassInitialize and ClassCleanUp are static, they are only
executed once even though several instances of a test class can be created by MSTest.
ClassInitialize executes in the instance of the test class corresponding to the first test method in
the test class. Similarly, MSTest executes ClassCleanUp in the instance of the test class
corresponding to the last test method in the test class.
2. Execution Interleaving: Since each instance of the test class is instantiated separately on a
different thread, there are no guarantees regarding the order of execution of unit tests in a
single class, or across classes. The execution of tests may be interleaved across classes, and
potentially even assemblies, depending on how you chose to execute your tests. The key thing
here is – all tests could be executed in any order, it is totally undefined.
3. TextContext Instances: TestContexts are different for each test method, with no sharing
between test methods.
[TestClass]
public class VSClass1
{
private TestContext testContextInstance;
[ClassInitialize]
public static void ClassSetup(TestContext a)
{
Console.WriteLine("Class Setup");
}
[TestInitialize]
public void TestInit()
{
Console.WriteLine("Test Init");
}
[TestMethod]
public void Test3()
{
Console.WriteLine("Test3");
}
[TestCleanup]
public void TestCleanUp()
{
Console.WriteLine("TestCleanUp");
}
[ClassCleanup]
public static void ClassCleanUp ()
{
Console.WriteLine("ClassCleanUp");
}
}
(This consists of 3 Test Methods, ClassInitialize, ClassCleanup, TestInitialize, TestCleanUp and an explicit
declaration of TestContext)
Test1 [Thread 1]: new TestContext -> ClassInitialize -> TestInitialize -> TestMethod1 ->
TestCleanUp
Test2 [Thread 2]: new TestContext -> TestInitialize -> TestMethod2 -> TestCleanUp
Test3 [Thread 3]: new TestContext -> TestInitialize -> TestMethod2 -> TestCleanUp ->
ClassCleanUp
The output after running all the tests in the class would be:
Class Setup
Test Init
Test1
TestCleanUp
Test Init
Test2
TestCleanUp
Test Init
Test3
TestCleanUp
ClassCleanUp
Controller-Agent Communications
Question: The load test in our scenario is driving integration tests (implemented using the VS unit
testing framework) so I want the data to be available to the unit test while it is running. I am thinking of
writing a lightweight service that acts as the provider of shared state. I will use the
ILoadTestPlugin.Initialize to initialize / reset the data source (using a filter for agent ID so that it runs
only once) by calling the service, retrieve the data from the service in LoadTest.TestStarting event and
then make this data available to the unit test using the test context. This way, the duration of the test
run is not affected by the state retrieval process. However, I need to be careful in implementation of the
shared state provider so that it doesn’t have a major impact on the test run results (because of
synchronisation / contention).
Answer: As you said, the service needs to be super-fast and simple. Maintaining a simple list of
name/value pairs would go a long way. The trickiest thing about the service is what locking to provide.
For example, for state variable keeping a count, we don’t want agents setting the value, as they will step
on each other and lose increments. A better design is to have a first class Increment command that the
service handles. There are similar questions for integrity of string data, although that is probably not as
important as providing a simple counter. Another common pattern is maintaining lists of stuff. One user
is adding things to the list, the other user is consuming them. This is probably best implemented with a
database.
You can also right-click on a form post or query string parameter in the request tab to start a search.
The recording will have the same name appended with “*Recorded+.” This gives you the ability to see
the requests the browser made and the responses during recording, and compare them to what the
web test is sending and receiving. You can also search the recording for specific values that were
recorded.
Tip: if this is value changes each time the test is run, the value from the result viewer will not be in the
editor. So rather than adding the extraction rule from the test result, add it from the recorder log
instead (since this will have the recorded value, which will also be in the Web test).
1) Any Reporting Names you used will show up in the results table.
2) Any requests with the same name but with different methods will be reported separately.
For the LoadTestTestDetail table, the big differences are that you get the outcome of the tests, which
virtual user executed it, and the end time of the test.
For the LoadTestPageDetail table, you now get the end time of the page as well as the outcome of the
page.
Another change in VS 2010 is that the default for whether or not to collect details has changed. In VS
2005 and VS 2008 the default was to not collect this detail data. In VS 2010, the default is to collect the
detail data. This is controlled by the Timing Details Storage property on the Run Settings node in a load
test.
So you can still run your own analysis on this data, but there is also a new view in VS that you can use to
get a look at the data. The view is the Virtual User Activity Chart. When a load test completes, there will
be a new button enabled on the load test execution toolbar. It is the detail button below:
When you click on this button you will brought to the Virtual User Activity Chart. It looks like the
following:
If you look at the bottom of the chart, you will see a zoom bar. The zoom bar allows you to change the
range that you are looking at. The zoom bar overlays one of the graphs from the graph view. So
whichever graph is selected in the graph view, you will see that on the zoom bar. This makes it very
easy to correlate spikes in a graph with what tests/pages/transactions are occurring during that spike.
The legend on the left also has some filtering and highlight options. If you uncheck a page, then all
instances of that page are removed from the chart. If you click to Highlight Errors, then all pages that
failed will have their color changed to red. If you look at bottom part of the legend, you will see all the
errors that occurred during the test. You can choose to remove pages with certain errors or remove all
successful pages so you only see errors.
There is one other very useful feature of this view. You can hover over any line to get more information
about the detail and possibly drill into the tests that the detail belongs to. For example this is what it
looks like when you hover a detail:
You see the full set of details collected for the test in the usual web test playback view that you are use
to. If it was a unit test, you would have seen the unit test viewer instead.
http://blogs.msdn.com/slumley/archive/2009/11/07/VS-2010-feature-load-testing-run-comparison-
report-in-excel.aspx
http://blogs.msdn.com/slumley/archive/2009/05/22/dev10-feature-load-test-excel-report-
integration.aspx
Using Visual Studio Ultimate enables you to generate 250 virtual users of load. To go higher than 250
users, you need to purchase a Virtual User Pack, which gives you 1000 users. You can use the 1000 users
on any number of agents. Note that if you install the Virtual User Pack on the same machine as Visual
Studio Ultimate, you do not get 1250 users on the controller. The 250 virtual users you get with Ultimate
can only be used on “local” runs, not on a Test Controller. If you need to generate more 1000 users, you
purchase additional Virtual User Packs, which aggregate or accumulate on the Test Controller. In other
words, installing 2 Virtual User Packs on one controller gives you 2000 Virtual Users, which can be run
on any number of agents.
This is what you get when you install Visual Studio Ultimate, which is the ability to generate
load “locally” using the test host process on the same machine that VS is running on. In addition
to limiting load to 250 users, it is also limited to one core on the client CPU.
Note that purchasing Ultimate also gives you the ability to collect ASP.NET profiler traces by
using a Test Agent as a data collector on the Web server.
This is a common configuration if you are scaling out your load agents. With this configuration,
the Test Controller and each Test Agent is on a separate machine.
The advantage of this configuration is the controller is easily shared by team members, and
overhead from the controller does not interfere with load generation or operation of the client.
Note the Test Controller must have one or more Virtual User Packs installed to enable load
testing. Load agents in this configuration always use all cores on the machine.
With configuration A, you install the Test Controller and Test Agent on the same machine as VS,
then configure the Test Controller with Virtual User Packs. This enables you to generate >250
virtual users from the client machine, and unlocks all cores in the processor. Configuration B
shows an alternative configuration, enabled if you configure the machine with Virtual User
Packs using the VSTestConfig command line.
Note that a Virtual User Pack can only be used on one machine at a time, and configuring it on a
machine ties it to that machine for 90 days. So you can’t have the same Virtual User Pack
installed on both the VS client and a separate machine running the Test Controller. See the
Virtual User Pack license for details.
In this configuration, the controller is running on the same machine as the Test client, with
distributed agents running as load generators. This configuration is recommended if you have a
solo performance tester. If your test controller and test agents will be shared by a team, we
recommend running the controller on a separate box. Note that test agents are tied to a single test
controller. You can’t have two test controllers controlling the same agent.
If you are using Visual Studio 2008, these options should look familiar to you as the VS 2008
load agents and controller offered the same configuration options. The new twist with VS 2010 is
the Virtual User Packs, which offer you more flexibility in how you configure your load agents.
The Test Controller and Test Agent are “free” when you purchase Ultimate.
It is not recommended to use ordered tests in a load test. In the load test results, you do not get the
pass/fail results, test timings or transaction timings for any of the inner tests. You just get a Pass/Fail
result and duration for the overall ordered test.
To address this issue, there is a new test mix type in VS2010 called Sequential Test Mix. Here is what it
looks like in the load test wizard:
For this mix type, you set the order of tests that each virtual user will run through. You can mix web and
unit tests in the mix and you will get the individual test, page and transaction results. When a virtual
user completes the last test in the mix, it will cycle back to the first test in the mix and start over.
This will launch a dialog and then select WebTest1. Then do same steps and add WebTest2. Now just
run WebTest3 and you will execute both tests. WebTest composition has been available since VS2008
When you choose to parameterize the web servers in a web test, you may see more webservers listed
than your test actually calls. This is expected behavior.
that the parameter parser is finding websites that reside inside query strings. Notice this in the .webtest
file:
<QueryStringParameter Name="Source"
Value="https%3A%2F%2Ffanyv88.com%3A443%2Fhttp%2Flocalhost%3A17012%2Fdefault%2Easpx"
RecordedValue="https%3A%2F%2Ffanyv88.com%3A443%2Fhttp%2Flocalhost%3A17012%2Fdefault%2Easpx" CorrelationBinding=""
UrlEncode="False" UseToGroupResults="False" />
Any Query String that has a URL gets added to the server list
Any Form Post parameter that has a URL gets added to the server list
NO added header value makes it into the list
If the form post or query parameter NAME is a URL (not the value, but the name of the
parameter), it does NOT get added.
Agents to Use
The agent names that are entered should be the names of agents that are connected to the controller to which the
load test will be submitted. They should be the simple computer names of the agents (as seen in the “Computer
Name” field in the Control Panel). Unfortunately, at this time, if you switch to submitting the load test to a
different controller, you will need to change the value for “Agents to Use” as there is no way to parameterize
this list to vary depending on the controller used. This list of agents designates a subset of those the agents that
are connected to the controller, and are in the Ready state when the load tests starts (they may be running a
different load test or other test run when the load test is queued as long as they become Ready when the load test
is taken out of the Pending state and starts running), and that meet any agent selection criteria to allow the test
run to be run on the agent. The Scenario will run on all agents in the list that meet these criteria, and the user
load for the Scenario will be distributed among these agents either evenly (by default) or according to any agent
weightings specified in the Agent properties for the agents (from the “Administer Test Controllers” dialog in Visual
Studio).
In Visual Studio 2008, if you wanted to conditionally execute some requests or you wanted to
loop through a series of requests for a given number of times, you had to convert a declarative
web test to a coded web test. In VS2010, these options are exposed directly in declarative
webtests.
The ability to add these are exposed by right-clicking on a request and selecting the option you
want from the context menu:
The context menu showing the loop and condition insert options
Open QTAgentService.exe.config
Add "<add key="WorkingDirectory" value="<location to use>"/>" under the <appSettings> node.
Create the <location to use> folder.
If you wish to use the machine’s IE proxy settings then you can set the Proxy property to “default”
(without the quotes). In this case you should turn off Automatic Proxy Detection on each agent.
Automatic Proxy detection is very slow and can greatly impact the amount of load you can drive on an
agent.
In 2008
By default, web test playback ignores proxy servers set for localhost, so enabling a proxy for 127.0.0.1
(which is where Fiddler captures) will not result in any captured data. To make this work, either add a
plugin with the following code, or put the following code in the Class constructor for a coded web test:
this.Proxy = "http://localhost:8888";
WebProxy webProxy = (WebProxy)this.WebProxy;
webProxy.BypassProxyOnLocal = false;
In 2010
To get fiddler to work in VS 2010, simply open Fiddler, then start playing the web test. There is no need
to code for anything.
The below is how you can set memory to 512mb. The size of the memory you use will vary based on the
machine, testing and how much memory you have.
Change the values as needed and note that the time is in milliseconds.
Add a key to the "appSettings" section of the file (add the "appSettings" section if needed) with the
name "LoadTestMaxErrorsPerType" and the desired value.
<appSettings>
<add key="LoadTestMaxErrorsPerType" value="5000"/>
</appSettings>
In 2008
To enable your application to use Server GC, you need to modify either the VSTestHost.exe.config or
the QTAgent.exe.config. If you are not using a Controller and Agent setup, then you need to modify the
VSTesthost.exe.config. If you are using a controller and agent, then modify the QTAgent.exe.config for
each agent machine. Open the correct file. The locations are
In 2010
The agent service in VS 2010 is now set to Server GC by default. No need to take any action here.
To retrieve a list of agents assigned to a controller without using the VS IDE, look in:
In 2008
<install point>\Microsoft Visual Studio 9.0 Team Test Load
Agent\LoadTest\QTControllerConfig.xml
In 2010
<install point>\Microsoft Visual Studio
10.0\Common7\IDE\QTControllerConfig.xml
The most common use for IP Switching is when load testing against a load balancer. Load balancer
typically use the IP address to route requests to a particular Web server in the farm. So if you have 2
agents driving load to 3 Web servers, since all traffic is coming from two IPs (one on each agent), only
two of the web servers would get all the traffic. IP Switching provides a way to have traffic come from
multiple IPs on the same agent, enabling the load balancer to balance load across the farm.
VSTT currently limits the number of unique IP addresses to 256 per agent. In most testing situations, this
will be plenty of addresses. The main place where this limitation might impact you is if you are running a
large test where every single user must have a separate IP Address for some sort of session state. This is
pretty unusual.
In VS 2008, there is no way to have a given virtual user use the same IP. That is, with IP switching turned
on, a given user will multiple IPs out of the IP pool, and may use different IPs on subsequent iterations.
In VS 2010, the Web test engine tries to ensure that the same user will always use the same IP address,
but there is no guarantee that it will be the case.
The biggest problem with assigning unique IP Addresses to every user is that currently the IP switching
configuration limits you to a range of 256 IP addresses per agent, which would mean you would also be
limited to 256 virtual users per agent. One solution is to use VMs to get multiple load test agents on a
single physical machine.
Windows IP Configuration
Where to enable IP Switching for the Load Test Itself (after configuring the agents to use it)
If you just record the values in a web test and post the recorded values, you can run into ASP.NET error
messages about invalid view state or failed event validation. The Visual Studio web test recorder will
normally automatically detect the __VIEWSTATE and __EVENTVALIDATION hidden fields as dynamic
parameters. This means the dynamically extracted values will be posted back instead of the recorded
values.
However, if the web server is load balanced and part of a web farm you may still run into invalid view
state and failed event validation errors. This occurs when not all servers in the web farm use the same
validationKey and the post back request is routed to a different server in the farm than the one on
which the page was rendered.
Visual Studio Client Resolution: The problem is that you have two network adapters on the client
machine. The following entries in the controller log confirm that this is the problem:
In regedit:
Read the following support article for the steps to resolve this issue on a test rig:
http://support.microsoft.com/kb/944496
1. Remove the _NT_SYMBOL_PATH in the environment where you start devenv.exe from.
2. Change _NT_SYMBOL_PATH, by putting a cache location in front of the symbol store location.
For more information about symbol paths and symbol servers, go to:
http://msdn.microsoft.com/en-us/library/ms681416(VS.85).aspx
A common root cause is the _NT_SYMBOLS_PATH variable defined in the environment that points to
somewhat slow symbol server (like \\symbols\symbols).
The below error may appear several times when running a load test where you are using IP Switching. In
most cases, this can be ignored.
The one situation where the presence of this error may indicate a real issue with the test is when the
application is relying on a given iteration to always come through on the same IP address for purposes of
maintaining a session (such as a load balancer like Microsoft ISA Server with the IP Sticky setting turned
on).
You might encounter timeouts when deploying load tests to agents when the deployment contains
many or large files. In that case you can increase the timeout for deployment. The default value is 300
seconds.
In 2010
You have to change the .testsettings file that corresponds to your active test settings in Visual Studio,
because the deployment timeout setting is not exposed via the Visual Studio UI. Check via the menu
Test | Select Active Test Settings (Visual Studio 2010) which file is active. You can find the file in the
Solution Items folder of your solution. Open it in the XML editor, by right clicking it, choosing “Open
With…” and selecting “XML (Text) Editor”.
The TestSettings element will have an Execution element. Add a child element called Timeouts, if not
already present, to the Execution element. Give it a deploymentTimeout attribute with the desired
timeout value in milliseconds. For example:
In 2008
In 2008 you have to change the .testrunconfig file that corresponds to your active test run configuration,
Add a child element Timeouts under the TestRunConfiguration element if no such element is already
present. Check via the menu Test | Select Active Test Run Configuration which file is active. You can find
the file in the Solution Items folder of your solution. Give it a deploymentTimeout attribute with the
desired timeout value in milliseconds. For example:
Visual Studio 2010 added the feature of handling network emulation within the test harness. This
functionality is based off of a toolkit that was unofficially released as NEWT
(http://blog.mrpol.nl/2010/01/14/network-emulator-toolkit/).
The default profiles within Visual Studio are somewhat limited, but these can be enhanced by making
additional emulation files or modifying the existing files.
The sample on the next page shows some of the items that can be set and changed: If you create a new
file, save it as a “*.NETWORK” file in the above directory. The name you assign the profile in the XML is
what will be displayed inside Visual Studio.
If you already have custom profiles you created with NEWT, just make sure to add the
<NetworkEmulationProfile name="NAME_OF_PROFILE_HERE"
xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
Before the <Emulation> tag and to close it after the </Emulation> tag
In 2008
C:\Program Files\Microsoft Visual Studio
9.0\Common7\IDE\Templates\LoadTest\CounterSets
In 2010
(x86) C:\Program Files\Microsoft Visual Studio
10.0\Common7\IDE\Templates\LoadTest\CounterSets
These files are standard XML files and can be modified to allow for quick and easy re-use of custom sets.
It is recommended that you copy the counter set you wish to enhance and add the name CUSTOM to it
so you will always remember that it is a custom counter set. Or you can create your own totally
independent counter set. The following shows the layout of the file:
Having a slow WAN between the controller and agents may definitely cause some timeouts or delays in
performance counter collection. Each performance counter category is read in a separate operation:
that’s one method call at the level of the .NET classes that we call, and I don’t know if each call results in
just one or more than one network read.
There are some timeout settings for performance counter collection that you can change by editing the
QTController.exe.config file (or VSTestHost.exe.config file when running locally on VS 2008, or in
devenv.config.exe for 2010) and adding these lines:
<appSettings>
<add key="LoadTestCounterCategoryReadTimeout" value="9000"/>
<add key="LoadTestCounterCategoryExistsTimeout" value="30000"/>
</appSettings>
The values are in ms, so 9000 is 9 seconds. If you make this change, also change the load test sample
rate to be larger than this: at least 10 or preferably 15 seconds, and yes with many agents located far
from the controller, it is recommended to delete most of the categories in the Agent counter set
(perhaps just leave Processor and Memory).
<DefaultCountersForAutomaticGraphs>
<DefaultCounter CategoryName="Memory" CounterName="Available MBytes"/>
</DefaultCountersForAutomaticGraphs>
In the Load Test editor, all of the performance counter categories that start with “LoadTest:” (see the
LoadTest counter set in the load test editor) is data that is collected on the agents by the load test
runtime engine. These are not real Perfmon counters in the sense that if you try to look at them with
Perfmon you won’t see them, though we make them look like Perfmon counters for consistency in the
load test results database and display. The agents send this some of this data (see below) in messages to
the controller every 5 seconds which rolls up the agent (e.g. Requests / sec across the entire rig rather
than per agent). The controller returns the rolled up results to Visual Studio for display during the run
and also stores them in the load test results database.
[Requests Per Second Counters] The VS RPS does not count cached requests, even though VS is sending
an http GET with if-modified-since headers.
What data is sent every 5 seconds? we do everything possible to limit how much data is sent back in
that message. What we do send back is the average, min, max values for all of the pseudo
performance counters in the categories that start with “LoadTest:” that you see under the “Overall”,
“Scenarios” and “Errors” nodes in the load test analyzer tree (nothing under the “Machines” node).
Note that the biggest factor in the size of these result messages is the number of performance counter
instances, which for Web tests is mostly determined by the number of unique URLs reported on during
the load test. We also send back errors in these 5 seconds messages, but details about the failed
requests are not sent until the end of the test, so tests with lots of errors will have bigger messages.
Lastly, we only send back metadata such as the category names and counter names once and use
numeric identifiers in subsequent messages, so the messages at the start of the load test may be slightly
larger than later messages.
One thing you could do to reduce the size of the messages is to reduce the level of reporting on
dependent requests. You could do this by setting the “RecordResult” property of the
WebTestRequest object to false. This eliminate the page and request level reporting for that request,
but you could add a transaction around that request single request and that would really match the
page time for that request
The first thing to do is create a custom class that does the data initialization (as described in the first
part of this post: http://blogs.msdn.com/slumley/pages/custom-data-binding-in-web-tests.aspx). Next,
instantiate the class inside your unit test as follows:
[TestClass]
public class VSClass1
{
private TestContext testContextInstance;
[ClassInitialize]
public static void ClassSetup(TestContext a)
{
string m_ConnectionString = @"Provider=SQLOLEDB.1;Data
Source=dbserver;Integrated Security=SSPI;Initial Catalog=Northwind";
CustomDs.Instance.Initialize(m_ConnectionString);
}
[TestMethod]
public void Test1()
{
Dictionary<string, string> dictionary = customDs.Instance.GetNextRow();
//......Add the rest of your code here.
}
Verifying saved results when a test hangs in the “In Progress” state after the test has
finished
If you run a test and either the test duration or the number of iterations needed for completion of the
test have been reached, but the test stays in the “In Progress” state for a long time, you can check if all
of the results have been written to the load test results repository by running this SQL query against the
LoadTest database:
If the EndTime has a non-NULL value then the controller is done writing results to the load test results
database and it should be safe to restart the rig (killing anything if needed).
This doesn’t necessarily mean that all results from all agents (if the agents got hung) were successfully
written to the load test database, but it does mean that there’s no point in waiting before killing the
agents/tests.
Comparison of a test with and without warmup. Notice the total number of tests run is different, but the recorded times are
close enough to be valid for reporting.
Scenario 2:
When you compare the summary page results to the detailed results values, there can be a difference in
what is reported. This is due to the implementation of collecting the timing details, which are currently
flushed when a test iteration ends. For iterations that are in progress with in-flight requests, we give the
iteration 10 seconds (configurable via cooldown) to complete any in-flight requests. If they do not
complete, the transactions in those iterations are not counted in the details, but are counted in the
summary page.
Comparing VS Results to IIS Results for 100% new vs. 100% return
This section shows how VS handles caching and how to interpret the numbers shown for total requests
and cached requests.
GIF 83
BMP 32719
200 OK - 3871 New users are simulated by “clearing” the cache at the start of
304 Not Modified - 29462 each new iteration, whereas the cache is carried from iteration
VS Requests: 33,333 to iteration for return users.
VS Requests Cached: 84,507
This results in many more requests being cached with return
TOR 10 - Caching - NewUsers
HTM 276 users.
HTML 271 NOTE: The total # of requests made by VS is a sum of the two VS
IIS Logs
GIF 276
From
GIF 85
BMP 3330 information, see the section “Add an Expires or a
200 OK - 3874 Cache-Control Header” from
304 Not Modified - 75 http://developer.yahoo.com/performance/rules.html).
VS Requests: 3,949
VS Requests Cached: 84,842 Notice that VS honors the content expiration (this is
TOR 11 - Caching - NewUsers - Content Expiration actually handled by the underlying System.NET
HTM 268 component). However, VS still reports the cached file
HTML 262 request, even though no call went out the wire. This is
IIS Logs
From
GIF 268
expected behavior since the request was a part of the
BMP 44622
200 OK - 45286 site. In order to see how many requests went on the
304 Not Modified - 134 wire, you need to use IIS logs or network traces.
VS Requests: 44,742
VS Requests Cached: 42,090
SELECT
sc-status, COUNT(*) AS Total
FROM *.log
WHERE
to_timestamp(date, time) between
timestamp('2010-02-12 02:13:22', 'yyyy-MM-dd hh:mm:ss')
and
timestamp('2010-02-12 02:18:22', 'yyyy-MM-dd hh:mm:ss')
GROUP BY
sc-status
data sources for data driven tests get read only once
When initializing data driven tests the data is read ahead of time, and only retrieved once. Therefore
there is no need to optimize the connection to the data source.
The amount of space required in the load test results repository to store the Timing Details data may be
very large, especially for longer running load tests. Also, the time to store this data in the load test
results repository at the end of the load test is longer because this data is stored on the load test agents
until the load test has finished executing at which time the data is stored into the repository. For these
reasons, Timing Details is disabled by default. However if sufficient disk space is available in the load
test results repository, you may wish to enable Timing Details to get the percentile data. Note that
there are two choices for enabling Timing Details in the Run Settings properties named "StatisticsOnly"
and "AllIndividualDetails". With either option, all of the individual tests, pages, and transactions are
timed, and percentile data is calculated from the individual timing data. The difference is that with the
StatisticsOnly option, once the percentile data has been calculated, the individual timing data is deleted
from the repository. This reduces the amount of space required in the repository when using Timing
Details. However, advanced users may want to process the timing detail data in other way using SQL
tools, in which case the AllIndividualDetails option should be used so that the timing detail data is
available for that processing.
However, if you are trying to monitor another SQL Server instance that is not the default SQL server
instance, the names of the performance counter categories for that instance will have different category
names. For example, if your SQL server instance is named "INST_A", then this performance counter
category will be named "MSSQL$INST_A:Locks". To change the load test to collect these performance
counters, the easiest thing to do is open the .loadtest file with the XML editor or a text editor and
replace all instances of "SQLServer:" by "MSSQL$INST_A:Locks" (correcting the replacement string for
your instance name).
90% of the total transactions were completed in less than <time> seconds
95% of the total transactions were completed in less than <time> seconds
The calculation of the percentile data for transactions is based not on the sampled data that is shown in
the graph, but on the individual timing details data that is stored in the table
LoadTestTransactionDetail. The calculation is done using a SQL stored procedure that orders the data
by the slowest transaction times, uses the SQL “top 10 percent” clause to find the 10% of the slowest
transactions then uses the min() function on that set of rows to get the value for the 90th percentile
time. The stored procedure in the LoadTest database that does this is
“Prc_UpdateTransactionPercentiles”.
VS 2010 – There was a change in the way the recovery model was configured in the
loadtestresultsrepository.sql command that ships with VS 2010, but the change does not take effect due
to a different command further down in the script. This issue is known and will be resolved in a future
version.
To change either version - Open SQL Management Studio and connect to the server that has the
LoadTest/LoadTest2010 database. Right click on the LoadTest/LoadTest2010 database in “Object
Explorer” and choose “Properties”. Go to the “Options” page and change the drop down for “Recovery
Model” to Simple.
InstanceName field in results database are appended with (002), (003), etc.
Question:In the LoadTest databases, the Instance Names are sometimes appended with “(002)”,
etc. For example, I have a transaction called “Filter Render Request” and in the load test database I
have two transactions. Also, I have a URL pointing to RenderWebPartContent and I have several
entries. Can someone give me a quick explanation?
Answer: To make a long story short it is a unique identifier that is used mostly internally to distinguish
between cases where you have the same test name in two different scenarios in the load test or the
same page name (simple file name) in different folders in two different requests.
For VS 2010:
http://blogs.msdn.com/slumley/archive/2010/02/12/description-of-tables-and-columns-in-vs-
2010-load-test-database.aspx
Once in the manager, you choose a controller name from the drop down list (or <local> if you want the
results from the local database) and the manager will populate with the tests it finds. You can select
whatever test results you wish to move, and then choose “export” to move them into a file (compressed
with an extension of .ltrar). That file can be moved to another machine and then imported into a new
results store.
You might wonder why there is still an entry in the Test Results window and what effect
importing/exporting the test result would have.
For most result types all of the data needed to display the result can be exported into a TRX file. This is
not true for load tests. The only thing that a TRX file stores for a load test is the connection string to the
database with the results and the run id of the run to load. So if you do not run the load test with
storage type set to database, then exporting the TRX file is useless. It will contain no useable data that
you can use for later analysis. So ALWAYS use a database when running load tests.
Notice the call to LoadTest.dbo.LoadTestRun is hardcoded, which is what causes the feature to break.
In general, we recommend you use the LoadTest database name (or in the case of 2010, the database is
named LoadTest2010).
In VS 2008, if a Web Test trx file is opened in an XML editor, you may notice the NAN page time for some
of the responses.
<Response url="http://teamtestweb1/storecsvs/"
contentType="text/html; charset=utf-8"
statusLine="HTTP/1.1 200 OK"
pageTime="NaN"
time="0.006"
statusCodeString="200 OK"
contentLength="12609">
This only happens to non top-level requests, i.e. redirects and dependents.
At the end of Web test execution, all results (objects and their members) are serialized to a trx file,
including the pageTime. NAN is the result of doing a .ToString() on a float or double value that has not
been initialized. This means that the pageTime is not known at the time this entry was written to the trx.
The following is the screenshot of the Web test result file opened in the Playback window. It shows how
this property is set in the code.
The high-lighted one is the top-level page. It is redirected and the redirected to page has some
dependent requests. The ‘Total Time’ for the top-level page, i.e. the page time, refers to the time to
send all requests and receive all responses (including the redirects and dependents) from the Web
server. It is only calculated and populated for the primary request, but not for ‘redirected to’ and the
dependents. This is why that you are seeing Nan page time in the XML file.
TRX files are the result files created when you run a unit or web test in Visual Studio. There are two
pieces here. The first describes how TRX files are constructed in VS 2008, and the second part shows
how things have changed for VS 2010
In 2008
In VS 2008, if you run a Web test outside a load test, the entire Web test result is serialized to the trx
file. So each request and response in the test is serialized. If the test runs multiple iterations, the trx file
can get very large.
We added optimizations to control the amount data that is stored in the TRX for request/response
bodies by only storing one copy of a unique response bodies (in multi-iteration runs you may end up
with multiple identical responses). Also, the request and response bodies are compressed to
dramatically reduce the amount of space they require in the TRX.
There is a test context snapshot stored before every request (including dependent requests).
Sometimes, you’ll find really large VIEWSTATE in a test context that can make them really large.
The request/response headers and the test context snapshots are not compressed and duplicates are
not eliminated, so they have the potential to become bloated.
In 2010
In VS2010, there is one major change on how the WebTestResultDetails class is persisted upon test
completion. Instead of writing the WebTestResultDetails class to a trx file, VS serializes the object to a
*.webtestResult file. The relative path of this file is added as an element to the trx file. By saying
‘relative’, it means relative to the path of the corresponding trx file.
The file only exists on the machine that you run the Web test from, i.e. the VS / mstest machine.
For a local run, the file goes to \TestResults\prefix_Timestamp\In\TestExecuId.
For a remote run, the file goes to \TestResults\prefix_Timestamp\In \Agent\TestExecuId.
When you open a Web test trx file from the Test Results window, VS reads the value of
WebTestResultFilePath from the trx file, and then loads the .webtestResult from
TrxDirecory\WebTestResultFilePath into Web Test Result window.
This behavior has been changed in SP1, HOWEVER, there are a couple of gotchas to be aware of:
The compressed size will only be reported in VS if the response in NOT using “chunked
encoding”
The test results window will not indicate whether the reported size is the compressed or the
uncompressed size.
VS has a receive buffer that defaults to 1,500,000 bytes and it throws away anything over that.
The number reported is what is saved in the buffer, not the number of bytes received. You can
increase the size of this buffer by altering the ResponseBodyCaptureLimit at the start of your
test. This needs to be done in code and cannot be modified in a declarative test.
CSV files created in VS or saved as Unicode will not work as data sources
If you create a CSV file in VS, it saves the file with a 2 byte prefix indicating encoding type, which is
hidden. When you select the file as a data source, the first column will be prefixed with two unusual
characters. The problem is the two bytes on the front that cannot be seen unless the file is viewed in
hex format. The solution is to open the file in notepad and save as ANSI.
Also, if a data file is created in Windows® Notepad or Microsoft® Excel® and saved as Unicode, it looks
good in Notepad or VS, but cannot be read in web tests. The solution is to open the file in notepad and
save as ANSI.
When you are recording a web test, VS uses the time between steps as you record to generate the
ThinkTime values after each request. When you add a comment, the recorder switches from RECORD
mode to PAUSE mode, however, the timer to calculate think times does not pause, so you end up with
think times that include the time you spent typing in the comment. This is also true if you manually
pause the recording for any other reason. To fix this, do the following:
In 2008
Go through the test after recording is complete and adjust the think times manually.
In 2010
VS 2010 offers a new dialog to make this easy. See the section New “Reporting Name” property for web
requests
In the VS IDE, you can right click on a webtest file and choose to “Open in XML Editor”. Once you do that
and then close the window, the next time you double click on the webtest to open it, the file should
open in the default declarative view. However, in VS 2010 there is a known issue that causes the
webtest to always be opened in XML mode.
---------------------------------------------------------------------
<None Include="WebTest1.webtest">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
<SubType>Designer</SubType>
</None>
---------------------------------------------------------------------
Possible DESKTOP HEAP errors when driving command line unit tests
When you run a large number of unit tests that call command line apps, and they are run on a test rig
(this does not happen when running tests locally), you could have several of the tests fail due to running
out of desktop heap. You need to increase the amount of heap that is allocated to a service and
decrease the amount allocated to the interactive user. See the following post for in depth information,
and consider changing the registry as listed below:
http://blogs.msdn.com/ntdebugging/archive/2007/01/04/desktop-heap-overview.aspx
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\SubSystems
OLD SETTING: “Windows SharedSection=1024,3072,512”
NEW SETTING: “Windows SharedSection=1024,1024,2460”
Changed in 2010
There is a QFE available that fixes the following bugs with Goal Based Load Patterns that were
introduced in VS 2008 SP1:
If you defined a goal based load pattern using a performance counter from any of the
“LoadTest:*” categories, an error would occur and the user load would not be not adjusted
according to the goal.
If you defined a goal based load pattern using a “single instance” performance counter (for
example Memory\Available Mbytes), an error would occur and the user load not be not
adjusted according to the goal.
If the Machine Name property entered for the goal based performance counter did not exactly
match the casing for the computer name, an error would occur and the user load would not be
adjusted according to the goal.
Also, Even if you know the value, you may see the error near the beginning, since the transaction may
not have run yet, so the instance to check may not yet exist.
Resolution: We've analyzed this memory leak and determined that this is a bug in the
System.Net.HttpWebRequest class (used to issue Web test requests) that occurs when the Web test
target https Web sites. A workaround is to set the Load Test to use the "Connection Pool" connection
model. This problem is fixed in VS 2010.
This can occur if you have signed code in your test harness and you make changes to some of the code
without resigning it. You can try either one of the below options to attempt to resolve it:
OPTION 1:
1. In the .NET Framework 2.0 Configuration, Go to Runtime Security Policy | Machine | All_Code
2. Right click All_Code, select "New...", and select any name for your new group. Click Next
3. Select URL as your condition
4. Type \\machine_name\shared_folder\assembly.dll or \\machine_name\shared_folder\* and
click Next
5. Make sure permission is set to FullTrust
6. Click Next, and Finish
7. Close all your Visual Studio IDEs, restart, and try again
OPTION 2:
This issue can also occur if you have a downloaded zip file (or other) that is flagged in the properties as
“Blocked” You need to unblock it to use. Right click on the file and go to the properties:
When you use the new feature in VS 2010 “Save Log on Test Failure”, you may get an “Out of disk
space” error. Depending on the number of “Maximum Test Logs” and the size of data for each iteration,
the logs being saved can be very large(for instance, a webtest that uploads and/or downloads large
files).
When a particular request encountered an error in VS 2008 while running a load test (with “Timing
Details Storage” set to “All Individual Details”), you could go to the details of the error and see the
information specific to that request. This option is no longer in VS 2010. It has been replaced by the new
detailed logging feature that logs the entire Web test or unit test result for a failed virtual user iteration.
While the limitation in the product still exists in 2010, however, you can unlock all processors by
installing a vUser license on the local machine. See “New Load Test and Load Test Rig Licensing and
configurations” for more information.
If you are experiencing the bug, you can work around it by:
These are often due to exhaustion of available connection ports either on the VS machine(s) or on the
machines under test. To see if this could be happening, open a CMD window on your VS machine(s) and
on the machine(s) under test, and run the following command:
The TIME_WAIT state is a throwback from the old days (well more accurately the default of 4 minutes is
the throwback). The idea is that if the client closes a connection, the server puts the socket into a
TIME_WAIT state. That way, if the client decides to reconnect, the TCP negotiation does not need to
occur again and can save a little bit of time and overhead. The concept was created because creating a
TCP connection was a costly operation years ago when networks were very slow).
To get around this issue, you need to make more connections available and/or decrease the amount of
time that a connection is kept in TIME_WAIT. In the machine’s registry, open the following key and
either add or modify the values for the two keys shown:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters
]
"TcpTimedWaitDelay"=dword:0000001e (30 seconds)
"MaxUserPort"=dword:0000fffe (65,535 ports)
If you are experiencing the issue on one of the VSTT load machines, you may also need to change the
load test connection model to “Connection Pooling” and increase the pool size considerably.
This error will occur if you get a NULL value in the LoadTest column of the LoadTestRun table. To fix it,
go to the table and delete the row that has the NULL value. The occurrence of this issue should be
extremely rare.
A default Hidden extraction rule was added to the request. When the rule fired, the result was:
$HIDDEN2._ListSchemaVersion_{9fcdfcc2-6d4f-4a22-a379-8224954c1d9a
It should have been
$HIDDEN2._ListSchemaVersion_{9fcdfcc2-6d4f-4a22-a379-8224954c1d9a}
This is not a bug, but just a side effect of how VS process context parameters.
Test results iteration count may be higher than the max test iterations set
When a test run that defines a specific number of test iterations is complete, you may see more tests
run than the iterations set in the run properties. This is rare and is caused by the load test process
crashes and restarts. This issue exists in VS 2008 and VS 2010. The reason for this is that the Restart file
we use to handle restarting a load test after QTAgent dies was never updated to include info about the
tests completed, so it will always run the initial number of test iterations after restart.
Resolution:
Find out what is causing QTAgent to crash and fix that issue.
A way to control this is to specify a Cool-Down period of 10 minutes in the Load Test’s run
settings. Assuming that the requests in your Web test have the default request timeout of 5 minutes,
all in-flight requests at the time load test completion at one hour should either finish or be timed out in
5 minutes and then the in-flight tests should be displayed in the User Details Test chart.
Here’s what I’ve discovered. There is an option in VSTT that allows you to keep VSTestHost alive after a
test run completes: go to “Tools”, “Options”, “Test Tools”, “Test Execution” and see the check box “Keep
test execution engine running between test runs”. This is on by default, and I’m guessing it is on for
you. When you run just a unit test in a test run, this option works and VSTestHost does not get killed
when the test run completes, so neither does its child processes. However, when you run a Web test, this
option seems to be ignored and VSTestHost is killed by a call to Process.Kill() which I believe does kill the
child processes of VSTestHost as well (if you uncheck this option, you’ll see that running the unit test has
the same behavior). I’m not sure why VSTestHost goes away even when this option is set when a Web
test is run – this may have been intentional. Here’s a workaround that seems to work instead:
create a unit test that sleeps for 10 seconds (or whatever time is needed)
create an ordered test that includes your coded Web test first then the unit test that sleeps
run the ordered test rather than the coded Web test
NOTE: an example of this scenario is firing off a batch file that starts a NETCAP.EXE window to gather
trace data during the test run. This NETCAP process must run asynchronously so it will not block the web
test. It must also complete by itself or the resultant trace file will not get written.
Web tests should not be starting other processes, or performing any blocking operations as they will
cause problems with the load test engine. For the netcap example, a better solution is to write this as a
VS2010 data collector.
return _goalLoadProfile;
}
The problem is that if a dependent request has an error, even though the test will be flagged
as failed, and the log for that iteration will be stored, the log does not contain any details for
any dependent requests. Therefore you do not get any details about why the failure
occurred.
To work around this issue, you need to make sure any dependent requests that are having
problems get moved back up to main requests, at least during a test debugging phase.
If you encounter time-outs when running a load test against a WCF service that uses message-level
security, this could be caused by the WCF service running out of security sessions. The maximum
number of simultaneous security sessions is a WCF configuration setting with a default value of 10. Any
additional requests to the service that would lead to more security sessions will be queued.
If you want the service support more than 10 simultaneous clients, you will need to change it in the WCF
configuration setting. Another reason you might run out of security sessions is when the client isn’t
properly closing those sessions after it is done with the service.
A WCF security session is established by a security handshake between client and service in which
asymmetric encryption is used to establish a symmetric encryption key for additional requests in the
same session. The initial asymmetric encryption is more computationally expensive than the symmetric
encryption that is used for subsequent requests. A client must explicitly close the security session to
release server resources or they will only be released by the server after a time-out in the order of
minutes.
If the client only needs to call the web service once, the message exchange with the symmetric key is
unnecessary and you can save a roundtrip by disabling security sessions. Set the
‘establishSecurityContext’ to false in the app.config of the client. This can also serve as a workaround for
clients that do not properly close the session, but do keep in mind that this will skew your performance
results. So only use this workaround while you fix the client.
For more details on secure sessions and the ‘establishSecurityContext’ property see
http://msdn.microsoft.com/en-us/library/ms731107.aspx
During a load test the load agents will write to a file called loadtestitemsresults.dat. If you are planning
to execute a long running load test, you need to be sure that the loadtestitemsresults.dat file will be on
a drive with enough disk space because it can grow into many GBs.
The loadtestitemsresults.dat file is created by the QTAgent or QTAgent32 process. You should add the
key WorkingDirectory to QTAgent.exe.config and/or QTAgent32.exe.config to point to the right drive.
You may run into an issue where the web request is failing with an HTTP 411 Length Required
response. This is in a Post Request with no body. This will not always occur as some web servers may
ignore the missing header. However RFC specification 2616 defines that even with a content length of
zero, the header should still be sent (http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html).
Visual Studio uses its own header collection class to allow for a single collection per request. This makes
the code more efficient. The internal method used to build this collection first removes all headers that
are restricted by the System.Net.HttpWebRequest class (http://msdn.microsoft.com/en-
us/library/system.net.httpwebrequest.headers.aspx) and then adds back the appropriate headers.
However the internal code does not add a content-length when the length is zero. Also, V.S. does not
allow you to directly set any headers that are controlled by the system (such as content-type and
content-length).
To work around this issue, add a dummy body to your request. Here is an example:
--NEW-- Web and load tests stay in “Pending” state for a long time before starting
While this can be caused by many different things, one possible cause is having an _NT_SYMBOL_PATH
environment variable set for the environment where devenv.exe runs.
You may receive the following error when trying to start a load test:
Warning 5/25/2010 4:58:53 PM Could not run load test 'LoadTest1' on agent 'PRITAMB1':
Network emulation is required for load test 'LoadTest1' but the driver is not installed on agent
PRITAMB1.
Network emulation is required for load test 'LoadTest1' but the driver is not installed on agent
PRITAMB1. PRITAMB1
This is most likely caused by the fact that the network emulation drivers did not successfully install
during the VS setup. There are two methods you can try to resolve this issue.:
NOTE: If you just install VS and not the remote agent, the Network Emulation driver is not installed. You
must run the command “VSTestConfig NETWORKEMULATION /install” from an elevated VS Command
Prompt. This will install the driver so that you can use it from VS
Missing ‘>’
--NEW-- Request failure with improperly encoded query strings calling SharePoint
2010
Applies only to 2010
When testing a site built on SharePoint 2010, requests may fail. When running this in a 2010 Web Test,
the query string is not encoded at all and fails out
1. POST to /global/pages/search.aspx
a. Response – HTTP 302 with location header: /global/pages/Search.aspx?k=ALL(Developer
OR Support Engineer)AND(prname="Engineering" OR
prname="ITOperations")AND(lvl=59 OR lvl=60 OR lvl=61 OR lvl=62)
2. GET to /global/pages/Search.aspx?k=ALL(Developer OR SUPPORT
ENGINEER)AND(PRNAME="ENGINEERING" OR PRNAME="ITOPERATIONS")AND(LVL=59 OR
LVL=60 OR LVL=61 OR LVL=62) HTTP/1.1
a. Response – HTTP 400 Bad Request
b. Fiddler only shows the request as /global/pages/Search.aspx?k=ALL(Developer
c. VS is set to follow redirects on the initial POST so this request was automatic
Resolution:
Visual Studio now has a property on requests called EncodeRedirectedUrl. Set this to true and it should
work as expected. This is not available in the UI, so you either need a plugin or a coded test to set it.
--NEW-- Network Emulation does not work in any mode other than LAN
Applies only to 2010
Let’s say you have3 two NIC cards and two IP addresses assigned on the Agent machine. One is used to
communicate with controller (intranet) and other to communicate with external web site (extranet).
As you may already know, for Network Emulation, the loadtest has to specify a port number from a port
range that is set for the network type to be emulated. Unfortunately, it also has to also specify a source
IP address in the .Net call (HTTPRequest.ServicePoint.BindIPEndPointDelegate), and it assumes the first
IP address that is returned by System.Net.Dns.GetHostAddresses is the correct one. In this case, we are
getting the intranet IP address first and ending up binding HTTP requests to it.
The solution that worked is to enable IP Switching, and specify an IP address range that consists of one
IP address that is equal to the one that works. (To set this, open TestManageTestControllers in VS,
and click on Properties for the Agent machine, and fill appropriate fields).
This will enable the Load Test to use correct IP address in communicating with Web Site.
Error that Browser Extensions are disabled when recording a web test
You might see the following error when trying to record a web test:
To fix this, go to “Tools” -> “Internet Options and set the following:
The number of HTTP requests per minute exceeded the configured limit. Contact your Forefront TMG
administrator
--NEW-- MaxConnection value in App.Config is not honored when running a load test
If you have a unit test that reads an App.Config file, and you set a maxconnection value in that config,
Visual Studio will ignore that value and default to a connection max of 100. Here is what happens:
Here is my sample test that writes the value of maximum connections to a file –
[TestMethod]
public void TestMethod1()
{
File.WriteAllText("c:\\out.txt", "The current connection limit is "
+ System.Net.ServicePointManager.DefaultConnectionLimit.ToString());
}
When run in a load test with 1 iteration, I see the following output –
The load test code does set the DefaultConnectionLimit to 100; otherwise it defaults to something very
low, so the load test code is overriding the config setting. If you write a line of code anywhere in your
unit test (like the TestInitizlize or ClassInitialize method) to set the DefaultConnectionLimit explicitly,
that should override the load test setting and the load test sets this before running any unit test code
With each release of VS we have made major strides in Web Test Authoring and Debugging. With VS
2008, we added a number of features to address the most common challenges with Web test authoring,
the most important being a low-level http recorder and a automatic correlation tool. This covered the
most prevalent challenges outlined in Web Test Authoring and Debugging Techniques. Again with VS
2010 we have made major strides in Web test authoring and debugging:
If you really want to test the user experience from the browser, use a Coded UI test to drive the
browser.
In order to be successful working with Web Performance Tests, it is important you understand the
fundamentals about how they work.
Ok, so Web tests work at the HTTP layer. What about requests sent and received by javascript and/or
browser plugins? The best example for java script generating HTTP traffic is AJAX calls. The most
common example of browser plugins are SilverLight or Flash. The Web test recorder will record HTTP
traffic from AJAX calls and from most (but not all) browser plugins.
This page looks like it failed, when if fact it succeeded! Looking closely at the response, and subsequent
requests, it is clear the operation succeeded. As stated above, the reason why the browser control is
pasting this message is because java script has been disabled in this control.
Another variant of this is plugins such as this page that is using SilverLight:
With VS 2010, we again have made tremendous strides across the tool, in recording, editing, and
debugging, so help you be successful doing this. Some of the high-level features are:
Editor Improvements
Recorder Improvements
Our goal with this release was to build tooling around the flow for debugging a web test, mostly to help
find and fix dynamic parameters. This flow is described in Sean’s seminal post, How to Debug a Web
Test. The flow is this:
In VS 2010, you’ll find commands in Web test playback and the editor that seamlessly support this flow:
1) A new recorder log that enables you to see the http traffic that was generated from IE. This is a
huge new feature critical for debugging tests. You can jump from a request, request parameter,
or response in playback to the same context in the recording log to compare them.
2) Search in playback and search and replace in the Web test editor. These features are super-
important for quickly finding and fixing dynamic parameters.
3) Jump from a request in playback to that same request in the editor. This greatly increases the
efficiency of the workflow.
4) Create an extraction rule directly from playback, automatically setting the correct parameters
on the extraction rule. Again, this increases efficiency.
In the editor, you can see this value is not bound to a context parameter:
Now go back to the results viewer. At this point, you want to find the dynamic values in the response of
one of the previous requests, as the dynamic parameter value had to have come from a response body
(since that’s how http and browsers work). To do this, you want to go to the recorder log. The reason
you want to do this from the recorder log is that the recording will have the original recorded value in it.
To do this, click on the recorder log icon (we really should have put this on the context menu too!).
This will take you to the same request with the same parameter selected. Now right-click on the
parameter and do a quick find to find the parameter value in a previous response. Again, you want to do
this from the recording log, since the parameter is dynamic the value will be in the recording log but not
the playback log.
Once the extraction rule is added, you also need to bind the parameter values. Choose yes to the
message box to launch search and replace from the Web test editor.
1) First, we changed the persistence mechanisms for Web test results to store results to a separate
log file rather than the in the trx.
2) We created a full public API for the Web test result.
3) We stamp request ids in each http request (enables jumping between playback and the editor).
4) The recorder generates a Web test result file and saves it as part of the recording.
The recorder log is persisted in the same file format as a Web test result. There is a full API over this data
(see the WebTestResult and WebTestResultDetails classes).
Recorder plugins are a new, super-powerful capability to the VS 2010 recorder. Recorder plugins are an
extensibility hook that gives you full access to the recorded result and the recorded Web test, and move
seamlessly from a recorded request to that corresponding request in the web test. This enables you to
make any modifications you see fit to the generated Web test. This is in effect a “catch-all”, the ultimate
power and productivity tool in your hands to save time fixing up Web tests.
Recorder plugins can be used for any number of reasons: fixing up dynamic parameters (adding
extraction rules and bindings), automatically adding validation rules, automatically adding data sources
and doing data bindings, filtering out recorded dependents, etc.
Recorder plugins are pretty straightforward to code up and install. Recorder Plugins derive from the
WebTestRecorderPlugin class. Once you have implemented a plugin, just drop the assembly into either
of these directories, and then restart VS:
using System;
using System.Collections.Generic;
using System.Text;
using System.ComponentModel;
using Microsoft.VisualStudio.TestTools.WebTesting;
using Microsoft.VisualStudio.TestTools.WebTesting.Rules;
using System.Diagnostics;
namespace RecorderPlugins
{
[DisplayName("Correlate ReportSession")]
[Description("Adds extraction rule for Report Session and binds this to
querystring parameters that use ReportSession")]
public class CorrelateSessionId : WebTestRecorderPlugin
{
public override void PostWebTestRecording(object sender,
PostWebTestRecordingEventArgs e)
{
// Loop through the responses in the recording, looking for the session
Id.
bool foundId = false;
foreach (WebTestResultUnit unit in e.RecordedWebTestResult.Children)
{
WebTestResultPage recordedWebTestResultPage = unit as
WebTestResultPage;
if (recordedWebTestResultPage == null)
{
continue;
}
// If we haven't found the session Id yet, look for it in this
response.
if (!foundId)
{
// Look for the "ReportSession" string in the response body of a
recorded request
int indexOfReportSession =
recordedWebTestResultPage.RequestResult.Response.BodyString.IndexOf("ReportSession");
if (indexOfReportSession > -1)
{
// Find the corresponding page in the test, this is the page
we want to add an extraction rule to
WebTestRequest requestInWebTest =
e.RecordedWebTest.GetItem(recordedWebTestResultPage.DeclarativeWebTestItemId) as
WebTestRequest;
Debug.Assert(requestInWebTest != null);
if (requestInWebTest != null)
{
foundId = true;
string startsWith = "?ReportSession=";
string endsWith = "&";
string contextParamName = "ReportSession";
AddExtractTextRule(requestInWebTest, startsWith, endsWith,
contextParamName);
e.RecordedWebTestModified = true;
}
}
}
if (requestInWebTest != null)
{
BindQueryStringParameter(requestInWebTest, "SessionId",
"SessionId");
}
}
}
}
///
/// Code to add an ExtractText rule to the request.
///
///
///
///
///
private static void AddExtractTextRule(WebTestRequest request, string
startsWith, string endsWith, string contextParameterName)
{
// add an extraction rule to this request
// Get the corresponding request in the Declarative Web test
ExtractionRuleReference ruleReference = new ExtractionRuleReference();
ruleReference.Type = typeof(ExtractText);
ruleReference.ContextParameterName = contextParameterName;
ruleReference.Properties.Add(new PluginOrRuleProperty("EndsWith",
endsWith));
ruleReference.Properties.Add(new PluginOrRuleProperty("StartsWith",
startsWith));
ruleReference.Properties.Add(new PluginOrRuleProperty("HtmlDecode",
"True"));
ruleReference.Properties.Add(new PluginOrRuleProperty("IgnoreCase",
"True"));
ruleReference.Properties.Add(new PluginOrRuleProperty("Index", "0"));
ruleReference.Properties.Add(new PluginOrRuleProperty("Required",
"True"));
ruleReference.Properties.Add(new
PluginOrRuleProperty("UseRegularExpression", "False"));
request.ExtractionRuleReferences.Add(ruleReference);
}
And binary post bodies are now handled correctly, which were not always handled correctly with VS
2008.
The recorder now also automatically handles File Uploads so they will “just work”. Files that are
uploaded are automatically added to the project, and the file upload file name will be dynamically
generated to enable you to upload the same file to different names automatically.
1) Conditional logins. In a load test, if you want to simulate a user logging in once and then doing
many operations in the test, this can be accomplished easily in a conditional rule. Session IDs are
typically handled by cookies, and you can easily set up a rule to only go to the login pages if the
login has not happened yet.
2) Variability in your scripts. If you want users to occasionally skip steps in a script, or randomly
repeat some steps, this is easily achieved with the probability rule which will only execute some
requests based on the probability you specify.
3) Loop until some operation succeeds. If an operation is expected to fail for some users, but will
succeed on retry, and you need to model the retry, you can do this by looping while the
operation is not successful. To do this, use an extraction rule to indicate whether or not the
action was successful, then use the Context Parameter Exists to loop until it is successful.
You can debug your loops and conditions using the results viewer, which shows the results of
conditional evaluations.
Use the Reporting Name and Response Time Goals to really light up your excel load test reports, as both
are propagated to the reports.
Setting the response time goal will also help you to find slow requests in a load test, as by default there
is a new Response Time Goal validation rule added to the test. This rule will fail pages that exceed the
goal by the threshold you specify (by default the tolerance is 0). This rule will cause slow requests to fail,
and enable you to collect logs on the failures, which may help you determine why the page is slow.
To this end, we have enabled two extensibility points that will enable us to address this out of band:
This third scenario is the one I want to delve into more in this section. Just as you want a rich editing
experience for working with Web services, REST, or JSON requests, you want a rich way to view this data
in the result viewer as well. The Web test result viewer plugins provide the perfect extensibility point for
this.
Like the response body editor, we are working on out of band plugins for the Web test result viewer.
Here is a screen shot of the result view plugin for binary data:
Notice the tree views in the bottom panes, showing binary data as a tree.
Conclusion
Your takeaway after reading this blog post should be - “Wow, VS 2010 is fantastic and will save me tons
of time creating and maintaining Web tests, I have to have it!”
By working with you directly on the forums and through our blogs, we saw the types of problems you
are hitting developing scripts. We also listened to your feedback and folded it back into the tool. In
places we didn’t have time to address, we’ve added extensibility points to enable us to deliver features
to you out of band, and for you to create your own solutions.
Now you can say: “I’m a performance tester, and Visual Studio 2010 was my idea!”
Recently we had a customer support issue on trouble shooting the Network Emulation driver in VS2010
Ultimate while doing load testing. I thought a blog on how we troubleshooted and isolated the
problem would be helpful, so here it is. In this blog, I discuss the problem, symptoms and also explain
how Network Emulation works in 2010. I also suggest specific steps to consider to isolate and narrow
down the problem.
Scope
This applies to Visual Studio 2010 Ultimate
Customer Scenario
The trouble shooting in this document is applicable to situations where you are attempting to use the
Network emulation capability newly available in VS 2010 Ultimate while creating a new Load Test and in
the "Edit Network Mix" screen of the wizard you select any other network type other than LAN.
True network emulation also provides flexibility in filtering network packets based on IP addresses or
protocols such as TCP, UDP, and ICMP. This can be used by network-based developers and testers to
emulate a desired test environment, assess performance, predict the effect of change, or make
decisions about technology optimization. When compared to hardware test beds, true network
emulation is a much cheaper and more flexible solution.
To use Network Emulation, you will need to install the Visual Studio 2010 Ultimate SKU. Network
Emulation is configured as part of Add and new Load Test Type in Visual Studio and following the wizard
screens (see above). Once you have set up network emulation following instructions at
http://msdn.microsoft.com/en-us/library/dd997557.aspx, you will run your load tests. When the load test
starts, it allocates a range of available ports for each of the Network profiles (DSL, 56.K Modem etc.) that
you have selected in your network mix. This port range is available to the Network Emulation Driver
that is enabled at run time (by default the network emulation driver is disabled).
During load testing, when the load generator sends a request to the application under test it specifies a
port from the port range. When the network emulation driver sees this port from the select port range,
it is able to associate this port with the network profile that this request should follow. This enables the
driver to throttle the load in software ensure it meets the network profile you have selected.
NOTE: There may be other conditions that maybe causing such socket exceptions as well. The load test
may continue to work, but the socket exceptions get logged. The next section will help you isolate and
trouble shoot where the problem lies.
1. Ensure that you have full network connectivity across all the machines that are participating
in your load test.
2. Ensure you have configured the Network Emulation correctly by following the instructions
and making sure admin rights are available for the agent.
3. Ensure that any/all firewalls are dis-abled (at least for trouble shooting) to ensure that firewall
is NOT blocking specific ports or traffic on the lab network.
o a. Run tcpview (available here) to ensure that any socket connections are actually visible
during run time (check for "red" highlights). You may also run your favorite port
monitoring tool (portmon is another example)
4. Ensure that there is no virus software on the load generator machine that is possibly
obstructing this software.
5. To isolate whether the problem is with the Network Emulation Driver or the Load Test
Components you should:
o a. Eliminating the network emulation driver as a cause
Run the load test with network emulation configured correctly (even though
you may be getting socket exceptions)
Ping another host to see whether the output shows network show down and/or
higher latency. Check if the delay value matches selected network profile. If the
latency values match the profile you have selected, then the network driver is
working well.
From that agent machine where you are running the load test, attempt a
connection to any host outside (like your favorite web page). This test verifies
that while the load test is running and network driver is enabled, that external
or lab connectivity is NOT a problem. This will isolate your network emulation
driver from being a problem area.
o b. Eliminating the Load Test Components as cause
You should download and run this sample test program (available as is, not
Microsoft supported) on the same machine as the load generator (agent
machine). This sample program simulates the exact set of socket connection
calls used in the load testing components. If this test program also displays
Socket Exceptions (like in the image below) then this eliminates the Load Testing
product as a cause for the socket exceptions and indicates the problem lies in
the environment, machine, network or something external to the tooling. Please
debug the external problem first before trying to run the load test again.
Also, if IPSEC is enabled, the ports in the network packet are encrypted and as such the network
emulation driver will not be able to determine that the packets are from the designated port range as
set by the load test engine (described above in "How Network Emulation Works in VS2010"). You must
disable IPSEC for network emulation to work.
Additional Resources:
http://msdn.microsoft.com/en-us/library/dd505008(VS.100).aspx
http://blogs.msdn.com/b/lkruger/archive/2009/06/08/introducing-true-network-emulation-in-visual-
studio-2010.aspx
This guide is to help troubleshoot connection issues between Visual Studio Test Controller and Agent as
well as remote test execution issues. It gives an overview of main connection points used by Test
Controller and Agent and walks through general troubleshooting steps. In the end it provides a list of
common errors we have seen and ways to fix them, and a description of tools that can be useful for
troubleshooting as well as how to obtain diagnostics information for test execution components.
We would like to use this guide as running document, please reply to this post to add your comments.
All connectivity issues can be divided into 2 main groups: network issues and security/permission issues.
2.2. Permissions
There are two scenarios which are different by how Test Controller is operating, and the permissions
used by Controller differ depending on the scenario:
3. Step-by-step troubleshooting
Let’s walk through general troubleshooting procedure for Test Controller/Agent connection issues. For
simplicity we’ll do that in step-by-step manner.
Before following these steps you may take a look at Known Issues section in the Appendix to see if your
issue is one of known common issues. The troubleshooting is based on the key connection points and in
essence involves making sure that:
Step 1. Make sure that the Controller is up and running and Client can connect to Controller.
Use Visual Studio or Microsoft Test Manager (see Tools section above) to view Controller status.
If you can’t connect to Controller, make sure that Controller service is running:
On Controller machine (you can also do that remotely) re/start controller service (see
Tools section in Appendix).
(if you still can’t connect) On Controller machine make sure that it can accept incoming
connections through Firewall
Open port 6901 (or create exception for the service program/executable).
Add Firewall Exception for File and Printer Sharing.
(if you still can’t connect) make sure that the user you run the Client under has permissions to
connect to Controller:
On Controller machine, add Client user to the TeamTestControllerAdmins local group.
(if you still can’t connect) On Client machine make sure that Firewall is not blocking incoming
and outgoing connections:
Make sure that there is Firewall exception for Client program (devenv.exe, mstest.exe,
mlm.exe) so that it can accept incoming connections.
Make sure that Firewall is not blocking outgoing connections.
(if you still can’t connect)
VS2010 only: the simplest at this time is to re-configure the Controller:
On Controller machine log on as local Administrator, run the Test Controller
Configuration Tool (see Tools section above) and re-configure the Controller.
All steps should be successful.
(if you still can’t connect) Restart Controller service (see the Service Management commands
section in Tools section above)
Step 2. Make sure that there is at least one Agent registered on Controller.
Use Visual Studio (Manage Test Controllers dialog) or Microsoft Test Manager (see Tools section
in the Appendix) to view connected Agents.
If there are no Agents on the Controller, connect the Agent(s).
VS2010 only:
On Agent machine log in as user that belongs to TeamTestAgentServiceAdmins.
On Agent machine open command line and run the Test Agent Configuration
Tool (see Tools section in the Appendix).
Step 3. Make sure that Agent is running and Ready (for each Agent)
Agent status can be one of: Ready/Offline (temporary excluded from Test Rig)/Not Responding/Running
Tests.
Use Visual Studio or Microsoft Test Manager (see Tools section in the Appendix) to check Agent
status.
If one of the Agents is not shown as Ready, make sure that Agent service is running:
On Agent machine (you can also do that remotely) re/start Agent service (see Tools
section in the Appendix).
(if Agent is still not Ready)
VS2010 only: the simplest at this time is to re-configure the Agent:
On Agent machine log on as local Administrator and run the Test Agent
Configuration Tool (see Tools section in the Appendix) and re-configure the
Agent.
All steps should be successful.
(if Agent is still not Ready)
If Agent is shown as Offline, select it and click on the Online button.
On Agent machine make sure that agent service can accept incoming connections on
port 6901 (if Firewall in on, there must be Firewall exception either for the port or for
the service program/executable).
Make sure that Agent service account belongs to the TeamTestAgentService on the
Controller.
On Controller machine use Computer Management->Local Groups to add Agent
user to the TeamTestAgentService group.
Restart services: Stop Agent service/Stop Controller service/Start Controller
service/Start Agent service.
Make sure that Agent machine can reach Controller machine (use ping).
Restart Agent service (see the Service Management commands section in Tools section
above).
Step 4. If all above did not help, it is time now to analyze diagnostics information.
(VS2010 only) Agent/Controller services by default log errors into Application Event Log (see
Tools section in the Appendix).
Step 5. Take a look at Known Issues section in the Appendix to see if your issue is similar to one of those.
Step 6. Collect appropriate diagnostics information and send to Microsoft (create Team Test Forum post
or Microsoft Connect bug).
Appendix 1. Tools
Visual Studio: Premium (VS2010 only), Team Test Edition (VS2008 only).
Manage Test Controllers dialog (Main menu->Test->Manage Test Controllers): see status
of Controller and all connected Agents, add/remove Agents to Controller, restart
Agents/the whole test rig, bring Agents online/offline, configure Agent properties.
Note: on VS2008 this dialog is called Administer Test Controllers.
Run tests remotely:
VS2008: update Test Run Configuration to enable remote execution (Main Menu->Test-
>Edit Test Run Configurations->(select run config)->Controller and Agent->Remote-
>provide Test Controller name), then run a test.
VS2010: update Test Settings to use remote execution role (Main Menu->Test->Edit Test
Settings -> (select test settings)->Roles->Remote Execution), then run a test.
Microsoft Test Manager (VS2010 only)
Lab Center->Controllers: see status of Controller and all connected Agents, add/remove
Agents to Controller, restart Agents/the whole test rig, bring Agents online/offline,
configure Agent properties. Note that Lab Center only shows controllers that are
associated with this instance of TFS.
Test Controller Configuration Tool (TestControllerConfigUI.exe, VS2010 only):
It is run as last step of Test Controller setup.
You can use it any time after setup to re-configure Controller. The tool has embedded
diagnostics which makes it easier to detect issues.
Test Agent Configuration Tool (TestAgentConfigUI.exe, VS2010 only):
It is run as last step of Test Controller setup.
Trace files:
Controller: vsttcontroller.log
Agent Service: vsttagent.log
Agent Process: VSTTAgentProcess.log
For Client, add the following section to appropriate .config file
(devenv.exe.config, mstest.exe.config, mlm.exe.config):
Inside the <configuration> section (note: “Verbose” is equivalent
to “4”):
<system.diagnostics>
<trace autoflush="true" indentsize="4">
<listeners>
Notes:
In case of Test Controller/Agent services the HKEY_CURRENT_USER is
the registry of the user the services are running under.
TraceLevel: 0/1/2/3/4 = Off/Error/Warning/Info/Verbose.
LogsDirectory is optional. If that is not specified, %TEMP% will be used.
Trace file name is <Process name>.EqtTrace.log, e.g.
devenv.EqtTrace.log.
Tracing from Test Controller Configuration Tool and Test Agent Configuration Tool:
To get trace file, click on Apply, then in the “Configuration Summary” window
on the view log hyperlink in the bottom.
SysInternals’ DebugView can also be used to catch diagnostics information.
Application configuration files
Controller, Agent and Client use settings from application configuration files:
<appSettings><add key="ControllerServicePort"
value="6901"/></appSettings>
Agent Service:
<appSettings><add key="AgentServicePort"
value="6910"/></appSettings>
Client: add the following registry values (DWORD). The Client will use one of the
ports from this range for receiving data from Controller:
HKEY_LOCAL_MACHINE\SOFTWARE\MICROSOFT\VisualStudio\10.0\EnterpriseTools\Qu
alityTools\ListenPortRange\PortRangeStart
HKEY_LOCAL_MACHINE\SOFTWARE\MICROSOFT\VisualStudio\10.0\EnterpriseTools\Qu
alityTools\ListenPortRange\PortRangeEnd
2.1. The message or signature supplied for verification has been altered (KB968389)
Symptom: Agent cannot connect to Controller.
Additional information:
EventL Log (Agent): The message or signature supplied for verification has been altered.
Trace file (Agent) contains:
I, <process id>, <thread id>, <date>, <time>, <machine name>\QTAgentService.exe,
AgentService: The message or signature supplied for verification has been altered.
I, <process id>, <thread id>, <date>, <time>, <machine name>\QTAgentService.exe,
AgentService: Failed to connect to controller.
Microsoft.VisualStudio.TestTools.Exceptions.EqtException: The agent can connect to the
controller but the controller cannot connect to the agent because of following reason:
An error occurred while processing the request on the server: System.IO.IOException:
The write operation failed, see inner exception. --->
System.ComponentModel.Win32Exception: The message or signature supplied for
verification has been altered
at System.Net.NTAuthentication.DecryptNtlm(Byte[] payload, Int32 offset, Int32 count,
Int32& newOffset, UInt32 expectedSeqNumber)
at System.Net.NTAuthentication.Decrypt(Byte[] payload, Int32 offset, Int32 count,
2.2. Controller/Agent in untrusted Windows domains or one is in a workgroup and another one
is in domain.
Symptom: Agent cannot connect to Controller.
Affected scenarios: Test Controller and Agent are not in the same Windows domain. They are either in
untrusted domains or one of them is in a domain and another one is in a workgroup.
Additional information:
Root cause: Due to Windows security, Agent cannot authenticate to Controller, or vice versa.
Resolution:
http://blogs.msdn.com/b/edglas/archive/2009/06/13/increasing-the-roi-of-our-automation.aspx
Best Practice: Blog on various considerations for web tests running under load
The following blog entry describes a number of different features and settings to consider when running
web tests under a load test in VSTT (a link to the blog entry is at the bottom of this topic). The following
topics are covered:
http://blogs.msdn.com/billbar/articles/517081.aspx
Workgroup authentication
In a Microsoft® Windows® domain environment, there is a central authority to validate credentials. In a
workgroup environment, there is no such central authority. Still, we should be able to have computers in
a workgroup talk to each other and authenticate users. To enable this, local accounts have a special
characteristic that allows the local security authority on the computer to authenticate a "principal" in a
special way.
If you have two computers and a principal "UserXYZ" on both machines the security identifiers are
different for MACHINE1\UserXYZ and MACHINE2\UserXYZ and for all practical purposes they are two
completely different "Principals". However if the passwords are the same for them on each of these
computers, the local security authority treats them as the same principal.
So when MACHINE1\UserXYZ tries to authenticate to MACHINE2\UserXYZ, and if the passwords are the
same, then on MACHINE2, the UserXYZ is authenticated successfully and
is treated as MACHINE2\UserXYZ. Note the last sentence. The user MACHINE1\UserXYZ
is authenticated as MACHINE2\UserXYZ if the passwords are the same.
http://blogs.msdn.com/dgorti/archive/2007/10/02/vstt-controller-and-agent-setup.aspx
In 2008
You can create a log file of each recording which will show headers and post body as well as returned
headers and response. The way to enable this is to add the following 2 keys:
[HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\9.0\EnterpriseTools
\QualityTools\WebLoadTest]
When you start recording a web test and the recorder bar is disabled or doesn’t show up it can be hard
to diagnose and fix the issue.
Michael Taute's blog provides a list with common reasons for this to happen and potential fixes for each.
Most of the times the reasons are security related. One of the most common reasons for this problem is:
Issue: recorder bar comes up, but the controls are disabled.
Fix: the web test recorder bar does not work with Internet Explorer Enhanced Security Configuration (IE
ESC) enabled. IE ESC can be removed from within the Control panel -> Add Remove Programs / Windows
Components and uncheck ESC (Windows Server 2003, Vista).
Windows Server 2008 requires a different process to disable this security feature. Start the Server
Manager, browse to the Security Information section and click Configure IE ESC. In the next window
decide for whom you want to enable or disable this feature. For more details and screenshots:
http://blogs.techrepublic.com.com/datacenter/?p=255
1. Go to c:\Program files\Microsoft Visual Studio 2008 Team Test Load Agent\LoadTest on the agent
machine.
2. Edit the QTAgentServiceUI.exe.config file
a. change the EqtTraceLevel to 4
<switches>
<add name="EqtTraceLevel" value="4" />
b. Change the CreateTraceListener value to yes
<appSettings>
<add key="CreateTraceListener" value="yes"/>
The above settings also apply to the QTAgent.exe.config, QTController.exe.config and the
QTControllerService.exe.config files.
Note: These files have moved in VS 2010 to C:\Program Files\Microsoft Visual Studio
10.0\Common7\IDE.
If you just record the values in a web test and post the recorded values, you can run into ASP.NET error
messages about invalid view state or failed event validation. The Visual Studio web test recorder will
normally automatically detect the __VIEWSTATE and __EVENTVALIDATION hidden fields as dynamic
parameters. This means the dynamically extracted values will be posted back instead of the recorded
values.
However, if the web server is load balanced and part of a web farm you may still run into invalid view
state and failed event validation errors. This occurs when not all servers in the web farm use the same
validationKey and the post back request is routed to a different server in the farm than the one on
which the page was rendered.
2) Make sure that none of the IP addresses in the range specified for a particular agent are already
configured on the chosen NIC.
* Edit the file QTAgentService.exe.config: (located at: <Program Files>\Microsoft Visual Studio
9.0 Team Test Load Agent\LoadTest\QTAgentService.exe.config)
* Change:
<add key="CreateTraceListener" value="no"/> to "yes"
* Change:
<add name="EqtTraceLevel" value="3" /> to “4”
* Restart the Load Test Agent service
* Re-run the load test with verbose logging configured, and look for lines in the log file that contain the
text: "Attempting to configure IP address:" and "Configured IP address:" This will tell you whether or not
you the agent service is attempting to configure the IP address you've specified. If you see the
"Configured IP address:" line, it has succeeded in configuring this IP address. If not, there should be
some error logged.
If you have verified the items in step 1 & 2 above, and the log indicates that the configuration of the IP
address is failing but you cannot determine the cause of the failure from the error message in the log (or
if there is no error message in the log), post a new thread to the Web and Load testing forum, or open a
Microsoft Support incident for further assistance, and provide details on the setup including the relevant
portions of the log file.
4) Make sure that the load test you are running is set to use IP Switching: Click on each of the "Scenario"
nodes in the load test editor, go to the property sheet, and verify that the "IP Switching" property is set
to True (normally it should be since this is the default, but it's worth checking).
If the log file created in step 3 shows that the IP addresses are being successfully configured, the next
step is to check the agent process log file to verify that the load test is actually sending requests using
those IP addresses.
* Change:
<add key="CreateTraceListener" value="no"/> to “yes”
* Change:
<add name="EqtTraceLevel" value="3" /> to “4”
6) If the number of unique IP addresses being used as shown by the log entries in step 5 is less than the
number in the range that was configured, it could be because your load test is configured to use a
connection pool with a smaller number of connections than the number of IP addresses specified. If
this is the case, you can increase the size of the connection pool, or switch to "Connection per User"
mode in the load test's run settings properties.
If you try to connect the two web tests before generating any code, your test will fail with the following
error:
There is no declarative Web test with the name 'DrillDown_Coded' included in this Web
test; the string argument to IncludeWebTest must match the name specified in an
IncludeDeclarativeWebTest attribute.
How to use methods other than GET and POST in a web test
Summary
FormPostHttpBody and StringHttpBody are the two built-in classes for generating HTTP request bodies.
If you need to generate requests containing something other than form parameters and strings then you
can implement an IHttpBody class.
More information
http://blogs.msdn.com/joshch/archive/2005/08/24/455726.aspx
Summary
One of the new Web Test features in Visual Studio 2008 is the ability to filter dependent requests. If you
have a request in your web test that fetches a lot of content such as images, JavaScript files or CSS files,
it’s possible to programmatically determine which requests are allowed to execute during the course of
the web test, and which aren't.
More information
http://blogs.msdn.com/densto/pages/new-in-orcas-filtering-dependent-requests.aspx
The following link describes how to use X509 certificate collections to make a SOAP request in .NET;
code for using them in a web test will be similar.
More information
http://msdn.microsoft.com/en-us/library/ms819963.aspx
However, with VS 2008 if you want to completely disable caching of all dependent requests and always
fetch them, you can so with the following WebTestPlugin:
In 2008
Summary
It is possible to create a custom data binding to bind to something other than a table, such as a select
statement. This blog post describes one possible method – creating one class which will manage the
data and creating a web test plug-in to add the data into the web test context.
More information
http://blogs.msdn.com/slumley/pages/custom-data-binding-in-web-tests.aspx
In 2010
http://blogs.msdn.com/slumley/archive/2010/01/04/VS-2010-feature-data-source-enhancements.aspx
this.Context.Add(“ContextNameToUse”,this.Datasource1[“ColumnToUse”]);
http://blogs.msdn.com/slumley/pages/load-testing-web-services-with-unit-tests.aspx
Or, in a declarative test this can be achieved by setting the username value to:
UserName{{$Random(0,10000)}}{{$WebTestUserId}}UserNameExt
validationRule.Validate(source, validationEventArgs);
//************************************************************************************
*************
// WebTestDependentFilter.cs
// Owner: Ed Glas
//
// This web test plugin filters dependents from a particular site.
// For example, if the site you are testing has ads served by another company
// you probably don't want to hit that site as part of a load test.
// This plugin enables you to filter all dependents from a particular site.
//
// Copyright(c) Microsoft Corporation, 2008
//************************************************************************************
*************
using Microsoft.VisualStudio.TestTools.WebTesting;
namespace SampleWebTestRules
{
public class WebTestDependentFilter : WebTestPlugin
{
string m_startsWith;
public string FilterDependentRequestsThatStartWith
{
get { return m_startsWith; }
set { m_startsWith = value; }
}
// Note, you can't modify the collection inside a foreach, hence the
second collection
// requests to remove.
foreach (WebTestRequest r in e.Request.DependentRequests)
{
if (!string.IsNullOrEmpty(FilterDependentRequestsThatStartWith) &&
r.Url.StartsWith(FilterDependentRequestsThatStartWith))
{
depsToRemove.Add(r);
}
}
foreach (WebTestRequest r in depsToRemove)
{
e.Request.DependentRequests.Remove(r);
}
}
}
}
{{Datasource.Table.Column}}
Here is a sample:
Define a static member variable of the unit test class, or if you have multiple unit test classes that need
to share the data, create a singleton object that is accessed by all of the unit tests. The only case in
which this would not work is if you have multiple unit test assemblies being used in the same load test
that all need to share the global data and you also need to set the “Run Unit Tests in Application
Domain” load test setting to true. In that case each unit test assembly has its own app domain and its
own copy of the static or singleton object.
CAVEAT: This will not work in a multi-agent test rig. If you have a multi-agent rig and you want truly
global data, you’d either need to create a common Web service or use a database that all of the agents
access.
using System.Threading;
using System.Diagnostics;
using System.IO;
.......
[TestMethod]
public void TestMethod1()
{
int x=0;
int iDuration = 10000;
try
{
Process myProcess = new Process();
myProcess = Process.Start("c:\\temp\\conapp2.exe", “arg1”, “arg2”);
How to add Console Output to the results store when running Unit tests under load
The following link points to a write-up on how to allow unit tests to write custom output messages to
the Load Test Results Store database from Unit tests while they are running in a load test:
http://blogs.msdn.com/billbar/pages/adding-console-output-to-load-tests-running-unit-tests.aspx
Any ThinkTime that has a value of zero will remain zero regardless of the distribution settings.
((LoadTestScenario)m_loadTest.Scenarios[0]).CurrentLoad = newLoad;
In VS 2008 SP1 and later, you can access the load profile using the LoadTestScenario.LoadProfile
property, and casting this to the appropriate LoadProfile class (such as LoadTestConstantLoadProfile).
How to create a webtest plugin that will only execute on a predefined interval
If you want to write a webtest plugin that will only fire on certain intervals (maybe for polling or
reporting), then use the following as a starting point.
The WebTestIteration property is guaranteed to be unique, so no need to worry about locking. If you
run this web test by itself it will “do something” because the WebTestIteration will be 1 (unless you run
the web test by itself with multiple iterations or data binding).
Rather than hard coding the frequency as 1 in 100, you could make the frequency a property of the
plugin that you set in the Web test editor, or a Web test context parameter or a load test context
parameter: the LoadTestPlugin would need to pass that down to the WebTestPlugin either by setting it
in the WebTestContext or just make the frequency a property on the plugin.
Note that the WebTestIteration property is incrememented separately for each Scenario (on each agent)
in the load test, but if you want the frequency to be across all Web iterations on an agent then you
could define a static int in the WebTestPlugin (and use Interlocked.Increment to atomically increment
it).
If you develop a plug-in or an extraction rule and you want to allow the properties you expose to be
Context Parameters that the user specifies you need to add some code to your plugin to check for the
existence of a Context Paramter using the curly brace ‘,,xyz--’ syntax.
For example suppose the user had a Context Parameter {{ComparisonEventTarget}} that they want to
provide as the property value for the EventTarget property in your plugin, (see the screen shot), then
use the following code snippet to have your extraction/plugin checks the value supplied to determine if
it contains the syntax “{{“.
this.EventTarget =
e.WebTest.Context[contextParamKey].ToString();
}
//. . . . . . . code to do your work starts here…
How To: Modify the ServicePointManager to force SSLv3 instead of TLS (Default)
If you need to modify the type of SSL connection to force SSLv3 instead of TLS (Default) then you must
modify the ServicePointManager.SecurityProtocol property to force this behavior. This can happen if
you are working with a legacy server that requires an older SSLv3 protocol and cannot negotiate for the
higher TLS security protocol. In addition, you may need to write code in your test to handle the
ServerCertificateValidationCallback to determine if the server certificate provided is valid. A code
snippet is provided below.
[TestMethod]
public void TestMethod1()
{
// We're using SSL3 here and not TLS. Without this line, nothing works.
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
//we wire up the callback so we can override the behavior, force it to
accept the cert from the server.
ServicePointManager.ServerCertificateValidationCallback =
RemoteCertificateValidationCB;
coded test.
In a coded test, you can easily add new requests that get returned in the body of the test. You would
just create a new WebTestRequest object and “yield return” it. For example if the rule adds a context
parameter called ErrorUrl, you would have following in code:
if(this.Context.ContainsKey("ErrorUrl"))
{
WebTestRequest request4 = new
WebTestRequest(this.Context["ErrorUrl"].ToString());
request4.Encoding = System.Text.Encoding.GetEncoding("utf-8");
yield return request4;
request4 = null;
}
Validation rule.
First you will need to add a dummy request after the page you want to check. The URL is not important
because you are going to change it based on outcome of the validation rule. In your validation rule set a
context parameter that contains the URL you want to redirect to. Here is a very simple rule that does
this. If return code is great than 400, it adds the URL to the context. In this case, it is just redirecting to
home page of the site.
Here is what the web test looks like: The dummy request is http://localhost.
A solution for VS 2010 using the new conditional rule logic that works for declarative editor. In VS 2010
you can now do branching and looping in declarative editor. So instead of a web test request plug-in,
we can do the redirect with a conditional rule. So you would do the following:
In 2008
http://blogs.msdn.com/slumley/pages/load-testing-web-services-with-unit-tests.aspx
In 2010
You will get the following dialog and can add the reference there:
TCPVCON is a sysinternals tool that is part of “TCPView” and can be downloaded from:
http://technet.microsoft.com/en-us/sysinternals/bb795532.aspx
If you need to run this command (or others) remotely, you can also look at the tool “PsTools” at the
same web page.
You can change this stored procedure by appending a call to your own stored procedure that
implements or starts your custom action. That stored procedure could be implemented as .NET code by
employing a CLR SQL Stored Procedure (see http://msdn.microsoft.com/en-us/library/5czye81z.aspx).
NOTE: changing the LoadTest database is an unsupported action that might interfere with automatic
upgrades to new versions of the database schema.
You can use MSTEST.EXE to start your load test outside Visual Studio. In that case you might run into
errors with missing DLLs for plugins that you do not encounter when running your load test inside Visual
Studio. Visual Studio looks at references to figure out what to deploy, while MSTEST.EXE does not. To fix
this you have to manually add the DLLs as deployment items in the test settings (VS2010) or test run
configuration file (VS2008).
Select the test settings file that you want to use with MSTEST.EXE. This will be one of the files in the
Solution Items folder of your solution with the
.testsettings extension [In 2010]
.testrunconfig extension [In 2008]
Open it in the Test Settings Editor. Go to the Deployment page. Select “Add File…” and select the DLLs
you want to deploy.
Specify the test settings file you have edited on the command line for MSTEST.EXE with the
using System;
using Microsoft.VisualStudio.TestTools.WebTesting;
using System.Net;
namespace WebTestPluginNamespace
{
public class MyWebTestPlugin : WebTestPlugin
{
public override void PreWebTest(object sender, PreWebTestEventArgs e)
{
// Create credentials to authenticate to your proxy
NetworkCredential proxyCredentials = new NetworkCredential();
proxyCredentials.Domain = "yourDomain";
proxyCredentials.UserName = "yourUserName";
proxyCredentials.Password = "yourPassword";
//Set the WebProxy so that even local addresses use the proxy
// webProxy.BypassProxyOnLocal = false;
e.WebTest.PreAuthenticate = true;
}
}
}
// Web Test
// using System.Collections.Generic;
// using Microsoft.VisualStudio.TestTools.WebTesting;
public static void DumpArgs(WebTestContext context)
{
foreach (KeyValuePair<string, object> kvp in context)
{
Debug.WriteLine(kvp.Key + " = " + kvp.Value);
}
}
// Unit Test
// using System.Collections;
// using Microsoft.VisualStudio.TestTools.UnitTesting;
public static void DumpArgs(TestContext context)
{
foreach (DictionaryEntry kvp in context.Properties)
{
Debug.WriteLine(kvp.Key + " = " + kvp.Value );
}
}
this.MoveDataTableCursor(”DataSource1”, ”Products”);
New to 2010
this.MoveDataTableCursor(”DataSource1”, ”Products”,32);
DeclarativeWebTest exposes all of the properties, requests, and rules of the loaded web test so they can
be manipulated in whatever way necessary and then resaved.
For example, if something in your web application has changed that affects a large group of your existing
Web Tests, rather than modify the tests by hand you could write some code to do this for you. Here's an
example of modifying an existing declarative web test in a C# console application:
if (reqToModify != null)
{
reqToModify.ExpectedHttpStatusCode = 404;
}
//Save Test
DeclarativeWebTestSerializer.Save(decWebTest, @"c:\test.webtest");
}
Next, open VS and open up the Test Rig Management dialog (Test -> Administer Test Controllers) and
add each agent back to the list.
Or if you have VS 2010, you can go to each agent and re-run the config tool, which will automatically add
the agent back to the controller.
http://blogs.msdn.com/billbar/archive/2007/07/31/configuring-a-non-default-port-number-for-the-vs-
team-test-controller.aspx
using System;
using System.Diagnostics;
using System.Collections.Generic;
using Microsoft.VisualStudio.TestTools.UnitTesting;
using Microsoft.VisualStudio.TestTools.LoadTesting;
namespace TestProject1
{
public class MyUserObject
{
public int UserId { get; set; }
public string SomeData { get; set; }
}
[TestClass]
public class UnitTestWithUserObjects
{
private static object s_userObjectsLock = new object();
private static Dictionary<int, MyUserObject> s_userObjects = new
Dictionary<int, MyUserObject>();
private TestContext testContextInstance;
public UnitTestWithUserObjects()
{
}
[TestMethod]
public void TestWithUserObjects()
{
MyUserObject userObject = GetUserObject();
Console.WriteLine("UserId: " + userObject.UserId);
DoSomeThingWithUser(userObject);
}
MyUserObject userObject;
lock (s_userObjectsLock)
{
if (!s_userObjects.TryGetValue(userId, out userObject))
{
userObject = new MyUserObject();
userObject.UserId = userId;
s_userObjects.Add(userId, userObject);
}
}
return userObject;
}
The solution proposed gives you a unique ID (a load test user Id) as an int. You would need to write
code to map the integer value to a unique user name. There are several ways to do this – I would
suggest that you use a DB table (or .csv file) where each row contains the load test user ID integer as
well as the data you need for each user (username, password, anything else). You would then need to
write code in your unit test (not using the unit test data binding feature) that reads a row from the
database using the LoadTestUserId to get the correct row for that user. A more efficient and only
slightly more complex solution would be to load all of the data from this user DB table into memory in
the unit test’s ClassInitialize method and store it in a static member variable of type Dictionary<int,
UserObject> where the int key is the LoadTestUserId. Then as each test method runs it gets the
LoadTestUserId as shown in the code attached to the attached email and looks up the user data in this
static Dictionary.
http://blogs.msdn.com/b/billbar/archive/2006/02/09/528649.aspx
--NEW-- How to set default extensions that the WebTest recorder will ignore
The following registry entries will dictate the behavior of the webtest recorder:
[HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\10.0\EnterpriseTools\QualityTools\WebLoadTest]
"WebTestRecorderMode"="exclude"
"ExcludeMimeTypes"="image;application/x-javascript;application/x-ns-proxy-autoconfig;text/css"
"ExcludeExtensions"=".js;.vbscript;.gif;.jpg;.jpeg;.jpe;.png;.css;.rss"
Gotcha: Check Your Validation Level in the Load Test Run Settings
By default, all validation rules added to a web test are marked HIGH. By default, all load tests have a
validation level of LOW. This means that NONE of the validation rules will run in a load test by default.
You either need to lower the level in the web test, or raise the level in the load test.
Caching for all dependent requests is disabled when you are playing back a web test in Visual Studio.
You will notice that if, for example, the same image file is used in multiple web pages in your web test,
the image will be fetched multiple times from the web server.
Each time you run a web test, the web test result chews up memory on the client. This can result in the
following out of memory exception:
This is not a Test Rig exception but a VS client exception. The resolution is to restart VS to release
memory. It is fixed in 2010.
--NEW-- Gotcha: Timeout attribute in coded web test does not work during a load
test
If you use the [Timeout()] attribute in a coded web test, it works as expected. However, if you then
run that webtest inside a load test, the attribute is ignored. This is expected behavior. To set timeouts,
use the request.timeout attribute instead.
Best Practice: Coded web tests and web test plug-ins should not block threads
http://blogs.msdn.com/billbar/archive/2007/06/13/coded-web-tests-and-web-test-plug-ins-should-not-
block-the-thread.aspx
// Since the heartbeat handler is inside the conditional, The event will be setup
// only on one machine All LoadProfile changes are sent to the controller and
// propogated across the rig automatically
loadTest.Heartbeat += new EventHandler<HeartbeatEventArgs>(_loadTest_Heartbeat);
}
}
Any comments and descriptions added will show up in the “Manage Load Test Results” dialog and will
make it much easier to determine which result set maps to the test run you wish to look at.
In 2008
All of the rules in this release on CodePlex relate to the inner text of a tag. For example, for a select tag
(list box and combo box), the option text is stored in inner text rather than an attribute:
<select name="myselect1">
<option>Milk </option>
<option>Coffee</option>
<option selected="selected">Tea</option>
</select>
In order to extract the value of the list box, we need to parse out the inner text of the selected option.
TextArea is another tag that does this, but there are also a lot of other examples in HTML where you
might want to extract or validate inner text. The new project has these new rules as well as a parser for
inner text and select tag:
1. ExtractionRuleInnerText
2. ExtractionRuleSelectTag
3. ValidationRuleInnerText
4. ValidationRuleSelectTag
Download location
http://codeplex.com
In 2010
Many of the features above are now built into VS 2010. Here is a list of these:
http://msdn.microsoft.com/en-us/library/bb385904(VS.100).aspx
Create a Visual Studio Add-In: Create a new Visual Studio Add-In project (see picture below).
This starts the Add-In Wizard. Complete the wizard
Set the margins for the main control. This will correct the size of the listbox from the previous
step. Make sure the value for TOP is big enough to uncover the other controls.
You will need to do a fair amount of work inside the “connect.cs” file to make the plugin work. However,
you should have your functional code (or at least the shell of it) in place before doing the connect.cs
work so the methods you reference will already exist. For my example, the only extra code I need is the
backing code for the user control. Double-Click on the listview and add the following methods:
Now we can jump into the connect.cs code. Here are the main items of interest for us:
The highlighted method names correspond to the matching method definitions below.
m_controls.Remove(e.WebTestResultViewer.TestResultId);
}
}
{ you created.
if (e.WebTestRequestResult != null)
{
foreach (UserControl userControl in m_controls[e.TestResultId].Values)
{
UserControl1 userControl1 = userControl as UserControl1;
if (userControl1 != null)
This is code that does the work for the addin. Here I get all of the
{
WebTestResponse response = e.WebTestRequestResult.Response;
In this post I am going to talk about a new feature that can help with web test recording. The feature is
extensible recorder plug-ins for modifying recorded web tests. Basically we are giving you the
opportunity to modify the recorded web test after you click stop on the web test recorder bar but prior
to the web test being fully saved back to the web test editor. So what problems does this help with?
The main one is performing your own custom correlation. In VS 2008 we added a process which runs
post recording that attempts to find dynamic fields. You can read this blog post for more information:
http://blogs.msdn.com/slumley/pages/web-test-correlation-helper-feature-in-orcas.aspx
This process still exists, but this process does not always find all dynamic fields for an application. So if
we did not find the dynamic fields in your application you had to manually perform the correlation
process. Here is a blog post that goes into detail about the manual process:
http://blogs.msdn.com/slumley/pages/how-to-debug-a-web-test.aspx Also there are cases that our
correlation process does not find the dynamic values, such as dynamic values in the URL.
This new feature allows you to write your own plug-in which can perform correlation or modify the web
test in many ways prior to it being saved back to the web test editor. So once you figure out that certain
dynamic variable have to be correlated for each of your recordings, you can automate the process. To
demonstrate how this works, I am going to write a recorder plug-in which will perform the correlation
that I manually walked through in my previous post. Please quickly read that:
http://blogs.msdn.com/slumley/pages/vs-2010-feature-web-test-playback-enhancements.aspx
Overview
base.PostWebTestRecording(sender, e);
using System.ComponentModel;
using Microsoft.VisualStudio.TestTools.WebTesting;
using Microsoft.VisualStudio.TestTools.WebTesting.Rules;
namespace RecorderPlugins
{
[DisplayName("Correlate ReportSession")]
[Description("Adds extraction rule for Report Session and binds this to
querystring parameters that use ReportSession")]
public class CorrelateSessionId : WebTestRecorderPlugin
{
public override void PostWebTestRecording(object sender,
PostWebTestRecordingEventArgs e)
{
//first find the session id
bool foundId = false;
ruleReference.Type = typeof(ExtractText);
ruleReference.ContextParameterName = "SessionId";
ruleReference.Properties.Add(new
PluginOrRuleProperty("EndsWith", "&ControlID="));
ruleReference.Properties.Add(new
PluginOrRuleProperty("HtmlDecode", "True"));
ruleReference.Properties.Add(new
PluginOrRuleProperty("IgnoreCase", "True"));
ruleReference.Properties.Add(new
PluginOrRuleProperty("Index", "0"));
ruleReference.Properties.Add(new
PluginOrRuleProperty("Required", "True"));
ruleReference.Properties.Add(new
PluginOrRuleProperty("StartsWith", "ReportSession="));
ruleReference.Properties.Add(new
PluginOrRuleProperty("UseRegularExpression", "False"));
WebTestRequest requestInWebTest =
e.RecordedWebTest.GetItem(page.DeclarativeWebTestItemId) as WebTestRequest;
if (requestInWebTest != null)
{
requestInWebTest.ExtractionRuleReferences.Add(ruleReference);
e.RecordedWebTestModified = true;
}
foundId = true;
}
}
else
{
//now update query string parameters
WebTestRequest requestInWebTest =
e.RecordedWebTest.GetItem(page.DeclarativeWebTestItemId) as WebTestRequest;
if (requestInWebTest != null)
{
foreach (QueryStringParameter param in
requestInWebTest.QueryStringParameters)
{
if (param.Name.Equals("ReportSession"))
{
param.Value = "{{SessionId}}";
}
}
}
}
b. Now that we found the response, we need to add an extraction rule. This code creates
the extraction rule and then finds the correct request in the web test to add the
extraction rule to. Each result object has a property called DeclaraticveWebTestItemId
which is what we will use to get correct request from the web test.
WebTestRequest requestInWebTest =
e.RecordedWebTest.GetItem(page.DeclarativeWebTestItemId) as WebTestRequest;
if (requestInWebTest != null)
{
requestInWebTest.ExtractionRuleReferences.Add(ruleReference);
e.RecordedWebTestModified = true;
}
c. Now we need to find all query string parameters that have ReportSession as name and
change the value to {{SessionId}}
WebTestRequest requestInWebTest =
e.RecordedWebTest.GetItem(page.DeclarativeWebTestItemId) as WebTestRequest;
6) Now that we have our plug-in, I need to compile and deploy it to one of the locations listed
above.
7) Restart VS
8) Open a test project and create a new web test. I now see the following dialog with my plug-in
available:
12) Now look at the first request in the web test and you will see the extraction rule.
This is a slightly more advanced feature, but it provides a huge time savings for automating
changes to your recorded web test. If you have multiple people creating web tests, you can use
this plug-in to make sure the same parameters or rules are added to each web test. And of
course you can automate correlation of parameters or URLs which the built in correlation tool
does not find.
In my post “Creating a Stand-Alone Network Emulator using VS2010 – Beta 1, I showed you how to
create a stand alone network emulator using the network emulation functionality introduced in the Beta
1 release of VS2010. Since then, the API for Network Emulation has gone through several changes and
long story short, the API for Beta 1 will not work for RC or RTM. To make things a bit easier, I have
created a new “Stand-Alone” Network Emulator UI (NEUI) that will allow you to take advantage of the
Network Emulation features in VS2010 without having to fire up VS2010 and start a unit or load test. I
have posted the source for this project on CodePlex for everyone to enjoy :). For now, it is in the project
“Web and Load Test Plugins for Visual Studio Team Test”, but my hope is that it will gain enough community
support and involvement that it will warrant going through the process of creating and maintaining it as
a separate project.
uses WPF
allows the user to select one Network Profile to emulate a specific network.
when minimized, displays in the system tray
when in the system tray, allows for the starting, stopping of network emulation and the
selection of the network profile.
Please feel free to download and use the emulator. Also, if you feel strongly enough, feel free to
suggest or contribute new features.
http://blogs.msdn.com/profiler/archive/2008/10/15/walkthroughs-using-VS-test-and-profilers-to-find-
performance-issues.aspx
http://msdn.microsoft.com/en-us/magazine/cc337887.aspx?pr=blog
http://www.codeguru.com/cpp/v-s/devstudio_macros/visualstudionet/article.php/c14823__1/
http://blogs.msdn.com/profiler/archive/2007/10/19/articles-on-new-visual-studio-team-system-2008-
profiler-features.aspx
Average Max and Min time taken for each page type
-i:IISW3C -recurse:-1 -Q:on "SELECT EXTRACT_EXTENSION(cs-uri-stem) as Type, AVG(time-
taken) AS Average, MAX(time-taken) AS Maximum, MIN(time-taken) AS Minimum INTO
PageTimes.txt FROM ex*.log WHERE time-taken &amp;gt; 0 GROUP BY Type ORDER BY
Average DESC"
Pulling data from inside the body string of event viewer logs
logparser -i:evt "SELECT extract_prefix(extract_suffix(Strings,0,'left text'),0,'right
text') as String INTO optimizer.txt FROM *.EVT WHERE Strings LIKE '%Optimizer
Results%'" -q:ON
(variation) Pulling data from inside the body string of event viewer logs constrained by timeframe
logparser -i:evt -q:ON "SELECT Count(*) AS Qty, SUBSTR(extract_suffix(Message, 0,
'Message :'), 0, 75) as String FROM Error! Hyperlink reference not
valid.name>\Application WHERE SourceName LIKE '%Enterprise%' AND Message LIKE
'%Timestamp: %' AND TimeGenerated > TIMESTAMP ('2008-06-06 07:23:15', 'yyyy-MM-dd
hh:mm:ss' ) GROUP BY String ORDER BY Qty DESC"
List of exceptions from saved event logs searching for keywords in the text output
-I:evt "SELECT QUANTIZE(TimeGenerated, 3600) AS Hour, COUNT(*) As Total, ComputerName
FROM *.evt WHERE EventID = 100 AND strings like '%overflow%' GROUP BY ComputerName,
hour"
Logparser command for querying netstat
netstat.exe -anp TCP | LogParser “SELECT [Local Address] AS Server,[Foreign Address]
AS Client,State FROM STDIN WHERE Server LIKE '%:443' OR Server LIKE '%:80'” -i:TSV -
iSeparator:space -nSep:2 -fixedSep:OFF -nSkipLines:3 -o:TSV -headers:ON
Command to query Netmon file and list out data on each TCP conversation
LogParser -fMode:TCPConn -rtp:-1 "SELECT DateTime, TO_INT(TimeTaken) AS Time,
DstPayloadBytes, SUBSTR(DstPayload, 0, 128) AS Start_Of_Payload INTO IE-Take2.txt FROM
IE-Take2.cap WHERE DstPort=80 ORDER BY DateTime ASC" -headers:ON
Command to query Netmon and find frame numbers based on specific text in payload
LogParser -fMode:TCPIP -rtp:-1 "SELECT Frame, Payload INTO 3dvia.txt FROM 3dvia.cap
WHERE DstPort=80 AND Payload LIKE '%ppContent%' " -headers:ON
The problem is that in the Microsoft-Ajax partial rendering (update panel) responses, hidden fields can
appear in two places: a field that is marked by the type “|hiddenField|” (where we were looking), but
also in a regular hidden field input tag in the HTML within an “|updatePanel|” field in the Ajax response
(which we were not looking at).
http://blogs.msdn.com/irenak/archive/2008/02/22/sysk-365-how-to-get-your-unit-tests-test-project-in-
visual-studio-2008-a-k-a-mstest
KB 956397 (http://support.microsoft.com/kb/956397/en-us)
http://blogs.msdn.com/billbar/archive/2008/08/04/bug-in-VS-2008-sp1-causes-think-time-for-
redirected-requests-to-be-ignored-in-a-load-test.aspx
Four New Methods added to the WebTestPlugin Class for 2008 SP1
http://blogs.msdn.com/billbar/pages/web-test-api-enhancements-available-in-VS-2008-sp1-beta.aspx
D N
Data Collectors, 85 Network
data source, 15, 74, 77, 87, 150 Firewall, 129, 130, 131, 132, 137, 138
declarative web test, 10, 52, 53, 86, 88, 92, 148, 149, 151, Netmon, 86, 197
162, 170, 171, 174, 176 Netstat, 93, 197
dependent requests, 12, 16, 72, 84, 85, 97, 148, 150, 153, TCP Parameters, 93
175 TCPView, 166
Deployment, 55, 63, 64, 65, 167 Tracing, 79, 133, 135, 136, 141, 195
NUnit, 26
E
P
Execution Interleaving, 26
extract, 11, 24, 25, 34, 94, 149, 158, 174, 178, 197 Parameter Data
Data Source, 15, 74, 77, 87
Random, 15, 129, 130, 151
F
Sequential, 15, 48
Fiddler, 54, 86 Unique, 15, 58, 72, 81, 85, 147, 151, 157
Parameters, 12, 25, 31, 32, 34, 50, 62, 87, 93, 94, 144, 148,
H 150, 151, 154, 156, 157, 158, 160, 161, 162, 169, 198
performance counters, 21, 68, 70, 71, 72, 79, 89, 195
HIDDEN parameters, 94, 149, 198 Performance Monitor, 71, 72
HTTP Headers, 10, 31, 72, 85, 143, 150, 197 Permissions, 130, 131
Content-Type, 10 phishing, 88
If-Modified-Since, 150 processor, 21, 46, 69, 70, 92
Pragma, 10 proxy server, 54, 168
Referrer, 10
SOAPAction, 10