Best Practices To Improve ASP - Net Web Application Performance
Best Practices To Improve ASP - Net Web Application Performance
Enabling tracing adds performance overhead and might expose private information, so it should be enabled only while an application is being actively analyzed.
Solution
Since ASP.NET manages session state by default, you pay the cost in memory even if you don't use it, i.e., whether you store your data in-process or on state server or in a SQL database, session state requires memory and it's also time consuming when you store or retrieve data from it.
Solution
You may not require session state when your pages are static or when you do not need to store information captured in the page. In such cases where you need not use session state, disable it on your web form using the directive:
<@%Page EnableSessionState="false"%>
In case you use the session state only to retrieve data from it and not to update it, make the session state read only by using the directive:
<@%Page EnableSessionState ="ReadOnly"%>
1. There are a number of drawbacks to the use of view state, however. 2. It increases the total payload of the page both when served and when requested. There is also an additional overhead incurred when serializing or deserializing view state data that is posted back to the server.
Solution Pages that do not have any server postback events can have the view state turned off. The default behavior of the ViewState property is enabled, but if you don't need it, you can turn it off at the control or page level. Within a control, simply set theEnableViewState property to false, or set it globally within the page using this setting:
<%@ Page EnableViewState="false" %>
If you turn view state off for a page or control, make sure you thoroughly test your pages to verify that they continue to function correctly.
Setting it to "true" requires the PDB information to be inserted into the file and this results in a comparatively larger file and hence processing will be slow.
Solution
5. Avoid Response.Redirect
Response.Redirect() method simply tells the browser to visit another
page.
How it affects performance
Redirects are also very chatty. They should only be used when you are transferring people to another physical web server.
Solution
For any transfers within your server, use .transfer! You will save a lot of needless HTTP requests. Instead of telling the browser to redirect, it simply changes the "focus" on the Web server and transfers the request. This means you don't get quite as many HTTP requests coming through, which therefore eases the pressure on your Web server and makes your applications run faster.
Tradeoffs
2. Server.Transfer
can really help streamline data entry techniques, although it may make for confusion when debugging 5. A) To reduce CLR Exceptions count, use Response.Redirect (".aspx", false) instead of response.redirect (".aspx").
String is Evil when you want to append and concatenate text to your string. All the activities you do to the string are stored in the
memory as separate references and it must be avoided as much as possible. i.e. When a string is modified, the run time will create a new string and return it, leaving the original to be garbage collected. Most of the time, this is a fast and simple way to do it, but when a string is being modified repeatedly, it begins to be a burden on performance: all of those allocations eventually get expensive.
Solution
Use String Builder whenever string concatenation is needed so that it only stores the value in the original string and no additional reference is created.
Exceptions are probably one of the heaviest resource hogs and causes of slowdowns you will ever see in web applications, as well as windows applications.
Solution
You can use as many try/catch blocks as you want. Using exceptions gratuitously is where you lose performance. For example, you should stay away from things like using exceptions for control flow.
1. The finally
the Block.
method gets executed independent of the outcome of block to kill resources like closing database
connection, closing files and other resources such that they get executed independent of whether the code worked in Try or went to Catch.
Client site validation can help reduce round trips that are required to process user's request. In ASP.NET, you can also use client side controls to validate user input. However, do a check at the Server side too to avoid the infamous JavaScript disabled scenarios.
10.
Round trips significantly affect performance. They are subject to network latency and to downstream server latency. Many data-driven Web sites heavily access the database for every user request. While connection pooling helps, the increased network traffic and processing load on the database server can adversely affect performance.
Solution
1. Keep round trips to an absolute minimum 2. Implement Ajax UI whenever possible. The idea is to avoid full page refresh and only update the portion of the page that needs to be changed
11.
Use Page.ISPostBack
Make sure you don't execute code needlessly. Use Page.ISPostBack property to ensure that you only perform page initialization logic when a page is first time loaded and not in response to client postbacks.
Explicitly using return allows the JIT to perform slightly more optimizations. Without a return statement, each function/method is given several local variables on stack to transparently support returning values without the keyword. Keeping these around makes it harder for the JIT to optimize, and can impact the performance of your code. Look through your functions/methods and insert return as needed. It doesn't change the semantics of the code at all, and it can help you get more speed from your application.
13. Use Foreach loop instead of For loop for String Iteration
Foreach is far more readable, and in the future it will become as fast as a For loop for special cases like strings. Unless string manipulation is a
real performance hog for you, the slightly messier code may not be worth it.
14.
When you use byRef, you pass pointers instead of the actual object. Many times, this makes sense (side-effecting functions, for example), but you don't always need it. Passing pointers results in more indirection, which is slower than accessing a value that is on the stack.
Solution
When you don't need to go through the heap, it is best to avoid it there by avoiding indirection.
15.
An ArrayList has everything that is good about an array PLUS automatic sizing, Add, Insert, Remove, Sort, Binary Search. All these great helper methods are added when implementing the IList interface.
Tradeoffs
17.
Use Paging
Take advantage of paging's simplicity in .NET. Only show small subsets of data at a time, allowing the page to load faster.
Tradeoffs
Just be careful when you mix in caching. Don't cache all the data in the grid.
18.
ASP.NET allows you to cache entire pages, fragment of pages or controls. You can also cache variable data by specifying the parameters on which the data depends. By using caching, you help ASP.NET engine to return data for repeated request for the same page much faster.
When and Why Use Caching
A proper use and fine tune of caching approach will result in better performance and scalability of your site. However, improper use of caching will actually slow down and consume lots of your server performance and memory usage. Good candidate to use caching is if you have infrequent chance of data or static content of web page.
19.
Authentication can also have an impact over the performance of your application. For example, passport authentication is slower than form-base authentication which in here turn is slower than Windows authentication.
20.
The use of web server controls increases the response time of your application because they need time to be processed on the server side before they are rendered on the client side.
Solution
One way to minimize the number of web server controls is by taking into consideration, the usage of HTML elements where they are suited, for example if you want to display static text.
21.
Try to reduce the number calls between the managed and unmanaged code. Consider doing more work in each call rather than making frequent calls to do small tasks.
22.
If you are working with distributed applications, this involves additional overhead negotiating network and application level protocols. In this case, network speed can also be a bottleneck. Try to do as much work as possible in fewer calls over the network.
23.
2. Many websites use a single CSS Style Sheet or Script File for the entire website. Sometimes, just going through these files and cleaning them up can improve the performance of your site by reducing the page size. If you are referencing images in your style sheet that are no longer used on your website, it's a waste of performance to leave them in there and have them loaded each time the style sheet is loaded. 3. Run a web page analyzer against pages in your website so that you can see exactly what is being loaded and what takes the most time to load.
24.
Use simple structs when you can, and when you don't do a lot of boxing and unboxing.
Tradeoffs
ValueTypes are far less flexible than Objects, and end up hurting
performance if used incorrectly. You need to be very careful about when you treat them like objects. This adds extra boxing and unboxing overhead to your program, and can end up costing you more than it would if you had stuck with objects.
25.
Minimize assemblies
Minimize the number of assemblies you use to keep your working set small. If you load an entire assembly just to use one method, you're paying a tremendous cost for very little benefit. See if you can duplicate that method's functionality using code that you already have loaded.
26.
By default, ASP.NET comes configured to encode requests and responses as UTF-8. If ASCII is all your application needs, eliminating the UTF overhead can give you back a few cycles. Note that this can only be done on a per-application basis.
27.
These are general things to adopt in any programming language, which consume lot of memory. Always avoid Nested Loops, Recursive functions, to improve performance.
28.
When you can, use toString() instead of format(). In most cases, it will provide you with the functionality you need, with much less overhead.
29.
Web developers who care about performance want browser to load whatever content it has as soon as possible. This fact is especially important for pages with a lot of content and for users with slow Internet connections. When the browser loads the page progressively the header, the logo, the navigation components serve as visual feedback for the user. When we place style sheets near the bottom part of the HTML, most browsers stop rendering to avoid redrawing elements of the page if their styles change thus decreasing the performance of the page. So, always place StyleSheets into the Header.
30.
Unlike StyleSheets, it is better to place scripts to the end of the document. Progressive rendering is blocked until all StyleSheets have been downloaded. Scripts cause progressive rendering to stop for all content below the script until it is fully loaded. Moreover, while downloading a script, the browser does not start any other component downloads, even on different hostnames. So, always have scripts at the end of the document.
31.
Using external files generally produces faster pages because the JavaScript and CSS files are cached by the browser. Inline JavaScript and CSS increase the HTML document size but reduce the number of HTTP requests. With cached external files, the size of the HTML is kept small without increasing the number of HTTP requests thus improving the performance.
Return multiple resultsets in a single database request, so that you can cut the total time spent communicating with the database. You'll be making your system more scalable, too, as you'll cut down on the work the database server is doing managing requests.
Queries that process and then return more columns or rows than necessary waste processing cycles that could best be used for servicing other requests.
Cause of Inefficient queries
1. Too much data in your results is usually the result of inefficient queries.
2. The SELECT
need to return all the columns in a row. Also, analyze the WHERE clause in your queries to ensure that you are not returning too many rows. Try to make the WHERE clause as specific as possible to ensure that the least number of rows are returned. 3. Queries that do not take advantage of indexes may also cause poor performance.
Round trips significantly affect performance. They are subject to network latency and to downstream server latency. Many data-driven Web sites heavily access the database for every user request. While connection pooling helps, the increased network traffic and processing load on the database server can adversely affect performance.
Solution
To ensure the efficient use of connection pooling, avoid keeping connections open and avoid varying connection strings.
If you select the wrong type of transaction management, you may add latency to each operation. Additionally, if you keep transactions active for long periods of time, the active transactions may cause resource pressure.
Solution
Transactions are necessary to ensure the integrity of your data, but you need to ensure that you use the appropriate type of transaction for the shortest duration possible and only where necessary.
10.
Reduce Serialization
Dataset serialization is more efficiently implemented in .NET Framework version 1.1 than in version 1.0. However, Dataset serialization often
introduces performance bottlenecks. You can reduce the performance impact in a number of ways: 1. Use column name aliasing 2. Avoid serializing multiple versions of the same data
CommandBuilder objects such as SqlCommandBuilder and OleDbCommandBuilder are useful when you
are designing and prototyping your application. However, you should not use them in production applications. The processing required to generate the commands affects performance.
Solution
Manually create stored procedures for your commands, or use the Visual Studio .NET design-time wizard and customize them later if necessary.
12.
3. Stored procedures do not have to be interpreted, compiled or even transmitted from the client, and cut down on both network traffic and server overhead.
instead
13.
When using a data adapter, avoid auto-generated commands. These require additional trips to the server to retrieve meta data, and give you a lower level of interaction control. While using auto-generated commands is convenient, it's worth the effort to do it yourself in performance-critical applications.
14.
With a data reader, use CommandBehavior.SequentialAccess. This is essential for dealing with blob data types since it allows data to be read off of the wire in small chunks. While you can only work with one piece of the data at a time, the latency for loading a large data type disappears. If you don't need to work the whole object at once, using Sequential Access will give you much better performance.
When you compile, use early binding. This tells the compiler to insert a Type Coercion is only done when explicitly mentioned. This has two major effects: 1. Strange errors become easier to track down. 2. Unneeded coercions are eliminated, leading to substantial performance improvements.
3. When you use an object as if it were of a different type, Visual Basic will coerce the object for you if you don't specify. This is handy since the programmer does not have to worry about less code.