Net Unit 3
Net Unit 3
Data Access
Introduction
• Data access refers to the process of retrieving and interacting with data stored in various
sources, such as databases, files, APIs and other data repositories.
• It plays a crucial role in modern information systems, enabling applications and users to
efficiently retrieve, manipulate, and update data for various purposes, such as analysis,
reporting, decision-making, and more.
• Data access is the key to unlocking the valuable insights and information stored in
various data sources.
• Data access involves querying, updating, inserting, and deleting data for Retrieval and
Manipulation.
A History of Microsoft Data Access Technologies
• Microsoft has played a key role in providing developers with tools to develop database
applications since the release of SQL Server 1.0 in 1989
• Microsoft's tools targeted not only their own database products, but also thousands of
other databases and data sources through a uniform set of data access clients
• ADO.NET's features and functionality today are a result of Microsoft's many years of
experience developing solutions and the evolution of data into a platform-agnostic format
represented in an industry-standard XML format
• The evolution of data into a common format is a powerful and simple concept that can be
traced back to Microsoft's history of providing these technologies
• The evolution involved moving from a platform-dependent aggregation of data through
technologies like ODBC drivers or OLEDB providers to a platform-agnostic format
represented in XML.
Open database connectivity
• After the initial release of Microsoft SQL Server, Microsoft saw the need to provide a standard
method of connectivity for developers to utilize SQL Server from their applications
• To address this need, Microsoft, IBM, and other manufacturers developed the Open Database
Connectivity API (ODBC)
• The ODBC specification provided a standard low-level API for database vendors to provide
database-specific drivers, allowing developers to interact with databases through a standard
interface
• ODBC has become the most widely accepted interface for accessing nearly every popular relational
database and a wide variety of nonrelational data sources
• The primary drawback of using the ODBC API was the difficulty in using a low-level API for data
access, which led to the introduction of higher-level, object-oriented means of accessing the ODBC
API
Visual Basic 3.0
• Adoption of Visual Basic by corporate developers began with the release of Visual Basic
3.0 in 1993
• Visual Basic 3.0 provided developers with a method to easily connect to a variety of data
sources and build more robust, data-driven applications
• Corporations could utilize Visual Basic's rapid application development strengths to
quickly build solutions that utilized new and existing databases throughout their
organizations
• The two primary technologies that made this possible in VB 3.0 were the Jet database
engine and a new object model called Data Access Objects
Jet database Engine
• The Jet database engine was initially developed as the core database engine built into the Microsoft Access database
• Until the release of VB 3.0, the Jet engine was specific to Access and its features could not be used from any other product
• With the release of VB 3.0, Microsoft shipped a version of the Jet engine that allowed developers to use its services to interact
with any database that provided an ODBC driver
• The Jet engine's initial focus was on ISAM databases such as Microsoft Access, Foxpro, or DBase
• The Jet engine's flexibility in utilizing any underlying ODBC driver provided a standard interface to a growing number of data
sources, making it an extremely popular data-access solution
• One of the primary problems of early adoption of the Jet database engine through VB was its size, which was over one
megabyte in memory during use
• The Jet engine provided a thick layer between the client application and the database, which added a large amount of overhead
to even the most basic database functions
• All query processing occurred on the client, which meant that any client request for a small subset of data required that the
entire table's data be moved across the network and onto the client computer, where filtering occurred within the Jet engine
itself
Data Access Objects (DAO)
• Data Access Objects is an abstract object model and framework that provides a standardized and simplified
way for developers to interact with data sources, primarily using the Microsoft Jet Database Engine.
• DAO provided a simple and flexible method for connecting to and manipulating data in any data source
compatible with the Jet engine
• Although DAOs were subject to architectural limitations imposed by the Jet engine, the simplicity of data
access through the DAO structure allowed developers to quickly develop robust and powerful database
applications in VB
• DAO provided a platform for third-party vendors to begin building a huge market of data-bound controls and
widgets
• The new capability to rapidly build database applications using DAO served to increase demand for tools and
controls that made many of the more difficult development tasks easier
• As new versions of Access and Visual Basic were released, and new features and functionality were added to
the underlying Jet engine, the DAO object model also grew to become a much more powerful data-access
tool utilized by millions of VB developers worldwide.
Visual Basic 4.0
• The popularity of Visual Basic began to skyrocket with the release of VB 3.0 and the
sudden availability of data-access tools.
• Microsoft began to build on those tools and address the architectural and functional
limitations they imposed.
• With the release of VB 4.0, Microsoft not only extended the functionality already
available through the DAO/Jet paradigm but also delivered two new database-access
methods.
• These new methods enabled developers to take advantage of the growing power of full
RDMS systems.
VBSQL
• Visual Basic 4.0 included support for a SQL Server-specific API called VBSQL.
• VBSQL provided a low-level API for connecting directly to a SQL Server database.
• VBSQL was built around the C-Based DB-Library and served as a lightweight and high-
speed interface that was relatively easy to code when using VB.
• VBSQL could be used only to connect to a SQL Server database , which severely
hindered its acceptance by the development community.
• More object-oriented and database-neutral methods for database access became available,
and developers became less likely to code directly in a database-specific API.
RDO
• VB 4.0 included a new object model for data access called Remote Data Objects (RDO).
• RDO was included to address design and scalability issues that developers were facing when
developing large distributed client/server applications with DAO and Jet.
• DAO and Jet required a heavy amount of processing and memory to be utilized on the client, while
RDO provided a much smaller and faster client-side object model while allowing the RDMS system
to bear the brunt of all the processing.
• RDO was not intended to replace the DAO/Jet data access method, which was still suitable for
Access and other ISAM databases.
• RDO served as a thin object interface directly to the underlying ODBC drivers.
• The RDO object model consisted of just 10 objects, as compared to the 17 objects provided by
DAO.
• By allowing the database to handle user accounts and security, the RDO object model did not need
to include specific object interfaces to expose them to developers.
OLEDB
• Microsoft adopted OLEDB providers as the core technology for interacting with diverse data
stores.
• Microsoft aimed to offer developers a data consumer for various client applications.
• Goal: Provide a user-friendly, object-oriented interface for any OLEDB-exposed data source.
• The data consumer should balance power and simplicity for developers.
• The envisioned consumer would enable different applications to connect to OLEDB-exposed
data.
• The data consumer would contribute to versatile data access across relational and nonrelational
data sources.
ADO
• ActiveX Data Objects (ADO) is the most widely adopted and popular object-oriented, data-access technology developed by
Microsoft
• ADO was initially used heavily only from Active Server Pages (ASP) to develop dynamic Web sites
• Microsoft included both its current OLEDB providers and the newest version of its ADO objects in a single data-access
package called Microsoft Data Access Components (MDAC)
• The ADO object model consists of just seven objects, providing developers the ability to query and manipulate data from any
OLEDB-compliant provider
• ADO developers can create and manipulate ADO Recordset-type objects directly, which allows them immediate access to the
underlying data and requires them to write much less plumbing code
• ADO does not always provide the optimal solution for data access in the disconnected world of the Internet
• ADO has evolved over the years to help address the growing disconnected nature of the Internet by providing functionality
such as Remote Data Services, disconnected Recordsets, and XML-based persistence
• The ADO Recordset object plays a key role in providing developers the flexibility to solve their data access needs with a single
object model.
Recordsets
• The entire development paradigm presented by ADO centers around the Recordset object.
• The Recordset object serves as a developer's primary interface when using ADO to interact with a
database.
• All manipulation of the underlying data using ADO occurs through the Recordset object.
• The ADO subsystem handles the details of making sure all changes are made back to the database.
• The Recordset object allows you to programmatically manipulate a subset of data from a database
using a variety of different objects, techniques, and cursor models.
• ADO.NET addresses the lack of a simple way to expose the extended types and features provided by
any individual database or product in a standard fashion for developers to work with.
• ADO.NET provides the capability to disconnect a Recordset from a data source by removing the
reference to its open database connection.
• Microsoft has evolved the current iteration of ADO to address some of the issues facing developers in highly distributed
Web-based application development.
• Developers have needed a way to pass sets of data between processes and computers, which was initially addressed by
providing the capability to use disconnected Recordsets in ADO.
• Remote Data Services (RDS) was another solution presented for the problem of passing sets of data between processes and
computers, but it did not gain wide acceptance among developers.
• ADO.NET has its foundation in the capability to transfer a set of data to and from an industry-standard XML format.
• DAO, RDO, and ADO share a lot of common characteristics in regards to the object models and interfaces exposed to
developers.
• All Microsoft's previous implementations of data-access technologies are tied to the Windows platform.
• The Internet has given rise to a more open and platform-agnostic environment built on standards such as HTML and XML.
• Microsoft assessed the current development community and the types of issues facing developers today to build upon the
lessons learned from developers using previous Microsoft data-access technologies.
Data Access Today
• ADO.NET was developed to solve the problems that existed with the available methods of data access and to look forward to the
types of applications developers will be building in the future.
• The basic development community and focus have shifted toward the Internet and Web-based development for the better part of
the last decade.
• Today's applications target a Web-server environment with possibly thousands of disconnected users who perform updates to one
or more back-end databases.
• The Web-based environment has been built heavily on a large number of industry-adopted standards, such as TCP/IP, HTTP,
HTML, and XML.
• The adoption of standards such as these has allowed the Internet and applications built upon the Internet to transcend beyond a
single platform or development tool.
• The Internet provides a common platform for devices such as cell phones or PDAs to communicate.
• The rise in popularity of handheld devices has increased the need for a thinner, more disconnected form to manipulate relational
data away from the server.
• This type of environment presents an entire new set of challenges than those specifically addressed by DAO, RDO, or even ADO.
Visual Basic and the Internet
• The Internet has created the most dramatic shift in the development structure of most VB developers.
• VB developers are familiar with building applications consisting of multiple forms and controls tied together
by common variables and an extremely event-driven paradigm.
• The Web works on an entirely different development model where applications are pieced together through a
set of related but stateless pages that post information from page to page to maintain a consistent
programmatic flow.
• Prior to the introduction of .NET, the differences in the basic development paradigm presented by both VB
and the standard ASP Web structure provided a steep learning curve for most VB developers.
• Finding the best way to utilize data access within each of these models presented a significant challenge to
most developers.
• The stateless environment of traditional data-access methods, such as DAO, RDO, or ADO, required all the
base objects (Connections, Recordsets, and so on) to be rebuilt during each call to a page.
• The concept of Web farms (applications running on a large number of identical Web servers) evolved,
requiring even more database interaction for applications that didn't generally use the database for the bulk of
their work.
Enterprise Application Integration (EAI)
• Enterprise Application Integration (EAI) is a key concept being addressed by developers today.
• A common interface for integration has become an integral piece to the enterprise puzzle due to the deployment of diverse
and powerful applications throughout the corporate enterprise.
• Integrating disparate applications is complex due to the many hardware and software platforms in today's corporate
environment.
• Microsoft needed to address the manipulation and relation of data from a wide range of applications and platforms with its
next generation of data access technology.
• Microsoft's BizTalk Server 2000 provides a standard platform for EAI built heavily on an XML-messaging paradigm.
• The next generation of data access tools needed to provide developers with a way to easily manipulate XML documents
from products such as BizTalk Server without requiring them to learn an entire new set of development technologies.
• Microsoft's next evolutionary step in data-access technologies needed to build on its past successes and solutions while
providing flexibility for developers to take advantage of current technologies and platforms.
• The solution needed to provide the type of connected access that developers have become familiar with while
accommodating a new disconnected paradigm shift.
• Microsoft presented the first release of its new ADO.NET Framework for data access with flexibility in mind.
Overview of ADO.NET
• ADO.NET provides a new mindset for data access that differs from anything previously available.
• ADO.NET uses XML as the underlying format to represent and manipulate data, allowing for greater flexibility and
cross-platform compatibility.
• Prior versions of ADO relied on OLEDB providers, which forced database vendors to standardize on Microsoft's
specification for the format and structure of an OLEDB provider.
• ADO.NET provides a new type of provider that manipulates data stores into XML and handles manipulating the
returned XML back into a format understandable by the data store.
• ADO.NET supports viewing and manipulating XML data from any source through a very database-like object model.
• The .NET Framework offers a number of different possibilities for developers when manipulating XML documents.
• ADO.NET provides a DataSet object that offers a table/row/field-like interface for interacting with data, which should
be familiar to database developers.
• The XML objects provided by the .NET Framework allow a more DOM-like metaphor for manipulating the same
data.
• Microsoft has provided a high level of overlap between these two methods for editing XML documents, which
provides developers with a tightly integrated environment that offers complete control over the structure of and the
data stored in the underlying XML document.
ADO.NET structure
• The ADO.NET object model revolves around two main groups of objects: DataSets and data providers.
• DataSets and related objects provide a database-neutral view of any data that can be exposed as an XML document.
• Data providers serve as the bridge between DataSets and data sources, allowing DataSets to remain isolated from
any specific data implementation or source.
• A single ADO.NET DataSet can encapsulate a large group of disparate tables while maintaining a consistent
relationship between them all.
• Developers can add tables, rows, and columns programmatically without any direct contact with an underlying data
store.
• The entire relational structure can be passed safely from machine to machine as a simple XML stream, while
retaining all structures and integrity.
• ADO.NET provides developers with the tools to develop the next generation of distributed applications on
the .NET Framework.
DataSets
• The most dramatic new addition to the ADO.NET architecture is the DataSet.
• DataSet provides an object model to manipulate one or more tables of data and maintain
relationships between tables.
• A DataSet can be loaded through a data provider or other means, and does not require any
particular implementation of a driver or provider.
• DataSets can be used to manipulate any data that can be exposed as XML, including file-
based configuration files.
• It is easily possible to load certain tables from one or more databases, certain tables from
text files, and even programmatically create certain tables within the same DataSet.
DataSet object model
• The root of the DataSet object model is the DataSet object itself, which handles all the base services for the entire
underlying structure, such as serializing to and from XML.
• The DataSet object provides access to the many objects that make up the entire DataSet object model, including
DataTable, DataRow, DataColumn, DataRelation, and DataConstraint.
• DataTable represents a single table of data within a DataSet, and a DataSet may contain multiple DataTable objects.
• DataRow represents the data within a DataTable as an array of fields defined by the DataColumn collection.
• DataColumn collection specifies information about the individual columns of data in each table, beyond just name and
data type.
• DataRelation collection specifies information regarding the specific table and column relationships that need to be
maintained between two DataTable objects in a single DataSet.
• DataConstraint object provides a means for developers to specify constraints that must be enforced on a particular
column in a DataTable.
• All these objects work together to provide a robust and dynamic object model for data access.
DataSets and XML schema
• A DataSet has no notion of databases and behaves the same way regardless of where the data
originated.
• A DataSet is a database-like wrapper around an XML document that allows developers to
manipulate that document in a fashion familiar to database developers.
• A DataSet relies on an XML schema document to describe the data it is currently manipulating.
• An XML schema document defines which elements are included in an XML document, the
structure and relationship between those elements, the data types of those elements, and more
detailed information about the document's structure.
• The XML schema document is an industry-standard format such as XML schema that provides
the structure of the underlying XML document, and the DataSet needs to maintain no database
specific information to perform its function.
• The structure of an ADO.NET DataSet can come from two possible places: it can be designed and built at
design time, or the DataSet can take a best guess at runtime through a process called inference.
• The process of inferring schema imposes a significant amount of overhead on the process of loading a
DataSet, and it is entirely dependent on the specific instance of an XML document for which the schema is
inferred.
• Allowing the appropriate data provider to query a specific data store for the pieces of schema when filling
a DataTable provides such a significant hit to performance that it sometimes takes the underlying data
provider longer to query the system tables for schema than it does to query and return the actual requested
data.
• Providing the XML schema document at design time allows modifying the schema by hand, should there
be a need, and saves the overhead of runtime inference.
• An ADO.NET DataSet can persist all of its structure and data into a single XML document, providing an
extremely powerful data-access paradigm when dealing with distributed applications.
Data providers
• ADO.NET Framework needed an object or set of objects to serve as the translation of specific databases and
data sources to and from DataSets.
• ADO.NET implementation was developed to place more of the specific implementation details back into the
hands of the developers.
• ADO.NET data providers are the set of objects and interfaces that provide all of the database-specific
implementations used to access data.
• ADO.NET data providers expose the implementation details programmatically, so the developer has more
control over a specific implementation.
• Microsoft implemented the base data provider through six distinct objects: Connection, Command,
Parameter, DataReader, DataAdapter, and Transaction.
• The ADO.NET Framework specifies only the interfaces that make up the base implementation of a data
provider. Each implementation provides its own set of objects that implement these base interfaces.
• Each of the initial data providers provides object implementations with a unique set of names, but they
implement a set of base interfaces.
• When true database neutrality is a priority, developers can develop against the interfaces themselves and
maintain the ability to plug and play different providers at any time.
ADO.NET
• Before manipulating and changing data through a DataSet object, you should
focus on the objects that make up a data provider.
• Each data provider targets a different type of data connection and implements
its own set of objects that expose a standard interface.
• Each provider provides a separate set of objects, each having a unique name
and located within unique namespaces.
• The SQL provider implements a set of objects with names preceded by a SQL
prefix, while the OleDB provider implements a similar set of objects with
names preceded by an OleDB prefix.
Connections
1. The starting point for dealing with ADO.NET data providers is the Connection
object, which provides the basis for all other ADO.NET data provider objects to
acquire and maintain their connections with their associated data stores.
2. Connection objects provided by data providers each provide at least two constructor
overrides.
3. The first Connection object is configured by specifying the connection string when
the object is instantiated, while the second object is created as an empty object and
has its location set later.
4. Each data provider in the .NET Framework must implement certain interfaces to
make them consistent for all developers, but this does not restrict data provider
vendors from extending the functionality of their providers beyond these interfaces
to provide database-specific functionality and information.
• The SqlConnection object can be created with or without specifying the connection string, but
specifying the connection string at instantiation can simplify the code.
• Data providers in .NET must implement certain interfaces to make them consistent for all developers,
but data provider vendors can extend the functionality of their providers beyond these interfaces to
provide database-specific functionality and information.
• The SqlConnection object exposes a method to access the WorkstationID of the client connecting to
the data source through the object.
• Different data providers in .NET may have different connection string formats and features. For
example, the SQL Server data provider does not require specifying a specific driver implementation,
while the .NET OLEDB data provider requires specifying an OLEDB provider in the connection
string.
Commands
• ADO.NET data providers use a Command object to execute data source-specific commands against a specific data
source.
• The most common types of commands to be executed against a data source are SQL commands, but a separate
Command object can be implemented whenever this isn't possible.
• The SqlCommand implementation provided in the System.Data.SqlClient provider namespace is used to create a
Command object to execute a single Select statement against a server in the example provided.
• A Command object associates valid commands (and their parameters) with data store connections and possibly
transactions.
• The Connection object exposes a CreateCommand method that returns a new Command object already associated
with that Connection object.
• Command objects in ADO.NET offer no way to view any possible results of executing its command. Viewing and
manipulating the results of executing a query fall to objects such as DataReader or DataSet.
• Command objects and SQL commands in general fall into two basic categories: those that return results and those
that do not.
• A DataReader object provides a very fast, forward-only method for iterating through a view of data, such as is
provided by executing a Command object.
ExecuteNonQuery Method of a Command Object
The Command object uses this same paradigm to execute queries that return other views of the data,
or even provides an ExecuteNonQuery method that executes the Command object and does not return
any data
The Parameters Collection of a Command Object
One important use of the Parameters collection is to handle any output parameters returned by stored
procedures.
The ExecuteScalar Method
• The Read function of a DataReader moves sequentially from row to row in a forward-only fashion
and returns a single Boolean result indicating whether there was actually another row to retrieve.
• As a DataReader maintains only a copy of a single row in memory at any given point in time, no
programmatic way to determine the total number of rows returned by a Command object when
using a DataReader object exists.
• Two main methods are available to developers for retrieving specific field values out of a
DataReader: using the Item property and passing either the name of the requested column or a
column index, or using a set of methods that each DataReader provides to return typed values from
specific rows given the column index.
• To retrieve the typed value of a specific row in a result set using the second method, you need to
know the index of that column and cannot easily index specific field values by the name of the
column (as is possible in the Item property).
• It is possible to retrieve the name and data type of a specific column given an index into the
columns collection using the GetName and GetFieldType methods of the DataReader.
DataAdapter objects
• DataAdapter objects provide a means for a disconnected DataSet object to interact with a
database.
• DataAdapter objects map the database into the XML format understood by the DataSet
and map the XML returned by a DataSet into the appropriate Insert, Update, and Delete
Command objects, and set all parameters accordingly.
• DataSet objects provide the capability to have many interrelated tables in a single DataSet
object, and each of these tables can be filled and updated through a separate DataAdapter.
• Each DataAdapter serves as host to four ADO.NET Command objects: SelectCommand,
InsertCommand, UpdateCommand, and DeleteCommand.
• Each DataAdapter exposes its four main Command objects as properties, allowing
developers to preconfigure each of these four Command objects and have complete
control over a particular set of data's interaction with a database.
• The basic task of a DataAdapter object is to select data out of a database and populate a
DataSet.
CommandBuilder
• ADO.NET DataAdapter objects provide the ability to configure Command objects in three ways:
manually, using CommandBuilder object, or using wizards available in Visual Studio .NET.
• CommandBuilder object generates appropriate insert, update, and delete commands and all
associated Parameter objects based on the provided Select statement in the SelectCommand
object.
• CommandBuilder works only with the most basic SQL statements and does not work when using
stored procedures or advanced SQL in the Select statement.
• Using CommandBuilder can be slow and resource-intensive as it must parse the Select statement
and dynamically decide on the appropriate statements to perform for insert, update, and delete
actions.
• Wizards in Visual Studio .NET allow you to graphically configure a DataAdapter, generate the
baseline "plumbing" code required for a specific database interaction and then go tweak the
resulting code to gain more control over the specific interaction and SQL statements.
• Visual Studio .NET data wizards and tools provide a better picture of the type of options
available to you when using them with ADO.NET.
TableMappings
• Table mapping is a feature that allows developers to associate all table names and column names
provided by a DataSet against those actually used by the data store.
• DataAdapter provides the ability to provide a set of table and column mappings.
• The mappings allow associating all table names and column names provided by the DataSet against
those actually used by the data store.
• This feature enables DataAdapter developers to provide a separate set of names for tables and
columns to the front-end developers and users of the DataSet.
• All mappings are maintained in a consistent place.
• DataAdapter serves as the "bridge" between a DataSet and the database.
Data set
• The DataSet is a database-neutral, disconnected collection of data that can be used for a wide variety
of client-side purposes.
• It provides a simple, reusable data structure that can be utilized to solve a large number of
development problems.
• It is a key component in ADO.NET, and it represents an in-memory cache of data retrieved from a
data source.
• It consists of a collection of DataTable objects that represent tables of data in a relational database.
• It can be used to hold the result of a SQL query or to represent a single table in a database.
• It can be used to update data in the database by using a DataAdapter to update the data source with
changes made to the DataSet.
Creating DataTables
• A single DataSet can contain many DataTable objects, and a DataSet provides the capability to enforce relationships between any two DataTables.
• The Tables collection of a DataSet contains each of the DataTables within the DataSet.
• The structure of each DataTable is defined by a collection of DataColumn objects.
• This collection can be accessed directly from a DataTable object using the Columns property.
• Each DataColumn object exposes a large amount of information defining the structure of that particular column.
• By manipulating a DataColumn object, you can configure exactly how a column of data behaves in a particular DataTable.
• The basic column features, such as name, data type, length, and initial value are easily available, as well as a host of more advanced properties.
• Properties such as AutoNumber, ReadOnly, Caption, and Unique provide powerful features that you can use to make a DataTable behave more like
a client-side database.
• The DataColumn object has a ColumnMapping property, which allows you to control how a specific column is added to the underlying XML.
• The available ColumnMapping types are Element, Attribute, Hidden, and SimpleContent.
• These settings affect how a DataColumn is written out when using the WriteXML method of a DataSet to write its contents out as XML.
Programmatically creating a DataSe
Filling DataSets from files
• The feature is extended by the ability to load dataset schema and data from files or any source that
serves XML.
• The DataSet can accept only an XML document containing data and then "infer" its schema, or it can
accept both data and schema to build its structure and populate it with data.
• The DataSet can read in an XML document from a file and have the DataSet perform all the work of
deciding on the best schema with a single line of code.
• The amount of work that has to occur when "inferring" the structure of a DataSet can be realized by
examining the provided code.
• After the structure of the first DataSet has been inferred, the WriteXMLSchema method is used to
write it back out to a file.
• After the second DataSet has been created, the ReadXMLSchema method is used to configure the
second dataset, and the ReadXML method is used to read in the same file.
• The Visual Studio .NET design time environment makes this type of DataSet configuration trivial.
Using DataSets with DataAdapters
• The provided code demonstrates the most basic method for filling a DataSet using
a DataAdapter.
• A single DataAdapter is usually used for filling a single DataTable within a
DataSet.
• The DataAdapter Update method is used for synchronizing modified data back into
the database and one of the key pieces of functionality provided by a DataAdapter
is to map all modifications to a DataSet back to the appropriate Command objects
encapsulated by the DataSet.
• When reconciling data with the database, only the rows that need to be updated,
inserted, or deleted are concerned.
• The DataSet offers the ability to easily extract all modified rows into a new
DataSet, a new DataTable, or a new collection of DataRows through the use of the
GetChanges method on a DataSet.
Tracking and maintaining relationships
• The DataView object helps minimize any excess calls to the database to perform similar functions.
• It is possible to utilize the DataView object to allow multiple clients to view the same DataSet in
different ways.
• Each DataTable within a DataSet contains a default DataView object that is what all controls and tools
bind to by default when binding to a DataTable.
• DataView provides the ability to sort and filter the data in a DataTable.
• DataViews only allow filtering complete rows and cannot be used to limit the number of columns
available to a client.
• DataView object can filter rows of data based on their RowState property.
• Additional RowState values are available to the rows of a DataView besides those of a DataTable.
Visual Studio .NET Database Tools
• Developers can use these tools to view and edit the data within data sources to simplify debugging and monitoring
of database applications without having to use a separate tool such as SQL Server Enterprise Manager.
• The Visual Studio .NET Server Explorer window lets developers manipulate database objects and remote servers
directly from the Visual Studio .NET IDE.
• The Server Explorer window has two sections: Data Connections and Servers.
• The Data Connections section lets developers add connections to regularly used data connections, and the Servers
section allows developers to connect to remote servers and program against a variety of services that may be
available, such as Web services.
Adding connections
• To add a connection to the Server Explorer, right-click the node and select Add Connection from the pop-up menu.
• After clicking Add Connection, the Data Link Properties dialog box opens, which provides all the options available
for adding a valid data connection using a variety of available drivers and data sources.
• By having this connection available in the Server Explorer, developers can easily drag the connections onto forms or
components and Visual Studio .NET and the IDE automatically add the appropriate code to configure the connection
within that item of a project.
• The Data Link Properties dialog box requires selecting the appropriate provider and connection information, such as
server name, database name, and login credentials, to properly configure a database connection in Visual
Studio .NET.
• After the settings have been properly configured, developers can access this connection information and manipulate
database objects underneath this connection if the datasource supports this, regardless of the Visual Studio .NET
solution that is currently open.
• This feature allows developers to add connections to their most commonly used databases and, if required, easily
share them across solutions without having to reconnect or configure.
Manipulating database tables in Visual Studio .NET
• By using the Server Explorer window to expand a specific database connection, developers can navigate to the Tables node to
gain access to a list of all existing user tables in the database.
• By right-clicking the Tables node, developers can select the New Table option from the resulting pop-up menu to add a new
table to the database.
• By right-clicking an existing table and selecting the Design Table option, developers can modify the structure of existing data
tables.
• The table designer provided by Visual Studio .NET allows developers to modify the structure of tables by providing complete
control over a table's column information.
• The column information for a table can be easily edited in the grid provided by the table designer.
• Other features available from within the table designer that can be accessed from the Visual Studio .NET toolbar when the
table designer is open include the ability to modify column indexes and foreign key relationships.
• The Server Explorer window provides developers with the ability to view the data in a table, export all the data to a file,
generate the appropriate SQL script to re-create the table on another database, and add a trigger to the database—all from
within the standard development environment.
• Each of these functions is available by right-clicking the appropriate table node in the Server Explorer window.
Manipulating database views in Visual Studio .NET
• Visual Studio .NET utilizes a full-featured, built-in query analyzer when working with database views.
• The query analyzer allows construction or editing of SQL statements in a manner similar to using the query analyzer in SQL Server Enterprise
Manager.
• Visual Studio .NET provides a powerful query builder for graphically designing and constructing SQL statements and views.
• The query builder can build queries graphically or by manually keying the required SQL statements.
• The query builder can verify the validity of the SQL statements against the appropriate data store, and even execute the statement and preview
the results.
• The query builder is broken into four vertically positioned panes: Diagram Pane, Grid Pane, SQL Pane, and Results Pane.
• The Diagram Pane provides a graphical representation of the tables included in a SQL statement and the joins between them.The Grid Pane
provides the ability to see all the columns selected to be returned as a result of executing the current query.
• The SQL Pane displays the current SQL statement as defined by your modifications to the Diagram and Grid panes.
• The Results Pane displays the results of a query executed after clicking the Run button on the toolbar.
• The query builder integrates tightly with the query analyzer in Visual Studio .NET to allow the quick building of complex stored procedures
made up of a number of logical "blocks" of SQL statements.
Manipulating database stored procedures in Visual Studio .NET
• Visual Studio .NET provides a built-in query analyzer that provides a powerful editing environment for coding
complex SQL statements required in stored procedures.
• The built-in query analyzer has a lot of the same functionality as that provided by the commonly used query
analyzers that ship with recent versions of Microsoft SQL Server.
• The Visual Studio .NET query analyzer can integrate with the SQL query builder to build or modify SQL
statements in logical "blocks" of SQL.
• When editing a stored procedure, you have a simplified interface that provides the capability to wrap SQL
statements into logical blocks that can then be edited by using the query builder.
• This integration allows logical SQL blocks to be verified and tested independently in an intuitive interface to
simplify the creation of the entire stored procedure.
• The capability to debug SQL Server stored procedures from directly within the IDE is another powerful new
feature available from within Visual Studio .NET when working with stored procedures.
• For large applications that are distributed across multiple tiers, this capability to debug from line to line across
multiple projects and down into the underlying stored procedures provides a powerful solution to solve the most
difficult development problems.