Mysql Workbench A Data Modeling Guide For Developers and Dbas
Mysql Workbench A Data Modeling Guide For Developers and Dbas
1 INTRODUCTION ....................................................................................................................3
6 CONCLUSION......................................................................................................................12
This being the case, modern and successful companies are leaving nothing to chance when it comes to
the definition, design, and implementation of their data. This equates to a professional and process-driven
approach to the creation and management of data that will flow through its business systems – a process
that is led by data management professionals who utilize a model-driven approach to data and employ
the right tools in the process to ensure that the capture and administration of data is properly carried out.
This paper looks at the various types of data that modern businesses need to manage, examines the
reasons why a model-driven approach to data management is necessary, and outlines the benefits such
an approach provides. It also highlights how the MySQL Workbench product from MySQL can be an
indispensable aid in the hands of experienced data modelers, developers, and DBAs who are tasked with
managing the complex data management infrastructure of a dynamic and growing business.
1. Operational data – normally transactional processing data that exists in the form of new/updated
customer orders and other data that supports the products and services that companies sell. This
data is generally found in relational databases that support transactional data flows.
2. Business Intelligence data – exists in the form of current and past operational data that is being
used to understand things like customer purchasing trends, the impact of marketing programs, and
more. This data typically resides in staging areas known as data warehouses or analytic data
stores, and is separated from the operational data to improve system response times for those
systems.
3. Historical data – represents the historical activity of business systems or audit trails of data usage
throughout an organization. It differs from business intelligence data in that it is seldom accessed
and is primarily kept online to meet government or industry compliance regulations.
4. Integration data – used to manage the flow of data from operational systems to analytic or
historical data stores. It most often defines the process of how transactional data is transformed
into business intelligence data.
5. Master data – equates to “reference data”. Reference data does not depend on other data
elements for its identity or meaning, and usually serves as consensus data that is shared
consistently across systems.
6. Metadata – is “data about data” and serves as the definition point of data elements along with
describing how they should be used.
7. Unstructured data – is typically handled in content management systems (although some are
moving this into traditional RDBMS engines), which manages the evolutionary life cycle of digital
information (video files, documents, etc.)
Again, there may be more narrowly-defined classifications of data, but the above represents the bulk of
what today’s modern enterprise tackles in the area of data management.
Models are the best means for representing the definition of data elements that support the various data
stores found throughout an enterprise. This being the case, it is not surprising that most IT organizations
utilize practices such as entity relationship diagramming (ERD) or other forms of modeling to capture and
preserve their data structures. The practice of model-driven data management is seen in a late 2005 North
American developer study jointly conducted by IDC and Infoworld, which showed that most organizations
look to modeling tools to help in their capture and implementation of data:
It is interesting to note that even companies that are small in terms of employee size still utilize a model-
driven approach to managing their data. Nearly ¾ of large companies are either using a model-driven
approach now or will be shortly.
Such large usage implies there are real and tangible benefits from using a model-driven approach to data
management. It’s helpful to briefly outline these advantages and then drill down into each one so a good
understanding can be had regarding why a model-driven approach to data management is preferred over
any other.
• Metadata management – ensures data consistency, enforces standards of data elements used
throughout an organization, and assists in identifying and cataloging elements for data governance
• Rapid application delivery – reduces the time it takes to craft and implement a new physical data
design and also the application that makes use of the underlying database
• Change management – helps to manage change between different iterations of data designs
• Packaged application management – removes the ‘black box’ feel of packaged applications by
graphically rendering the heart of any application, which is the database.
• Reporting and communication – greatly simplifies the communication and reporting of new and
modified data designs
• Performance management – helps to more quickly pinpoint design flaws in data designs that
contribute to inefficient response times in actual data-driven systems
Each one of these areas is explored in more detail in the sections that follow.
There is really no substitute for what a model-driven approach offers when it comes to quickly delivering
good physical database designs. Many times a database design begins in a conceptual way, which
In addition to modeling standard data-related objects (tables, indexes, etc.), code may also be included in
certain physical models so that everything related to and that touches data is kept in one place. This
equates to objects such as stored procedures, triggers, and more being included in a model. In addition,
the security aspects of a design can also be included so that user login details and individual object
permissions are recorded. Such abilities (all of which are included in MySQL Workbench) greatly enhance
the abilities for developers to move their projects forward as everything they need to create and manage
the data aspects of their applications resides in their model-driven tools.
All good modeling tools, like MySQL Workbench, support forward engineering of physical designs, which
means that all SQL code used to create a database and its dependent objects is automatically written and
runs right the first time. This eliminates the error-prone and time-consuming process of a developer or DBA
hand coding database creation syntax.
Perhaps one of the most difficult challenges facing data management professionals in the area of change
management is successfully performing complex alterations in existing physical database designs.
Performing the detailed impact analysis of proposed changes and preparing the database change scripts
can be a long and error-ridden task, which is unfortunate as mistakes made during database alterations
can be very costly.
Fortunately, most good modeling tools like MySQL Workbench alleviate such problems as they contain
synchronization utilities that allow a DBA or developer to make changes to a physical data model and then
synch those changes with an existing physical databases. The tool does all the work of performing the
impact analysis and generating the right database alteration code, with all changes being easily previewed
before they are executed against the target database.
In most tools, the reverse can be done as well – a data model can be altered to reflect changes that have
been made to an existing physical database. This aspect is important as oftentimes emergency changes
must be performed to a database, but if models are being used for change management and revision
purposes, they need to be updated to reflect what is currently in the IT infrastructure.
A model-driven approach to packaged application management can greatly simplify this process. Models
are the perfect way to visually understand the data relationships and data definitions of a complex
database structure. Via a reverse-engineering utility found in nearly all good modeling tools, a DBA or
developer can quickly visualize the inner workings of a packaged application’s database and understand
Again, models can come to the rescue as they are a perfect way for both technical and non-technical
people to grasp how data is defined and how it can be accessed. Good modeling tools like MySQL
Workbench can help on this front as they all offer the ability to create and export picture files of model
designs, and most provide reporting facilities that create Web-based and/or text-based reports that break
down large models into tabular report formats that are easily read and navigated.
In the pursuit for better overall database performance, many professionals are ignoring what is perhaps the
number one contributor to excellent RDBMS speed – the physical database design. If a database runs
slow, oftentimes those implementing it will add additional hardware power, but throwing hardware at a bad
design never works in the long run. In the short term, things may appear to get better and if the database is
relatively static in nature, things may remain that way. But if the database is dynamic and the data/user
load continues to grow, the situation will slide back to the point where it once was. Instead, the physical
design of the database should be interrogated for performance flaws and tuning opportunities. The reason
for this is a foundational one. If the foundation is flawed, then the house needs to be put in order at that
level before anything else is done.
One of the keys to understanding the discipline of performance monitoring is this: When a DBA monitors
a database for performance, they are really validating their physical design implementation. If the
performance monitor they choose to use is blasting them with flashing lights, alarm bells, and pager alerts,
it's probably because their physical design is failing. If all is quiet in the performance monitor, then their
physical design is likely a current success. Some will try and argue that SQL code is the number one factor
in database performance, but in reality, it is the physical design (code comes next).
Creating robust efficient physical designs can be difficult and intricate work. IT professionals need to arm
themselves with some serious power tools that have the capability to slice through the difficulties involved
in building and retrofitting complex physical database designs. Long gone are the days when a DBA or
modeler could handle most of their work with a SQL query interface and a drawing tool. Today, relational
databases are just too robust and contain too many complexities for such primitive aids.
At a minimum, those creating databases will need two things flanking both ends of their arsenal: a solid
data modeling tool and a robust performance monitoring product. As has already been mentioned,
performance monitoring is really the validation of a database's physical design. When foundational cracks
are identified with the monitor, DBAs and developers will need a high-quality design tool to aid in rectifying
the situation – one that makes quick work of sometimes complicated schema changes.
MySQL Workbench’s interface and automated processes ensures out-of-box success for a variety of
audiences within modern enterprises including database administrators, application developers, data
architects and IT management. The product runs on Windows, Linux, and Mac so that users can design
their databases from all the popular desktop operating systems. A quick tour of MySQL Workbench’s
feature set showcases how the tool supplies the benefits previously discussed in the area of model-driven
data management.
MySQL Workbench allows a developer or DBA to create one or multiple models within its interface and
supplies various different views of the objects (tables, views, stored procedures, etc.) being designed.
In addition, MySQL Workbench has a number of other aids that help DBAs and developers quickly create
database designs right the first time. A Model Validation utility checks a data model for any possible
mistakes and reports all found issues back to the user. For large models that are difficult to navigate, a
Zoom feature allows a data modeler to easily zoom in and out to get either a bird’s eye view of an entire
data model or just focus on one specific area. To locate various objects (tables, column names, etc.) in a
large model, an advanced Find utility locates all occurrences of whatever search criteria the user supplies,
with the results providing point-and-click navigation to whatever is selected in the output.
Finally, a variety of other helpful functions exist in the tool such as the allowance for different modeling
notations, an Autolayout function that automatically arranges tables on a diagram, and a scripting ability
that lets advanced users extend the tool via LUA
To help DBAs and developers with change management, MySQL Workbench offers a Synchronization
utility that compares a data model against a target MySQL server and performs a synchronization between
the two. Using MySQL Workbench, A DBA or developer first connects to a target MySQL server, and then
the tool compares all aspects of the currently used model with the physical MySQL database. The
Synchronization utility then presents a graphical display of all the differences that it found and lets the
developer or DBA decide what they want to do next.
MySQL Workbench provides three options for any found differences between a model and a physical
database:
1. Ignore differences
2. Synchronize the model with the physical database server
3. Synchronize the physical database server with the model
MySQL Workbench allows a user to make such decisions on either a global or per object basis, so
everything can be handled in the exact way a DBA desires. Once the synchronization operation is
complete, a DBA can save the MySQL Workbench model to preserve versions of how the model and
database looked at a particular point in time.
6 Conclusion
Modern businesses know the value of using a model-driven approach to manage the definitions and designs
of data that’s used in their key production systems. Models stand second-to-none when it comes to quickly
understanding, organizing, and managing both custom built and packaged data-driven applications.
To help data professionals tackle the challenges that come with designing and understanding complex
databases, MySQL offers MySQL Workbench, which helps DBAs and developers both quickly create new
MySQL databases and manage the lifecycle of existing MySQL databases with change management and
reporting functions. With MySQL Workbench, the productivity of a database professional is increased as the
tool eliminates the complex and error-prone processes of manually performing the previously mentioned tasks,
with the end result being a quicker and more efficient delivery of MySQL-based applications.
In addition, other white papers are available from MySQL on both business (TCO, etc.) and technical topics
(storage engines, etc.) – see http://www.mysql.com/why-mysql/white-papers/ for free paper downloads.