0% found this document useful (0 votes)
188 views7 pages

Part 1 Section F - Technology and Analytics

Uploaded by

freeiannejoyb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
188 views7 pages

Part 1 Section F - Technology and Analytics

Uploaded by

freeiannejoyb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

10/8/22, 10:12 AM CMA Exam Review - Part 1 - Section F: Technology and Analytics

Study Guide
Section F: Technology and Analytics

Data are available in greater quantities and forms than ever before, and advances in technology have made it possible for organizations to use data more effectively
for planning and decision making. The knowledge needed for effective use of data includes an understanding of how information systems are designed, how they
operate, and how they are managed. The skills needed to gather, analyze, report, and use data for decision making are increasingly vital for finance professionals to
add value in their organizations.

Topics covered in this section include information systems, data governance, technology-enabled finance transformation, data analytics, business intelligence, data
mining, analytical tools, and visualization.
As indicated in the Content Specification Outlines, candidates are assumed to have an understanding of basic statistics, including measures of central tendency and
dispersion.

Section F.1. Information Systems


The candidate should be able to:

a. Identify the role of the accounting information system (AIS) in the value chain.
A. The primary role of the accounting information system (AIS) is to provide reliable and timely information to decision makers both inside and outside of
the organization. The AIS provides this information in the form of official financial statements or as performance reports for internal users.
B. The AIS adds value by providing necessary information that is used for analysis, evaluation, regulation, and strategic decision making.
b. Demonstrate an understanding of the accounting information system cycles, including revenue to cash, expenditures, production, human resources and
payroll, financing, and property, plant, and equipment, as well as the general ledger (GL) and reporting system.
A. The Revenue to Cash Cycle refers to the process of taking orders, shipping products or delivering services, billing customers, and collecting cash from
sales. The relevant records or documents for this cycle are customer purchase orders, sales orders, picking tickets, shipping documents, invoices, and
cash receipts.
B. The Expenditure Cycle is the process of placing orders, receiving shipment of products or delivery of services, approving invoices, and making cash
payments. The relevant records or documents for this cycle are purchase requisitions, purchase orders, receiving reports, and invoices.
C. The Production Cycle is the process by which materials are converted into finished goods. The relevant records or documents for this cycle are cost
accounting reports, bills of materials, customer orders, production schedules, production orders, material requisitions, move tickets, operations reports,
job-time tickets, and cost of goods manufactured reports.
D. The Human Resources and Payroll Cycle is the process of recruiting, interviewing, and hiring personnel, paying employees for their work, promoting
employees, and finalizing employees’ status from retirements, firings, or voluntary terminations. The relevant records or documents for this cycle are
master payroll files, time reports, hiring, promotion, transfer, and firing records, tax and insurance rate records, and individual employment records with
data such as withholdings and deductions.
E. The Financing Cycle is the process of obtaining funding, through debt or equity, to run an organizations’ activities and to purchase PPE, servicing the
financing, and ultimate repayment of financial obligations. The relevant records and documents for this cycle include cash budgets, debt instrument
records, equity holding records, and repayment schedules.
F. The Property, Plant, and Equipment Cycle is the process of acquiring resources (e.g., land, buildings, and machinery) needed to enable an organizations’
business activities. The relevant records and documents for this cycle are acquisition records, depreciation schedules, and disposal reports.
G. The General Ledger and Reporting System is the process of recording, classifying, and categorizing an organization's economic transactions and
producing summary financial reports. The relevant records and documents for this system are the general and subsidiary ledgers, financial statements,
and managerial reports.
c. Identify and explain the challenges of having separate financial and nonfinancial systems.
A. The primary challenge of separate financial and nonfinancial systems is data maintenance—that is, making sure the data is accurately linked in both
systems. When the two systems are separate, the data must be reconciled to make sure they are measuring the same thing. If the data the systems draw
upon is not located in the same place or database, extensive controls must be created and maintained in order to avoid costly errors and inconsistencies.
d. Define enterprise resource planning (ERP) and identify and explain the advantages and disadvantages of ERP.
A. ERP is the integrated management of core business processes. ERP brings together business functions such as inventory management, operations,
accounting, finance, human resources, and supply chain management.
B. Advantages of ERP include the availability of real-time data, wide distribution of information, single system learning, and lower operational costs.
C. Disadvantages of ERP include high initial monetary, implementation, and training costs.
e. Explain how ERP helps overcome the challenges of separate financial and nonfinancial systems, integrating all aspects of an organization's activities.
A. ERP enables a single information system to provide both financial and nonfinancial information to users. This reduces the errors that can arise when
different systems draw on different information sources. For example, a customer relationship management (CRM) is a nonfinancial system that can
draw data from the same underlying information system as an AIS system that processes financial data about sales activity. An ERP can link the CRM
system to AIS to reduce errors and increase information usefulness.
f. Define relational database and demonstrate an understanding of a database management system.
A. A relational database is a formally described set of data tables that recognizes relationships among data items. Each row of table has a primary key, or a
unique identifier, that can be used to link tables together.
B. A database management system is the interface or program between a company's database and the application programs that access the database. The
DBMS defines, reads, manipulates, updates, and deletes data in a database. It optimizes how data in databases are stored and retrieved, and facilitates
an organization's administrative operations.
g. Define a data warehouse.
A. A data warehouse is a storage system used to aggregate data from multiple sources into a central integrated data repository. Data warehouses are used
to facilitate analysis of business activities and operations.
h. Define enterprise performance management (EPM), also known as corporate performance management (CPM) or business performance management (BPM).

https://app.efficientlearning.com/pv5/v8/5/app/cma/part1_11th_2020.html?#lesson 1/7
10/8/22, 10:12 AM CMA Exam Review - Part 1 - Section F: Technology and Analytics
A. Enterprise Performance Management (EPM) is a process that facilitates the linking of an organization's strategies to specific plans and actions. The
overall process can be broken down into subprocesses such as planning, budgeting, and forecasting; performance reporting; and profitability and cost
analysis.
i. Discuss how EPM can facilitate business planning and performance management.
A. EPM software packages can improve efficiencies in planning, budgeting, and reporting processes by relying on a centralized database and workflow. EPM
can also reduce or even eliminate the need for spreadsheet-based business activities by acting as a central repository for performance data. EPM also
provides a more holistic view of an organization's performance by linking its financial and operational data and metrics. This helps facilitate the analysis
and reporting of the organization's activities.

Section F.2 Data Governance


The candidate should be able to:

a. Define data governance (i.e., managing the availability, usability, integrity, and security of data).
A. Data governance refers to the overall management of data within an organization. It comprises the procedures, policies, rules, and processes that
oversee the following data attributes:
i. Availability: the ability to make data accessible when and where it is needed
ii. Usability: the delivery of data to end users in formats and structures that allow for its successful use
iii. Integrity: the accuracy and consistency of data
iv. Security: the protection of data from unauthorized access and possible corruption
b. Demonstrate a general understanding of data governance frameworks, COSO's Internal Control framework, and ISACA's COBIT (Control Objectives for
Information and Related Technologies).
A. Data governance frameworks help organizations design and manage the structure of data governance systems. Two of the primary data governance
frameworks typically used in the accounting profession are the Committee of Sponsoring Organization's (COSO) Internal Control – Integrated Framework
and the Information Systems Audit and Control Association's (ISACA) Control Objectives for Information and Related Technologies (COBIT).
B. COSO's Internal Control – Integrated Framework (ICIF) defines five components of internal control: control environment, risk assessment, control
activities, information and communication, and monitoring activities. These components should be considered for internal control over operations,
reporting, and compliance activities, and can be designed from the entity level down to the level of individual business functions.

C. The COBIT framework primarily focuses on internal controls related to information technology. The framework divides IT into four major parts: Plan and
Organize, Acquire and Implement, Deliver and Support, and Monitor and Evaluate. COBIT also provides best practices for IT management in the form of
resources, technical guides, and training.
c. Identify the stages of the data life cycle; i.e., data capture, data maintenance, data synthesis, data usage, data analytics, data publication, data archival, and
data purging.
A. Data Capture: The recording or securing of data. Data can be captured by entering by hand, scanned by computers, or acquired by sensors.
B. Data Maintenance: The process of creating usable data, which may include cleansing, scrubbing, and processing through an extract – transform – load
(ETL) methodology.
C. Data Synthesis: The use of statistical methods to obtain a better overall estimate or answer to the questions for which data are used. Sometimes called
data modeling.
D. Data Usage: Action taken with data to support the mission of the business such as processing invoices, contacting customers, sending purchase orders to
vendors, etc.
E. Data Analytics: The use of data analysis methodologies to answer questions and make decisions.
F. Data Publication: The act of sending data outside the organization. This typically means sending data to business partners such as sending a statement
to a customer.
G. Data Archival: The process of removing data from active use to be stored for potential future use.
H. Data Purging: Deleting data that is no longer useful or needed
d. Demonstrate and understanding of data preprocessing and the steps to convert data for further analysis, including data consolidation, data cleaning
(cleansing), data transformation, and data reduction.
One of the most important steps in extracting information from data is preprocessing the data before it gets used in analytics models. Preprocessing is
necessary because big data (which usually comes from different sources) is typically not standardized or commonly formatted. Data preprocessing is
commonly divided into four processes.

https://app.efficientlearning.com/pv5/v8/5/app/cma/part1_11th_2020.html?#lesson 2/7
10/8/22, 10:12 AM CMA Exam Review - Part 1 - Section F: Technology and Analytics
A. Data consolidation: the process of collecting and bringing together data from multiple sources. Data consolidation consists of cycling through defined
data sources, connecting to each one, reading data from the source, and storing newly collected data in a central location. Data redundancy can be
avoided by ensuring that only new data is imported.
B. Data cleaning (cleansing): the process of ensuring data matches the requirements for analysis. Criteria for clean data include validity, accuracy,
completeness, consistency, and uniformity. Data cleaning is typically the most time-consuming and effortful of the data preprocessing steps.
C. Data transformation: the process of applying algorithms to convert data from its raw form into an output form that meets analytical requirements.
Examples include conversion of temperature data into a common scale or financial values into a common currency.
D. Data reduction: the process of aggregating or otherwise decreasing the volume of raw data so that it can be handled efficiently. Examples include
reducing granular data into metrics that provide more actionable data for analytical models.
e. Discuss the importance of having a documented record retention (or record management) policy.
A. A documented record retention policy is an important control to ensure data is secure. Records must be kept and maintained for internal use as long as
they are needed by users to research, analyze, and document past events and decisions. In addition, records must be preserved to meet legal and
regulatory requirements.
f. Identify and explain controls and tools to detect and thwart cyberattacks, such as penetration and vulnerability testing, biometrics, advanced firewalls, and
access controls.
A. A cyberattack is an attempt by an individual or organization to gain access to the information system or computer of another individual or organization.
Examples include malware, phishing, and denial-of-service attacks.
B. Controls and tools to detect and thwart cyberattacks include the following:
i. Vulnerability testing: Actions taken to identify existing vulnerabilities. It does not attempt to assess if and how the vulnerability could be exploited.
In contrast, penetration testing is undertaken to actively exploit potential weaknesses in a system and identify potential resulting damages.
ii. Biometrics: The use of physical features and measurements for identify verification. Biometrics can include using fingerprints, facial recognition,
and even stride pattern to verify individuals. Biometrics help ensure that only authorized personnel are allowed to physically be in certain
locations, to access and alter data, and/or perform specified business functions.
iii. Firewalls: Security rules and monitors of incoming and outgoing traffic used in computer networks to prevent unauthorized users from gaining
access. Firewalls can take the form of hardware, software, or a combination of both.
iv. Access controls: Limits on who can access a place or a resource. Physical access controls restrict who can enter into geographic areas, buildings,
and/or rooms. Logical access controls restrict which individuals can connect to computer networks, system files, and data. Access control can take
the form of passwords, personal identification numbers (PINs), credentials, or other authorization and authentication forms.

Section F.3 Technology-Enabled Finance Transformation


The candidate should be able to:

a. Define the systems development life cycle (SDLC), including systems analysis, conceptual design, physical design, implementation and conversion, and
operations and maintenance.
A. The systems development life cycle (SDLC) is a structured road map for designing and implementing a new information system. It follows the following
five steps:
i. Systems analysis: identifying the needs of the organization and assembling the information regarding modifications to the current system and/or
the purchase and development of a new system.
ii. Conceptual design: creating a plan for meeting the needs of the organization. Design alternatives are prepared and detailed specifications are
created for the desired system.
iii. Physical design: creating detailed specifications for creating the system based on its conceptual design. The design would include specifications for
computer code, inputs, outputs, data files and databases, processes and procedures, and proper controls.
iv. Implementation and conversion: the installation of the new system, including hardware and software. The new system is tested and users are
trained. New standards, procedures, and controls are instituted.
v. Operations and maintenance: the execution of the system, including checking performance, making adjustments as necessary, and maintaining
the system. Improvements are made and fixes are put in place until it is determined that the cost of maintaining old systems exceeds its benefits,
and the cycle starts again.
b. Explain the role of business process analysis in improving system performance.
A. Business process analysis represents a systematic method of examining a company's business process to determine how they can be improved in either
effectiveness, efficiency, or both. Common approaches to business process analysis include clear establishment of process objectives, diagramming or
flowcharting current and optimal process flow, and the identification and elimination of nonvalue adding activities.
c. Define robotic process automation (RPA) and its benefits.
A. Robotic process automation (RPA) is the use of software to complete routine, repetitive tasks, typically in settings with high volumes of routinized
actions. Rather than employing human labor to perform these functions, robotic systems can manipulate data, record transactions, process information,
and perform many other business and IT processes.
B. RPA can provide greater consistency and speed in work performed. Rule-based processing allows computers to execute routine tasks rather than having
an individual perform the work in front of a computer. RPA can also allow organizations to scale processes faster than by hiring and training workers to
perform identical tasks.
d. Evaluate where technologies can improve efficiency and effectiveness of processing accounting data and information (e.g., artificial intelligence (AI)).
A. As distinct from RPA, artificial intelligence (AI) involves computers performing tasks requiring critical analysis and pattern recognition. For example, AI
can be used to recognize speech or textual patterns, and to analyze various inputs and provide recommendations. AI is adaptive in that it allows
computers to learn from prior information processing experiences and revise and update future processing.
B. Because AI can process information more quickly and in larger quantities than the human mind can, and because it does not suffer computational
fatigue, it can improve efficiency and effectiveness of accounting processes. For example, AI can be used to classify or categorize transactions into
appropriate accounts or to identify potential errors or irregularities in accounting data, which could be used to improve financial reporting or by auditors
to detect misstatements and/or fraudulent activities. AI can also analyze cost data and create reports on cost behaviors and patterns.
e. Define cloud computing and describe how it can improve efficiency.

https://app.efficientlearning.com/pv5/v8/5/app/cma/part1_11th_2020.html?#lesson 3/7
10/8/22, 10:12 AM CMA Exam Review - Part 1 - Section F: Technology and Analytics
A. Cloud computing is a shared resource setup using a network of remote servers that are connected by the Internet. The remote servers are used to store,
manage, and process data. Cloud computing can provide access to larger data storage, faster processing speeds, and numerous software applications.
B. Cloud computing can help avoid data loss due to localized hardware failures and malfunctions because of networked backups and redundancies. The
“cloud” or network of servers provides a safeguard by storing information on multiple servers at multiple geographic locations.
f. Define software as a service (SaaS) and explain its advantages and disadvantages.
A. Software as a service (SaaS) is a software distribution model in which a third-party provider hosts applications and makes them available to customers
over the Internet. Advantages include lower IT costs for customers in the form of reduced need for local installation and equipment, as well as reduced
responsibility for maintenance and troubleshooting. Disadvantages include potential limitations on functionality and customization.
g. Recognize potential applications of blockchain, distributed ledger, and smart contracts.
A. Blockchain refers to a distributed, digital ledger of economic transactions within a peer-to-peer network. Transaction data is not stored in a single
location, but across thousands of computers and servers simultaneously. This improves data validity because the database cannot be hacked or
corrupted.
B. Blockchain allows for cryptocurrencies such as Bitcoin to function because it facilitates economic exchange based on a public, digital ledger for the
recording and verification of transactions. Blockchain also facilitates smart contracts that can be completed, verified, and carried out without involving
third parties because computerized protocols are used to execute and enforce contract terms.

Section F.4 Data Analytics


The candidate should be able to:
Business Intelligence

a. Define Big Data, explain the four Vs: volume, velocity, variety, and veracity, and describe the opportunities and challenges of leveraging insight from this data.
A. Big Data refers to datasets that are extremely large and/or complex, usually requiring special software and computational power to be processed and
analyzed.
B. Big data is often characterized along four dimensions, known as the four Vs:
i. Volume refers to the quantity or scale of the data.
ii. Velocity refers to the speed with which big data is generated and analyzed.
iii. Variety refers to the different types of data that may be involved (e.g., numerical, textual, images, audio, video, etc.).
iv. Veracity refers to the accuracy or quality of the data.
b. Explain how structured, semi-structured, and unstructured data is used by a business enterprise.
A. Structured data is easily searchable because it has fixed fields and unique identifiers, such as data organized in a spreadsheet with column or row
identifiers. Semi-structured data lacks neat, organized fields but may still have organizing features such as tags or markers. Examples include Extensible
Markup Language (XML) and email. Unstructured data is unorganized and not easy to search or categorize. Examples include Twitter and text messages,
photos, and videos.
c. Describe the progression of data, from data to information to knowledge to insight to action.
A. When structures or organizations are used, data is transformed into information. Information is different from data because information carries meaning
and understanding. Based on information, knowledge can be defined and created. Knowledge is what we know and how we understand that things are.
Knowledge is transformed through critical analysis and logical thinking to understand situations and context in order to take action.
d. Describe the opportunities and challenges of managing data analytics.
A. Data analytics processes data into information by organizing data and using analysis techniques to identify and understand relationships, patterns,
trends, and causes. Data analytics can help individuals develop and refine information into knowledge, which requires human understanding.
B. Challenges to data analytics include the following: the compilation of disparate data sources into a unified structure; costly procedures and processes for
data validation and verification; and the need for specialized training and frequent updating of expertise.
e. Explain why data and data science capability are strategic assets.
A. Data analytics can help identify new opportunities and evaluate the efficacy of organizational practices and philosophies. Data analytics can help
increase value by improving the understanding of operations and providing information for the evaluation of performance and strategic options.
f. Define business intelligence (BI), i.e., the collection of applications, tools, and best practices that transform data into actionable information in order to make
better decisions and optimize performance.
A. Business intelligence refers to the applications, tools, and best practices that transform data into actionable information.
Data Mining
g. Define data mining.
A. Data mining refers to the use of statistical methods, computer learning, artificial intelligence, and large-scale computing power to analyze large amounts
of data in order to extract useful information about relationships, trends, patterns, and anomalies.
h. Describe the challenges of data mining.
A. Challenges of data mining include:
i. Data quality: errors or missing values can limit the ability to conduct rigorous data mining.
ii. Multiple sources: combining data from multiple sources can make matching observations and data transformation difficult.
iii. Data volume: analysis of large datasets can require a challenging level of computation power.
iv. Output volume: data mining techniques can produce enormous amounts of output that can require significant time and effort to navigate.
i. Explain why data mining is an iterative process and both an art and a science.
A. Data mining is iterative in that datasets often need to be simplified and statistical tools and queries need to be refined repeatedly to focus results and
provide actionable findings.
B. Data mining is a science in the sense that statistical tools and analyses need to be used with precision in order to produce reliable insights. It is an art in
the sense that patterns and trends can often be seen only by looking at the data in different ways, and by relying on both experience and creativity to
transform data into knowledge and productive actions.
j. Explain how query tools (e.g., Structured Query Language (SQL)) are used to retrieve information.
A. Query tools such as Structured Query Language (SQL) are used to manipulate and extract information from a database. They are the primary
mechanisms by which we can communicate with a database.

https://app.efficientlearning.com/pv5/v8/5/app/cma/part1_11th_2020.html?#lesson 4/7
10/8/22, 10:12 AM CMA Exam Review - Part 1 - Section F: Technology and Analytics
k. Describe how an analyst would mine large data sets to reveal patterns and provide insights.
A. An analyst could use data mining techniques to identify sales trends by products, regions, customer segments, or other categories. Data mining could
also be used to better understand cost behavior and identify cost drivers. These insights could result from techniques such as data clustering,
longitudinal analysis, and regression, among others.
Analytic Tools
l. Explain the challenge of fitting an analytic model to the data.
A. Part of the challenge when analyzing data is developing expectations (or models) of how different variables are connected. Another key consideration is
whether the data meet the assumptions underlying the statistical analyses being conducted. Incomplete or misspecified models can lead to inaccurate
or erroneous conclusions or estimates of effect sizes.
m. Define the different types of data analytics, including descriptive, diagnostic, predictive, and prescriptive.
A. There are four general categories of data analytics approaches:
i. Descriptive: observational analysis designed to report the characteristics of historical data. It describes statistical properties such as the mean,
median, range, or standard deviation.
ii. Diagnostic: analysis designed to examine uncover and understand why certain outcomes take place. It focuses on correlations and the size and
strength of statistical associations.
iii. Predictive: analysis designed to build upon descriptive and diagnostic analytics to make predictions about future events. Predictive analysis
considers risk assessments, usually as outcome likelihoods and uncertainties.
iv. Prescriptive: analysis that draws upon the other forms of analytics to infer or recommend the best course of action. Prescriptive analysis can take
the form of optimization or simulation analyses to identify and prescribe optimal actions.
n. Define the following analytic models: clustering, classification, and regression; determine when each would be the appropriate tool to use.
A. Clustering involves grouping similar objects or data points together. It is focused on identifying patterns of similarity and dissimilarity. Because it is an
exploratory technique, it is usually used in the beginning stages of data analysis.
B. Classification attempts to predict which category or class an item belongs too. Classification typically begins with predefined categories and then
attempts to sort an item into one of those categories.
C. Regression analyzes the correlation of an outcome (dependent variable) with explanatory or independent variables. It is used to understand the
statistical properties (i.e., strength or weakness) of a hypothesized relationship and/or to make predictions based on that relationship.
o. Identify the elements of both simple and multiple regression equations.
A. A regression equation typically looks like this: Y = a + bX. In this case, Y is the dependent variable. The intercept is represented by a. The slope is
represented by b. X is the independent variable. In multiple regression, more independent variables are added. For example, Y = a + bX + cW + dZ.
B. In the equations above, a (the intercept) refers to the value of the dependent variable when the independent variable takes the value of 0. The
coefficients on each of the independent variables (b, c, and d) represent the amounts by which the dependent variable would be predicted to increase or
decrease given a one unit increase or decrease in the value of that independent variable. See item p below for more detail.
p. Calculate the result of regression equations as applied to a specific situation.
A. Assume a cost equation model based a single cost driver (units): Total Cost = Fixed Cost + Variable Cost × Units. If the regression output estimated the
fixed cost (intercept) to be $1,000 and the variable cost (slope) to be $20, then the total cost equation would look like this: Total Cost = $1,000 + $20 ×
Units. This equation could then be used to predict a total cost based on the output level. If output were 100 units, then the total cost would be estimated
to be $3,000 = $1,000 + ($20 × 100 units).
B. If the cost equation were based on two variable cost drivers, such as machine hours and labor hours, the equation would be: Total Cost = Fixed Cost +
(Variable Cost × Machine Hours) + (Variable Cost × Labor Hours). If the multiple regression output estimated the fixed cost (intercept) to be $1,000 and the
variable cost coefficients (slopes) to be $15 per machine hour and $20 per labor hour, the total cost equation would look like this: Total Cost = $1,000 +
($15 × Machine Hours) + ($20 × Labor Hours). Then for a scenario involving 50 machine hours and 75 labor hours, the total cost would be estimated to be
$3,250 = $1,000 + (50 machine hours × $15) + (75 × $20 labor hours).
q. Demonstrate an understanding of the coefficient of determination (R squared) and the correlation coefficient (R).
A. The coefficient of determination (R squared) is a measure of how well a regression equation fits the data. It is interpreted as the percent of variation in
the dependent variable that is explained by variation in the independent variables. Its range is from 0 to 1, with a higher percentage indicating greater
explanatory power.
B. The coefficient of correlation (R) is a measure of how two variables are related, in both direction and strength. Its range is from −1 to 1, with −1 indicating
perfect negative correlation, 1 indicating perfect positive correlation, and 0 indicating no correlation.
r. Demonstrate an understanding of time series analyses, including trend, cyclical, seasonal, and irregular patterns.
A. Time series analyses consider data points over time, which can allow for the identification of patterns over time that can facilitate prediction. Time series
patterns (in which future values can be predicted based on past observations) include cyclical trends (such as macroeconomic cycles), seasonal trends
(such as retail spikes around holidays), and irregular trends (such as fluctuations due to unforeseen events such as natural disasters).
s. Identify and explain the benefits and limitations of regression analysis and time series analysis.
A. Regression analysis provides valuable information about the nature and uncertainty of statistical relationships. It can facilitate estimation and prediction
based on principles that are relatively easy to understand, while also providing information about the confidence levels with which such estimates and
predictions can be made.
B. Limitations of regression include that it should only be relied on to make predictions in the range of values from which the observed data are used (this is
called the relevant range). Outside of this range, the model may not fit the data as the regression indicates, and poor predictions may result. Regression
can also be influenced by outliers; that is, extreme observations can distort the regression output.
t. Define standard error of the estimate, goodness of fit, and confidence interval.
A. Standard error of the estimate is a measure of the average distance between the regression line and the data points. The smaller the standard error is,
the more accurate the predictions are.
B. Goodness of fit refers to how well the regression model fits the observed data.
C. A confidence interval is a range of values within which the true parameter value can be said to be, given a certain confidence level. All else equal, the
greater the confidence required, the wider the confidence interval will be.
u. Explain how to use predictive analytic techniques to draw insights and make recommendations.
A. Predictive data analysis employs various methods and techniques to generate insights and create recommendations for decision makers. Predictive
analysis uses tools such as data mining techniques, big data, statistical modeling, and machine learning to create predictive data models. These models
are used to identify patterns and relationships within the data.

https://app.efficientlearning.com/pv5/v8/5/app/cma/part1_11th_2020.html?#lesson 5/7
10/8/22, 10:12 AM CMA Exam Review - Part 1 - Section F: Technology and Analytics
v. Describe exploratory data analysis and how it is used to reveal patterns and discover insights.
A. Exploratory data analysis is used to summarize the characteristics of a dataset. This technique often involves using visual methods to examine the data
for patterns or anomalies.
w. Define sensitivity analysis and identify when it would be the appropriate tool to use.
A. Sensitivity analysis is the exploration of how a dependent variable might be affected given different possible values of the independent variables. For
example, when predicting a company's net income in future years, an assumption is likely made about how fast sales will grow. Sensitivity analysis
examines how results might change if assumptions about the model or prediction are changed. This can be referred to as what-if analysis.
x. Demonstrate an understanding of the uses of simulation models, including the Monte Carlo technique.
A. Simulation models are computational algorithms that use specified assumptions to model possible outcomes. Monte Carlo simulation is a specific type
of simulation that generates the probability that an outcome will occur. A Monte Carlo simulation random draws values for variables of interest, each
with specified possible ranges and probability distributions, and then provides the realized outcome based on thousands of iterations.
y. Identify the benefits and limitations of sensitivity analysis and simulation models.
A. Sensitivity and simulations models use the uncertainty in statistical relationships and model assumptions to help decision makers assess risk and
consider the possibilities and probabilities of the different outcomes from a course of action. This can be a huge benefit for organizational strategy and
decision making. The primary limitation of these tools is that they are only as good as the inputs on which they are based. Poor assumptions about future
events or about the data on which these tools are used can lead to uninformative or misleading analyses.
z. Demonstrate an understanding of what-if (or goal-seeking) analysis.
A. What-if or goal-seeking analyses are similar to sensitivity analysis, in which outcome predictions are made under a variety of different assumptions about
values of independent variables or the future state of the organization or its environment. Goal-seeking analysis may also seek to understand exactly
what assumptions or conditions must obtain in order for certain outcomes to occur.

aa. Identify and explain the limitations of data analytics.

A. Limitations of data analytics include the following:


i. Data quality. If data collection has been inconsistent, if the data are incomplete, or if there are errors in the data itself, analytics could lead to poor
results.
ii. Data reliability. Analysis of data that comes from humans (such as survey results) needs to be considered in light of the possibility of memory
errors, bias, incentives to provide inaccurate information, and other issues that may reduce the reliability of data.
iii. Over-reliance on correlations. Evidence of correlation in the data may be spurious, meaning it may be observed by chance. Alternatively,
correlations may lead decision makers to assume a causal relationship that may in fact operate in reverse or be explained by a third variable that
is not in the data. Overreliance on correlations could lead to poor assumptions and inaccurate predictions.
iv. Failure to consider uncertainty. Future outcomes can only be predicted with uncertainty. Data analytic tools often provide valuable information
about the uncertainty in the data and predictions flowing from that data. But if decision makers fail to account for that uncertainty, they may
make decisions and pursue actions with unjustified confidence.

Visualization

bb. Utilize table and graph design best practices to avoid distortion in the communication of complex information.

A. Best practices for table and graph design include the following:
i. Planning: the purpose and content of the table and graph should be planned.
ii. Focus: The focus of the table/graph should be the most prominent part of the design.
iii. Alignment: For tables, text should be aligned or justified on the left of the cell and numerical data should be aligned on the right side of the cell.
iv. Size: characters should be large enough to read, and the use of common fonts is recommended.
v. Clutter: Tables/graphs should leave sufficient white space to help the reader focus on the message.
vi. Color: Color can provide depth, focus, and contrast, but too much or poorly planned color can distract.

cc. Evaluate data visualization options and select the best presentation approach (e.g., histograms, boxplots, scatter plots, dot plots, tables, dashboards, bar
charts, pie charts, line charts, bubble charts).

A. Histograms show the distribution of numerical data. Histograms are useful to see the distribution of data points, but they are not useful for comparison.
B. Boxplots show the distribution of data by displaying the quartiles in which data occur. Box plots do not show individual values and can be skewed, but
they are one of the few techniques that display outliers. They are also useful in showing a comparison among distributions.
C. Scatterplots show how two variables are related. Scatterplots are useful visual tools, but it can be difficult to see the extent of correlation.
D. Dot plots are similar to histograms. Values are typically represented by small circles stacked on one another in each category.
E. Tables list information in rows and columns. They can provide a large quantity of information, but they can be difficult to use as a visual tool for quick
reporting.
F. Dashboards are a quick, summary view of key performance indicators.
G. Bar charts are used with categorical data to show the proportion of data in each category with horizontal or vertical bars. Bar charts are limited in that
they are not designed to show trends.
H. Pie charts are used with categorical data to show the proportion of data in each category with slices in a circle. Pie charts have limited usefulness when
proportions are not significantly different.
I. Line charts are used to show a series of data points for one variable. Multiple lines may be stacked onto the same chart to show multiple variables. Line
charts tend to be used primarily to show data over time.
J. Bubble charts are an enhancement of a scatter chart wherein an additional dimension of the data is shown by the size of the circle. As with pie charts,
bubble charts are best when bubble sizes display significant variation.

dd. Understand the benefits and limitations of visualization techniques.

A. See section ee for a discussion of benefits and limitations of visualization techniques.

https://app.efficientlearning.com/pv5/v8/5/app/cma/part1_11th_2020.html?#lesson 6/7
10/8/22, 10:12 AM CMA Exam Review - Part 1 - Section F: Technology and Analytics

ee. Determine the most effective channel to communicate results.

A. See section ee for a discussion of the conditions under which each of the visualization options may be the best communication channel.

ff. Communicate results, conclusions, and recommendations in an impactful manner using effective visualization techniques.

A. Keeping graphs clear and concise is vital for effective visualization. Avoid including irrelevant information so that users of the visuals focus on what
matters most.
B. Be sure to provide context to the data you are sharing. Often, the individuals who are tasked with creating visual aids are more familiar with the topic
than others. Visual aids should be created such that anyone in the audience can quickly grasp the significance of the presented information.
C. Revisit visual aids frequently and make incremental improvements where possible.

https://app.efficientlearning.com/pv5/v8/5/app/cma/part1_11th_2020.html?#lesson 7/7

You might also like