Business Analytics Using RBusiness Analytics Using R
Business Analytics Using RBusiness Analytics Using R
Module 1
Education is still one of the major fundamentals around which careers are born and built.
Interested aspirants can apply for a Business Analytics Program to understand the
fundamentals, scope, importance, and benefits before choosing a career in this trending field.
Let's start with a simple explanation for What is Business Analytics before going into the
fundamentals that every beginner should know.
Business Analytics may be defined as refining past or present business data using modern
technologies. They are used to build sophisticated models for driving future growth. A
general Business Analytics process may include Data Collection, Data Mining, Sequence
Identification, Text Mining, Forecasting, Predictive Analytics, Optimization, and Data
Visualization.
Every business today produces a considerable amount of data in a specific way. Business
Analytics now are leveraging the benefits of statistical methods and technologies to analyze
their past data. This is used to uncover new insights to help them make a strategic decision
for the future.
Business Intelligence, a subset of the Business Analytics field, plays an essential role in
utilizing various tools and techniques such as machine learning and artificial
intelligence technologies to predict and implement insights into daily operations.
Thus, Business Analytics brings together fields of business management, and computing to
get actionable insights. These values and inputs are then used to remodel business procedures
to generate more efficiency and build a productive system.
With advancement today, we have Business Analytics tools that utilize past and present data
to give businesses the right direction for their future.
Types of Business Analytics Techniques
1. Descriptive Analytics: This technique describes the past or present situation of the
organization's activities.
2. Diagnostic Analytics: This technique discovers factors or reasons for past or current
performance.
3. Predictive Analytics: This technique predicts figures and results using a combination of
business analytics tools.
A complete business analytics life cycle starts from raw data received from the devices or
services, then collecting data in an unstructured type, then processing and analysing data to
draw actionable insights. These are then integrated into business procedures to deliver better
outcomes for the future.
With Business Analytics tools, we can have a more profound understanding of primary
and secondary data emerging from their activities. This helps businesses refine their
procedures further and be more productive.
To stay competitive, companies need to be ahead of their peers and have all the latest
toolsets to assist their decision making in improving efficiency as well as generating more
profits.
Now that we have added more value into our learning on What is Business Analytics by
learning the importance, let us next understand its scope.
Business Analytics has been applied to a wide variety of applications. Descriptive analytics is
thoroughly used by businesses to understand the market position in the current scenarios.
Meanwhile, predictive and prescriptive analytics are used to find more reliable measures for
businesses to propel their growth in a competitive environment.
In the last decade, business analytics is among the leading career choices
for professionals with high earning potential and assisting businesses to drive growth with
actionable inputs.
We have understood well about what Business Analytics is, let us next understand its
benefits.
To club in one phrase: Business Analytics brings actionable insights for businesses. However,
here are the main benefits of Business Analytics:
4. These insights help in decision making and planning for the future.
6. Discover hidden trends, generate leads, and scale business in the right direction.
We have learned all about What is Business Analytics, let us next see how different it is from
business intelligence.
Data analytics is the process of analyzing data sets to make decisions about the information
contained within them. The endeavor of business goals or insights is not a prerequisite for
using data analytics. Business analytics is a part of this broader practice.
Analytics gets used by data science to guide decision-making. Data scientists investigate data
using cutting-edge statistical techniques. They let the data's features direct their analysis. Data
science isn't always required, even when sophisticated statistical algorithms get used on data
sets. That's because genuine data science investigates solutions to open-ended questions. But
business analytics aims to address a particular query or issue.
When attempting to implement a business analytics strategy, organizations may run into
issues with both business analytics and business intelligence:
Too many data sources: Business data is getting produced by a broad range of internet-
connected devices. They frequently create various data types, which must get incorporated
into an analytics strategy.
Lack of skills: Some companies, mainly small and medium-sized businesses (SMBs), may
find it tough to find candidates with the necessary business analytics knowledge and
abilities.
Data storage limitations: Before determining how to process data, a company must decide
where to store it.
Business Analytics Examples and Tools
Many business analytics and business intelligence tools can automate advanced data analytics
tasks. Here are a few examples of commercial business analytics software:
Dundas Business Intelligence has automated trend forecasting and an intuitive interface
Qlik's QlikView has data visualization and automated data association features
Sisense is renowned for its data warehousing and dynamic text-analysis capabilities
Tableau offers sophisticated capabilities for natural language processing and unstructured
text analysis.
Tibco Spotfire is an automated statistical and unstructured text analysis tool with powerful
abilities.
Organizations should consider the following factors when choosing a business analytics tool:
The primary duty of business analytics professionals is to gather and analyze data to affect
the strategic choices that an organization makes. The following are some projects for which
they could perform the analysis:
Identifying potential issues the company might face and possible solutions
Comprehending KPIs
Comprehending regulatory and reporting requirements
Employers typically look for the following skills when hiring for these positions:
Analytical problem-solving
Attention to detail
Insights
Data insights are knowledge that a company gains from analysing sets of information
pertaining to a given topic or situation. Analysis of this information provides insights
that help businesses make informed decisions and reduces the risk that comes with trial-and-
error testing methods.
Importance of data in business analytics
Data can help businesses measure whether certain actions, products or services are profitable
and where their greatest expenses might be. Identifying expenses is often the key to
increasing profits because businesses can reduce those expenses and keep more of the
revenue they earn.
The five areas are (in no particular order of importance); 1) decision-making, 2) problem
solving, 3) understanding, 4) improving processes, and 5) understanding customers.
Data analytics is used in tracking customers' behaviour towards products or services. You can
use it to identify why sales are low, what products people buy, why they are buying them,
how much they are spending on these products, how you can sell your products better, and
many other queries
The data is all the information you store about your company and how it operates. It tells you
who your customers are, how much money you have in receivables, the state of your supply
chain and more. The bottom line? Data is your most important asset.
Some common data collection methods include surveys, interviews, observations, focus
groups, experiments, and secondary data analysis. The data collected through these methods
can then be analyzed and used to support or refute research hypotheses and draw conclusions
about the study's subject matter.
Data, in the context of databases, refers to all the single items that are stored in a database,
either individually or as a set. Data in a database is primarily stored in database tables, which
are organized into columns that dictate the data types stored therein.
Data, Information and Knowledge
Data are plain facts. The word “data” is plural for “datum.” When data are processed,
organized, structured or presented in a given context so as to make them useful, they are
called Information.
It is not enough to have data (such as statistics on the economy). Data themselves
are fairly useless, but when these data are interpreted and processed to determine
its true meaning, they becomes useful and can be named as Information.
Data is the raw material that can be processed by any computing machine. Data
can be represented in the form of:
Information: Information is data that has been converted into a more useful or
intelligible form. It is the set of data that has been organized for direct utilization of
mankind, as information helps human beings in their decision making process.
Examples are: Time Table, Merit List, Report card, Headed tables, printed
documents, pay slips, receipts, reports etc. The information is obtained by
assembling items of data into a meaningful form. For example, marks obtained by
students and their roll numbers form data, the report card/sheet is the .information.
Other forms of information are pay-slips, schedules, reports, worksheet, bar charts,
invoices and account returns etc. It may be noted that information may further be
processed and/or manipulated to form knowledge. Information containing wisdom is
known as knowledge.
Need of Information
The second stage involves understanding the importance of data and how it can
improve business operations. At this level, a company is incorporating data into its
decision-making. Employees analyze data to measure the results of business
actions and track the progress of their goals. Because of this increased data usage,
there are security measures to protect company data and often incorporation of
automated processes to maintain data flow.
This advanced stage of data maturity involves observing data usage from previous
stages and using that knowledge to become more competitive in the marketplace.
Companies in this stage create a user-friendly process for accessing data and
promote data literacy and basic data analytics skills among all professionals, not
just data scientists or analysis
Another key feature of this data maturity stage is how professionals share this data.
At this level, employees can share data internally and externally, which means they
can use the company's data to help improve client satisfaction and internal
productivity.
5. The driven or innovator stage
In the last stage, the company uses its data to implement change within the
company. At this maturity level, company executives consider data when setting
company goals. This can help them create more innovative business practices. For
example, a company may use production data to set new productivity goals and
research new practices to help professionals succeed in their daily tasks.
How do professionals use data maturity models?
Professionals use data maturity models to analyze their company's current data
management practices. This helps them determine which goals they can set and
how they can help the company advance to a higher level. Being in a higher stage of
the data maturity model can help companies integrate data analysis into their
business practices, which can lead to effective and data-driven choices.
To use a data maturity model effectively, professionals mark certain data milestones
within their company. For example, if several employees interpret data but don't hold
the job titles of data analyst or data scientist, the company is at least in the second
stage of maturity. By observing milestones and tracking overall data usage,
companies can better target their goals.
Using data maturity models helps an organization understand how each employee
incorporates data into their decisions. By making company data accessible and
encouraging access, companies can help professionals make more informed
decisions. For example, when management professionals feel comfortable
interpreting data, they can use this information to support the top performers in their
company as they advance their careers and guide the low performers toward better
productivity.
(b) Expands Employees Skill set
Once organizations reach the second level of data maturity, having data and
information literacy skills becomes more of an asset. When many professionals
within the company are sharing and interpreting data, it can encourage other
employees to learn more about data and how to incorporate it. Management
professionals can then recognize which employees show the most potential in data
management, which can lead to promotions and high productivity levels
Using data maturity models can help companies integrate data analysis into their
business operations. These tools often allow companies to become more efficient
because data can offer objective guidance regarding decisions and goal-setting. For
example, at higher maturity levels, companies can use data to make more informed
budget plans or set feasible goals for growth
When non-executive professionals are comfortable accessing and using data, they
can apply the information to their own short- and long-term goals. This can increase
their productivity levels and confidence. For example, if a professional
notice that one of their regular clients often updates their budget around the end of
the month, they can use this information to their advantage to make better business
proposals
Data quality
Measures of quality are all around us. Quality frameworks are designed to communicate
information on how a specific item measures up against a trusted standard. Overall, quality
frameworks help define what good looks like for a particular industry or issue.
Generally data is considered high quality if fits the intended purpose of its use. In addition,
there are several dimensions commonly associated with high quality data.
Accuracy – all data correctly reflects the object or event in the real world
Completeness – all data that should be present is present
Relevance – all data meets the requirements for intended use
Timeliness – all data reflects the correct point in time
Consistency – values and records are represented in the same way within/across
datasets
There may be other dimensions added to these five like uniqueness, validity, or openness
designed to capture different elements of the data that are important to particular users. But
overall if your data meets the definitions of all or several of the dimensions noted above, it is
high quality. Some organizations take it a step further and create their own data quality
scores to help make the term more meaningful for their own users.
No matter how you define it, having good data quality is important. So how do we improve
data quality in our organizations?
How do we improve data quality?
Improving data quality starts with understanding the data
lifecycle.
Across all stages of the lifecycle, describing your data well is critical. As
discussed in a previous blog, good description and metadata helps to
provide context for data, standardizes formats and rules within and across
organizations, and improves the use of data overall.
Good metadata improves the quality of data by improving
consistency (one of the five dimensions mentioned above) and by
creating a mechanism for starting to assess quality on the other four
dimensions through the data life cycle.
Several business analytics tools are available in the market that offers specific solutions to
match requirements. Professionals might need business analytics skills, like understanding
and expertise of statistics or SQL to manage them.
Financial Modelling
Financial modelling is the process of creating a summary of a company's expenses and earnings in
the form of a spreadsheet that can be used to calculate the impact of a future event or decision.
A financial model has many uses for company executives. Financial analysts most often use it to
analyze and anticipate how a company's stock performance might be affected by future events or
executive decisions.
Financial analysts use them to explain or anticipate the impact of events on a company's stock,
from internal factors such as a change of strategy or business model to external factors such as a
change in economic policy or regulation.
Financial models are used to estimate the valuation of a business or to compare businesses to their
peers in the industry. They also are used in strategic planning to test various scenarios, calculate
the cost of new projects, decide on budgets, and allocate corporate resources.
The objective of financial modeling is to combine accounting, finance, and business metrics to create
a forecast of a company's future results. A financial model is simply a spreadsheet which is usually
built in Microsoft Excel, that forecasts a business's financial performance into the future.
Knowing how to build a financial model is a must for financial officers, investors, and others involved in
the financial operations of a business or organization.
Here are the six basic steps for building a financial model:
1. Gather historical data. You’ll need at least the last three years of financial data for the company.
2. Calculate ratios and metrics. Using the historical data from the first step, you’ll calculate historical
ratios and metrics, like growth margins and rates, asset turnover ratios, and inventory changes.
3. Make informed assumptions. Armed with your historical data, ratios, and metrics, continue using
this information to build future ratio and metric projections. Use assumptions to calculate future
growth margins and rates, assets that may turnover, and projected changes in inventory.
4. Create a forecast. Use all the above data and reports to forecast the usual accounting documents,
such as future income, balance sheet, and cash flow statements. Do this by reversing your original
calculations for historic ratios and metrics. Specifically, use your previous assumptions to build out the
forecasted statements.
5. Value the company. After you’ve forecasted, you can now value the company using the DCF,
or Discounted Cash Flow, method.
6. Review. Once you have this information before you, use your drafted statements to decide how
different scenarios may play out.
The process also forces the business to think about the various changes that may happen
internally as well as in the external environment. Hence, it would be fair to say that
companies which create financial models are somehow forced to do more due diligence as
compared to their counterparts. This creates a better understanding of the business.
Creation of financial models, therefore, has a spill over effect which leads to a better
understanding of the underlying business.
2. Helps Decide on a Funding Strategy: When companies develop financial models, they are
able to clearly understand what their cash flow situation will be. The cash flow requirements
that the company would face as well as the ability to borrow and make interest payments
can be easily ascertained. This helps the company choose an appropriate funding strategy.
For instance, start-up firms have uncertain revenues. However, their expenses are more or
less fixed.
Using financial modelling, they can decide on the amount of money that they need to have
on hand in order to ensure that they survive till the revenues start flowing in. Therefore,
start companies are able to ascertain the amount of equity stake they should sell so as to
reach the next milestone.
3. Helps Reach the Correct Valuation: Financial modelling allows companies to understand
their true worth. In the absence of modelling, the worth of a company is decided by using
discounted cash flow models. Some of these models assume linear relationships between
revenues and expenses, which are just not true.
Financial models make it possible to ascertain the exact amount of free cash flow that will
accrue to the firm at different points in time. This helps companies to know their exact
worth when they are selling out their stakes to third party investors such as investment
bankers and private equity funds.
The data needs to collected, the underlying factors have to be identified, and the model
needs to be tested for financial as well as technical irregularities. This model then needs to
be made intuitive and user-friendly.
Needless to say, all this costs a lot of time and money. Many companies, particularly smaller
ones, may not have the resources to spare for this exercise. Hence, in many cases, financial
models have very limited applicability.
2. Inaccurate: In many cases, financial models have proven to be woefully inadequate. The
subprime mortgage crisis of 2008 is widely quoted while trying to explain this point.
However, it needs to be understood that inaccuracy is built into the model itself.
Nobody has the knowledge required to predict factors such as interest rates, tax rates, and
market shares with utmost precision. If a person did have such an ability, they would make a
killing by trading in the stocks and derivatives market and would not need to create financial
models!
Therefore, the numbers provided by the financial model need to be taken with a pinch of
salt. If numbers are being projected far into the future, then one can be almost certain that
these numbers will not be met.
3. Soft Factors Not Considered: Lastly, many mergers have failed because of soft factors such
as difficulties integrating the culture of the two acquired companies. It is impossible to build
such factors into financial models. On the one hand, models take into account synergies
which will be created by reducing expenses as a result of the merger. However, on the other
hand, they do not take into account the expenses which will arise due to lack of cultural
compatibility. This leads to an overvaluation of assets in the long run. Many mergers have
failed in the past even though the financial models had predicted that these models would
be successful.
1. Data quality: Financial models rely on accurate and reliable data to make predictions.
However, data can be incomplete, inconsistent, or biased, which can lead to inaccurate
results.
2. Model complexity: Financial models can be complex and difficult to understand, which
can make it challenging for decision-makers to interpret and use the results.
3. Assumptions: Financial models are based on assumptions about the future, which are
inherently uncertain. If these assumptions turn out to be incorrect, the model's
predictions may be inaccurate.
4. Changing economic conditions: Financial models may not accurately reflect changing
economic conditions, such as shifts in interest rates or market volatility, which can impact
the accuracy of the model's predictions.
5. Human error: Financial modelling involves a lot of calculations and data entry, which can
be prone to human error. Even small mistakes can have significant impacts on the model's
predictions.
6. Validation: It can be challenging to validate financial models, as there may be limited
historical data to compare the model's predictions against.
Overall, financial modeling requires careful attention to data quality, model complexity,
assumptions, changing economic conditions, human error, and validation to ensure accurate and
reliable results.
Predictive Analytics
Predictive analytics is the process of using data to forecast future outcomes. The process uses data
analysis, machine learning, artificial intelligence, and statistical models to find patterns that might
predict future behavior.
Data scientists use predictive models to identify correlations between different elements in selected
datasets. Once data collection is complete, a statistical model is formulated, trained, and modified to
generate predictions.
The workflow for building predictive analytics frameworks follows five basic steps:
1. Define the problem: A prediction starts with a good thesis and set of requirements. For instance, can
a predictive analytics model detect fraud? Determine optimal inventory levels for the holiday
shopping season? Identify potential flood levels from severe weather? A distinct problem to solve
will help determine what method of predictive analytics should be used.
2. Acquire and organize data: An organization may have decades of data to draw upon, or a continual
flood of data from customer interactions. Before predictive analytics models can be developed, data
flows must be identified, and then datasets can be organized in a repository such as a data
warehouse like Big Query.
3. Pre-process data: Raw data is only nominally useful by itself. To prepare the data for the predictive
analytics models, it should be cleaned to remove anomalies, missing data points, or extreme
outliers, any of which might be the result of input or measurement errors.
4. Develop predictive models: Data scientists have a variety of tools and techniques to develop
predictive models depending on the problem to be solved and nature of the dataset. Machine
learning, regression models, and decision trees are some of the most common types of predictive
models.
5. Validate and deploy results: Check on the accuracy of the model and adjust accordingly. Once
acceptable results have been achieved, make them available to stakeholders via an app, website, or
data dashboard.
Regression analysis
Decision trees
Decision trees are classification models that place data into different categories based on distinct
variables. The method is best used when trying to understand an individual's decisions. The model
looks like a tree, with each branch representing a potential choice, with the leaf of the branch
representing the result of the decision. Decision trees are typically easy to understand and work well
when a dataset has several missing variables.
Neural networks
Neural networks are machine learning methods that are useful in predictive analytics when
modeling very complex relationships. Essentially, they are powerhouse pattern recognition engines.
Neural networks are best used to determine nonlinear relationships in datasets, especially when no
known mathematical formula exists to analyze the data. Neural networks can be used to validate
the results of decision trees and regression models.
Time Series
Hypothesis Testing
Sensitivity Analysis
Sensitivity analysis determines how different values of an independent variable affect a particular
dependent variable under a given set of assumptions. In other words, sensitivity analyses study
how various sources of uncertainty in a mathematical model contribute to the model's overall
uncertainty. This technique is used within specific boundaries that depend on one or more input
variables.
Sensitivity analysis is used in the business world and in the field of economics. It is commonly used
by financial analysts and economists and is also known as a what-if analysis.
Scenario analysis
Scenario analysis is a method for predicting the possible occurrence of an object or the
consequences of a situation, assuming that a phenomenon or a trend will be continued in the
future (Kishita et al., 2016).
Definition: Scenario Analysis is a process to ascertain and analyze possible events that can take place
in the future. This is an important tool in the world of finance and economics, and is used extensively
to make projections for the future.
Scenario analysis is the process of estimating the expected value of a portfolio after a given period of
time, assuming specific changes in the values of the portfolio's securities or key factors take place,
such as a change in the interest rate.
What is the essential difference between a sensitivity analysis and a scenario analysis? With a
sensitivity analysis, one variable is examined over a broad range of values. With a scenario analysis,
all variables are examined for a limited range of values.
The difference between the two methods is that sensitivity analysis examines the effect of changing
just one variable at a time. On the other hand, scenario analysis assesses the effect of changing all
the input variables at the same time.
Example
Howard Ben Enterprises is considering whether to open a new office building in downtown Austin
next year. They conduct a scenario analysis to see what may occur if the cost of raw materials went
up. They also examine the business demand for more office space in Austin
Below are the steps that you can follow to implement a one-dimensional sensitivity analysis in
excel.
A sensitivity analysis, otherwise known as a “what-if” analysis or a data table, is another in a long line
of powerful Excel tools that allows a user to see what the desired result of the financial model would
be under different circumstances.
Sensitivity analysis in excel helps us study the uncertainty in the model’s output with the
changes in the input variables. It primarily does stress testing of our modeled assumptions
and leads to value-added insights.
In the context of DCF valuation, Sensitivity Analysis in excel is especially useful in finance
for modeling share price or valuation sensitivity to assumptions like growth rates or cost of
capital.
In this article, we look professionally at the following Sensitivity Analysis in Excel for DCF
Modeling.
Sensitivity analysis in Excel tests how input variable changes affect the model’s output. Stress testing
assumptions can provide valuable insights and enhance research.
Excel’s sensitivity analysis is essential for valuing companies using discounted cash flow. It helps to
determine how changes in assumptions impact share prices or valuation.
Excel’s sensitivity analysis helps understand financial and operational behavior. It can be beneficial
for valuations like DCF or DDM in finance. By creating scenarios, you can see how changes in factors
like interest rates or GDP affect the valuation. It’s important to use common sense when developing
sensitivity cases.
https://www.youtube.com/watch?v=7s4sHwnIyPk
https://breakingintowallstreet.com/kb/excel/sensitivity-analysis-excel/
Under What-If Analysis in the Data Tab, they can click the Scenario Manager and then Add. Variable
cells (called changing cells) can be adjusted before saving the scenario for future use. For this
analyst, sales of 2,500 items is a best-case scenario.
https://www.youtube.com/watch?v=AvaaiWyypX4
An Excel dashboard is one-pager (mostly, but not always necessary) that helps managers and
business leaders in tracking key KPIs or metrics and take a decision based on it. It contains
charts/tables/views that are backed by data. A dashboard is often called a report, however, not all
reports are dashboards.
To monitor the organization's overall performance, dashboards allow you to capture and report
specific data points from each of the departments in the organization, providing a snapshot of
current performance and a comparison with earlier performance. Visual presentation of
performance measures.
Dashboard Reporting Benefits
Demand forecasting refers to the process of predicting the quantity of goods and services that will
be demanded by consumers at a future point in time. More specifically, the methods of demand
forecasting entail using predictive analytics to estimate customer demand in consideration of key
economic conditions
Demand forecasting is a combination of two words; the first one is Demand and another forecasting.
Demand means outside requirements of a product or service. In general, forecasting means making
an estimation in the present for a future occurring event. Here we are going to discuss demand
forecasting and its usefulness.
What is Demand?
Demand in terms of economics may be explained as the consumers’ willingness and ability to
purchase or consume a given item/good. Furthermore, the determinants of demand go a long way in
explaining the demand for a particular good.
For instance, an increase in the price of a good will lead to a decrease in the quantity that may be
demanded by consumers. Similarly, a decrease in the cost or selling price of a good will most likely
lead to an increase in the demanded quantity of the goods.
This indicates the existence of an inverse relationship between the price of the article and the
quantity demanded by consumers. This is commonly known as the law of demand and can be
graphically represented by a line with a downward slope.
The graphical representation is known as the demand curve. The determinants of demand are
factors that cause fluctuations in the economic demand for a product or a service.
Determinants of Demand
Some of the important determinants of demand are as follows,
People use price as a parameter to make decisions if all other factors remain constant or equal.
According to the law of demand, this implies an increase in demand follows a reduction in price and
a decrease in demand follows an increase in the price of similar goods.
The demand curve and the demand schedule help determine the demand quantity at a price level.
An elastic demand implies a robust change quantity accompanied by a change in price. Similarly, an
inelastic demand implies that volume does not change much even when there is a change in price.
Rising incomes lead to a rise in the number of goods demanded by consumers. Similarly, a drop in
income is accompanied by reduced consumption levels. This relationship between income and
demand is not linear in nature. Marginal utility determines the proportion of change in the demand
levels.
Complementary products – An increase in the price of one product will cause a decrease in
the quantity demanded of a complementary product. Example: Rise in the price of bread will
reduce the demand for butter. This arises because the products are complementary in
nature.
Substitute Product – An increase in the price of one product will cause an increase in the
demand for a substitute product. Example: Rise in price of tea will increase the demand for
coffee and decrease the demand for tea.
4] Consumer Expectations
Expectations of a higher income or expecting an increase in prices of goods will lead to an increase
the quantity demanded. Similarly, expectations of a reduced income or a lowering in prices of goods
will decrease the quantity demanded.
Consumer's Equilibrium
The term equilibrium defines a state of rest from where there is no tendency to change anything. A
consumer is observed to be in the state of equilibrium when he/she does not aspire to change
his/her level of consumption i.e. when he/she attains maximum satisfaction. Therefore, consumer
equilibrium refers to the situation when the consumer has attained maximum possible satisfaction
from the number of commodities purchased given his/her income and price of the commodity in the
market. Read the article below to understand more about consumer equilibrium.
A consumer is said to be in an equilibrium state when he feels that he cannot change his situation
either by earning more or by spending more or by changing the number of things he buys. A rational
consumer will purchase a commodity up to the point where the price of the commodity is equivalent
to the marginal utility obtained from the thing. If this condition is not fulfilled, the consumer will
either purchase more or less. If he purchases more, the MU will fall and situations will arise when
the price paid will exceed marginal utility. In order to prevent negative utility, i.e. dissatisfaction, he
will reduce his consumption and MU will go on increasing till price = marginal utility. On the other
hand, if marginal utility is greater than the price paid, the consumer will enjoy additional satisfaction
from the unit he has consumed beforehand. This will urge him to buy more and more units of
commodity leading to successive falls in MU till it gets equal to price. Hence, by buying more or less
quantity, a consumer will eventually reach a point where P= MU. Here, his total utility is maximum.
Importance of Consumer Equilibrium It enables consumers to maximize his/her utility from the
consumption of one or more commodities. It helps the consumers to arrange the combination of
two or more products based on consumer taste and preference for maximum utility.
In the case of a single commodity, let’s assume: The purchase would be restricted only to the single
commodity The price of the commodity is already given in the market. The consumer only
determines how much he needs to purchase at a given price. Being a rational human being, the goal
of a consumer is to maximize the consumer surplus which implies the surplus of utility he earns over
his expenditures on the good at the point of purchase. There are no limitations on the consumer
expenditure i.e. he has sufficient money to buy whatever quantity he decides to buy at a given price.
Types of Demand
In economics, demand is the quantity of a good or service that a consumer is willing and able to
purchase at different price levels available during a given time period. Although the demand is the
desire of a consumer to purchase a commodity, it is not the same as desire. Desire is just a wish of a
consumer to purchase a commodity even though he is unable to buy it. However, demand is a
consumer’s desire to purchase a commodity, provided he is willing to spend and has sufficient
purchasing power.
Hence, we can say that the four essential elements of demand are:
Types of Demand
1. Price Demand:
Assuming other factors as constant, a relationship between the price and demand of a commodity is
known as Price Demand. Price Demand can be shown as:
Dx = f(Px)
Where,
f = Functional Relationship
2. Cross Demand:
Assuming other things remaining as constant, a relationship between the demand of a given
commodity and the price of related commodities is known as Cross Demand.
3. Income Demand:
Assuming other factors as constant, a relationship between the consumer’s income and the quantity
demanded for a commodity is known as Income Demand. Income Demand can be shown as:
Dx = f(Y)
Where,
4. Joint Demand:
When demand for two or more goods arises simultaneously for satisfying a particular want of the
consumer, then such type of demand is known as Joint Demand. For example, the demand for milk,
coffee beans, and sugar is a joint demand as all these goods are demanded together to prepare
coffee.
5. Composite Demand:
When a commodity can be used for more than one purpose, then such type of demand is known as
Composite Demand. For example, the demand for water is a composite demand as it can be used
for various purposes like bathing, drinking, cooking, etc.
6. Derived Demand:
The kind of demand for a commodity, which depends on the demand for other goods, is known as
Derived Demand. For example, demand for workers/labour, producing bags is a derived demand as
it depends on the demand for bags.
7. Direct Demand:
When a commodity directly satisfies the demand of consumers, then its demand is known as Direct
Demand. For example, demand for books, stationery, clothes, food, etc., is a direct demand as these
goods directly satisfy the wants.
8. Competitive Demand:
When two commodities are close substitutes of each other and an increase in the demand for one
commodity will decrease the demand for the other commodity, then the demand for any one of the
commodities is known as Competitive Demand. For example, an increase in demand for tea might
decrease the demand for coffee, which makes the demand for these goods competitive demand.
This happens because when consumers purchase more of one commodity (say tea), it leads to a
lesser requirement for the other commodity (say coffee).
9. Alternative Demand:
Demand for a commodity is known as alternative demand when it can be satisfied by using different
alternatives. For example, there are number of alternatives to satisfy the demand for clothes like
jeans, shirts, trousers, suits, saree, pants, etc.
Whether you're preparing for your first job interview or aiming to upskill in this ever-evolving tech
landscape, GeeksforGeeks Courses are your key to success. We provide top-quality content at
affordable prices, all geared towards accelerating your growth in a time-bound manner. Join the
millions we've already empowered, and we're here to do the same for you. Don't miss out - check it
out now!
Law of Demand
Now the law of demand states that all conditions being equal, as the price of a product increases,
the demand for that product will decrease. Consequently, as the price of a product decreases, the
demand for that product will increase. For instance, a consumer may buy two dozens of bananas if
the price is Rs.50.
However, if the price increases to Rs.70, then the same consumer may restrict the purchase to one
dozen. Hence, the demand for the bananas, in this case, was reduced by one dozen. Therefore, the
law of demand defines an inverse relationship between the price and quantity factors of a product.
The graph shows the demand curve shifts from D1 to D2, thereby demonstrating the inverse
relationship between the price of a product and the quantity demanded.
Giffen Goods
Giffen Goods is a concept that was introduced by Sir Robert Giffen. These goods are goods that are
inferior in comparison to luxury goods. However, the unique characteristic of Giffen goods is that as
its price increases, the demand also increases. And this feature is what makes it an exception to the
law of demand.
The Irish Potato Famine is a classic example of the Giffen goods concept. Potato is a staple in the
Irish diet. During the potato famine, when the price of potatoes increased, people spent less on
luxury foods such as meat and bought more potatoes to stick to their diet. So as the price of
potatoes increased, so did the demand, which is a complete reversal of the law of demand.
Veblen Goods
The second exception to the law of demand is the concept of Veblen goods. Veblen Goods is a
concept that is named after the economist Thorstein Veblen, who introduced the theory of
“conspicuous consumption“. According to Veblen, there are certain goods that become more
valuable as their price increases. If a product is expensive, then its value and utility are perceived to
be more, and hence the demand for that product increases.
And this happens mostly with precious metals and stones such as gold and diamonds and luxury cars
such as Rolls-Royce. As the price of these goods increases, their demand also increases because
these products then become a status symbol.
In addition to Giffen and Veblen goods, another exception to the law of demand is the expectation
of price change. There are times when the price of a product increases and market conditions are
such that the product may get more expensive. In such cases, consumers may buy more of these
products before the price increases any further. Consequently, when the price drops or may be
expected to drop further, consumers might postpone the purchase to avail the benefits of a lower
price.
For instance, in recent times, the price of onions had increased to quite an extent. Consumers
started buying and storing more onions fearing further price rise, which resulted in increased
demand.
There are also times when consumers may buy and store commodities due to a fear of shortage.
Therefore, even if the price of a product increases, its associated demand may also increase as the
product may be taken off the shelf or it might cease to exist in the market.
Another exception to the law of demand is necessary or basic goods. People will continue to buy
necessities such as medicines or basic staples such as sugar or salt even if the price increases. The
prices of these products do not affect their associated demand.
Change in Income
Sometimes the demand for a product may change according to the change in income. If a
household’s income increases, they may purchase more products irrespective of the increase in their
price, thereby increasing the demand for the product. Similarly, they might postpone buying a
product even if its price reduces if their income has reduced. Hence, change in a consumer’s income
pattern may also be an exception to the law of demand.
3.Barometric Method
There is no easy or simple formula to forecast the demand. Proper judgment along with the scientific
formula is needed to correctly predict the future demand for a product or service. Some methods of
demand forecasting are discussed below:
When the demand needs to be forecasted in the short run, say a year, then the most feasible
method is to ask the customers directly that what are they intending to buy in the forthcoming time
period. Thus, under this method, potential customers are directly interviewed. This survey can be
done in any of the following ways:
a. Complete Enumeration Method: Under this method, nearly all the potential buyers are
asked about their future purchase plans.
b. Sample Survey Method: Under this method, a sample of potential buyers are chosen
scientifically and only those chosen are interviewed.
c. End-use Method: It is especially used for forecasting the demand of the inputs. Under this
method, the final users i.e. the consuming industries and other sectors are identified. The
desirable norms of consumption of the product are fixed, the targeted output levels are
estimated and these norms are applied to forecast the future demand of the inputs.
Hence, it can be said that under this method the burden of demand forecasting is on the buyer.
However, the judgments of the buyers are not completely reliable and so the seller should take
decisions in the light of his judgment also.
The customer may misjudge their demands and may also change their decisions in the future which
in turn may mislead the survey. This method is suitable when goods are supplied in bulk to industries
but not in the case of household customers.
2] Collective Opinion Method
Under this method, the salesperson of a firm predicts the estimated future sales in their region. The
individual estimates are aggregated to calculate the total estimated future sales. These estimates are
reviewed in the light of factors like future changes in the selling price, product designs, changes in
competition, advertisement campaigns, the purchasing power of the consumers, employment
opportunities, population, etc.
The principle underlying this method is that as the salesmen are closest to the consumers they are
more likely to understand the changes in their needs and demands. They can also easily find out the
reasons behind the change in their tastes.
Therefore, a firm having good sales personnel can utilize their experience to predict the demands.
Hence, this method is also known as Salesforce opinion or Grassroots approach method. However,
this method depends on the personal opinions of the sales personnel and is not purely scientific.
3] Barometric Method
This method is based on the past demands of the product and tries to project the past into the
future. The economic indicators are used to predict the future trends of the business. Based on
future trends, the demand for the product is forecasted. An index of economic indicators is formed.
There are three types of economic indicators, viz. leading indicators, lagging indicators, and
coincidental indicators.
The leading indicators are those that move up or down ahead of some other series. The lagging
indicators are those that follow a change after some time lag. The coincidental indicators are those
that move up and down simultaneously with the level of economic activities.
Another one of the methods of demand forecasting is the market experiment method. Under this
method, the demand is forecasted by conducting market studies and experiments on consumer
behaviour under actual but controlled, market conditions.
Certain determinants of demand that can be varied are changed and the experiments are done
keeping other factors constant. However, this method is very expensive and time-consuming.
5] Expert Opinion Method
Usually, market experts have explicit knowledge about the factors affecting demand. Their opinion
can help in demand forecasting. The Delphi technique, developed by Olaf Helmer is one such
method.
Under this method, experts are given a series of carefully designed questionnaires and are asked to
forecast the demand. They are also required to give the suitable reasons. The opinions are shared
with the experts to arrive at a conclusion. This is a fast and cheap technique.
II Statistical Methods
Statistical forecasting models, also known as quantitative forecasting models, use business statistics
findings to create relationships and correlating data. This method can help a business determine
how its operations compare to those of businesses in a similar sector or market. You can also use it
to assess benchmarks, profitability and growth rates. Statistical forecasting methods include straight-
line, moving average, simple linear and multiple linear regression. They help you determine different
levels and repetitions and compare one or more independent variables with one dependent variable
to know their effects on one another.
The statistical method is one of the important methods of demand forecasting. Statistical methods
are scientific, reliable and free from biases. The major statistical methods used for demand
forecasting are:
a. Trend Projection Method: This method is useful where the organization has a sufficient
amount of accumulated past data of the sales. This date is arranged chronologically to
obtain a time series. Thus, the time series depicts the past trend and on the basis of it, the
future market trend can be predicted. It is assumed that the past trend will continue in the
future. Thus, on the basis of the predicted future trend, the demand for a product or service
is forecasted.
b. Regression Analysis: This method establishes a relationship between the dependent variable
and the independent variables. In our case, the quantity demanded is the dependent
variable and income, the price of goods, the price of related goods, the price of substitute
goods, etc. are independent variables. The regression equation is derived assuming the
relationship to be linear. Regression Equation: Y = a + bX. Where Y is the forecasted demand
for a product or service.
Multiple linear regression
Summary. Multiple linear regression refers to a statistical technique that uses two or more
independent variables to predict the outcome of a dependent variable. The technique enables
analysts to determine the variation of the model and the relative contribution of each independent
variable in the total variance.
Multiple linear regression (MLR), also known simply as multiple regression, is a statistical technique
that uses several explanatory variables to predict the outcome of a response variable. The goal of
multiple linear regression is to model the linear relationship between the explanatory (independent)
variables and response (dependent) variables. In essence, multiple regression is the extension of
ordinary least-squares (OLS) regression because it involves more than one explanatory variable.
Multiple linear regression (MLR), also known simply as multiple regression, is a statistical
technique that uses several explanatory variables to predict the outcome of a response
variable.
Multiple regression is an extension of linear (OLS) regression that uses just one explanatory
variable.
MLR is used extensively in econometrics and financial inference.
Y=β0+β1X+β2Z+ β2K+ ϵ
Y=dependent variable
X,Z,K=explanatory/Independent variables
β0=y- intercept (constant term)
β1, β2, β3 =slope coefficients for each explanatory variable
ϵ=the model’s error term (also known as the residuals)
Multiple regressions are based on the assumption that there is a linear relationship between both
the dependent and independent variables. It also assumes no major correlation between the
independent variables.
positive coefficient indicates that as the value of the independent variable increases, the mean of
the dependent variable also tends to increase. A negative coefficient suggests that as the
independent variable increases, the dependent variable tends to decrease.
In a multiple regression model, the constant represents the value that would be predicted for the
dependent variable if all the independent variables were simultaneously equal to zero--a situation
which may not physically or economically meaningful.
Logistic Regression
Logistic regression is a data analysis technique that uses mathematics to find the relationships
between two data factors. It then uses this relationship to predict the value of one of those factors
based on the other. The prediction usually has a finite number of outcomes, like yes or no. Logistic
Regression is another statistical analysis method borrowed by Machine Learning. It is used when our
dependent variable is dichotomous or binary. It just means a variable that has only 2 outputs, for
example, A person will survive this accident or not, The student will pass this exam or not.
The Differences Between Linear Regression and Logistic Regression. Linear Regression is used to
handle regression problems whereas Logistic regression is used to handle the classification
problems. Linear regression provides a continuous output but Logistic regression provides discreet
output
Logistic regression is used to predict the categorical dependent variable. It's used when the
prediction is categorical, for example, yes or no, true or false, 0 or 1. For instance, insurance
companies decide whether or not to approve a new policy based on a driver's history, credit history
and other such factors.
Binary logistic regression: In this approach, the response or dependent variable is dichotomous in
nature—i.e. it has only two possible outcomes (e.g. 0 or 1). Some popular examples of its use include
predicting if an e-mail is spam or not spam or if a tumor is malignant or not malignant.
Logistic regression is much easier to implement than other methods, especially in the
context of machine learning: A machine learning model can be described as a mathematical
depiction of a real-world process. The process of setting up a machine learning model
requires training and testing the model. Training is the process of finding patterns in the
input data, so that the model can map a particular input (say, an image) to some kind of
output, like a label. Logistic regression is easier to train and implement as compared to other
methods.
Logistic regression works well for cases where the dataset is linearly separable: A dataset is
said to be linearly separable if it is possible to draw a straight line that can separate the two
classes of data from each other. Logistic regression is used when your Y variable can take
only two values, and if the data is linearly separable, it is more efficient to classify it into two
seperate classes.
Logistic regression provides useful insights: Logistic regression not only gives a measure of
how relevant an independent variable is (i.e. the (coefficient size), but also tells us about the
direction of the relationship (positive or negative). Two variables are said to have a positive
association when an increase in the value of one variable also increases the value of the
other variable. For example, the more hours you spend training, the better you become at a
particular sport. However: It is important to be aware that correlation does not necessarily
indicate causation! In other words, logistic regression may show you that there is a positive
correlation between outdoor temperature and sales, but this doesn’t necessarily mean that
sales are rising because of the temperature. If you want to learn more about the difference
between correlation and causation, take a look at this post
To run the regression, arrange your data in columns as seen below. Click on the “Data” menu, and
then choose the “Data Analysis” tab. You will now see a window listing the various statistical tests
that Excel can perform. Scroll down to find the regression option and click “OK
Interpretation of R-square
R-Squared (R² or the coefficient of determination) is a statistical measure in a regression model that
determines the proportion of variance in the dependent variable that can be explained by the
independent variable. In other words, r-squared shows how well the data fit the regression model.
if R-squared value 0.3 < r < 0.5 this value is generally considered a weak or low effect size, - if R-
squared value 0.5 < r < 0.7 this value is generally considered a Moderate effect size, - if R-squared
value r > 0.7 this value is generally considered strong effect size
Correlation and R-squared are two important measures in statistical analysis. Correlation measures
the strength of the relationship between two variables, while R-squared measures the amount of
variation in the data that is explained by the model.
Interpretation of p-value
You can also see the p-value (in red box) indicating whether or not the test is statistically significant
(i.e. if p < 0.05). In this example, the p-value is 0.00018 which says that there is significant regression.