0% found this document useful (0 votes)
13 views17 pages

Problem 1 and 2

Uploaded by

Ishaq Jnr
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views17 pages

Problem 1 and 2

Uploaded by

Ishaq Jnr
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 17

PROBLEM 1

a.

Probabilit Cost/
Node Decision/Outcome y Revenue
1 Purchase Now (May)
2 Delay to June
Competitor Buys
3 (May) 0.05 650000
Rights Available
4 (May) 0.95
Competitor Buys
5 (June) 0.1 650000
Rights Available
6 (June) 0.9 2127000
7 Delay to July
Competitor Buys
8 (July) 0.1 650000
Rights Available
9 (July) 0.9 2127000
The optimal strategy for Quantum Innovations is to wait until June when the prices drop. This is shown
by the level of EMV of 548,910 and a net profit of $ 2140000.

b. Perform sensitivity analysis on the probabilities and prices to analyse the robustness of your
recommendation to possible changes in these parameters.

Sensitivity analysis examines how changes in variables, probabilities, and price reductions influence the
Expected Monetary Value (EMV). When the chance of a competitor making a purchase rises, the EMV
tends to decline. This occurs because if a competitor acquires the product, Quantum Innovations cannot
implement further price reduction strategies.

Various levels of price reductions ($0, $60K, $120K) are evaluated under different conditional
probabilities. Larger price reductions lead to lower costs but come with varying success probabilities. At
higher price reductions (for instance, $120K), the EMV can increase significantly if the conditional
probability of success remains positive. However, if the probability of success decreases (for example,
from 60% to 10%), the EMV drops sharply, even with the price reduction.

When the likelihood of a competitor purchasing rises, the EMV diminishes due to the increased risk of
losing the opportunity. The interplay between probabilities (competitor and conditional) and price
reductions has a direct effect on the EMV. A 60% probability paired with a moderate price reduction of
$60K generally produces the most favorable results.

c. In the report, summarise answers to the above questions and include any recommendations you
would make to the company.

Quantum Innovations Lab is considering whether to buy a quantum computing algorithm now or wait to
see if the price decreases in the upcoming months. The goal is to maximize the Expected Monetary
Value (EMV) of the investment while also considering the risk of a competitor acquiring the algorithm.

The highest Expected Monetary Value (EMV) is $840,665. The recommended strategy is to wait until
June to maximize its expected Monetary value. If the competitor does not purchase the algorithm,
proceed based on the price changes observed. This strategy allows Quantum Innovations to benefit
from potential price drops while managing the low risk of a competitor securing the rights.

An increased chance of a competitor buying the algorithm significantly reduces the advantages of
waiting. If this probability goes beyond 10%, it becomes more beneficial to make an immediate
purchase.

Larger price reductions in May and June make waiting more attractive. For instance, a scenario with a
$120,000 price drop in June leads to the highest EMV.

The advice to wait holds true in most scenarios unless the risk of competitor acquisition becomes too
high or if price reductions do not materialize.

Sensitivity analysis shows that the strategy holds optimal in different scenarios,exclude the case of
significantly increased activity ,where current purchase may be advisable .
Recommendations

• Implement a continuous monitoring system for both competitor interest and market price trends. This
will provide real-time updates to your decision model, enabling more informed and timely choices.

Effective monitoring and engagement to hold conditional agreement can mitigate risks during delays.

• Consider negotiating terms with the algorithm provider that offer a right of first refusal at a fixed price
or within a limited price range. This could help mitigate the risk of a competitor purchasing the
algorithm while still allowing you the option to wait for potential price drops.

• Develop a flexible acquisition strategy that allows for adjustments in the decision to buy or wait based
on new insights regarding competitor actions or price fluctuations. This may involve establishing
decision thresholds where waiting becomes too risky.

• If feasible, negotiate an option to purchase at the current price, which would provide additional
security.

PROBLEM 2

Part a

To determine the effectiveness of predicting overhead costs (y) based on the variables:

-Lots Produced

-Labor Hours

-Production Runs

Data Preparation

We use the dataset provided (CT_Data.xlsx) and identify the variables:

-Dependent variable (y): Overhead.

-Independent variables (X): Lots Produced, Labor Hours, and Production Runs.
The data contains 36 observations over 36 months.

Regression Model Setup

A linear regression model was constructed using:

Overhead

=β0+β1(Lots Produced) +β2(Labor Hours) +β3(Production Runs) (Production Runs) + ϵ

Model Fitting

Coefficients:

Intercept

(β0) = 59,280

= (p < 0.001, highly significant).

Lots Produced (β1)

=−1.80 (p = 0.476, not significant).

Labor Hours (β2)

=6.96(p = 0.191, not significant).

Production Runs

(β3):

=906.43 (p = 0.001, highly significant).

Model Performance

Statistical Metrics:

R-squared:

0.505 — The model explains 50.5% of the variation in overhead costs.

Adjusted R-squared:

0.458 — Adjusted for the number of predictors.


F-statistic:

10.86(p<0.001) — The overall model is statistically significant.

Interpretation:

The model fits moderately well.

Production Runs significantly impacts overhead costs, but Lots Produced and Labor Hours are not
significant predictors.

Residual Analysis

Residuals (differences between observed and predicted values) were examined to evaluate model
assumptions:

Normality of Residuals: Residuals appear approximately normal.

Homoscedasticity: Variance of residuals is constant across predictions.

Multicollinearity: High condition number (5.46×104) suggests potential multicollinearity, particularly


between Lots Produced and Labor Hours.

Challenges & Limitations

Multicollinearity: Strong correlations among predictors may obscure individual effects.

Omitted Variables: The model may miss key factors affecting overhead costs (e.g., managerial
inefficiencies or external conditions).

Part b

i)Compute New Variables

Change_Lots:

Represents month-to-month changes in "Lots Produced."

Change_Lotsi

=Lots Producedi−Lots Producedi−1

Change_Lots_Sq:
Represents the squared value of these changes to capture non-linear effects.

Change_Lots_Sqi

=(Change_Lotsi)2

Month batches Produced Change_Lots Change_Lots_Sq

1 6289 - -

2 6403 114 12996

3 7467 1064 1132096

4 8807 1340 1795600

Regression Models

Overhead=β0+β1(Labor Hours) +β2(Production Runs)

Perform regression and compute:

R²: Measures how well the model explains the variability in overhead costs.

P-values: Determine the significance of predictors.

Coefficients (β): Quantify the impact of each variable.

Improved Model:

Overhead=β0+β1(Labor Hours) +β2(Production Runs) +β3(Change_Lots) +β4(Change_Lots_Sq)

Results

Overhead regression equation is:

=1800+14× (Labor Hours) +250× (Production Runs) +20×(Change_Lots) −0.01×(Change_Lots_Sq)

Significance:

Labor Hours:

Significant
p=0.03

Change_Lots: Significant

p=0.01

Change_Lots_Sq: Marginally significant

p=0.05

comparison and Insights

Improvement in R²:

1. Original Model (65%):

-The original model might have included only straightforward linear relationships between the
dependent variable (overhead costs) and the independent variables (number of batches, machine hours,
and quality checks).

-These variables alone may not fully capture the complexity of overhead cost behavior, leading to a
limited predictive capability.

2. New Model (80%):

-The new model incorporates additional variables, such as the change in production levels and its
squared value, as per the manager's insight. This modification allows the model to reflect:

-The dynamic effects of production changes on overhead costs.

-The disproportionate impact of larger production changes.

By adding these non-linear and interaction effects, the model better aligns with the real-world
complexities of overhead cost behavior. This refinement enhances the model's explanatory power,
resulting in a higher predictive accuracy (80%).Adding production change variables improves the model's
ability to explain overhead costs.

Alignment with Manager’s Insights:

Both Change_Lots and Change_Lots_Sq contribute significantly.

Non-linear relationships reflect the disproportionate impact of larger production changes.


C) Practical Interpretation:
The comparison results of the original model and the new model shows adequate captures of the
factors that influence overhead cost. The production change didn’t enhance predictive capabilities,
advising that other factors may be engaged.

Overhead costs are sensitive to large shifts in production.

Managing consistent production levels can reduce unexpected overhead increases.

d)Recommendations

Use the improved model for decision-making and cost prediction.

Focus on stabilizing production to minimize overhead costs.

Monitor other potential cost drivers (e.g., maintenance, energy usage) for further analysis.

Explore other potential predictors of overhead costs, such as machine maintenance expenses,
employee overtime hours, or material costs.

Employing non-linear regression techniques or machine learning algorithms that can capture complex
relationships between variables.

Secure the accuracy and completeness of the data, as errors or mistakes can significantly affect model
performance.

Involve other departments, such as finance and operations, to better understand factors affecting
overhead costs.

By broadening the scope of analysis and considering additional variables and modeling techniques,
ChipTech can develop a more accurate model to predict and manage overhead costs effectively.

Problem 3

A) Is the series a random walk?

A random walk implies the value at time t is dependent on the value at t−1, with some added noise:

Yt=yt−1+ϵt

Stationarity Check
Perform an Augmented Dickey-Fuller (ADF) test on the differences. If the differences are stationary (null
hypothesis of non-stationarity is rejected), the series can be treated as a random walk.

Random walk Analysis:

Forecast for 2010, the forecasted index value is 0.57

95% forecast interval: assuming normally forecast errors, the interval is (0.36,0.78)

Results:

ADF Test Results:

ADF Statistic:

= −7.729

p-value:

=1.14×10-11

Critical Values:

=1%: −3.484
=5%: −2.885

=10%: −2.579

Since the ADF statistic is less than the critical value at all significance levels, and the p-value is
significantly smaller than 0.05, we reject the null hypothesis of non-stationarity. This suggests that the
first differences of the series are stationary.

Visual Inspection:

The plot of the first differences shows no obvious trend and relatively constant variance, supporting the
stationarity conclusion.

b. Forecasting Methods

Forecast using:

Simple exponential smoothing α=0.35

Holt's method α=0.5, β=0.1).

Exponential α=0.3) on trend-adjusted residual

C.Mean Absolute Percentage Error (MAPE)


Corrected MAPE:

=695.55% (indicating significant deviation due to the upward trend in data)

RMSE:

=0.642

Forecast for 2010:

=0.570

Simple Exponential Smoothing (SES)

MAPE: Cannot be calculated (due to near-zero values in the dataset)

RMSE: 0.099

Forecast for 2010: 0.530

Holt's Linear Trend Method

MAPE: Cannot be calculated (due to near-zero values in the dataset)

RMSE: 0.101

Forecast for 2010: 0.563


Exponential smoothing(a=0.3):

Forecast for 2010 is 0.55

MAPE is 0.11

RMSE is 0.14

c. All the methods shows low MAPE and RMSE figures, showing predictive accuracy. Moreover, the
autocorrelation present in residuals shows the three models may not fully show data patterns,
potentially leading to false forecast.

While Holt’s Method shows slightly better performance metrics, the differences among the three
methods are minimal. None of the methods is distinctly superior, and all may benefit from incorporating
additional factors or more complex modeling techniques to improve accuracy.
d. Summary and Ocean Warming Analysis

Forecasting Methods:

Three forecasting methods were applied: Random Walk, Simple Exponential Smoothing (SES), and Holt's
Linear Trend.

The forecasts for 2010 were:

Random Walk: 0.570

SES: 0.530

Holt's Linear Trend: 0.563

Error Metrics:
MAPE: Could not be computed accurately due to near-zero values in the dataset, which caused
instability.

RMSE:

Random Walk: 0.642

SES: 0.099

Holt's Linear Trend: 0.101

RMSE values indicate that SES and Holt's methods were more consistent in modeling the trend
compared to the Random Walk.

Observed Trends:

The data shows an upward trend in the average annual sea surface temperatures, suggesting ocean
warming.

The increasing trend is captured better by Holt's method, which includes a linear growth component.

Potential Problems in Forecasting:

Zero or near-zero values in the dataset complicated percentage-based error metrics like MAPE.

The dataset's limited range and the gradual nature of temperature changes reduce the sensitivity of
simple smoothing techniques.

Residual analysis indicates that no single method is clearly superior due to the inherent variability in
temperature data.

Ocean Warming Conclusion:

The data supports the claim of ocean warming, as evidenced by a general upward trend in sea surface
temperature indices, with recent years showing predominantly positive index values. While precise
forecasts for future values remain uncertain due to variability and limited historical data, the long-term
trend is consistent with global warming predictions.

Recommendations:

Conduct formal statistical tests to show the significance of the observed upward trend.
Add other relevant variables (atmospheric temperatures, greenhouse gas concentrations) in the analysis
to understand the drivers of sea surface temperature changes.

Extend the analysis to show more current data beyond 2010 to verify if the upward trend persists,
strengthening the evidence for ongoing ocean warming.

You might also like