BUAD 306 Chapter 3 Forecasting Everyday Forecasting Weather
BUAD 306 Chapter 3 – Forecasting
Everyday Forecasting Weather ¢ Time ¢ Traffic ¢ Other examples? ? ? ¢
What is Forecasting? Forecast: A statement about the future ¢ Used to help managers: ¢ Plan the system l Plan the use of the system l
Use of Forecasts Accounting Cost/profit estimates Finance Cash flow and funding Human Resources Hiring/recruiting/training Marketing Pricing, promotion, strategy MIS IT/IS systems, services Operations Schedules, MRP, workloads Product/service design New products and services
Forecasting Basics Assumes causal system past ==> future ¢ Forecasts rarely perfect because of randomness ¢ Forecasts more accurate for groups vs. individuals ¢ Forecast accuracy decreases as time horizon increases ¢
Elements of a Good Forecast Timely – feasible horizon ¢ Reliable – works consistently ¢ Accurate – degree should be stated ¢ Expressed in meaningful units ¢ Written – for consistency of usage ¢ Easy to Use - KISS ¢
Approaches to Forecasting Judgmental – subjective inputs ¢ Time Series – historical data ¢ Associative – explanatory variables ¢
Judgmental Forecasts Executive Opinions Bias? ? ¢ Outside Opinions Accuracy? ¢ Consumer Surveys Guarantee? ? ? ¢ Sales Force Feedback Bias? ? ? ¢
What would you rather evaluate? Period A B C 1 30 18 46 2 34 17 26 3 32 19 27 4 34 19 23 5 35 22 22 6 30 23 48 7 34 23 29 8 36 25 20 9 29 24 14 10 31 26 18 11 35 27 47 12 31 28 26 13 37 29 27 14 34 31 24 15 33 33 22 16
Time Series Forecasts Based on observations over a period of time ¢ Identifies: ¢ Trend – LT movement in data l Seasonality – ST, regular variations l Cycles – longer wavelike variations l Irregular Variations – unusual events l Random Variations – chance/residual l
Forecast Variations Irregular variation Random variation Trend Cycles 11 12 13 14 11 12 13 Seasonal Variations
Naïve Forecasting Simple to use ¢ Minimal to no cost ¢ Data analysis is almost nonexistent ¢ Easily understandable ¢ Cannot provide high accuracy ¢ Can be a standard for accuracy ¢ RULE: “Whatever happened “yesterday” is going to happen tomorrow as long as I apply LOGIC. ”
HW Problem 1 Day Muffins Buns Cupcakes 1 30 18 46 2 34 17 26 3 32 19 27 4 34 19 23 5 35 22 22 6 30 23 48 7 34 23 29 8 36 25 20 9 29 24 14 10 31 26 18 11 35 27 47 12 31 28 26 13 37 29 27 14 34 31 24 15 33 33 22 16
Techniques for Averaging Moving average ¢ Weighted moving average ¢ Exponential smoothing ¢
Simple Moving Average n MAn = Ai i=1 n Where: i = index that corresponds to periods n = number of periods (data points) Ai = Actual value in time period I MA = Moving Average Ft = Forecast for period t
Example 1: Moving Average Four period moving average for period 7: Period 1 2 3 Sales 3520 2860 4005 4 5 6 7 8 3740 4310 5001 4890 ? ? Four period moving average for period 8: Four period moving average for period 9 if actual for 8 = 5025:
Weighted Moving Average Similar to a moving average, but assigns more weight to the most recent observations. ¢ Total of weights must equal 1. ¢
Example 2: Weighted Moving Average Period 1 2 3 Sales 3520 2860 4005 4 5 6 7 8 3740 4310 5001 4890 ? ? Compute a weighted moving average forecast for period 8 using the following weights: . 40, . 30, . 20 and. 10:
Calculating Error ¢ Mathematically: e t = A t - Ft Let’s discuss examples on board…
Premise - Exponential Smoothing The most recent observations might have the highest predictive value…. ¢ And since all forecasts have error… ¢ We should give more weight to the error in the more recent time periods when forecasting. ¢
Exponential Smoothing Ft = Ft-1 + (At-1 - Ft-1) Next forecast = Previous forecast + (Actual -Previous Forecast) Smoothing Constant
About = Smoothing constant selected by forecaster ¢ It is a percentage of the forecast error ¢ The closer the value is to zero, the slower the forecast will be to adjust to forecast errors (greater smoothing) ¢ The higher the value is to 1. 00, the greater the responsiveness to errors and the less smoothing ¢
Example 3: Exponential Smoothing Ft = Ft-1 + (At-1 - Ft-1) Period 1 2 3 Sales 3520 2860 4005 4 5 6 7 8 3740 4310 5001 4890 ? ? ¢ ¢ Assume a starting forecast of 4030 for period 3. Given data at left and =. 10, what would the forecast be for period 8?
HW #2 – Let’s Discuss Month Feb Sales 19 Mar Apr 18 15 May June July Aug 20 18 22 20
Techniques for Seasonality Seasonal Variations – regularly repeating movements in series values that can be tied to recurring events ¢ Examples in life/around campus? ? ? ¢ Computing Seasonal Relatives: Although we will discuss how relatives are created in class, you do not have to know this for exam – just how to apply the relatives to a forecast.
Using Seasonal Relatives ¢ Allows you to incorporate seasonality or deseasonalize data Seasonalize (×) : Factors seasonality into the trend forecast so that you can see peaks and valleys. l Deseasonalize (÷) : Removes seasonal components to get a clearer picture of underlying trend l ¢ Value of each to business? ? ?
Example 4: Using Seasonal Relatives A publisher wants to predict quarterly demand for a certain book for periods 12 and 15, which happen to be in the 4 th and 1 st quarters of a particular year. The data series consists of both trend and seasonality. The trend portion of demand is projected using the equation: yt=12, 500 + 150. 5 t. Quarter relatives are Q 1= 1. 3, Q 2=. 8, Q 3=1. 4, Q 4=. 9 Use this information to predict demand for periods 12 and 15. l Calculate the trend values (plugging in T value) l Incorporate relatives to calculate seasonalized values
HW #11 – Let’s Discuss The following equation summarizes the trend portion of quarterly sales of condos over a long cycle. Prepare a forecast for each Q of next year and the first quarter of the following year. Ft = 40 – 6. 5 t + 2 t 2 Ft = unit sales t= 0 at 1 Q of last year Quarter Relative 1 1. 1 2 1 3 . 6 4 1. 3
Assoc. Forecasting Technique: Simple Linear Regression Predictor variables - used to predict values of variable interest ¢ Regression - technique for fitting a line to a set of points ¢ Least squares line - minimizes sum of squared deviations around the line ¢
Linear Regression Assumptions ¢ Variations around line are random l No patterns are apparent Deviations around the line should be normally distributed ¢ Predictions are being made only in the range of observed values ¢ Should use minimum of 20 observations for best results ¢
Suppose you analyze the following data. . .
The regression line has the following equation: y c = a + bx Where: y c = Predicted (dependent) variable x = Predictor (independent) variable b = slope of the line a = Value of y c when x=0 b = n ( xy) - ( x)( y) n( x 2) - ( x)2 a = y - b x n
Example 5 - Linear Regression: Suppose that a manufacturing company made batches of a certain product. The accountant for the company wished to determine the cost of a batch of product given the following data: Size of batch 20 30 40 50 70 80 100 120 150 Cost of batch (in 1000 s) $1. 4 3. 4 4. 1 Question… which is 3. 8 the dependent (y) and 6. 7 which is the independent 6. 6 (x) variable? 7. 8 10. 4 11. 7
We are now ready to determine the values of b and a: b = n ( xy) - ( x)( y) = 9 (5264) - (660)(55. 9) n( x 2) - ( x)2 = 47376 -36894 572400 -435600 9(63600) - (660)2 = 10482 = 136800 a = y - b x = 55. 9 -. 0766(660) = n 9
Our linear regression equation: y c = a + bx yc= What is the cost of a batch of 125 pieces? yc=
Problem #7 Freight car loadings at a busy port are as follows: Week # 1 220 10 380 2 245 11 420 3 280 12 450 4 275 13 460 5 300 14 475 6 310 15 500 7 350 16 510 8 360 17 525 9 400 18 541
Problem #7 b = n ( xy) - ( x)( y) n( x 2) - ( x)2 a = y - b x n
Correlation (r) ¢ A measure of the relationship between two variables • Strength • Direction (positive or negative) ¢ Ranges from -1. 00 to +1. 00 • Correlation close to 0 signifies a weak relationship – other variables may be at play • Correlation close to +1 or -1 signifies a strong relationship
Example 6: Continued r = n( xy) - ( x)( y) n( x 2)- ( x)2 * r = n( y 2) - ( y)2 9 (5264) - (660)(55. 9) 9(63600)- (660)2 * 9(439. 11) - (55. 9)2 47376 - 36894 = 10482 =. 985 136800 * 827. 18 369. 86 * 28. 76
Coefficient of Determination (r 2) ¢ ¢ ¢ How well a regression line “fits” the data Ranges from 0. 00 to 1. 00 The closer to 1. 0, the better the fit r =. 985 r 2 =. 9852 =. 97
Forecast Accuracy Error - difference between actual value and predicted value ¢ Mean absolute deviation (MAD) ¢ l ¢ Average absolute error Mean squared error (MSE) l Average of squared error Why can’t we simply calculate error for each observed period and then select the technique with the lowest error?
Error Example Period Actual 1 55 2 60 3 75 4 58 5 80 6 90 7 70 8 92 9 100 10 #errors? 3 PMA 63. 33333 64. 33333 71 76 80 84 87. 33333 5 PMA 65. 6 72. 6 74. 6 78 86. 4 6 4 3 P WMA. 6, . 3, . 1 EX SM. 2 LR 55. 69 55 60. 66 56 65. 63 68. 5 59. 8 70. 6 63. 3 59. 44 75. 57 72. 9 63. 552 80. 54 83. 8 68. 8416 85. 51 77 69. 07328 90. 48 85. 2 73. 65862 95. 45 94. 6 78. 9269 100. 42 6 8 9 Does the # of errors calculated impact the "accuracy" comparison? ? ?
Calculating Error ¢ Mathematically: e t = A t - Ft What do the negative errors mean? How do they affect total error?
Calculating MAD and MSE Actual forecast MAD = n 2 (Actual forecast) MSE = n -1
Conclusions with MAD & MSE The MAD and MSE can be used as a comparison tool for several forecasting techniques. ¢ The forecasting technique that yields the lowest MAD and MSE is the preferred forecasting method. ¢
- Slides: 47