Forecasting
Name
Institution
Course
Date
1.1 Definition of forecasting
Jupp 2006 defines forecasting as the process through which a forecast is derived. According to the author, forecasting is estimating the value or condition of a situation or investment in the coming days for instance next year’s inflation level, anticipated demand for a product or service among other anticipations (Jupp, 2006). Forecasting is there predicting a situation based on the past partners and trends. All forecasts, and thereby forecasting, work on the basis that a partner or history repeats itself and will continue to do so. Additionally, Jupp 2006 further states that forecasting also involve examining the relationships between cause and impacts that might exists between dependent and independent variables.
According to Jupp 2006, forecasting varies and can range from simple to complex process, all depending on the question to be answered and outcome expected. There exist various ways of deriving a forecast, and none is better than the other, and it all depends on the forecast. However, the forecasting process remains the same and it is in essence based upon scientific school of inquiry. Jupp 2006 explains the forecasting process commences with problem definition and reviewing the available information on the problem. After this, hypothesis is formulated, and research strategy designed to address the hypothesis. This is then followed by data collection, result analysis and then decision making on whether to take an action or not.
1.2 Examples of general data patterns
General data patterns include time series analysis which is a time ordered sequence study taken at equally spaced time intervals. It forecast on project patterns that were identified in recent time series observations. Time series include seasonal, trend-based and stationary(David & Kenneth, 1999) . Time series example include: mean temperature, daily stock exchange, monthly sales inventory etc.
1.3 Types of qualitative forecasting methods
Qualitative forecasting is subjective and is based on opinions and judgmental. Judgmental forecast are qualitative and are nonmathematical. Qualitative forecasting can bias the forecast and reduce the forecast accuracy. It can incorporate latest changes in inside information and in the environment. In cases where a firm chooses this qualitative forecasting techniques there are two ways to do the statistic, thus: compiling information from experts or companies employees, then forming consensus: the second technique is the life cycle method which compares a new product to those already in the market. The most vital part is deciding what people will yield the opinion on the matter at hand. The deplhi method is normally used in some qualitative forecasting techniques. In this method people comprising the panel get elected and asked questions on significant business forecast (David & Kenneth, 1999). Then they are shown the answers of each panel members, and discuss about them. Later the summary is forwarded to the management of the firm. When developing a new product a company needs qualitative forecasting techniques it can choose the life cycle approach The initiative is to mirror products in terms of their relevant life cycles. The period of growth of the product, maturity and decline of the product will be the example for the company’s expectation for the new product.
1.4a. Naïve Approach periods actual are used as the current periods forecast without changing them or establishing the causal factors. Naive is for comparison with sophisticated data. In Naive Model, when we assume n that recent periods are the best predictors of the future. Naïve forecasts are the most cost-effective goal forecasting form, and they give a standard against which more complicated models can be compared. In stationary time series data, this approach articulates that the forecast for any period equals the historical average. For time series data that are stationary in terms of first distinctions, the naïve forecast equals the previous period’s actual value.
F t +1 = Y t
1.5b. Moving Average
The demand on a moving average forecast is calculated by an average of actual demands from a specific number of previous periods. The new forecast drops the demand in the previous period and represents it with the current period demand. Thus the data in calculation moves over time
Simple moving average: At = Dt + Dt-1 + Dt-2 + … + Dt-N+1
N
Where N = total number of periods in the average
Forecast for period t+1: Ft+1 = At
Key Decision: N – the number of periods to be considered in the forecast
Tradeoff: Higher value of N – lower responsiveness, greater smoothing
Lower value of N – more responsiveness, less smoothing the more periods (N) over which the moving average is calculated, the less susceptible the forecast is to random variations, but the less responsive it is to changes a large value of N is appropriate if the underlying pattern of demand is stable a smaller value of N is appropriate if the underlying pattern is changing or if it is important to identify short-term fluctuations.
1.4c. Weighted Moving Averages
Weighted moving average is a moving average in which each historical demand is weighted differently. Thus:
Average: At = W1 Dt + W2 Dt-1 + W3 Dt-2 + … + WN Dt-N+1
Where by:
N = total number of periods in the average
Wt = weight applied to period t’s demand
Sum of all the weights = 1
Forecast: Ft+1 = At = forecast for period t+1
1.4d. Exponential Smoothing
Exponential smoothing is a very popular as a forecasting method for a variety of time series statistics. It was invented by Brown and Holt who used the forecasting technique to measure the demand for spare parts. Bolt developed exponential smoothing form for processes with linear trends, regular processes, and for seasonal data (David & Kenneth, 1999).
Simple exponential smoothing
A pragmatic and simple form for a time series considering each observation as made of an error component (epsilon) and of a constant (b) thus: Xt = b + t. The constant b is moderately constant in each sector of the series; however it may change gradually with time. Computing a kind of moving average is the way to isolate the true value of b which is the systematic part of the series and this is where the current and immediately preceding observations are allocated with greater weight than the respective older remarks. Simple exponential smoothing achieves this kind of weighting, where exponentially minor weights are allocated to older remarks. The formula for simple exponential smoothing is:
St = *Xt + (1-)*St-1
Regardless of the theoretical form for the process originally the observed time series, simple exponential smoothing will always produce quite accurate forecasts. It is the best to use in one period ahead forecasting.
Exponential smoothing gives less weight to demand in earlier periods and more weight to demand in more recent periods, and
Average: At = a Dt + (1 – a) At-1 = a Dt + (1 – a) Ft
Forecast for period t+1: Ft+1 = At
Where:
At-1 = “series average” calculated by the exponential smoothing model to period t-1
a = smoothing parameter between 0 and 1
The more the weight given to the most recent demand, the larger the smoothing parameter
Exponential Smoothing:
This is the most commonly used time series method because it is easy to use and also it has minimal quantity of data needed to evaluate. This method only needs three pieces of data to start:
The Last period’s forecast (Ft) and the Last periods actual value (At)Then the Select value of smoothing coefficient, ,between 0 and 1.0
When the old period is unavailable, use naive method or average the last few periods
Higher values (e.g. .7 or .8) can place more weight on the last period’s random deviation
1.4e. Least Squares Method this is a simple regression method.
The LSR method creates an equation describing a straight line relationship linking the passage of time and historical sales data. LSR fits a line to the chosen range of data In order to minimize the sum of the squares of the differences between the regression lines the actual sales data points. The forecast is a projection of this straight line into the future.
This method involves sales data history for the period represented by the number of periods best fit plus the particular number of historical data periods. The minimum requirement is two historical data points. This method is significant to forecast demand when a linear trend is in the data.
3.2.6.1 Example: Method 6: Least Squares Regression
Linear Regression, or Least Squares Regression (LSR), is the most popular method for identifying a linear trend in historical sales data. The method calculates the values for a and b to be used in the formula:
Y = a + b X
This equation describes a straight line, where Y represents sales and X represents time. Linear regression is slow to recognize turning points and step function shifts in demand. Linear regression fits a straight line to the data, even when the data is seasonal or better described by a curve. When sales history data follows a curve or has a strong seasonal pattern, forecast bias and systematic errors occur.
Forecast specifications: n equals the periods of sales history used in calculating the values for a and b. For example, specify n = 4 to use the history from September through December as the basis for the calculations. When data is available, a larger n (such as n = 24) would ordinarily be used. LSR defines a line for as few as two data points. For this example, a small value for n (n = 4) was chosen to reduce the manual calculations that are required to verify the results.
Minimum required sales history: n periods plus the number of time periods that are necessary for calculating the forecast performance.
1.5a. Error or (BIAS)
A forecast is subjective if it errors more in one way than in the other. The method used tends to under-forecasts or over-forecasts the observations. The Bias value is simply calculated as the average error value minus one-step-ahead forecast. Mainly, a the disadvantage of this calculation is that negative and positive error values may cancel each other out, and this is why this measure is not a good indicator of overall fit.
BIAS Error =Actual –forecast thus
1.5b. Mean Absolute Deviation (MAD)
The mean absolute deviation value is calculated as the average absolute error value. When this value is 0 (zero), the fit (forecast) is faultless. Therefore MAD weighs all errors evenly.
Mean square error (MSE), mean absolute deviation (MAD), and tracking signal are the three useful measure of forecast error.
There are four factors to consider when selecting a model: amount and type of data available, degree of accuracy required, length of forecast horizon, and patterns present in the data
1.5c. Mean Absolute Percentage Error (MAPE)
Mean absolute percentage error (MAPE).
A better measure of relative overall fit is the mean absolute percentage error. This calculation is more significant than the mean squared error. For example, knowing that the average forecast is “off” by ±5% is a useful result in itself, whereas a mean squared error is not instantly interpretable. Generally MAPE weighs error according to their relative error (David & Kenneth, 1999).
1.5d. Mean Square Error (MSE)
Sum of squared error (SSE), Mean squared error. These values are calculated as the average of the squared values error. This method is the most used lack-of-fit indicator in statistical fitting measures.
MSE weighs errors according to their squared value.
1.5e. Standard Error
It seem reasonable to express the lack of fit in terms of the relative deviation of the fore forecasts from the observed values, that is, relative to the magnitude of the observed values..the absolute errors might not be of much interest as are the relative errors in the forecasts. To evaluate the relative error, various indices have been proposed. Thus: the percentage error value is calculated as:
PEt = 100*(Xt – Ft )/Xt
Where Xt is the practical value at time t, and Ft is the forecasts.
All the above evaluations depend on the actual value.
References
Jupp, V., 2006. The SAGE Dictionary of Social Research Methods. Publisher SAGE.
David, B., & Kenneth, T. (1999). Forecasting new product penetration with flexible substitution patterns. Journal of Econometrics, 89(1999), 101-129. Retrieved from http://else.berkeley.edu/.