# Mathematics Time Series and Spectral Analysis Worksheet

Description

– Double check example questions attached before making an offer, the worksheet questions are difficult.- 80%+ grade needed.- Answer questions on paper or in word.- Notes and exercises will be provided.Topics included:1. Overview. Stationarity, outline of Box-Jenkins approach through identification of model, fitting, diagnostic checking, and forecasting. Mean, autocorrelation function, partial autocorrelation function.

2. Models. Autoregressive (AR) models, moving average (MA) models, ARMA models, their autocorrelation functions, and partial autocorrelation functions. Transformations and differencing to achieve stationarity, ARIMA models.

3. Estimation and diagnostics. Identifying possible models using autocorrelation function, and partial autocorrelation function. Estimation, outline of maximum likelihood, conditional and unconditional least squares approaches. Diagnostic checking, methods and suggestions of possible model modification.

4. Forecasting. Minimum mean square error forecast and forecast error variance, confidence intervals for forecasts, updating forecasts, other forecasting procedures.

5. Seasonality, time series regression.

6.The frequency representation of a stationary time series.

7. The use of a periodiogram to carry out harmonic analysis.

1 attachmentsSlide 1 of 1attachment_1attachment_1

Unformatted Attachment Preview

Module Code: MATH580201
1. Let {Xt } be a stochastic process defined for t 2 Z by
Xt =
p
X
i=0
↵ i “t
i+
q
X
i t i,
i=0
where {“t } and { t } are mutually independent normally distributed white noise processes
with finite variances “2 and 2 respectively. Assume that p, q > 0 and that ↵p , q are
non-zero.
(a) LEEDS5802 Is the process {Xt } strongly stationary, weakly stationary, or not stationary? Justify your answer.
(b) LEEDS5802 Find an expression for the autocovariance function of {Xt } for general
values of p, q, ↵i and i .
Now compute the numerical values of the autocorrelation function if p = 1, q = 2,
↵0 = 1, ↵1 = 1/2, 0 = 1, 1 = 1/2, 2 = 1/4, and “2 = 2 = 2.
(c) LEEDS5802 Show that if p = q = p0 and ↵i = i for i = 0, . . . , q, then {Xt } is a
moving average process, and write down a definition of this process.
(d) LEEDS5802 For the moving average process in (c), assume 2 = 0, explain briefly
what you understand by invertibility. Why it is a desirable property?
(e) LEEDS5802 Let Xt = “t + 1 “t 1 + 0.5″t 2 . Then give two values of 1 , one of
which gives an invertible MA(2) process and one of which gives a non-invertible
MA(2) process. Explain why your values of 1 have the desired properties.
(f) LEEDS5802 Consider the moving average process
X t = “t + “t 1 .
By writing “t in terms of Xt , Xt 1 , Xt 2 , …, verify that Xt is not invertible.
Page 2 of 7
Turn the page over
Module Code: MATH580201
2. (a) LEEDS5802 Define the partial autocorrelation coefficient ↵kk of a stochastic process {Xt }. State the autocorrelation ⇢1 and ⇢2 , and hence find ↵11 and ↵22 , for
an ARMA(1, 1) process Xt = ↵Xt 1 + “t + “t 1 with “t ⇠ (0, 2 ), |↵| < 1 and | | < 1. Use your results to comment on how ⇢k and ↵kk can be used in model selection. (b) LEEDS5802 The following tables show the first ten sample autocorrelations and partial autocorrelations of Xt and Yt = rXt for a series of n = 750 observations. k ⇢bk ↵ bkk k ⇢bk ↵ bkk 1 0.91 0.91 1 0.03 0.03 2 0.83 0.02 Xt : x̄ = 3 4 0.75 0.68 0.00 0.01 0.043, s2x = 6.104. 5 6 7 0.61 0.55 0.49 0.04 0.03 0.01 Yt = rXt : ȳ = 2 3 4 0.04 0.05 0.02 0.04 0.06 0.01 8 0.42 0.06 0.006, s2y = 1.074. 5 6 7 0.00 0.02 0.01 0.01 0.03 0.01 8 0.02 0.02 9 0.36 0.02 10 0.31 0.01 9 0.04 0.04 Identify a model for this time series and obtain preliminary estimates for the parameters of your model. (c) LEEDS5802 Let et be the residuals after fitting a model to time series data, sketch a correlogram of the et that would indicate problems with the fitted model. (d) LEEDS5802 Consider an ARIMA(1, 2, 0) process Xt : let r2 Xt = Yt , then Yt = ↵Yt 1 + "t is a stationary AR(1) process. Here we assume "t ⇠ N (0, "2 ) is white noise. i. What do you understand by the expression “r2 Xt ”? ii. Derive var(Ȳ ) where Ȳ is the sample mean of Y1 , ..., Yn . iii. For large samples, i.e. n is large, show var(Ȳ ) ⇡ 2 11 "n 1 +↵ . ↵ iv. What implications does your result in 2(d)iii above have? Page 3 of 7 Turn the page over 10 0.04 0.04 Module Code: MATH580201 3. (a) LEEDS5802 Consider the first order AR(1) time series Xt = ↵Xt independent "t ⇠ N (0, "2 ). 1 + "t with i. Choose a proper non-zero real value for ↵ and a proper non-zero real value for 2 ". ii. Obtain the l-step ahead forecast Xn (l) for this model. Explain what happens to Xn (l) as l ! 1. iii. Find the correlation between the one-step ahead forecast error en (1) and twostep ahead forecast error en (2). iv. What implications does your result in 3(a)iii above have for forecast errors? (b) LEEDS5802 For a time series X1 , ..., X100 with values X1 = 98, X2 = 101, X99 = 105, X100 = 102, the following model is regarded as a good fit Xt 100 = 0.8(Xt 1 100) + "t + 0.4"t 1 where "t is a white noise and it’s estimated variance is 6. i. Pick a suitable non-zero value for X0 and "0 to use in the subsequent parts. Derive residuals at t = 1 and t = 2. ii. Pick a suitable non-zero value for residuals at t = 99 and t = 100 to use in the subsequent parts. Obtain the corresponding forecast values X100 (1) and X100 (2). iii. Derive the variance of your one-step ahead forecast error e100 (1) and two-step ahead forecast error e100 (2). iv. Calculate the 95% confidence intervals for X101 and X102 based the results you have derived so far. Page 4 of 7 Turn the page over Module Code: MATH580201 4. (a) LEEDS5802 Consider modelling a time series in terms of periodic components of the form A cos(2⇡f t) + B sin(2⇡f t), where t denotes time and f denotes frequency. Assume that observations Xk are taken at times tk = k⌧ , k = 0, . . . , n 1. i. Explain the concept of aliasing and show that pair of frequencies {f, f } is aliased. ii. Deduce the greatest frequency f⌧ which can be distinguished from these data. iii. Assume ⌧ = 1, determine whether f = 1, f = 0.1 can be distinguished or not. (b) LEEDS5802 Let X0 , ..., X3 be the last four digits of your student ID i. calculate the DFT of X0 , ..., X3 ii. derive the corresponding inverse DFT at time point 1 iii. calculate the corresponding spectral density I(fj ) and explain what I(fj ) measures iv. draw the corresponding periodogram and comment (c) LEEDS5802 How can you use results in spectral analysis to determine whether a time series is white noise or not? Derive the expected value of the spectral density of an ARMA(p, q) process. You may use without proof standard results on the expected spectral density of white noise, but should state carefully any results you use. (d) LEEDS5802 Sketch the spectral density function for each of the following time series, explaining what features are present and why. In each case, assume t = 0, 1, ..., n 1 for n = 200. −0.2 0.2 Z 0.6 1.0 i. Xt = sin(40⇡t/n) cos(40⇡t/n) + cos(60⇡t/n) + 2 sin(80⇡t/n) ii. Yt = cos(23⇡t/n) + cos(160⇡t/n) iii. The time series Zt given in Figure 1 below. 0 50 100 150 200 Time Page 5 of 7 Turn the page over Module Code: MATH580201 Figure 1 (e) LEEDS5802 In Figure 2 which smoothed periodogram corresponds to which time series? Explain your reasoning. 1.0 0.02 −0.5 0.50 i spectrum a 150 200 250 300 0.0 b ii spectrum 1.5 100 0.3 frequency 0.0 50 0.2 150 200 250 0.4 0.5 300 0.0 0.1 0.2 0.3 Time frequency 0.4 0.5 c iii 0.4 0.5 0.4 0.5 0.4 0.5 0.4 0.5 100 150 200 250 0.50 300 0.0 0.1 0.2 0.3 Time frequency d iv spectrum 0.02 0.50 50 0 1 2 3 0 0.02 0 2 spectrum 4 0 0.1 Time 1.00 100 0.05 50 3.0 0 50 100 150 200 250 300 0.0 0.1 0.2 0.3 Time frequency e v 50 100 150 200 250 0.50 300 0.0 0.1 0.2 0.3 Time frequency f vi 0 0.5 0.2 2 spectrum 4 2.0 0 0.02 −0.5 1.0 spectrum 2.5 0 0 50 100 150 200 250 300 Time 0.0 0.1 0.2 0.3 frequency Figure 2 Page 6 of 7 Turn the page over Purchase answer to see full attachment Tags: forecasting AutoCorrelation Function Stochastic Process Stationarity squares approaches forecast error variance User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

## Reviews, comments, and love from our customers and community:

This page is having a slideshow that uses Javascript. Your browser either doesn't support Javascript or you have it turned off. To see this page as it is meant to appear please use a Javascript enabled browser.

Peter M.
So far so good! It's safe and legit. My paper was finished on time...very excited!
Sean O.N.
Experience was easy, prompt and timely. Awesome first experience with a site like this. Worked out well.Thank you.
Angela M.J.
Good easy. I like the bidding because you can choose the writer and read reviews from other students
Lee Y.
My writer had to change some ideas that she misunderstood. She was really nice and kind.
Kelvin J.
I have used other writing websites and this by far as been way better thus far! =)
Antony B.
I received an, "A". Definitely will reach out to her again and I highly recommend her. Thank you very much.