Module Code: MATH380202
3. (a) Let {} be a white noise process with variance σ2.
Define an ARMA(p,q) process {X} in terms of {+} and state (without proof)
conditions for {X} to be (i) weakly stationary and (ii) invertible.
Define what is meant by an ARIMA (p, d, q) process. Let {Y} be such an ARIMA(p, d, q)
process and show how it can also be represented as an ARMA process, giving the
AR and MA orders of this representation.
(b) The following tables show the first nine sample autocorrelations and partial auto-
correlations of X and Y₁ = VX+ for a series of n = 1095 observations. (Notice
that the notation in this part has no relationship with the notation in part (a) of
this question.)
Identify a model for this time series and obtain preliminary estimates for the pa-
rameters of your model.
X₁
= 15.51, s² = 317.43.
k
1
2
3
4
5
6
7
Pk
0.981
0.974
0.968
akk 0.981 0.327
8
9
0.927
0.963 0.957 0.951 0.943 0.935
0.121 0.104 0.000 0.014 -0.067 -0.068 -0.012
Y₁ = VX : y = 0.03, s² = 11.48.
k
1…
Let G be a graph with n ≥ 2 vertices x1, x2, . . . , xn, and let A be the adjacency matrixof G. Prove that if G is connected, then every entry in the matrix A^n−1 + A^nis positive.
Module Code: MATH380202
1. (a) Define the terms "strongly stationary" and "weakly stationary".
Let {X} be a stochastic process defined for all t € Z. Assuming that {X+} is
weakly stationary, define the autocorrelation function (acf) Pk, for lag k.
What conditions must a process {X+) satisfy for it to be white noise?
(b) Let N(0, 1) for t€ Z, with the {+} being mutually independent. Which of
the following processes {X+} are weakly stationary for t> 0? Briefly justify your
answers.
i. Xt for all > 0.
ii. Xo~N(0,) and X₁ = 2X+-1+ &t for t > 0.
(c) Provide an expression for estimating the autocovariance function for a sample
X1,..., X believed to be from a weakly stationary process. How is the autocor-
relation function Pk then estimated, and a correlogram (or acf plot) constructed?
(d) Consider the weakly stationary stochastic process ✗+ = + + +-1+ +-2 where
{E} is a white noise process with variance 1. Compute the population autocorre-
lation function Pk for all k = 0, 1, ....
Intro Stats, Books a la Carte Edition (5th Edition)
Knowledge Booster
Learn more about
Need a deep-dive on the concept behind this application? Look no further. Learn more about this topic, subject and related others by exploring similar questions and additional content below.
Introduction: MARKOV PROCESS And MARKOV CHAINS // Short Lecture // Linear Algebra; Author: AfterMath;https://www.youtube.com/watch?v=qK-PUTuUSpw;License: Standard Youtube License
Stochastic process and Markov Chain Model | Transition Probability Matrix (TPM); Author: Dr. Harish Garg;https://www.youtube.com/watch?v=sb4jo4P4ZLI;License: Standard YouTube License, CC-BY