Modern Stochastics: Theory and Applications logo


  • Help
Login Register

  1. Home
  2. Issues
  3. Volume 2, Issue 3 (2015): PRESTO-2015
  4. Fredholm representation of multiparamete ...

Modern Stochastics: Theory and Applications

Submit your article Information Become a Peer-reviewer
  • Article info
  • Full article
  • Cited by
  • More
    Article info Full article Cited by

Fredholm representation of multiparameter Gaussian processes with applications to equivalence in law and series expansions✩
Volume 2, Issue 3 (2015): PRESTO-2015, pp. 287–295
Tommi Sottinen   Lauri Viitasaari 1  

Authors

 
Placeholder
https://doi.org/10.15559/15-VMSTA39CNF
Pub. online: 2 October 2015      Type: Research Article      Open accessOpen Access

✩ The authors thank the referees for their useful comments.
1 Lauri Viitasaari was partially funded by Emil Aaltonen Foundation.

Received
25 June 2015
Revised
21 September 2015
Accepted
21 September 2015
Published
2 October 2015

Abstract

We show that every multiparameter Gaussian process with integrable variance function admits a Wiener integral representation of Fredholm type with respect to the Brownian sheet. The Fredholm kernel in the representation can be constructed as the unique symmetric square root of the covariance. We analyze the equivalence of multiparameter Gaussian processes by using the Fredholm representation and show how to construct series expansions for multiparameter Gaussian processes by using the Fredholm kernel.

1 Introduction

In this article, we consider multiparameter processes, that is, our time is multidimensional. Throughout the paper, the dimension of time $n\ge 1$ is arbitrary but fixed.
We use the following notation throughout this article: $\mathbf{t},\mathbf{s},\mathbf{u}\in {\mathbb{R}}^{n}$ are n-dimensional multiparameters of time: $\mathbf{t}=(t_{1},\dots ,t_{n})$, $\mathbf{s}=(s_{1},\dots ,s_{n})$, $\mathbf{u}=(u_{1},\dots ,u_{n}$); $\mathbf{0}$ is an n-dimensional vector of 0s, and $\mathbf{1}$ is an n-dimensional vector of 1s. We denote $\mathbf{s}\le \mathbf{t}$ if $s_{k}\le t_{k}$ for all $k\le n$. For $\mathbf{s}\le \mathbf{t}$, the set $[\mathbf{s},\mathbf{t}]\subset {\mathbb{R}}^{n}$ is the n-dimensional rectangle $\{\mathbf{u}\in {\mathbb{R}}^{n};\mathbf{s}\le \mathbf{u}\le \mathbf{t}\}$.
Let $X=(X_{\mathbf{t}})_{\mathbf{t}\in [\mathbf{0},\mathbf{1}]}$ be a real-valued centered Gaussian multiparameter process or field defined on some complete probability space $(\varOmega ,\mathcal{F},\mathbb{P})$. We assume that the Gaussian field X is separable, that is, its linear space, or the 1st chaos,
\[\mathcal{H}_{1}=\mathrm{cl}\hspace{0.1667em}\hspace{0.1667em}\big(\mathrm{span}\big\{X_{\mathbf{t}}\hspace{0.1667em};\hspace{0.1667em}\mathbf{t}\in [\mathbf{0},\mathbf{1}]\big\}\big)\]
is separable. Here $\mathrm{cl}$ means closure in ${L}^{2}(\varOmega ,\mathcal{F},\mathbb{P})$.
Our main result, Theorem 1, shows when the Gaussian field X can be represented in terms of the Brownian sheet. Recall that the Brownian sheet $W=(W_{\mathbf{t}})_{\mathbf{t}\in [\mathbf{0},\mathbf{1}]}$ is the centered Gaussian field with the covariance
\[\mathbb{E}[W_{\mathbf{t}}W_{\mathbf{s}}]=\prod \limits_{k=1}^{n}\min (t_{k},s_{k}).\]
The Brownian sheet can also be considered as the Gaussian white noise on $[\mathbf{0},\mathbf{1}]$ with the Lebesgue control measure. This means that $\mathrm{d}W$ is a random measure on $([\mathbf{0},\mathbf{1}],\mathcal{B}([\mathbf{0},\mathbf{1}]),\mathrm{Leb}([\mathbf{0},\mathbf{1}]))$ characterized by the following properties:
  • 1. $\int _{A}\mathrm{d}W_{\mathbf{t}}\sim \mathcal{N}(0,\mathrm{Leb}(A))$,
  • 2. $\int _{A}\mathrm{d}W_{\mathbf{t}}$ and $\int _{B}\mathrm{d}W_{\mathbf{s}}$ are independent if $A\cap B=\varnothing $.
Then, if $f,g:[\mathbf{0},\mathbf{1}]\to \mathbb{R}$ are simple functions, then we have the Wiener–Itô isometry
(1)
\[\mathbb{E}\bigg[\int _{[\mathbf{0},\mathbf{1}]}f(\mathbf{t})\hspace{0.1667em}\mathrm{d}W_{\mathbf{t}}\int _{[\mathbf{0},\mathbf{1}]}g(\mathbf{s})\hspace{0.1667em}\mathrm{d}W_{\mathbf{s}}\bigg]=\int _{[\mathbf{0},\mathbf{1}]}f(\mathbf{t})g(\mathbf{t})\hspace{0.1667em}\mathrm{d}\mathbf{t}.\]
Consequently, the integral $\int _{[\mathbf{0},\mathbf{1}]}f(\mathbf{t})\hspace{0.1667em}\mathrm{d}W_{\mathbf{t}}$ can be extended for all $f\in {L}^{2}([\mathbf{0},\mathbf{1}])$ by using the isometry (1), and the isometry (1) will also hold for this extended integral.
In this article, we show the Fredholm representation for Gaussian fields satisfying the trace condition (3) in Section 2, Theorem 1. In Section 3, we apply the Fredholm representation to give a representation for Gaussian fields that are equivalent in law, and in Section 4, we show how to generate series expansions for Gaussian fields by using the Fredholm representation. The Fredholm representation of Theorem 1 can also be used to provide a transfer principle that builds stochastic analysis and Malliavin calculus for Gaussian fields from the corresponding well-known theory for the Brownian sheet. We do not do that in this article, although it would be quite straightforward given the results for the one-dimensional case provided in [9].

2 Fredholm representation

Recall that X is a separable centered Gaussian field with covariance function R and W is a Brownian sheet. Suppose that X is defined on a complete probability space $(\varOmega ,\mathcal{F},\mathbb{P})$ that is rich enough to carry Brownian sheets.
The following theorem states that the field X can be realized as a Wiener integral with respect to a Brownian sheet. Let us note that it is not always possible to construct the Brownian sheet W directly from the field X. Indeed, consider the trivial field $X\equiv 0$ to see this. As a consequence, the Karhunen representation theorem (see, e.g., [2, Thm. 41]) cannot be applied here. Consequently, the Brownian sheet in representation (2) is not guaranteed to exist on the same probability space with X.
In any case, representation (2) holds in law. This means that for a given Brownian sheet W, the field given by (2) is a Gaussian field with the same law as X.
Theorem 1 (Fredholm representation).
Let $(\varOmega ,\mathcal{F},\mathbb{P})$ be a probability space such that $\sigma \{\xi _{k};k\in \mathbb{N}\}\subset \mathcal{F}$, where $\xi _{k}$, $k\in \mathbb{N}$, are i.i.d. standard normal random variables. Let X be a separable centered Gaussian field defined on $(\varOmega ,\mathcal{F},\mathbb{P})$. Let R be the covariance of X.
Then there exist a kernel $K\in {L}^{2}([\mathbf{0},\mathbf{1}])$ and a Brownian sheet W, possibly, defined on a larger probability space, such that the representation
(2)
\[X_{\mathbf{t}}=\int _{[\mathbf{0},\mathbf{1}]}K(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}W_{\mathbf{s}}\]
holds if and only if R satisfies the trace condition
(3)
\[\int _{[\mathbf{0},\mathbf{1}]}R(\mathbf{t},\mathbf{t})\hspace{0.1667em}\mathrm{d}\mathbf{t}<\infty .\]
Proof.
From condition (3) it follows that the covariance operator
\[\mathrm{R}f(\mathbf{t})=\int _{[\mathbf{0},\mathbf{1}]}f(\mathbf{s})R(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}\mathbf{s}\]
is Hilbert–Schmidt. Indeed, the Hilbert–Schmidt norm of the operator R satisfies, by the Cauchy–Schwarz inequality,
\[\begin{array}{r@{\hskip0pt}l}\displaystyle \| \mathrm{R}\| _{\mathrm{HS}}& \displaystyle =\sqrt{\int _{[\mathbf{0},\mathbf{1}]}\int _{[\mathbf{0},\mathbf{1}]}R{(\mathbf{t},\mathbf{s})}^{2}\hspace{0.1667em}\mathrm{d}\mathbf{t}\hspace{0.1667em}\mathrm{d}\mathbf{s}}\\{} & \displaystyle \le \sqrt{\int _{[\mathbf{0},\mathbf{1}]}\int _{[\mathbf{0},\mathbf{1}]}R(\mathbf{t},\mathbf{t})R(\mathbf{s},\mathbf{s})\hspace{0.1667em}\mathrm{d}\mathbf{t}\hspace{0.1667em}\mathrm{d}\mathbf{s}}\\{} & \displaystyle =\int _{[\mathbf{0},\mathbf{1}]}R(\mathbf{t},\mathbf{t})\hspace{0.1667em}\mathrm{d}\mathbf{t}.\end{array}\]
Since Hilbert–Schmidt operators are compact operators, it follows from, for example, [7, p. 233] that the operator R admits the eigenfunction representation
(4)
\[\mathrm{R}f(\mathbf{t})=\sum \limits_{k=1}^{\infty }\lambda _{k}\int _{[\mathbf{0},\mathbf{1}]}f(\mathbf{s})\phi _{k}(\mathbf{s})\hspace{0.1667em}\mathrm{d}\mathbf{s}\hspace{0.1667em}\hspace{0.1667em}\phi _{k}(\mathbf{t}).\]
Here ${(\phi _{k})_{k=1}^{\infty }}$, the eigenfunctions of R, form an orthonormal system on ${L}^{2}([\mathbf{0},\mathbf{1}])$. In particular, this means that
(5)
\[R(\mathbf{t},\mathbf{s})=\sum \limits_{k=1}^{\infty }\lambda _{k}\hspace{0.1667em}\phi _{k}(\mathbf{t})\phi _{k}(\mathbf{s}).\]
From this it follows that the square root of the covariance operator R admits a kernel K if and only if
(6)
\[\sum \limits_{k=1}^{\infty }\lambda _{k}<\infty .\]
Note that condition (6) is equivalent to condition (3). Consequently, we can define
(7)
\[K(\mathbf{t},\mathbf{s})=\sum \limits_{k=1}^{\infty }\sqrt{\lambda _{k}}\hspace{0.1667em}\phi _{k}(\mathbf{t})\phi _{k}(\mathbf{s})\]
since the series in the right-hand side of (7) converges in ${L}^{2}([\mathbf{0},\mathbf{1}])$, and the eigenvalues ${(\lambda _{k})_{k=1}^{\infty }}$ of a positive-definite operator R are nonnegative.
Now,
\[\begin{array}{r@{\hskip0pt}l}\displaystyle R(\mathbf{t},\mathbf{s})& \displaystyle =\sum \limits_{k=1}^{\infty }\lambda _{k}\hspace{0.1667em}\phi _{k}(\mathbf{t})\phi _{k}(\mathbf{s})\\{} & \displaystyle =\sum \limits_{k=1}^{\infty }\sum \limits_{\ell =1}^{\infty }\sqrt{\lambda _{k}}\sqrt{\lambda _{\ell }}\phi _{k}(\mathbf{t})\phi _{\ell }(\mathbf{s})\int _{[\mathbf{0},\mathbf{1}]}\phi _{k}(\mathbf{u})\phi _{\ell }(\mathbf{u})\hspace{0.1667em}\mathrm{d}\mathbf{u}\\{} & \displaystyle =\int _{[\mathbf{0},\mathbf{1}]}\Bigg(\sum \limits_{k=1}^{\infty }\sqrt{\lambda _{k}}\phi _{k}(\mathbf{t})\phi _{k}(\mathbf{u})\hspace{0.1667em}\sum \limits_{\ell =1}^{\infty }\sqrt{\lambda _{\ell }}\phi _{\ell }(\mathbf{s})\phi _{\ell }(\mathbf{u})\Bigg)\mathrm{d}\mathbf{u}\\{} & \displaystyle =\int _{[\mathbf{0},\mathbf{1}]}K(\mathbf{t},\mathbf{u})K(\mathbf{s},\mathbf{u})\hspace{0.1667em}\mathrm{d}\mathbf{u},\end{array}\]
where the interchange of summation and integration is justified by the fact that series (7) converges in ${L}^{2}([\mathbf{0},\mathbf{1}])$. From this calculation and from the Wiener–Itô isometry (1) of the integrals with respect to the Brownian sheet it follows that the centered Gaussian processes on the left-hand side and the right-hand side of Eq. (2) have the same covariance function. Consequently, since they are Gaussian fields, they have the same law. This means that representation (2) holds in law.
Finally, we need to construct a Brownian sheet W associated with the field X such that representation (2) holds in ${L}^{2}(\varOmega ,\mathcal{F},\mathbb{P})$. Let ${(\tilde{\phi }_{k})_{k=1}^{\infty }}$ be any orthonormal basis on ${L}^{2}([\mathbf{0},\mathbf{1}])$. Set
\[\phi _{k}(\mathbf{t})=\int _{[\mathbf{0},\mathbf{1}]}\tilde{\phi }_{k}(\mathbf{s})K(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}\mathbf{s}.\]
Then ${(\phi _{k})_{k=1}^{\infty }}$ is an orthonormal basis (possibly finite or even empty!) on the reproducing kernel Hilbert space (RKHS) of the Gaussian field X (see further for a definition). Let Θ be an isometry from the RKHS to ${L}^{2}(\varOmega ,\sigma (X),\mathbb{P})$. Set $\xi _{k}=\varTheta (\phi _{k})$. Then $\xi _{k}$ are i.i.d. standard normal random variables, and by the reproducing property we have that
\[X_{\mathbf{t}}=\sum \limits_{k=1}^{\infty }\phi _{k}(\mathbf{t})\hspace{0.1667em}\xi _{k}\]
in ${L}^{2}(\varOmega ,\mathcal{F},\mathbb{P})$. Now, it may be that there are only finitely many $\xi _{k}$ developed this way. If this is the case, then we augment the finite sequence ${(\xi _{k})_{k=1}^{n}}$ with independent standard normal random variables. Then set
\[W_{\mathbf{t}}=\sum \limits_{k=1}^{\infty }\int _{[\mathbf{0},\mathbf{t}]}\tilde{\phi }_{k}(\mathbf{s})\hspace{0.1667em}\mathrm{d}\mathbf{s}\hspace{0.1667em}\hspace{0.1667em}\xi _{k}.\]
For this Brownian sheet, representation (2) holds in ${L}^{2}(\varOmega ,\mathcal{F},\mathbb{P})$. Indeed,
\[\begin{array}{r@{\hskip0pt}l}\displaystyle \int _{[\mathbf{0},\mathbf{1}]}K(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}W_{\mathbf{s}}& \displaystyle =\int _{[\mathbf{0},\mathbf{1}]}K(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}\sum \limits_{k=1}^{\infty }\int _{[\mathbf{0},\mathbf{t}]}\tilde{\phi }_{k}(\mathbf{s})\hspace{0.1667em}\mathrm{d}\mathbf{s}\hspace{0.1667em}\hspace{0.1667em}\xi _{k}\\{} & \displaystyle =\sum \limits_{k=1}^{\infty }\int _{[\mathbf{0},\mathbf{1}]}K(\mathbf{t},\mathbf{s})\tilde{\phi }_{k}(\mathbf{s})\hspace{0.1667em}\mathrm{d}\mathbf{s}\hspace{0.1667em}\hspace{0.1667em}\xi _{k}\\{} & \displaystyle =\sum \limits_{k=1}^{\infty }\phi _{k}(\mathbf{t})\hspace{0.1667em}\xi _{k}\\{} & \displaystyle =X_{\mathbf{t}}.\end{array}\]
Here the change of summation, differentiation, and integration is justified by the fact that the everything is square integrable.  □
Remark 1.
  • 1. The eigenfunction expansion (5) for the kernel $(\mathbf{t},\mathbf{s})\mapsto K(\mathbf{t},\mathbf{s})$ is symmetric in t and s. Consequently, it is always possible to have a symmetric kernel in representation (2), that is, in principle it is always possible to transfer from a given representation
    \[X_{\mathbf{t}}=\int _{[\mathbf{0},\mathbf{1}]}K(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}W_{\mathbf{s}}\]
    to
    \[X_{\mathbf{t}}=\int _{[\mathbf{0},\mathbf{1}]}\tilde{K}(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}\tilde{W}_{\mathbf{s}}\]
    where $\tilde{W}$ is some other Brownian sheet, and the kernel $\tilde{K}$ is symmetric. Unfortunately, for a given kernel K and Brownian sheet W, the authors do not know how to do this analytically.
  • 2. In general, it is not possible to choose a Volterra kernel K in (2). By a Volterra kernel we mean a kernel that satisfies $K(\mathbf{t},\mathbf{s})=0$ if $s_{k}>t_{k}$ for some k. To see why a Volterra representation is not always possible, consider the following simple counterexample: $X_{\mathbf{t}}\equiv \xi $, where ξ is a standard normal random variable. This field cannot have a Volterra representation since Volterra fields vanish in the origin. A Fredholm representation for this field is simply $X_{\mathbf{t}}=\int _{[\mathbf{0},\mathbf{1}]}\mathrm{d}W_{\mathbf{s}}$ (with suitable Brownian sheet W depending on $\xi )$.
    For a more complicated counterexample (with $X_{0}=0)$ see [9, Example 3.2].
    Consequently, in general, it is not possible to generate a Gaussian field X on the rectangle $[\mathbf{0},\mathbf{t}]$ from the noise W on the same rectangle $[\mathbf{0},\mathbf{t}]$. Instead, the whole information on the cube $[\mathbf{0},\mathbf{1}]$ may be needed.
  • 3. If the family $\{K(\mathbf{t},\hspace{0.1667em}\cdot \hspace{0.1667em})\hspace{0.1667em};\mathbf{t}\in [\mathbf{0},\mathbf{1}]\}$ is total in ${L}^{2}([\mathbf{0},\mathbf{1}])$, then a Brownian sheet in representation (2) exists on the same probability space $(\varOmega ,\mathcal{F},\mathbb{P})$. Moreover, in this case, it can be constructed from the Gaussian field X. Indeed, in this case, we can apply the Karhunen representation theorem [2, Thm. 41].
The reproducing kernel Hilbert space (RKHS) of the Gaussian field X is the Hilbert space $\mathcal{H}$ that is isometric to the linear space $\mathcal{H}_{1}$, and the defining isometry is $R(\mathbf{t},\cdot )\mapsto X_{\mathbf{t}}$. In other words, the RKHS is the Hilbert space of functions over $[\mathbf{0},\mathbf{1}]$ extended and closed linearly by the relation
\[\big\langle R(\mathbf{t},\cdot ),R(\mathbf{s},\cdot )\big\rangle _{\mathcal{H}}=R(\mathbf{t},\mathbf{s}).\]
The RKHS is of paramount importance in the analysis of Gaussian processes. In this respect, the Fredholm representation (2) is also very important. Indeed, if the kernel K of Theorem 1 is known, then the RKHS is also known as the following reformulation of Lifshits [6, Prop. 4.1] states.
Proposition 1.
Let X admit representation (2). Then
\[\mathcal{H}=\bigg\{f\hspace{0.1667em};\hspace{0.1667em}f(\mathbf{t})=\int _{[\mathbf{0},\mathbf{1}]}\tilde{f}(\mathbf{s})K(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}\mathbf{s},\tilde{f}\in {L}^{2}\big([\mathbf{0},\mathbf{1}]\big)\bigg\}.\]
Moreover, the inner product in $\mathcal{H}$ is given by
\[\langle f,g\rangle _{\mathcal{H}}=\underset{\tilde{f},\tilde{g}}{\inf }\hspace{0.1667em}\int _{[\mathbf{0},\mathbf{1}]}\tilde{f}(\mathbf{t})\tilde{g}(\mathbf{t})\hspace{0.1667em}\mathrm{d}\mathbf{t},\]
where the infimum is taken over all such $\tilde{f}$ and $\tilde{g}$ that
\[\begin{array}{r@{\hskip0pt}l}\displaystyle f(\mathbf{t})& \displaystyle =\int _{[\mathbf{0},\mathbf{1}]}\tilde{f}(\mathbf{s})K(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}\mathbf{s},\\{} \displaystyle g(\mathbf{t})& \displaystyle =\int _{[\mathbf{0},\mathbf{1}]}\tilde{g}(\mathbf{t})K(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}\mathbf{s}.\end{array}\]

3 Application to equivalence in law

Two random objects ξ and ζ are equivalent in law if, their distributions satisfy $\mathbb{P}[\xi \in B]>0$ if and only if $\mathbb{P}[\zeta \in B]>0$ for all measurable sets B. On the contrary, the random objects ξ and ζ are singular in law if there exists a measurable set B such that $\mathbb{P}[\xi \in B]=1$ but $\mathbb{P}[\zeta \in B]=0$. For centered Gaussian random objects there is the well-known dichotomy that two centered Gaussian objects are either equivalent or singular in law; see [4, Thm. 6.1].
There is a complete characterization of the equivalence by any two Gaussian processes due to Kallianpur and Oodaira; see [5, Thms. 9.2.1 and 9.2.2]. It is possible to extend this to Gaussian fields and formulate it in terms of the operator K. The result would remain quite abstract, though. Therefore, we due not pursue in that direction. Instead, the following Proposition 2 gives a partial solution to the problem what do Gaussian fields equivalent to a given Gaussian field X look like. Proposition 2 uses only the Hitsuda representation theorem, which is, unlike the Kallianpur–Oodaira theorem, quite concrete.
Let $\tilde{X}=(\tilde{X}_{\mathbf{t}})_{\mathbf{t}\in [\mathbf{0},\mathbf{1}]}$ be a centered Gaussian field with covariance function $\tilde{R}$, and let $X=(X_{\mathbf{t}})_{\mathbf{t}\in [\mathbf{0},\mathbf{1}]}$ be a centered Gaussian field with covariance function R.
Proposition 2 (Representation of equivalent Gaussian fields).
Suppose that X has representation (2) with kernel K and Brownian sheet W. If
(8)
\[\tilde{X}_{\mathbf{t}}=\int _{[\mathbf{0},\mathbf{1}]}K(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}W_{\mathbf{s}}-\int _{[\mathbf{0},\mathbf{1}]}\int _{[\mathbf{s},\mathbf{1}]}K(\mathbf{t},\mathbf{s})L(\mathbf{s},\mathbf{u})\hspace{0.1667em}\mathrm{d}W_{\mathbf{u}}\hspace{0.1667em}\mathrm{d}\mathbf{s}\]
for some $L\in {L}^{2}([\mathbf{0},\mathbf{1}])$, then $\tilde{X}$ is equivalent in law to X.
Proof.
By [8, Prop. 4.2] we have the following multiparameter version of the Hitsuda representation theorem: A centered Gaussian field $\tilde{W}=(\tilde{W}_{\mathbf{t}})_{\mathbf{t}\in [\mathbf{0},\mathbf{1}]}$ is equivalent in law to a Brownian sheet if and only if it admits the representation
(9)
\[\tilde{W}_{\mathbf{t}}=W_{\mathbf{t}}-\int _{[\mathbf{0},\mathbf{t}]}\int _{[\mathbf{0},\mathbf{s}]}L(\mathbf{s},\mathbf{u})\hspace{0.1667em}\mathrm{d}W_{\mathbf{u}}\hspace{0.1667em}\mathrm{d}\mathbf{s}\]
for some Volterra kernel $L\in {L}^{2}([\mathbf{0},\mathbf{1}])$.
Let then X have the Fredholm representation
(10)
\[X_{\mathbf{t}}=\int _{[\mathbf{0},\mathbf{1}]}K(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}W_{\mathbf{s}}.\]
Then $\tilde{X}$ is equivalent to X if it admits the representation
(11)
\[\tilde{X}_{\mathbf{t}}=\int _{[\mathbf{0},\mathbf{1}]}K(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}\tilde{W}_{\mathbf{s}},\]
where $\tilde{W}$ is related to W by (9). But Eq. (8) implies precisely this.  □
Remark 2.
On the kernel level, Eq. (8) states that
\[\tilde{K}(\mathbf{t},\mathbf{s})=K(\mathbf{t},\mathbf{s})-\int _{[\mathbf{s},\mathbf{1}]}K(\mathbf{t},\mathbf{u})L(\mathbf{u},\mathbf{s})\hspace{0.1667em}\mathrm{d}\mathbf{u}\]
for some Volterra kernel $L\in {L}^{2}([\mathbf{0},\mathbf{1}])$.

4 Application to series expansions

The Mercer square root (7) can be used to build the Karhunen–Loève expansion for the Gaussian process X. But the Mercer form (7) is seldom known. However, if we can somehow find any kernel K such that representation (2) holds, then we can construct a series expansion for X by using the Fredholm representation of Theorem 1 as follows.
Proposition 3 (Series expansion).
Let X be a separable Gaussian process with representation (2), and let ${(\phi _{k})_{k=1}^{\infty }}$ be any orthonormal basis on ${L}^{2}([\mathbf{0},\mathbf{1}])$. Then X admits the series expansion
(12)
\[X_{\mathbf{t}}=\sum \limits_{k=1}^{\infty }\int _{[\mathbf{0},\mathbf{1}]}\phi _{k}(\mathbf{s})K(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}\mathbf{s}\cdot \xi _{k},\]
where the ${(\xi _{k})_{k=1}^{\infty }}$ is a sequence of independent standard normal random variables. The series (12) converges in ${L}^{2}(\varOmega ,\mathcal{F},\mathbb{P})$ and also almost surely uniformly if and only if X is continuous.
The proof below uses reproducing kernel Hilbert space technique. For more details on this, we refer to [3], where the series expansion is constructed for fractional Brownian motion by using the transfer principle.
Proof.
The Fredholm representation (2) implies immediately that the reproducing kernel Hilbert space of X is the image $\mathrm{K}{L}^{2}([\mathbf{0},\mathbf{1}])$ and K is actually an isometry from ${L}^{2}([\mathbf{0},\mathbf{1}])$ to the reproducing kernel Hilbert space of X. Indeed, this is what Proposition 1 states.
The ${L}^{2}$-expansion (12) follows from this due to [1, Thm. 3.7] and the equivalence of almost sure convergence of (12), and the continuity of X follows from [1, Thm. 3.8].  □

References

[1] 
Adler, R.J.: An Introduction to Continuity, Extrema, and Related Topics for General Gaussian Processes. Lect. Notes Monogr. Ser., vol. 12. Institute of Mathematical Statistics Hayward, CA (1990), 160 p. MR1088478 (92g:60053)
[2] 
Berlinet, A., Thomas-Agnan, C.: Reproducing Kernel Hilbert Spaces in Probability and Statistics. Kluwer Academic Publishers, Boston, MA (2004), 355 p. MR2239907. doi:10.1007/978-1-4419-9096-9
[3] 
Gilsing, H., Sottinen, T.: Power series expansions for fractional Brownian motions. Theory Stoch. Process. 9(3–4), 38–49 (2003). MR2306058
[4] 
Hida, T., Hitsuda, M.: Gaussian Processes. Transl. Math. Monogr., vol. 120. American Mathematical Society, Providence, RI (1993), 183 p. Translated from the 1976 Japanese original by the authors. MR1216518 (95j:60057)
[5] 
Kallianpur, G.: Stochastic Filtering Theory. Stoch. Model. Appl. Probab., vol. 13, p. 316. Springer (1980). MR583435 (82f:60089)
[6] 
Lifshits, M.: Lectures on Gaussian Processes. Springer Briefs Math. Springer (2012), 121 p. MR3024389. doi:10.1007/978-3-642-24939-6
[7] 
Riesz, F., Sz.-Nagy, B.: Functional Analysis. Frederick Ungar Publishing Co., New York (1955), 468 p. Translated by Leo F. Boron. MR0071727 (17,175i)
[8] 
Sottinen, T., Tudor, C.A.: On the equivalence of multiparameter Gaussian processes. J. Theor. Probab. 19(2), 461–485 (2006). MR2283386 (2008e:60100). doi:10.1007/s10959-006-0022-5
[9] 
Sottinen, T., Viitasaari, L.: Stochastic analysis of Gaussian processes via Fredholm representation. Preprint, arXiv:1410.2230 (2014)
Reading mode PDF XML

Table of contents
  • 1 Introduction
  • 2 Fredholm representation
  • 3 Application to equivalence in law
  • 4 Application to series expansions
  • References

Copyright
© 2015 The Author(s). Published by VTeX
by logo by logo
Open access article under the CC BY license.

Keywords
Equivalence in law Gaussian sheets multiparameter Gaussian processes representation of Gaussian processes series expansions

MSC2010
60G15 60G60

Metrics
since March 2018
528

Article info
views

415

Full article
views

323

PDF
downloads

161

XML
downloads

Export citation

Copy and paste formatted citation
Placeholder

Download citation in file


Share


RSS

  • Theorems
    1
Theorem 1 (Fredholm representation).
Theorem 1 (Fredholm representation).
Let $(\varOmega ,\mathcal{F},\mathbb{P})$ be a probability space such that $\sigma \{\xi _{k};k\in \mathbb{N}\}\subset \mathcal{F}$, where $\xi _{k}$, $k\in \mathbb{N}$, are i.i.d. standard normal random variables. Let X be a separable centered Gaussian field defined on $(\varOmega ,\mathcal{F},\mathbb{P})$. Let R be the covariance of X.
Then there exist a kernel $K\in {L}^{2}([\mathbf{0},\mathbf{1}])$ and a Brownian sheet W, possibly, defined on a larger probability space, such that the representation
(2)
\[X_{\mathbf{t}}=\int _{[\mathbf{0},\mathbf{1}]}K(\mathbf{t},\mathbf{s})\hspace{0.1667em}\mathrm{d}W_{\mathbf{s}}\]
holds if and only if R satisfies the trace condition
(3)
\[\int _{[\mathbf{0},\mathbf{1}]}R(\mathbf{t},\mathbf{t})\hspace{0.1667em}\mathrm{d}\mathbf{t}<\infty .\]

MSTA

MSTA

  • Online ISSN: 2351-6054
  • Print ISSN: 2351-6046
  • Copyright © 2018 VTeX

About

  • About journal
  • Indexed in
  • Editors-in-Chief

For contributors

  • Submit
  • OA Policy
  • Become a Peer-reviewer

Contact us

  • ejournals-vmsta@vtex.lt
  • Mokslininkų 2A
  • LT-08412 Vilnius
  • Lithuania
Powered by PubliMill  •  Privacy policy