1 Introduction
We study iterations of a finite family of circle homeomorphisms. This topic has been studied already from a number of different points of view. One may, for example, take a purely deterministic approach and study the associated action of the group of circle homeomorphisms (the special case of the group of orientation preserving circle diffeomorphisms is treated in [12, 19, 13]). Or one may, as we will, take a probabilistic approach and investigate Markov chains generated by random independent and identically distributed (i.i.d.) iterations of functions from the family (such as in [16, 8, 21]).
We restrict our attention to families of functions which are forward minimal in the sense that for any two points on the circle, there are orbits from the first point arbitrary close to the second one using some concatenations of functions from the family. The set of distances which are preserved simultaneously by all maps allows us to distinguish between distinct types of ergodic behavior for such Markov chains.
By finding a topologically conjugate system which is non-expansive on average, under the additional assumption that the system of inverse maps is forward minimal, we prove limit theorems including almost sure synchronization of random trajectories (which is sometimes also referred to as Antonov’s theorem [1]) provided that the system is not topologically conjugate to a family containing only isometries, and uniqueness and fiberwise properties of stationary distributions.
In contrast to many previous authors we do not assume that all maps preserve orientation or, a priori, that the system of inverse maps is forward minimal (such as in [1, 12, 14, 19, 13, 21]) or contains at least one map which is minimal (as in [21]). Our setting is also studied in [17] (without any minimality condition), where a different approach is used and ideas of [3] are adapted which in turn are built on ideas of [15, 8]. See also [22]. One further precursor in a more specific setting is the work by Furstenberg [10] where the homeomorphisms are the projective actions of elements of $SL_{2}(\mathbb{R})$.
2 Random iterations
Let K be a compact topological space equipped with its Borel sets. We call a finite set $F=\{f_{1},\dots ,f_{N}\}$ of continuous functions $f_{j}:K\to K$, $j=1,\dots ,N$, an iterated function system (IFS). If all maps $f_{j}$ are homeomorphisms, as we will in general assume here, then we also consider the associate IFS ${F}^{-1}:=\{{f_{1}^{-1}},\dots ,{f_{N}^{-1}}\}$ of the inverse maps.
We will discuss different points of view on random and deterministic iterations of functions from an IFS and recall some standard notations and facts.
Given $(I_{n})_{n\ge 1}$ a stochastic sequence with values in $\{1,\dots ,N\}$, for $x\in K$ define
We may consider without loss of generality the (a priori) unspecified common domain of the random variables $I_{n}$ as $\varSigma ={\{1,\dots ,N\}}^{\mathbb{N}}$, equipped with a probability measure P defined on its Borel subsets, with $I_{n}$ being defined as $I_{n}(\omega )=\omega _{n}$ for every $\omega =(\omega _{1}\omega _{2}\dots )\in \varSigma $ and $n\ge 1$.
We will later also consider the shift map $\sigma :\varSigma \to \varSigma $ defined by $\sigma (\omega _{1}\omega _{2}\dots ):=(\omega _{2}\omega _{3}\dots )$.
For any $\omega =(\omega _{1}\omega _{2}\dots )\in \varSigma $, any $n\ge 0$ and any $x\in K$ we thus define ${Z_{n}^{x}}(\omega )=Z_{n}(x,\omega )$, where
The sequence $(Z_{n}(x,\omega ))_{n\ge 0}$ is called the trajectory corresponding to the realization ω of the random process $({Z_{n}^{x}})_{n\ge 0}$ starting at $x\in K$. It is common to also consider iterates in the reversed order and to define
If F is an IFS of homeomorphisms, then we also consider the associate sequence $({Z_{n}^{-}}(x,\omega ))_{n\ge 0}$ defined by
Note that for every $\omega \in \varSigma $ and $x\in K$ it holds
(2)
\[\widehat{Z}_{n}(x,\omega ):=(f_{\omega _{1}}\circ \cdots \circ f_{\omega _{n}})(x),\hspace{2em}\widehat{Z}_{0}(x,\omega )=x.\]
\[{Z_{n}^{-}}(x,\omega ):=\big({f_{\omega _{n}}^{-1}}\circ \cdots \circ {f_{\omega _{1}}^{-1}}\big)(x),\hspace{2em}{Z_{0}^{-}}(x,\omega )=x,\]
and the sequence $({\widehat{Z}_{n}^{-}}(x,\omega ))_{n\ge 1}$ defined by
(3)
\[{\widehat{Z}_{n}^{-}}(x,\omega ):=\big({f_{\omega _{1}}^{-1}}\circ \cdots \circ {f_{\omega _{n}}^{-1}}\big)(x),\hspace{2em}{\widehat{Z}_{0}^{-}}(x,\omega )=x.\]2.1 Iterated function systems with probabilities and Markov chains
Let $(I_{n})_{n\ge 1}$ be i.i.d. variables. The probability measure P is then a Bernoulli measure determined by a probability vector $p=(p_{1},\dots ,p_{N})$. It then follows that ${Z_{n}^{x}}=Z_{n}(x,\cdot )$ defined in (1) and ${\widehat{Z}_{n}^{x}}=\widehat{Z}_{n}(x,\cdot )$ defined in (2) both have the same distribution for any fixed $n\ge 1$, and $({Z_{n}^{x}})_{n\ge 0}$ is a (time-homogeneous) Markov chain with transfer operator T defined for bounded measurable functions $h:K\to \mathbb{R}$ by
If p is non-degenerate, that is, if $p_{j}>0$ for every $j=1,\dots ,N$, then we call the pair $(F,p)$ an IFS with probabilities. The Markov chain $({Z_{n}^{x}})_{n\ge 0}$ is obtained by independent random iterations where in each iteration step the functions $f_{j}$ are chosen with probability $p_{j}$.
Markov chains generated by IFSs with probabilities is a particular class of Markov chains that has received a considerable attention in recent years. The IFS terminology was coined by Barnsley and Demko [4].2
A Borel probability measure μ on K is an invariant probability measure for the IFS with probabilities $(F,p)$ if
\[T_{\ast }\mu =\mu ,\hspace{1em}\text{where}\hspace{1em}T_{\ast }\mu (\cdot )=\sum \limits_{j}p_{j}\mu \big({f_{j}^{-1}}(\cdot )\big).\]
Such a measure μ is also called a stationary distribution for the corresponding Markov chain, since if X is a μ-distributed random variable, independent of $(I_{n})_{n\ge 0}$ then $({Z_{n}^{X}})_{n\ge 0}$ will be a stationary stochastic sequence.Remark 1.
By continuity of all functions $f_{j}$, $j=1,\dots ,N$, it follows that $({Z_{n}^{x}})_{n\ge 0}$ has the weak Feller property, that is, T maps the space of real valued continuous functions on K to itself. It is well known that Markov chains with the weak Feller property have at least one stationary distribution, see for example [18]. Hence, any IFS with probabilities $(F,p)$ has at least one invariant probability measure.
Remark 2.
Another formalism (which will not be used here) for analyzing stochastic sequences related to an IFS with probabilities is the one of a (deterministic) step skew product map $(\omega ,x)\mapsto (\sigma (\omega ),f_{\omega _{1}}(x))$ with the shift map $\sigma :\varSigma \to \varSigma $ in the base and locally constant fiber maps. The Bernoulli measure is a σ-invariant measure in the base. Invariant measures (and hence stationary distributions) are closely related to measures which are invariant for the step skew product (see, for example, [23, Chapter 5]).
Given a positive integer n, define by ${T}^{n}=T\circ \cdots \circ T$ and ${T_{\ast }^{n}}=T_{\ast }\circ \cdots \circ T_{\ast }$ (each n times) the concatenations of T and $T_{\ast }$, respectively. We call a stationary distribution μ for $({Z_{n}^{x}})_{n\ge 0}$ attractive if for any $x\in K$ we have ${T_{\ast }^{n}}\delta _{x}\to \mu $ as $n\to \infty $ in the weak∗ topology, where $\delta _{x}$ denotes the Dirac measure concentrated in $x\in K$. In other words, for any continuous $h:K\to \mathbb{R}$ and for any $x\in K$ we have
An attractive stationary distribution is uniquely stationary.
Let ρ be some metric on K. We say that an IFS with probabilities $(F,p)$ is contractive on average with respect to ρ if for any $x,y\in {\mathbb{S}}^{1}$ we have
for some constant $c<1$ and non-expansive on average if (6) holds for some constant $c\le 1$.
Remark 3.
It is well known that a Markov chain $({Z_{n}^{x}})_{n\ge 0}$ generated by an IFS with probabilities $(F,p)$ which is contractive on average has an attractive (and hence unique) stationary distribution. More generally the distribution of ${Z_{n}^{x}}$ then converges (in the weak∗ topology) to the stationary distribution with an exponential rate that can be quantified for example by the Wasserstein metric, see e.g. [20].
Far less is known for non-expansive systems. The theory for Markov chains generated by non-expansive systems can be regarded as belonging to the realm of Markov chains where $\{{T}^{n}h\}$ is equicontinuous for any continuous $h:K\to \mathbb{R}$, or “stochastically stable” Markov chains (see [18] for a survey).
The Markov chain $({Z_{n}^{x}})_{n\ge 0}$ is topologically recurrent if for any open set $O\subset K$ and any $x\in K$ we have
In the present paper we are going to study a special class of topologically recurrent Feller continuous Markov chains generated by IFSs with probabilities of homeomorphisms on the circle. The topology of the circle and the hence implied monotonicity of the maps play a crucial role for our results.
3 IFSs with homeomorphisms on the circle
From now on we will always assume $K={\mathbb{S}}^{1}=\mathbb{R}/\mathbb{Z}$ to be the unit circle and consider an IFS $F={\{f_{j}\}_{j=1}^{N}}$ of homeomorphisms $f_{j}:{\mathbb{S}}^{1}\to {\mathbb{S}}^{1}$. Let $d(x,y):=\min \{|y-x|,1-|y-x|\}$ be the standard metric on ${\mathbb{S}}^{1}$.
3.1 Deterministic iterations and simultaneously preserved distances
An IFS $F={\{f_{j}\}_{j=1}^{N}}$ is forward minimal if for any open set $O\subset K$ and any $x\in K$ there exist some $n\ge 0$ and some $\omega \in \varSigma $ such that
In other words, for a forward minimal IFS it is possible to go from any point x arbitrarily close to any point y by applying some concatenations of functions in the IFS. We say that the IFS $F={\{f_{j}\}_{j=1}^{N}}$ of homeomorphisms $f_{j}$ is backward minimal if the IFS ${\{{f_{j}^{-1}}\}_{j=1}^{N}}$ is forward minimal.
Remark 4.
Note that F is forward (backward) minimal if and only if for every nonempty closed set $A\subset {\mathbb{S}}^{1}$ satisfying $f_{j}(A)\subset A$ (${f_{j}^{-1}}(A)\subset A$) for every j we have $A={\mathbb{S}}^{1}$.
Note that not every forward minimal IFS is automatically backward minimal if $N>1$ (see [5] for a discussion and counterexamples). By [5, Corollary E], an IFS is both forward and backward minimal if and only if there exists an $\omega \in \varOmega $ such that $(Z_{n}(x,\omega ))_{n\ge 0}$ is dense, for any $x\in {\mathbb{S}}^{1}$. (By forward minimality this property trivially holds for some fixed $x\in {\mathbb{S}}^{1}$, but the choice of ω might depend on $x\in {\mathbb{S}}^{1}$.) A simple sufficient condition for an IFS of circle homeomorphisms to be both forward and backward minimal is that at least one of the maps has a dense orbit. A class of IFSs which are forward and backward minimal (so-called expanding-contracting blenders) but without a map with a dense orbit can be found in [9, Section 8.1].
The following is somehow related to the study of the well-known concept of rotation numbers of orientation-preserving circle homeomorphisms which was introduced by Poincaré and which provides an invariant to (almost completely) characterize topologically conjugacy.3 Rotation numbers are also important when studying an IFS (which can be considered as a special group action) of orientation-preserving circle homeomorphisms. The surveys [12, 19] review these facts, see also [13].
Here we deal with a more general class of IFSs in which not necessarily all maps preserve orientation.
Given F and a metric ρ on ${\mathbb{S}}^{1}$, let $L=L(F,\rho )$ defined by
be the set of ρ-distances which simultaneously are preserved by all maps in F.
(7)
\[\begin{array}{r@{\hskip0pt}l}\displaystyle L:=\big\{s\in [0,1/2]:& \displaystyle \rho (x,y)=s\hspace{2.5pt}\text{implies that}\hspace{2.5pt}\rho \big(f_{j}(x),f_{j}(y)\big)=s\\{} & \displaystyle \text{ for any }j=1,\dots ,N\text{ and }(x,y)\in {\mathbb{S}}^{1}\times {\mathbb{S}}^{1}\big\}\end{array}\]Remark 5.
Note that since all maps of the IFS are homeomorphisms it follows that for every $x,y\in {\mathbb{S}}^{1}$ with $\rho (x,y)\in L(F,\rho )$ we have
\[\rho (x,y)=\rho \big(f_{j}(x),f_{j}(y)\big)=\rho \big({f_{j}^{-1}}(x),{f_{j}^{-1}}(y)\big)\hspace{1em}\text{for all}\hspace{2.5pt}j=1,\dots ,N,\]
and thus $L(F,\rho )=L({F}^{-1},\rho )$. Moreover, note that by continuity of the maps of the IFS, the set L is closed.We have the following dichotomy.
Lemma 1.
If $L=L(F,\rho )$ is finite, then
for some $k\ge 1$.
If $L=L(F,\rho )$ is infinite, then $L=[0,1/2]$. All IFS maps are then isometries (with respect to ρ).
Proof.
Consider the operation $\oplus :L\times L\to {\mathbb{S}}^{1}$ defined by
Note that L is closed under this operation, that is, $\oplus :(L\times L)\to L$. Indeed, given $s_{1},s_{2}\in L$, if $x,z\in {\mathbb{S}}^{1}$ are such that $\rho (x,z)=s_{1}\oplus s_{2}$, then there is a point $y\in {\mathbb{S}}^{1}$ such that $\rho (x,y)=s_{1}$ and $\rho (y,z)=s_{2}$. Thus, we have $\rho (f_{j}(x),f_{j}(y))=s_{1}$ and $\rho (f_{j}(y),f_{j}(z))=s_{2}$ for every $j=1,\dots ,N$. Since all maps $f_{j}$ are homeomorphisms, it follows that $\rho (x,z)=\rho (f_{j}(x),f_{j}(z))$ for all $j=1,\dots ,N$ and hence $s_{1}\oplus s_{2}\in L$.
It follows that if L is finite (and nontrivial) then the smallest positive element of L must be a rational number of the form $1/k$ for some integer $k>1$ and hence L must have the given form.
If L is infinite, then $L=[0,1/2]$, since L has then arbitrary small positive elements and must therefore be a dense, and by continuity of all maps in F, also a closed subset of $[0,1/2]$. All IFS maps are then isometries. □
Remark 6.
If $L(F,d)$ is finite and $1/k$ is its smallest positive element, then the IFS $\widetilde{F}=\{\tilde{f}_{j}\}$ with maps $\tilde{f}_{j}(x)=k(f_{j}(x/k)\hspace{2.5pt}\text{mod }1/k)$, $j=1,\dots ,N$, satisfies $L(\widetilde{F},d)=\{0\}$. Thus, we can describe the dynamical properties of an IFS with the set of preserved distances $L(F,d)$ being finite in terms of the dynamics of an IFS with no positive preserved distances. Observe that each of the maps $f_{j}$ is semiconjugate with $\tilde{f}_{j}$ by means of the map $\pi :{\mathbb{S}}^{1}\to {\mathbb{S}}^{1}$ defined by $\pi (x)=kx\hspace{0.3em}\mathrm{mod} \hspace{0.3em}1$, that is, we have $\pi \circ f_{j}=\tilde{f}_{j}\circ \pi $.
Intuitively we may in all cases regard the infimum of all positive elements of $L=L(F,d)$ as the “common prime period” of all maps, where the case when L is infinite corresponds to a degenerated case. As mentioned above, for orientation-preserving homeomorphisms this number can be compared with the rotation number functions in [12, 19, 13].
3.2 Random iterations
First, recall the following well-known fact about forward minimal IFSs with probabilities on ${\mathbb{S}}^{1}$ (compare also [19, Lemma 2.3.14]). We say that a measure μ has full support if the support of μ is ${\mathbb{S}}^{1}$.
Lemma 2.
Let $(F,p)$ be an IFS with probabilities of homeomorphisms on ${\mathbb{S}}^{1}$ and $\mu _{+}$ be an invariant probability measure for $(F,p)$. If F is forward minimal then $\mu _{+}$ is nonatomic and has full support.
Proof.
By contradiction, suppose that $\mu _{+}$ is atomic. Let $x\in {\mathbb{S}}^{1}$ be a point of maximal positive $\mu _{+}$-mass. By invariance of $\mu _{+}$, we obtain
\[\mu _{+}\big(\{x\}\big)=\sum \limits_{j=1}^{N}p_{j}\mu _{+}\big(\big\{{f_{j}^{-1}}(x)\big\}\big)\]
and hence, since we assume that p is non-degenerate, we have $\mu _{+}(\{{f_{j}^{-1}}(x)\})=\mu _{+}(\{x\})$ for every j. Hence, we obtain that the (nonempty) set
satisfies ${f_{j}^{-1}}(A)\subset A$ for every j. Since $\mu _{+}$ is finite, A is finite (and, in particular, closed). Hence, since every ${f_{j}^{-1}}$ is bijective, we in fact have ${f_{j}^{-1}}(A)=A$ and $f_{j}(A)=A$ for every j. Assuming that F is either backward minimal or forward minimal, we hence obtain $A={\mathbb{S}}^{1}$, which is a contradiction. Hence $\mu _{+}$ is nonatomic.An analogous argument shows that $\mu _{+}$ has full support. Indeed, let the (closed) set $A=\operatorname{supp}\mu _{+}$ denote the support of $\mu _{+}$. By invariance of $\mu _{+}$, for every j we have $\mu _{+}({f_{j}^{-1}}(A))=\mu _{+}(A)=1$ which implies $A\subset {f_{j}^{-1}}(A)$, i.e. $f_{j}(A)\subset A$ for every j, so if $(F,p)$ is forward minimal, then $\mu _{+}$ has full support. □
We say that a probability measure μ on ${\mathbb{S}}^{1}$ is s-invariant for $s\in [0,1]$ if $(R_{s})_{\ast }\mu =\mu $, where $R_{s}(x)=(x+s)\text{ mod }1$. Analogously, we say that an ${\mathbb{S}}^{1}$-valued random variable X is s-invariant if its distribution is s-invariant, in which case X and $R_{s}(X)$ have the same distribution.
Lemma 3.
Let $(F,p)$ be an IFS with probabilities of homeomorphisms on ${\mathbb{S}}^{1}$ which is forward minimal. Then any invariant probability measure for $(F,p)$ is s-invariant for any $s\in L(F,d)$.
Proof.
Let μ be an invariant probability measure for $(F,p)$. Let $s\in L(F,d)$. Consider an arbitrary interval I of length s satisfying $\mu (I)\ge \mu ({I^{\prime }})$ for all other intervals ${I^{\prime }}$ of length s. By invariance of μ we have $\mu (I)=\sum _{j}p_{j}\mu ({f_{j}^{-1}}(I))$. Hence, since p is non-degenerate, it follows that $\mu (I)=\mu ({f_{j}^{-1}}(I))$ for every j.
Since I is of length $s\in L(F,d)=L({F}^{-1},d)$, the interval ${f_{j}^{-1}}(I)$ is also of length s for any j. More generally, the μ-measure of the image of I under arbitrary finite concatenations of functions from ${F}^{-1}$ is an interval of length s and of measure $\mu (I)$. By forward minimality and continuity of the maps in F it therefore follows that all intervals of length s have the same μ-measure equal to $\mu (I)$.
This property implies that μ is s-invariant. Indeed, consider an arbitrary interval $(c,d)$ in ${\mathbb{S}}^{1}$, where $d=R_{\alpha }(c)$, for some $0<\alpha \le 1/2$. If α is larger than s then
\[\begin{array}{r@{\hskip0pt}l}\displaystyle \mu ((c,d))& \displaystyle =\mu \big(\big(c,R_{s}(c)\big)\big)+\mu \big(\big(R_{s}(c),d\big)\big)=\mu \big(\big(d,R_{s}(d)\big)\big)+\mu \big(\big(R_{s}(c),d\big)\big)\\{} & \displaystyle =\mu \big(\big(R_{s}(c),R_{s}(d)\big)\big).\end{array}\]
Otherwise, if α is smaller than or equal to s, then
\[\begin{array}{r@{\hskip0pt}l}\displaystyle \mu ((c,d))+\mu \big(\big(d,R_{s}(c)\big)\big)& \displaystyle =\mu \big(\big(c,R_{s}(c)\big)\big)=\mu (I)=\mu \big(\big(d,R_{s}(d)\big)\big)\\{} & \displaystyle =\mu \big(\big(d,R_{s}(c)\big)\big)+\mu \big(\big(R_{s}(c),R_{s}(d)\big)\big),\end{array}\]
which also implies $\mu ((c,d))=\mu ((R_{s}(c),R_{s}(d))$. □Given a measurable transformation $\varPhi :{\mathbb{S}}^{1}\to {\mathbb{S}}^{1}$ and a probability measure μ, we denote by $\varPhi _{\ast }\mu $ the pushforward of μ defined by $\varPhi _{\ast }\mu (E)=\mu ({\varPhi }^{-1}(E))$ for each Borel set E of ${\mathbb{S}}^{1}$.
Remark 7.
Recall that if μ is nonatomic (i.e. continuous) and fully supported Borel measure on ${\mathbb{S}}^{1}$ then its distribution function defines a homeomorphism $\varPhi :{\mathbb{S}}^{1}\to {\mathbb{S}}^{1}$ and ${\varPhi _{\ast }^{-1}}\mu =\mu _{\mathrm{Leb}}$.
We state a preliminary result.4
Proposition 1.
Let $(F,p)$ be an IFS with probabilities of homeomorphisms on ${\mathbb{S}}^{1}$ which is backward minimal. Let $\mu _{-}$ be an invariant measure for $({F}^{-1},p)$ and let $\varPhi _{-}:{\mathbb{S}}^{1}\to {\mathbb{S}}^{1}$ be defined by $\varPhi _{-}(x):=\mu _{-}([0,x])$. Then
is a metric on ${\mathbb{S}}^{1}$ and $(F,p)$ is non-expansive on average with respect to ρ.
The IFS $G={\{g_{j}\}_{j=1}^{N}}$ given by the maps $g_{j}:=\varPhi _{-}\circ f_{j}\circ {\varPhi _{-}^{-1}}$, $j=1,\dots ,N$, with probabilities p is non-expansive on average with respect to d and we have $L(G,d)=L(F,\rho )$.
Proof.
Let $(F,p)$ be an IFS with probabilities of homeomorphisms on ${\mathbb{S}}^{1}$ which is backward minimal. Let $\varPhi _{-}(x)=\mu _{-}([0,x])$, where $\mu _{-}$ is an invariant probability measure for $({F}^{-1},p)$, and define
Clearly, $L(G,d)=L(F,\rho )$. By Lemma 2 applied to $({F}^{-1},p)$, $\mu _{-}$ is nonatomic and has full support and hence we have $\rho (x,y)\ge 0$ and $\rho (x,y)=0$ if and only if $x=y$. Moreover, clearly $\rho (x,y)=\rho (y,x)$ and $\rho (x,y)\le \rho (x,z)+\rho (z,y)$. Hence, ρ defines a metric on ${\mathbb{S}}^{1}$. The definition of ρ and the invariance of $\mu _{-}$ together imply
\[\begin{array}{r@{\hskip0pt}l}\displaystyle \sum \limits_{j=1}^{N}p_{j}\rho \big(f_{j}(x),f_{j}(y)\big)& \displaystyle =\sum \limits_{j=1}^{N}p_{j}\min \big\{\mu _{-}\big(\big[f_{j}(x),f_{j}(y)\big]\big),\mu _{-}\big(\big[f_{j}(y),f_{j}(x)\big]\big)\big\}\\{} & \displaystyle =\sum \limits_{j=1}^{N}p_{j}\min \big\{\mu _{-}\big(f_{j}\big([x,y]\big)\big),\mu _{-}\big(f_{j}\big([y,x]\big)\big)\big\}\\{} & \displaystyle \le \min \Bigg\{\sum \limits_{j=1}^{N}p_{j}\mu _{-}\big(f_{j}\big([x,y]\big)\big),\sum \limits_{j=1}^{N}p_{j}\mu _{-}\big(f_{j}\big([y,x]\big)\big)\Bigg\}\\{} & \displaystyle =\min \big\{\mu _{-}\big([x,y]\big),\mu _{-}\big([y,x\big)])\big\}=\rho (x,y),\end{array}\]
which proves that $(F,p)$ is non-expansive on average with respect to ρ. □The following result can be regarded as the heart of the paper.5
Theorem 1.
Let $(F,p)$ be an IFS with probabilities of homeomorphisms on ${\mathbb{S}}^{1}$ which is forward minimal and non-expansive on average with respect to some metric ρ. Then $\rho ({Z_{n}^{x}},{Z_{n}^{y}})$ converges almost surely to an L-valued random variable for any $x,y\in {\mathbb{S}}^{1}$, where $L=L(F,\rho )$.
As an immediate corollary of Proposition 1 and Theorem 1 we get the following result. This type of result is usually referred to as Antonov’s theorem (see [2], where all maps in the IFS are assumed to preserve orientation, see also [13, 14]). Also in our generality, the present corollary is not new and follows (although not explicitly stated) from results by Malicet [17] who studied an even more general setting ( without assuming minimality).
Corollary 1.
Let $(F,p)$ be an IFS with probabilities of homeomorphisms on ${\mathbb{S}}^{1}$ which is forward and backward minimal. Then exactly one of the following cases occurs:
-
1) (synchronization) For any $x,y\in {\mathbb{S}}^{1}$ and almost every $\omega \in \varSigma $ we have $d(Z_{n}(x,\omega ),Z_{n}(y,\omega ))\to 0$ as $n\to \infty $.
-
2) (factorization) There exists a positive integer $k\ge 2$ and a homeomorphism $\varPsi :{\mathbb{S}}^{1}\to {\mathbb{S}}^{1}$ of order k (that is, ${\varPsi }^{k}=\textit{i}d$) which commutes with all $f_{j}$. Moreover, there is a naturally associated IFS $\check{F}=\{\check{f}_{j}\}$ where each map $\check{f}_{j}$ is a topological factor6 (with a common factoring map) of the corresponding map $f_{j}$ of F such that $(\check{F},p)$ has the synchronization property claimed in item 1).
-
3) (invariance) All maps $f_{j}$ are conjugate (with a common conjugation map) to an isometry (with respect to d). There exists a probability measure which is invariant for all maps $f_{j}$, $j=1,\dots ,N$, and hence also uniquely invariant for $(F,p)$.
Proof.
Apply Proposition 1 to $(F,p)$ and consider the homeomorphism $\varPhi _{-}:{\mathbb{S}}^{1}\to {\mathbb{S}}^{1}$, and the metric ρ such that $(F,p)$ is non-expansive on average with respect to ρ. Consider the IFS $(G,p)$, conjugate to $(F,p)$ through the conjugating map $\varPhi _{-}$, which is non-expansive on average with respect to d and recall $L=L(F,\rho )=L(G,d)$. We consider three cases:
Case $L=\{0\}$. By Theorem 1, we have $\rho ({Z_{n}^{x}},{Z_{n}^{y}})\to 0$ a.s. and thus $d({Z_{n}^{x}},{Z_{n}^{y}})\to 0$ a.s., proving item 1).
Case L finite and nontrivial. By Lemma 1, $L(G,d)=\{0,1/k,\dots ,\lfloor k/2\rfloor /k\}$ for some $k\ge 2$. By Remark 6 applied to $(G,d)$, with $\check{f}_{j}(x)=\tilde{g}_{j}(x):=k(g_{j}(x/k)\hspace{0.3em}\mathrm{mod} \hspace{0.3em}1/k)$ we have $\check{f}_{j}\circ \varPsi =\varPsi \circ f_{j}$, where $\varPsi =\pi \circ {\varPhi _{-}^{-1}}$ with $\pi (x)=kx\hspace{0.3em}\mathrm{mod} \hspace{0.3em}1$, and the IFS $(\check{F},p)$ satisfies $L(\check{F},d)=L(\widetilde{G},d)=\{0\}$.
Since by Lemma 3 we have ${\varPhi _{-}^{-1}}(R_{1/k}(x))=({\varPhi _{-}^{-1}}(x)+1/k)\hspace{0.3em}\mathrm{mod} \hspace{0.3em}1$, it follows that
\[\begin{array}{r@{\hskip0pt}l}\displaystyle \varPsi \big(R_{1/k}(x)\big)& \displaystyle =\big(\pi \circ {\varPhi _{-}^{-1}}\big)\big(R_{1/k}(x)\big)\\{} & \displaystyle =\pi \big(\big({\varPhi _{-}^{-1}}(x)+1/k\big)\hspace{0.3em}\mathrm{mod} \hspace{0.3em}1\big)=\big(\pi \circ {\varPhi _{-}^{-1}}\big)(x)=\varPsi (x),\end{array}\]
and thus Ψ is an order k homeomorphism having the claimed properties, proving item 2).
Case L infinite. By Lemma 1, we have $L(G,d)=[0,1/2]$. All maps in G are thus isometries (with respect to d) and hence simultaneously preserve the Lebesgue measure. The measure $\mu _{+}:=({\varPhi _{-}^{-1}})_{\ast }\mu _{\mathit{Leb}}$ is invariant for all maps of F, and by Lemma 3 uniquely invariant for $(F,p)$, proving item 3). □
Remark 8.
IFSs with nontrivial L can be regarded as degenerated systems. For a typical system satisfying the conditions of Theorem 1 we thus have that $\rho ({Z_{n}^{x}},{Z_{n}^{y}})\to 0$ as $n\to \infty $ a.s. for any $x,y\in {\mathbb{S}}^{1}$. Using techniques from [17, Theorem D] it seems plausible that it should be possible to prove that convergence is exponential (see also [16]), and that $(F,p)$ is contractive on average with respect to some metric in this case.
Proof of Theorem 1.
Let $(F,p)$ be a forward minimal IFS which is non-expansive on average with respect to ρ. Let $\mathscr{F}_{n}$ be the sigma field generated by $I_{1},\dots ,I_{n}$. Fix $x,y\in {\mathbb{S}}^{1}$. Note that ${Z_{n}^{x}}$ and ${Z_{n}^{y}}$ are both measurable with respect to $\mathscr{F}_{n}$ and
\[\begin{array}{r@{\hskip10.0pt}c@{\hskip10.0pt}l}\displaystyle \mathbb{E}\big(\rho \big({Z_{n+1}^{x}},{Z_{n+1}^{y}}\big)|\mathscr{F}_{n}\big)& \displaystyle =& \displaystyle \mathbb{E}\big(\rho \big(f_{I_{n+1}}\big({Z_{n}^{x}}\big),f_{I_{n+1}}\big({Z_{n}^{y}}\big)\big)|\mathscr{F}_{n}\big)\\{} & \displaystyle =& \displaystyle \sum \limits_{j=1}^{N}p_{j}\rho \big(f_{j}\big({Z_{n}^{x}}\big),f_{j}\big({Z_{n}^{y}}\big)\big)\le \rho \big({Z_{n}^{x}},{Z_{n}^{y}}\big),\end{array}\]
so the stochastic sequence $(\rho ({Z_{n}^{x}},{Z_{n}^{y}}))_{n\ge 0}$ is a bounded super-martingale with respect to the filtration $\{\mathscr{F}_{n}\}$. By the Martingale convergence theorem it follows that $\rho ({Z_{n}^{x}},{Z_{n}^{y}})\stackrel{\text{a.s.}}{\to }\xi $ as $n\to \infty $ for some random variable $\xi ={\xi }^{x,y}$.Let $L=L(F,\rho )$. We will now show that ξ is L-valued a.s., that is, we will show that the distance between any two points $a,b\in {\mathbb{S}}^{1}$ with $\rho (a,b)=\xi (\omega )$ is preserved by all the maps in F for P a.a. $\omega \in \varSigma $.
We will show that any two points $a,b\in {\mathbb{S}}^{1}$ with $\rho (a,b)=\xi (\omega )$ can simultaneously be (almost) reached by $\{Z_{n}(x,\omega ),Z_{n}(y,\omega )\}$ followed by an application of an arbitrary map for infinitely many n and that this leads to a contradiction if the distance between some points with distance $\xi (\omega )$ is not preserved by all maps in F for a typical ω.
Let us first prove the following claim that for any z and any index j, any open set in ${\mathbb{S}}^{1}$ will be visited followed by an application of the map $f_{j}$ infinitely many times by trajectories $(Z_{n}(z,\omega ))_{n\ge 0}$ corresponding to typical realizations ω.
Claim 1.1.
For any $z\in {\mathbb{S}}^{1}$ and any open set $O\subset {\mathbb{S}}^{1}$ and any $j\in \{1,\dots ,N\}$ we have
Proof.
Let $z\in {\mathbb{S}}^{1}$. Consider an open set $O\subset {\mathbb{S}}^{1}$ and an index $j\in \{1,\dots ,N\}$. By forward minimality, for every $q\in {\mathbb{S}}^{1}$ there exists some positive integer $n_{q}$ and some $c_{q}>0$, such that
Considering the left hand side expression in (8) as a function of q, by continuity (recall the weak Feller property) one concludes that there exists an open set $O_{q}$ containing q and some positive integer $n_{q}$ and some ${c^{\prime }_{q}}>0$
for any $z\in O_{q}$. Thus, by compactness, there exists a positive integer N such that
(8)
\[P\big({Z_{n_{q}}^{q}}\in O,I_{n_{q}+1}=j\big)=P\big({Z_{n_{q}}^{q}}\in O\big)P(I_{n_{q}+1}=j)>c_{q}>0.\]Let
\[\begin{array}{r@{\hskip0pt}l}\displaystyle A_{m}& \displaystyle :=\big\{\omega :Z_{n}(z,\omega )\in O,\omega _{n+1}=j,\text{ for some }n\in \big\{mN,\dots ,(m+1)N-1\big\}\big\}\\{} & \displaystyle =\big\{\omega :Z_{n-mN}\big(Z_{mN}(z,\omega ),{\sigma }^{mN}(\omega )\big)\in O,\omega _{n+1}=j,\\{} & \displaystyle \phantom{=\{\omega :Z_{n}(z,\omega )\in O,\omega _{n+1}=j,}\text{ for some }n\in \big\{mN,\dots ,(m+1)N-1\big\}\big\}.\end{array}\]
For any $m\ge 1$ we have
\[\begin{array}{r@{\hskip0pt}l}\displaystyle P(A_{m})& \displaystyle \ge \underset{q\in {\mathbb{S}}^{1}}{\inf }P\big(\big\{\omega :Z_{n-mN}\big(q,{\sigma }^{mN}(\omega )\big)\in O,\omega _{n+1}=j,\\{} & \displaystyle \phantom{=\{\omega :Z_{n}(z,\omega )\in O,\omega _{n+1}=j,}\text{ for some }n\in \big\{mN,\dots ,(m+1)N-1\big\}\big\}\big)\\{} & \displaystyle =\underset{q\in {\mathbb{S}}^{1}}{\inf }P\big({Z_{n}^{q}}\in O,I_{n+1}=j\text{ for some }n<N\big)=s>0\end{array}\]
and hence $P({A_{m}^{c}})\le 1-s$. More generally, we can similarly show that
for any $j<k$, which implies
This implies the assertion. □We can now choose Ω with $P(\varOmega )=1$ such that for any $\omega \in \varOmega $, for any a priori fixed index j, the trajectory $(Z_{n}(x,\omega ))_{n\ge 0}$ visits infinitely many times any open interval followed by an application of $f_{j}$. Indeed, let $\{a_{k}\}_{k}$ be a dense set in ${\mathbb{S}}^{1}$ and for every index pair $(k,\ell )\in {\mathbb{N}}^{2}$ let ${\varOmega _{k,\ell }^{j}}$ be the set provided by the Claim for the point x, an index j, and the open set $O_{k,\ell }=(a_{k}-1/\ell ,a_{k}+1/\ell )$. Let
\[\varOmega :=\bigcap \limits_{j=1}^{N}\bigcap \limits_{k\in \mathbb{N}}\bigcap \limits_{\ell \in \mathbb{N}}{\varOmega _{k,\ell }^{j}}\]
and note that $P(\varOmega )=1$.By the above, without loss of generality, we can also assume that Ω is such that for every $\omega \in \varOmega $ we have $\rho (Z_{n}(x,\omega ),Z_{n}(y,\omega ))\to \xi (\omega )$ as $n\to \infty $.
Fix $\omega \in \varOmega $. Let $a,b,c$ be points in ${\mathbb{S}}^{1}$, with $\rho (a,b)=\rho (a,c)=\xi (\omega )$, where b is obtained from a by a clockwise rotation and c is obtained from a by a counter-clockwise rotation. Note that if $0<\xi (\omega )<1/2$ then the points $a,b,c$ will be distinct, and otherwise $b=c$. By definition of Ω we know that if $O_{a}$ is an open set containing a, $O_{b}$ is an open set containing b, and $O_{c}$ is an open set containing c then there are infinitely many n such that $Z_{n}(x,\omega )\in O_{a}$ and either $Z_{n}(y,\omega )\in O_{b}$ or $Z_{n}(y,\omega )\in O_{c}$. We say that a is clockwise nice if for arbitrarily small open sets $O_{a}$ and $O_{b}$ containing a and b, respectively either $Z_{n}(x,\omega )\in O_{a}$ and $Z_{n}(y,\omega )\in O_{b}$ simultaneously or $Z_{n}(y,\omega )\in O_{a}$ and $Z_{n}(x,\omega )\in O_{b}$ simultaneously for infinitely many n, and counterclockwise nice if for arbitrarily small open sets $O_{a}$ and $O_{b}$ containing a and b, respectively either $Z_{n}(x,\omega )\in O_{a}$ and $Z_{n}(y,\omega )\in O_{c}$ simultaneously or $Z_{n}(y,\omega )\in O_{a}$ and $Z_{n}(x,\omega )\in O_{c}$ simultaneously for infinitely many n. We call a nice if a is both clockwise nice and counterclockwise nice.
Proof.
We first prove that there exist both clockwise nice and counterclockwise nice points. Indeed, by definition of Ω, any $a\in {\mathbb{S}}^{1}$ is either clockwise nice, counterclockwise nice, or nice. By contradiction, suppose that all points $a\in {\mathbb{S}}^{1}$ are only clockwise nice (the case that all points are only counterclockwise nice is analogous). Then, in particular, a given point a and the point c obtained from a counterclockwise rotation of a would both be only clockwise nice. But c being clockwise nice would imply that a is counterclockwise nice, contradiction.
Thus, there exist points of either type which are arbitrarily close to each other. Hence, there exists at least one point in ${\mathbb{S}}^{1}$ which is nice.
By definition of Ω it follows that nice points are mapped to nice points by all maps, so by forward minimality it follows that every point in ${\mathbb{S}}^{1}$ is nice. □
Let us now prove that the distance between any two points $a,b\in {\mathbb{S}}^{1}$ with $\rho (a,b)=\xi (\omega )$ is preserved by all the maps in F. Arguing by contradiction, suppose that $\xi (\omega )\notin L$, and consider an interval $[a,b]$ with $\rho (a,b)=\xi (\omega )$ such that for some $j\in \{1,\dots ,N\}$ we have
By continuity of $f_{j}$, there exist open intervals $O_{a}$ and $O_{b}$ containing a and b, respectively and some positive number ε such that for any ${a^{\prime }}\in O_{a}$ and any ${b^{\prime }}\in O_{b}$ we have
By choice of Ω and the fact that a is nice, there exist arbitrary large integers n such that either $Z_{n}(x,\omega )\in O_{a}$, and $Z_{n}(y,\omega )\in O_{b}$ simultaneously or $Z_{n}(y,\omega )\in O_{a}$, and $Z_{n}(x,\omega )\in O_{b}$ simultaneously and $I_{n+1}(\omega )=\omega _{n+1}=j$. Hence
\[\big|\rho \big(Z_{n}(x,\omega ),Z_{n}(y,\omega )\big)-\rho \big(Z_{n+1}(x,\omega ),Z_{n+1}(y,\omega )\big)\big|>\varepsilon ,\]
contradicting the assumption that $\omega \in \varOmega $.This completes the proof that for any $x,y\in {\mathbb{S}}^{1}$, $\rho ({Z_{n}^{x}},{Z_{n}^{y}})$ converges almost surely to an L-valued random variable. □
The following result about uniqueness of invariant probability measures is not new and was, to the best of our knowledge, first proved in [17]. A simple direct proof based on equicontinuity was recently presented in [22]. Note that equicontinuity of $\{{T}^{n}h\}$, where ${T}^{n}h(x)=\int h(Z_{n}(x,\omega ))\hspace{0.1667em}dP(\omega )$ for any Lipschitz continuous function $h:{\mathbb{S}}^{1}\to \mathbb{R}$, follows trivially from Proposition 1. Indeed, if ρ is the metric of Proposition 1, then $\int \rho (Z_{n}(x,\omega ),Z_{n}(y,\omega ))\hspace{0.1667em}dP(\omega )\le \rho (x,y)$. For completeness we will show that uniqueness of invariant probability measures is also a very simple consequence of Theorem 1.
Corollary 2.
Any IFS $(F,p)$ with probabilities of homeomorphisms on ${\mathbb{S}}^{1}$ which is forward and backward minimal has a unique invariant probability measure $\mu _{+}$.
Proof.
Let $\mu _{-}$ be an invariant probability measure for $({F}^{-1},p)$ and define the metric ρ by $\rho (x,y):=\min \{\mu _{-}([x,y]),\mu _{-}([y,x])\}$. By Proposition 1, the IFS $G=\{g_{j}\}_{j}$ defined by $g_{j}:=\varPhi _{-}\circ f_{j}\circ {\varPhi _{-}^{-1}}$, where $\varPhi _{-}(x)=\mu _{-}([0,x])$, with probabilities p is non-expansive on average with respect to d and we have $L:=L(G,d)=L(F,\rho )$.
By Theorem 1, with $Z_{n}$ as in (1) and $W_{n}:=\varPhi _{-}\circ Z_{n}\circ {\varPhi _{-}^{-1}}$, we have that $d({W_{n}^{x}},{W_{n}^{y}})$ converges almost surely to an L-valued random variable as $n\to \infty $, for any $x,y\in {\mathbb{S}}^{1}$.
We are now going to show that there is a unique invariant probability measure $\nu _{+}$ for the IFS $(G,p)$. This will imply that $\mu _{+}:=({\varPhi _{-}^{-1}})_{\ast }\nu _{+}$ is the unique invariant probability measure for $(F,p)$.
Let us divide the proof into cases:
Case $L=\{0\}$. Consider first the (generic) case $L=\{0\}$. Thus, $d({W_{n}^{x}},{W_{n}^{y}})\to 0$ as $n\to \infty $ a.s. for any $x,y\in {\mathbb{S}}^{1}$. Let $\nu _{+}$ be an invariant probability measure for $(F,p)$, that is, a stationary distribution for $({W_{n}^{x}})_{n\ge 0}$ (recall Remark 1). For any $x,y\in {\mathbb{S}}^{1}$ and for any continuous $h:{\mathbb{S}}^{1}\to \mathbb{R}$, by Lebesgue’s dominated convergence theorem
\[{T}^{n}h(x)-{T}^{n}h(y)=\int _{\varSigma }h\big(W_{n}(x,\omega )\big)dP(\omega )-\int _{\varSigma }h\big(W_{n}(y,\omega )\big)dP(\omega )\to 0\]
as $n\to \infty $, and thus by invariance of $\nu _{+}$ we have
\[\bigg|{T}^{n}h(x)-\int h\hspace{0.1667em}d\nu _{+}\bigg|=\bigg|{T}^{n}h(x)-\int {T}^{n}h\hspace{0.1667em}d\nu _{+}\bigg|\le \int \big|{T}^{n}h(x)-{T}^{n}h(y)\big|\hspace{0.1667em}d\nu _{+}(y)\]
and by Lebesgue’s dominated convergence theorem the latter tends to 0 as $n\to \infty $. This implies that $\nu _{+}$ must be attractive and thus unique (recall Remark 3).
Case $L=\{0,1/k,\dots ,\lfloor k/2\rfloor /k\}$ for some $k\ge 2$. By Lemma 3 all invariant probability measures for $(G,p)$ are $1/k$-invariant. By contradiction, suppose that there are two distinct invariant probability measures ${\nu _{+}^{1}}$ and ${\nu _{+}^{2}}$ for $(G,p)$. Hence, if X and Y are two random variables with distribution ${\nu _{+}^{1}}$ and ${\nu _{+}^{2}}$ respectively, independent of $\{I_{n}\}$, then ${W_{n}^{X}}\hspace{0.3em}\mathrm{mod} \hspace{0.3em}1/k$, and ${W_{n}^{Y}}$ mod $1/k$ will also have distinct distributions for any fixed $n\ge 0$, by $1/k$–invariance of ${\nu _{+}^{1}}$ and ${\nu _{+}^{2}}$. The latter is however impossible since the IFS $\widetilde{G}=\{\tilde{g}_{j}\}$ defined by $\tilde{g}_{j}(x)=k(g_{j}(x/k)\hspace{0.3em}\mathrm{mod} \hspace{0.3em}1/k)$, $j=1,\dots ,N$, satisfies $L(\widetilde{G},d)=\{0\}$ (recall Remark 6) and therefore the distribution of ${W_{n}^{X}}\hspace{0.3em}\mathrm{mod} \hspace{0.3em}1/k$ converges to the same limit as the limiting distribution of ${W_{n}^{Y}}\hspace{0.3em}\mathrm{mod} \hspace{0.3em}1/k$, as $n\to \infty $. The invariant probability measure, $\nu _{+}$, is therefore unique.
By applying Breiman’s ergodic theorem for Feller chains with a unique stationary distribution starting at a point (see, for example, [6] or [18]), we get the following result. Let $\delta _{x}$ denote the Dirac measure concentrated in the point $x\in {\mathbb{S}}^{1}$, and let
denote the empirical distribution along the trajectory starting at $x\in {\mathbb{S}}^{1}$ determined by $\omega \in \varSigma $ at time $n-1$.
Corollary 3.
Let $(F,p)$ be an IFS with probabilities of homeomorphisms on ${\mathbb{S}}^{1}$ which is forward and backward minimal and let $\mu _{+}$ denote its unique invariant probability measure. Then ${\mu _{n}^{x}}(\omega )$ converges to $\mu _{+}$ (in the weak∗ sense) P a.s. for any $x\in {\mathbb{S}}^{1}$.
Let $\stackrel{\text{d}}{\to }$ denote convergence in distribution. We are now ready to state our first result about invariant measures/stationary distributions for the IFS with probabilities generated by the inverse maps.
Proposition 2.
Let $(F,p)$ be an IFS with probabilities of homeomorphisms on ${\mathbb{S}}^{1}$ which is forward minimal and non-expansive on average with respect to d. Assume that some map $f_{j}$ is not an isometry (with respect to d). Then $L(F,d)=\{0,1/k,\dots ,\lfloor k/2\rfloor /k\}$ for some $k\ge 1$ and for any $1/k$-invariant nonatomic and fully supported random variable X on ${\mathbb{S}}^{1}$, independent of $(I_{n})_{n\ge 0}$ we have
as $n\to \infty $ for P a.a. $\omega \in \varSigma $, where ${\widehat{Z}}^{-}(\omega )$ is a random variable with distribution
\[{\mu _{\omega }^{-}}=\frac{1}{k}\sum \limits_{i=0}^{k-1}\delta _{\frac{1}{k}(i+{\widehat{Z}}^{-}(\omega ))}\]
for some random variable ${\widehat{Z}}^{-}:\varSigma \to {\mathbb{S}}^{1}$ and ${\mu _{\omega }^{-}}=({f_{\omega _{1}}^{-1}})_{\ast }{\mu _{\sigma (\omega )}^{-}}$ for P a.a. $\omega \in \varSigma $.
Proof.
Let $L=L(F,d)$. By Lemma 1 together with our hypotheses, we have $L=\{0,1/k,\dots ,\lfloor k/2\rfloor /k\}$ for some $k\ge 1$. Hence, if $d(x,y)=s\in L$, then
for all $j=1,\dots ,N$ and thus
for any $\omega \in \varSigma $ and $n\ge 0$.
Let us denote by $Z_{n}$ and ${\widehat{Z}_{n}^{-}}$ the sequences defined in (1) and (3), respectively. By Theorem 1 we have that $d({Z_{n}^{x}},{Z_{n}^{y}})$ converges almost surely to an L-valued random variable as $n\to \infty $.
Given $\omega \in \varSigma $, let
\[{\widehat{Z}}^{-}(\omega ):=k\sup \big\{y:\big|Z_{n}\big([0,y],\omega \big)\big|\to 0,\text{ as }n\to \infty \big\},\]
where $|\cdot |$ denotes the length of an interval and where we use the notation
Note that $y\mapsto |Z_{n}([0,y],\omega )|$ is an increasing function, for each fixed n and ω. Further, $|Z_{n}([0,y],\omega )|$ converges to an element in $\{0,1/k,\dots ,1\}$ as $n\to \infty $, for any $y\in {\mathbb{S}}^{1}$ for P a.a. $\omega \in \varSigma $. Indeed, this follows from the fact that $d({Z_{n}^{x}},{Z_{n}^{y}})$ converges to an element of L and the fact that $x\mapsto {Z_{n}^{x}}$ is a random homeomorphism. So ${\widehat{Z}}^{-}:\varSigma \to {\mathbb{S}}^{1}$ is a well-defined random variable.Let m be an arbitrary $1/k$-invariant nonatomic probability measure fully supported on ${\mathbb{S}}^{1}$. Note that if I is an interval of length $i/k$, then $m(I)=i/k$ for any $0\le i\le k$. If $x\notin \{{\widehat{Z}}^{-}(\omega )/k,({\widehat{Z}}^{-}(\omega )+1)/k,\dots ,({\widehat{Z}}^{-}(\omega )+(k-1))/k\}$, then
\[\begin{array}{r@{\hskip0pt}l}\displaystyle m& \displaystyle \big(\big\{y\in {\mathbb{S}}^{1}:{\widehat{Z}_{n}^{-}}(y,\omega )\le x\big\}\big)=m\big(\big\{y\in {\mathbb{S}}^{1}:{\widehat{Z}_{n}^{-}}(y,\omega )\in [0,x]\big\}\big)\\{} & \displaystyle =m\big(\big\{y\in {\mathbb{S}}^{1}:Z_{n}\big({\widehat{Z}_{n}^{-}}(y,\omega ),\omega \big)\in Z_{n}\big([0,x],\omega \big)\big\}\big)\\{} & \displaystyle =m\big(\big\{y\in {\mathbb{S}}^{1}:y\in Z_{n}\big([0,x],\omega \big)\big\}\big)\\{} & \displaystyle \to \left\{\begin{array}{l@{\hskip10.0pt}l}0\hspace{1em}& \text{ if }x<\displaystyle \frac{{\widehat{Z}}^{-}(\omega )}{k},\\{} \displaystyle \frac{i}{k}\hspace{1em}& \text{ if }\displaystyle \frac{{\widehat{Z}}^{-}(\omega )+(i-1)}{k}<x<\displaystyle \frac{{\widehat{Z}}^{-}(\omega )+i}{k},\hspace{0.1667em}\hspace{0.1667em}1\le i\le k-1,\\{} 1\hspace{1em}& \text{ if }x>\displaystyle \frac{{\widehat{Z}}^{-}(\omega )+(k-1)}{k}\end{array}\right.\end{array}\]
as $n\to \infty $ for P a.a. $\omega \in \varSigma $. Thus, if X is an m-distributed random variable on ${\mathbb{S}}^{1}$, independent of $(I_{n})_{n\ge 0}$ and ${\widehat{Z}}^{-}(\omega )$ has distribution
\[{\mu _{\omega }^{-}}=\frac{1}{k}\sum \limits_{i=0}^{k-1}\delta _{(i+{\widehat{Z}}^{-}(\omega ))/k}\hspace{1em}\text{ for }P\text{ a.a. }\omega \in \varSigma \]
then $m({\widehat{Z}_{n}^{-}}(X,\omega )\le x)\to m({\widehat{Z}}^{-}(\omega )\le x)$ as $n\to \infty $ if x is a continuity point of the cumulative distribution function of ${\widehat{Z}}^{-}(\omega )$ (for P a.a. $\omega \in \varSigma $). Thus, ${\widehat{Z}_{n}^{-}}(X,\omega )$ converges in distribution to ${\widehat{Z}}^{-}(\omega )$ as $n\to \infty $ for P a.a. $\omega \in \varSigma $. By taking limits in the equality ${\widehat{Z}_{n}^{-}}(X,\omega )={f_{\omega _{1}}^{-1}}({\widehat{Z}_{n-1}^{-}}(X,\sigma (\omega )))$, it therefore follows that
for P a.a. ω. Thus if ${\mu _{\omega }^{-}}$ denotes the distribution of ${\widehat{Z}}^{-}(\omega )$, then
for P a.a. ω.By integrating both sides of this equality with respect to P (recall that P is a Bernoulli measure determined by a probability vector $p=(p_{1},\dots ,p_{N})$) we thus obtain that
\[\begin{array}{r@{\hskip10.0pt}c@{\hskip10.0pt}l}\displaystyle \mu _{-}& \displaystyle :=& \displaystyle \int {\mu _{\omega }^{-}}\hspace{0.1667em}dP(\omega )=\sum \limits_{j=1}^{N}\int _{\omega :\omega _{1}=j}\big({f_{\omega _{1}}^{-1}}\big)_{\ast }{\mu _{\sigma (\omega )}^{-}}\hspace{0.1667em}dP(\omega )\\{} & \displaystyle =& \displaystyle \sum \limits_{j=1}^{N}\int _{\omega :\omega _{1}=j}\big({f_{j}^{-1}}\big)_{\ast }{\mu _{\sigma (\omega )}^{-}}\hspace{0.1667em}dP(\omega )\\{} & \displaystyle =& \displaystyle \sum \limits_{j=1}^{N}\int _{\omega :\omega _{1}=j}\big({f_{j}^{-1}}\big)_{\ast }\mu _{-}\hspace{0.1667em}dP(\omega )\\{} & \displaystyle =& \displaystyle \sum \limits_{j=1}^{N}p_{j}\big({f_{j}^{-1}}\big)_{\ast }\mu _{-}\hspace{0.1667em}\end{array}\]
and $\mu _{-}$ is therefore invariant for $({F}^{-1},p)$. Since by construction $\mu _{\omega }$ is independent of X, it follows that $\mu _{-}$ is indeed uniquely invariant. □Corollary 4.
Let $(F,p)$ be an IFS with probabilities of homeomorphisms on ${\mathbb{S}}^{1}$ which is forward and backward minimal. Assume that not all maps in F are conjugate (with a common conjugation map) to an isometry (with respect to d). Let $\mu _{-}$ be an invariant probability measure for $({F}^{-1},p)$, and let k be the largest integer such that $\mu _{-}$ is $1/k$-invariant. Conclusion: if X is a $\mu _{-}$-distributed random variable, independent of $(I_{n})_{n\ge 0}$, then
as $n\to \infty $ for P a.a. $\omega \in \varSigma $, where ${\widehat{Z}}^{-}(\omega )$ is a random variable with distribution $\mu _{\omega }$, uniformly distributed on k distinct points, and satisfying ${\mu _{\omega }^{-}}=({f_{\omega _{1}}^{-1}})_{\ast }{\mu _{\sigma (\omega )}^{-}}$ for P a.a. $\omega \in \varSigma $. It therefore follows that $\mu _{-}$ is unique and given by $\mu _{-}=\int {\mu _{\omega }^{-}}\hspace{0.1667em}dP(\omega )$.
Remark 10.
Convergence in (9) also follows from Furstenbergs martingale argument [11], but here we say more about the limit: The limit is $1/k$-invariant and independent of X (this implies that $\mu _{-}$ is uniquely invariant) and the limiting fiber measures $\mu _{\omega }$ are uniform and supported on sets of size k.
Proof.
Let $\mu _{-}$ be an invariant probability measure for $({F}^{-1},p)$. By Proposition 1, the IFS $G=\{g_{j}\}_{j}$ defined by $g_{j}:=\varPhi _{-}\circ f_{j}\circ {\varPhi _{-}^{-1}}$, where $\varPhi _{-}(x)=\mu _{-}([0,x])$, with probabilities p satisfies the hypotheses of Proposition 2. Note that F is forward minimal if, and only if, G is. Let $L(G,d)$ be the corresponding set of simultaneously preserved distances. By Lemma 1, we have $L(G,d)=\{0,1/k,\dots ,\lfloor k/2\rfloor /k\}$ for some $k\ge 1$.
Let us denote by $({\widehat{W}_{n}^{-}})_{n\ge 0}$ the sequence for the IFS G which is analogously defined as in (3) for the IFS F. Since F and G are conjugate by means of $\varPhi _{-}$, it is easy to check that
Note that if X is a $\mu _{-}$-distributed random variable, independent of $(I_{n})_{n\ge 0}$, then $Y:=\varPhi _{-}(X)$ is distributed according to the Lebesgue measure on ${\mathbb{S}}^{1}$. Hence, in particular, it follows that Y is a $1/k$-invariant, nonatomic, and fully supported random variable on ${\mathbb{S}}^{1}$.
By Proposition 2, it therefore follows that
\[{\widehat{W}_{n}^{-}}\big(\varPhi _{-}(X),\omega \big)=\varPhi _{-}\big({\widehat{Z}_{n}^{-}}(X,\omega )\big)\stackrel{\text{d}}{\to }{\widehat{W}}^{-}(\omega )\]
as $n\to \infty $ for P a.a. $\omega \in \varSigma $, where ${\widehat{W}}^{-}(\omega )$ is a random variable with distribution
\[{\nu _{\omega }^{-}}=\frac{1}{k}\sum \limits_{i=0}^{k-1}\delta _{\frac{1}{k}(i+{\widehat{W}}^{-}(\omega ))}\]
for some random variable ${\widehat{W}}^{-}:\varSigma \to {\mathbb{S}}^{1}$, that is, we have
\[{\widehat{Z}_{n}^{-}}(X,\omega )\stackrel{\text{d}}{\to }{\varPhi _{-}^{-1}}\big({\widehat{W}}^{-}(\omega )\big)\]
as $n\to \infty $ for P a.a. $\omega \in \varSigma $. Thus, if we define ${\widehat{Z}}^{-}(\omega ):={\varPhi _{-}^{-1}}({\widehat{W}}^{-}(\omega ))$, then this random variable has distibution $\mu _{\omega }(\cdot )=\nu _{\omega }(\varPhi _{-}(\cdot ))$. Moreover, the measure $\mu _{-}$ given by
is the unique invariant probability measure for $({F}^{-1},p)$. □