## 1 Introduction and results

Independently introduced in [33] and [30], exist almost surely, where $|A|$ denotes the cardinality of the set

*Ξ*-coalescents are exchangeable Markovian processes $\varPi =(\varPi _{t})_{t\ge 0}$ on the set of partitions of $\mathbb{N}:=\{1,2,\dots \}$ whose transitions are due to mergers of partition blocks. The distribution of*Π*is characterised by a finite measure*Ξ*on the infinite simplex where $|x|:=\sum _{i\in \mathbb{N}}x_{i}$. We exclude $\varXi =0$, since it leads to a coalescent without coalescence events.*Ξ*-coalescents allow that disjoint subsets of blocks merge into distinct new blocks, hence they are also called coalescents with simultaneous multiple mergers. If*Ξ*is concentrated on $[0,1]\times \{0\}\times \{0\}\times \cdots \hspace{0.1667em}$, only a single set of blocks is allowed to merge. Such a coalescent is a*Λ*-coalescent, see [32]. In this case,*Λ*is a finite measure on $[0,1]$, the restriction of*Ξ*on the first coordinate of*Δ*. The restriction ${\varPi }^{(n)}$ of*Π*on $[n]:=\{1,\dots ,n\}$ is called the*Ξ*-*n*-coalescent. Denote the blocks of $\varPi _{t}$ by $(B_{i}(t))_{i\in \mathbb{N}}$, where*i*is the least element of the block (we set $B_{i}(t)=\varnothing $ if*i*is not a least element of a block). Clearly, $1\in B_{1}(t)$. We call $B_{1}(t)$ the block of 1 at time*t*. Due to the exchangeability of the*Ξ*-coalescent, Kingman’s correspondence ensures that, for every $t\ge 0$, the asymptotic frequencies##### (1)

\[ f_{i}(t):=\underset{n\to \infty }{\lim }\frac{|B_{i}(t)\cap [n]|}{n},\hspace{1em}i\in \mathbb{N},\]*A*.The family of

*Ξ*-coalescents is a diverse class of processes with very different properties, see e.g. the review [15] for*Λ*-coalescents. We will focus on*Ξ*-coalescents with dust, i.e.*Ξ*fulfils (see [33]) where $\nu _{0}(dx)=\varXi (dx)/(x,x)$ with $(x,x):=\sum _{i\in \mathbb{N}}{x_{i}^{2}}$ for $x=(x_{1},x_{2},\dots )\in \varDelta $. These coalescents are characterised by a non-zero probability that, at any time*t*, there is a positive fraction of $\mathbb{N}$, the dust, that has not yet merged. Note that $i\in \mathbb{N}$ is part of the dust at time*t*if and only if $\{i\}$ is a block at time*t*, which is called a singleton block. The asymptotic frequency of the dust component is $S_{t}:=1-\sum _{i\in \mathbb{N}}f_{i}(t)$. Having dust is equivalent to $P(S_{t}>0)>0$ for all $t>0$. We are interested in*Ξ*-coalescents which stay infinite, i.e. which almost surely have an infinite number of blocks for each $t>0$. We will put some further emphasis on simple*Λ*-coalescents satisfying This class includes Dirac coalescents with $\varLambda =\delta _{p}$, the Dirac measure in $p\in (0,1]$. Consider the frequency process $f_{1}:=(f_{1}(t))_{t\ge 0}$ of the block of 1. For*Λ*-coalescents, Pitman characterises $f_{1}$ as follows (reproduced from [32], adjusted to our notation).##### Proposition 1.

*[*

*32*

*, Proposition 30]*

*No matter what Λ, the process*$f_{1}$

*is an increasing pure jump process with càdlàg paths,*$f_{1}(0)=0$

*and*$\lim _{t\to \infty }f_{1}(t)=1$

*. If*$\mu _{-1}=\infty $

*then almost surely*$f_{1}(t)>0$

*for all*$t>0$

*and*$\lim _{t\searrow 0}f_{1}(t)=0$

*. If*$\mu _{-1}<\infty $

*then*$f_{1}$

*starts by holding at zero until an exponential time with rate*$\mu _{-1}$

*, when it enters*$(0,1]$

*by a jump, and proceeds thereafter by a succession of holds and jumps, with holding rates bounded above by*$\mu _{-1}$

*.*

Moreover, in [32, Section 3.9], a general formula for the moments of $f_{1}(t)$ for fixed $t>0$ is provided.

For two particular coalescents without dust, further properties of $f_{1}$ are known. For Kingman’s

*n*-coalescent ($\varLambda =\delta _{0}$), the complete distribution of block sizes is explicitly known, see [24, Theorem 1], from which one can derive some properties of the block of 1 due to exchangeability. For the Bolthausen–Sznitman coalescent (*Λ*the uniform distribution on $[0,1]$) the block of 1 can be characterised as in [32, Corollary 16]. For instance, $f_{1}$ is Markovian for the Bolthausen–Sznitman coalescent.Different specific aspects of the block of 1 have been analysed for different Due to the exchangeability of the

*Λ*/*Ξ*-*n*-coalescents including their asymptotics for $n\to \infty $.- • Minimal clade size: The size $M_{n}$ of the block of 1 for the
*n*-coalescent at its first jump. For Kingman’s*n*-coalescent and for*Λ*beta-distributed with parameters $(2-\alpha ,\alpha )$ with $\alpha \in (1,2)$, $X_{n}$ converges in distribution for $n\to \infty $, see [6] and [34]. For the Bolthausen–Sznitman*n*-coalescent, $\log (M_{n})/\log (n)$ converges in distribution [14]. These results do not cover*Λ*/*Ξ*-coalescents with dust. - • The number of blocks involved in the first merger of the block of 1, see [34]. The results cover
*Λ*-coalescents with dust.

*Ξ*-coalescent, any result for the distribution of the block of 1 holds true for the block containing any other $i\in \mathbb{N}$. We want to further describe $f_{1}$ for*Ξ*-coalescents with dust. For any finite measure*Ξ*on*Δ*which fulfils (2), we introduce We see that $\gamma \in (0,1]$, since \[ 0<\varXi (\varDelta )=\int _{\varDelta }(x,x)\nu _{0}(dx)\le \int _{\varDelta }|x|\nu _{0}(dx)=\mu _{-1}<\infty .\]

Define $\varDelta _{f}:=\bigcup _{k\in \mathbb{N}}\{x\in \varDelta :x_{1}+\cdots +x_{k}=1\}$. We extend Proposition 1 for *Ξ*-coalescents with dust which stay infinite, i.e. have almost surely infinitely many blocks for each $t\ge 0$ (equivalent to $\varXi (\varDelta _{f})=0$, see Lemma 4). While the extension to*Ξ*-coalescents and the explicit waiting time distributions are a direct follow-up from Pitman’s proof, we provide a more detailed description of the jump heights of $f_{1}$. Proposition 1 ensures that the jumps of $f_{1}$ are separated by (almost surely) positive waiting times, we denote the value of $f_{1}$ at its*k*th jump with $f_{1}[k]$ for $k\in \mathbb{N}$.##### Theorem 1.

*In any Ξ-coalescent Π with dust and*$\varXi (\varDelta _{f})=0$

*, the asymptotic frequency process*$f_{1}:=(f_{1}(t))_{t\ge 0}$

*of the block of*1

*, defined by Eq.*(1)

*, is an increasing pure jump process with càdlàg paths,*$f_{1}(0)=0$

*and*$\lim _{t\to \infty }f_{1}(t)=1$

*, but*$f_{1}(t)<1$

*for*$t>0$

*almost surely. The waiting times between almost surely infinitely many jumps are distributed as independent*$\mathrm{Exp}(\mu _{-1})$

*random variables. Its jump chain*$(f_{1}[k])_{k\in \mathbb{N}}$

*can be expressed via stick-breaking*

*where*$(X_{j})_{j\in \mathbb{N}}$

*are pairwise uncorrelated,*$X_{j}>0$

*almost surely and*$E(X_{j})=\gamma $

*for all*$j\in \mathbb{N}$

*. In particular,*$E(f_{1}[k])=1-{(1-\gamma )}^{k}$

*. In general,*$(X_{j})_{j\in \mathbb{N}}$

*are neither independent nor identically distributed.*

##### Remark 1.

*From Theorem*

*1*

*, the dependence between*$f_{1}$

*and its jump times is readily seen as follows. Recall [*

*32*

*, Eq. (51)]*

*that*$E(f_{1}(t))=1-{e}^{-t}$

*for any Λ-coalescent with*$\varLambda ([0,1])=1$

*. If we would have independence, integrating*$E(f_{1}(t))$

*over the waiting time distribution*$\mathrm{Exp}(\mu _{-1})$

*for the first jump of*$f_{1}$

*would yield*$E(f_{1}[1])={(1+\mu _{-1})}^{-1}$

*, in contradiction to*$E(f_{1}[1])=1/\mu _{-1}$

*by Theorem*

*1*

*.*

Dirac coalescents ($\varLambda =\delta _{p}$ for some $p\in (0,1]$) are a family of

*Λ*-coalescents with dust. They have been introduced as simplified models for populations in species with skewed offspring distributions (reproduction sweepstakes), see [9]. Their jump chains (discrete time Dirac coalescents) can also arise as large population size limits in conditional branching process models [21, Theorem 2.5].We further characterise $f_{1}$ as follows, including an explicit formula for its distribution at its first jump.

##### Proposition 2.

*Let*$\varLambda =\delta _{p}$

*,*$p\in [\frac{1}{2},1)$

*and*$q:=1-p$

*.*$f_{1}$

*takes values in the set*

##### (6)

\[ \mathcal{M}_{p}:=\Bigg\{\sum \limits_{i\in \mathbb{N}}b_{i}p{q}^{i-1}:b_{i}\in \{0,1\},1\le \sum \limits_{i\in \mathbb{N}}b_{i}<\infty \Bigg\}.\]*For*$x=\sum _{i\in \mathbb{N}}b_{i}p{q}^{i-1}\in \mathcal{M}_{p}$

*, we have*

##### (7)

\[ P\big(f_{1}[1]=x\big)=p{q}^{j-1}\prod \limits_{i\in J\setminus \{j\}}P(Y+i\in J)\prod \limits_{i\in [j-1]\setminus J}P(Y+i\notin J)>0,\]*where*$Y\stackrel{d}{=}Geo(p)$

*,*$J:=\{i\in \mathbb{N}|b_{i}=1\}$

*and*$j:=\max J$

*. The process*$f_{1}$

*is not Markovian whereas its jump chain*$(f_{1}[k])_{k\in \mathbb{N}}$

*is Markovian.*

##### Remarks 2.

- •
*The law of*$f_{1}[1]$*is a discrete measure on*$[0,1]$*for Dirac coalescents. Surprisingly different properties arise for different values of p. For instance,*$\mathcal{M}_{2/3}=\{\sum _{i\in \mathbb{N}}b_{i}{3}^{-i}:b_{i}\in \{0,2\},1\le \sum _{i\in \mathbb{N}}b_{i}<\infty \}$*is a subset of the ternary Cantor set which is nowhere dense in*$[0,1]$*, whereas*$\mathcal{M}_{1/2}$*, the set of all*$x\in [0,1]$*with finite 2-adic expansion, is dense in*$[0,1]$*.* - •
*We omitted*$f_{1}[1]$*for the star-shaped coalescent (*$\varLambda =\delta _{1}$*), since it just jumps from*0*to*1*at time*$T\stackrel{d}{=}\mathrm{Exp}(1)$*.* - •
*Recall that*$f_{1}$*is Markovian for the Bolthausen–Sznitman coalescent in contrast to*$f_{1}$*for the Dirac coalescents specified above.*

Our key motivation was to provide a more detailed description of the jump chain of $f_{1}$, especially properties of the value $f_{1}[1]$ at the first jump which is the asymptotic frequency of the minimal clade. Theorem 1 provides a first-order limit result for all

*Ξ*-coalescents with dust.##### Corollary 1.

*Let Π be a Ξ-coalescent with dust and*${\varPi }^{(n)}$

*its restriction on*$[n]$

*. Let*$M_{n}$

*be the minimal clade size, i.e. the size of the block of 1 at its first merger in*${\varPi }^{(n)}$

*. Then,*$M_{n}/n\to f_{1}[1]$

*almost surely,*$f_{1}[1]>0$

*almost surely and*$E(f_{1}[1])=\gamma $

*.*

Compared to the known results listed above for the minimal clade size for dust-free coalescents, the minimal clade size is much larger asymptotically for $n\to \infty $ ($O(n)$ compared to $o(n)$).

The law of $f_{1}[1]$ in (7) follows from the following more general description of $f_{1}[1]$ for simple We have $\alpha \in [0,1]$ since ${x}^{-1}\le {x}^{-2}$ on $(0,1]$ (if $\mu _{-1}<\infty $, we have $\varLambda (\{0\})=0$). Additionally, $\alpha >0$ if and only if $\mu _{-2}<\infty $, so if

*Λ*-coalescents. We introduce, for a finite measure*Λ*on $[0,1]$ with $\mu _{-1}={\int _{0}^{1}}{x}^{-1}\varLambda (dx)<\infty $,##### (8)

\[ \alpha :=\frac{\mu _{-1}}{\mu _{-2}}=\frac{{\textstyle\int _{0}^{1}}{x}^{-1}\varLambda (dx)}{{\textstyle\int _{0}^{1}}{x}^{-2}\varLambda (dx)}.\]*Λ*characterises a simple coalescent (recall that $\mu _{-2}\ge \mu _{-1}>0$ since we exclude $\varLambda =0$).##### Proposition 3.

*Let Λ fulfil*(3)

*. Then,*

##### (9)

\[ f_{1}[1]={\sum \limits_{i=1}^{C}}{B_{i}^{(C)}}P_{i}\prod \limits_{j\in [i-1]}(1-P_{j})=\sum \limits_{i\in \mathbb{N}}P_{i}\prod \limits_{j\in [i-1]}(1-P_{j})\sum \limits_{k\ge i}{B_{i}^{(k)}}1_{\{C=k\}},\]*where*$(P_{i})_{i\in \mathbb{N}}$

*are i.i.d. with*$P_{i}\stackrel{d}{=}{\mu _{-2}^{-1}}{x}^{-2}\varLambda (dx)$

*. We have*

\[ P\big(C=k|(P_{i})_{i\in \mathbb{N}}\big)=P_{k}\prod \limits_{j\in [k-1]}(1-P_{j}),C\hspace{2.5pt}\textit{is}\hspace{2.5pt}Geo(\alpha )\textit{-distributed.}\]

*Given*$(P_{i})_{i\in \mathbb{N}}$

*, C and*$({B_{i}^{(k)}})_{k\in \mathbb{N},i\in [k]}$

*are independent and*$({B_{i}^{(k)}})_{k\in \mathbb{N},i\in [k]}$

*is defined as*

##### (10)

\[\begin{array}{r@{\hskip0pt}l}& \displaystyle P\big(\big({B_{1}^{(j)}},\dots ,{B_{j}^{(j)}}\big)=b|(P_{i})_{i\in \mathbb{N}}\big)\\{} & \displaystyle \hspace{1em}=\prod \limits_{i\in J\setminus \{j\}}P\big(I(i)\in J|(P_{i})_{i\in \mathbb{N}}\big)\prod \limits_{i\in [j-1]\setminus J}P\big(I(i)\notin J|(P_{i})_{i\in \mathbb{N}}\big)\hspace{2.5pt}\hspace{2.5pt}\textit{almost surely},\end{array}\]*where*$b=(b_{1},\dots ,b_{j})\in {\{0,1\}}^{j-1}\times \{1\}$

*,*$J:=\{i\in [j]|b_{i}=1\}$

*and, for each*$i\in \mathbb{N}$

*,*$I(i):=\min \{j\ge i+1:{B_{i}^{(j)}}=1\}$

*. We have*

##### Remarks 3.

- •
*The distribution of*$f_{1}[1]$*for Dirac coalescents with*$p>\frac{1}{2}$*has a structure somewhat similar to the Cantor distribution, see e.g. [**26**]**and [**18**]. The Cantor distribution is the law of*$\sum _{i\in \mathbb{N}}B_{i}p{q}^{i-1}$*for*$p\in (0,1)$*, where*$(B_{i})_{i\in \mathbb{N}}$*are i.i.d. Bernoulli variables with success probability*$\frac{1}{2}$*, whereas in our case*$(B_{i})_{i\in \mathbb{N}}$*are dependent Bernoulli variables with success probabilities*$P(B_{i}=1)=P(\sum _{k\ge i}{B_{i}^{(k)}}1_{\{C=k\}}=1)=p{q}^{i-1}+\sum _{k>i}{p}^{2}{q}^{k-1}=p{q}^{i-1}(1+q)$*, see Eq.*(9)*. The Cantor distribution is a shifted infinite Bernoulli convolution. Infinite Bernoulli convolutions are the set of distributions of*$\sum _{i\in \mathbb{N}}\omega _{i}{(-1)}^{B_{i}}$*with*$\omega _{i}\in \mathbb{R}$*for*$i\in \mathbb{N}$*satisfying*$\sum _{i\in \mathbb{N}}{\omega _{i}^{2}}<\infty $*, see [**31**, Section 2]. They have been an active field of research since the 1930’s, e.g. see [**10**,**35**]**and the survey [**31**].*

Our main tool for the proofs is Schweinsberg’s Poisson construction of the

*Ξ*-coalescent. The article is organised as follows. We recall (properties of) the Poisson construction in Section 2. Section 3 characterises staying infinite for*Ξ*-coalescents with dust. These prerequisites are then used to prove the results for*Ξ*-coalescents with dust in Section 4 and for simple*Λ*-coalescents in Section 5.## 2 Poisson construction of a *Ξ*-coalescent and the block of 1

We recall the construction of a

*Ξ*-*n*-coalescent*Π*from [33]. We are only interested in constructing a*Ξ*-coalescent with dust, which implies $\varXi (\{0\})=0$, see Eq. (2).Let $\mathcal{P}$ be a Poisson point process on $A=[0,\infty )\times {\mathbb{N}_{0}^{\infty }}$ with intensity measure where, for $x\in \varDelta $, ${P}^{(x)}$ is a probability measure on $\mathbb{N}_{0}$ with ${P}^{(x)}(\{k\})=x_{k}$ and ${P}^{(x)}(\{0\})=1-|x|$ (Kingman’s paintbox) and $\nu _{0}$ is defined as in Eq. (2). For $n\in \mathbb{N}$, the restriction ${\varPi }^{(n)}$ of

*Π*to $[n]$ can be constructed by starting at $t=0$ with each $i\in [n]$ in its own block. Then, for each subsequent time $(T=)t$ with a Poisson point $(T,(K_{i})_{i\in \mathbb{N}})$, merge all present blocks*i*(at most*n*) with identical $k_{i}>0$, where*i*is the least element of the block (there are only finitely many points of $\mathcal{P}$ that lead to a merger of blocks in $[n]$).*Π*is then pathwise defined by its restrictions $({\varPi }^{(n)})_{n\in \mathbb{N}}$. From now on we will assume without loss of generality that the*Ξ*-coalescent with dust is constructed via the Poisson process $\mathcal{P}$.The block of 1 can only merge at Poisson points $P=(T,(K_{i})_{i\in \mathbb{N}})$ with $K_{1}>0$. We take a closer look at these Poisson points. We introduce $\text{exchangeable}(Q)$ indicators following [32, p.1884]: These are exchangeable Bernoulli variables which are conditionally i.i.d. given a random variable

*X*with distribution*Q*on $[0,1]$ which gives their success probability. Alternatively, we denote these as $\text{exchangeable}(X)$ indicators if we can specify*X*.##### Lemma 1.

*For any finite measure Ξ on Δ fulfilling*(2)

*,*$\mathcal{P}$

*splits into two independent Poisson processes*

\[ \mathcal{P}_{1}:=\big\{\big(T,(K_{i})_{i\in \mathbb{N}}\big)\in \mathcal{P}:K_{1}>0\big\}\hspace{1em}\textit{and}\hspace{1em}\mathcal{P}_{2}:=\big\{\big(T,(K_{i})_{i\in \mathbb{N}}\big)\in \mathcal{P}:K_{1}=0\big\}.\]

$\mathcal{P}_{1}$ *has almost surely finitely many points on any set*$[0,t]\times {\mathbb{N}_{0}^{\infty }}$

*, thus we can order*

\[ \mathcal{P}_{1}=\big(\big(T_{j},\big({K_{i}^{(j)}}\big)_{i\in \mathbb{N}}\big)\big)_{j\in \mathbb{N}},\]

*where*$T_{j}<T_{j+1}$

*almost surely for*$j\in \mathbb{N}$

*.*

$(T_{j})_{j\in \mathbb{N}}$

*is a homogeneous Poisson process on*$[0,\infty )$*with intensity*$\mu _{-1}$*.* $(({K_{i}^{(j)}})_{i\in \mathbb{N}})_{j\in \mathbb{N}}$

*is an i.i.d. sequence in j and*$(1_{\{{K_{1}^{(1)}}={K_{i}^{(1)}}\}})_{i\ge 2}$*are exchangeable(Q) indicators with* \[ Q:=\frac{1}{\mu _{-1}}\int _{\varDelta }\sum \limits_{i\in \mathbb{N}}x_{i}\delta _{x_{i}}\nu _{0}(dx),\]

*which is a probability measure on*$[0,1]$*. For*$X\stackrel{d}{=}Q$*, we have*$X>0$*almost surely and*$E(X)=\gamma $*.*##### Proof.

$\mathcal{P}_{1}$ and $\mathcal{P}_{2}$ are obtained by restricting $\mathcal{P}$ on the disjoint subsets $A_{1}:=[0,\infty )\times \mathbb{N}\times {\mathbb{N}_{0}^{\infty }}$ and $A_{2}:=[0,\infty )\times \{0\}\times {\mathbb{N}_{0}^{\infty }}$ of Thus, on any bounded set

*A*. Thus, $\mathcal{P}_{1}$ and $\mathcal{P}_{2}$ are independent Poisson processes (restriction theorem [25, p.17]) with intensity measures $\nu _{1}=\nu (\cdot \cap A_{1})$ and $\nu _{2}=\nu (\cdot \cap A_{2})$. For any Borel set $B\subseteq [0,\infty )$ and*λ*being the Lebesgue measure,##### (12)

\[ \nu _{1}\big(B\times {\mathbb{N}_{0}^{\infty }}\big)=\lambda (B)\int _{\varDelta }\underset{=|x|}{\underbrace{{P}^{(x)}(\mathbb{N})}}\prod \limits_{i\ge 2}\underset{=1}{\underbrace{{P}^{(x)}(\mathbb{N}_{0})}}\nu _{0}(dx)=\lambda (B)\mu _{-1}.\]*B*, $\mathcal{P}_{1}$ has almost surely finitely many points, which can be ordered as described. Projecting $\mathcal{P}_{1}$ on the first coordinate*t*of*A*yields a Poisson process with intensity measure $\mu _{-1}dt$ (mapping theorem [25, p.18]).Now, we project the points of $\mathcal{P}_{1}$ on the coordinate of $({K_{i}^{(j)}})_{i\in \mathbb{N}}$. Recall the construction of a Poisson process as a collection of i.i.d. variables with distribution ${(\mu (C))}^{-1}\mu $ on sets of finite mass for $l_{1}\in \mathbb{N}$ and $l_{2},\dots ,l_{m}\in \mathbb{N}_{0}$. Consider $(1_{\{{K_{1}^{(1)}}={K_{i}^{(1)}}\}})_{i\ge 2}$. To show that they are $\text{exchangeable}(Q)$ indicators, [32, Eq. (27)] has to be fulfilled, i.e. we need to show $P(\{i\in [m]:{K_{i}^{(1)}}={K_{1}^{(1)}}\}=M)=E({X}^{|M|-1}{(1-X)}^{m-|M|})$ for $X\stackrel{d}{=}Q$ and any $M\subseteq [m]$ with $1\in M$. Using Eq. (13) we compute

*C*of the intensity measure*μ*, e.g. [25, p.23]. It shows that we can treat the collection of $(T_{j},({K_{i}^{(j)}})_{i\in \mathbb{N}})$ with, for instance, $T_{i}\in [k,k+1)$ for $k\in \mathbb{N}$ as a random number of i.i.d. random variables with distribution ${(1\cdot \mu _{-1})}^{-1}\nu _{1}$. Since $\nu _{1}$ has a product structure on $A_{1}$, we have that $(({K_{i}^{(j)}})_{i\in \mathbb{N}})_{j\in \mathbb{N}}$ are i.i.d. in*j*and have distribution, for $m\in \mathbb{N}$,##### (13)

\[ P\big(\big({K_{i}^{(1)}}=l_{i}\big)_{i\in [m]}\big)=\frac{1}{\mu _{-1}}\int _{\varDelta }\prod \limits_{i\in [m]}{P}^{(x)}(l_{i})\nu _{0}(dx)=\frac{1}{\mu _{-1}}\int _{\varDelta }\prod \limits_{i\in [m]}x_{l_{i}}\nu _{0}(dx)\] \[\begin{array}{r@{\hskip0pt}l}\displaystyle P\big(\big\{i\in [m]:{K_{i}^{(1)}}={K_{1}^{(1)}}\big\}=M\big)& \displaystyle =\sum \limits_{j\in \mathbb{N}}P\big(\big\{i\in [m]:{K_{i}^{(1)}}=j\big\}=M,{K_{1}^{(1)}}=j\big)\\{} & \displaystyle =\frac{1}{\mu _{-1}}\int _{\varDelta }\sum \limits_{j\in \mathbb{N}}{x_{j}^{|M|}}{(1-x_{j})}^{m-|M|}\nu _{0}(dx)\\{} & \displaystyle =E\big({X}^{|M|-1}{(1-X)}^{m-|M|}\big).\end{array}\]

Clearly, $P(X>0)=1$ since $\varXi (\{0\})=0$ and $E(X)={\mu _{-1}^{-1}}\int _{\varDelta }(x,x)\nu _{0}(dx)=\gamma $. □##### Remarks 4.

- •
*Q can be seen as the expected value of the random probability measure*$Q_{x}:=|x{|}^{-1}\sum _{i\in \mathbb{N}}x_{i}\delta _{x_{i}}$*for*$x\in \varDelta $*with x drawn from*${\mu _{-1}^{-1}}|x|\nu _{0}(dx)$*. In the Poisson construction, this means we draw a “paintbox”*$x\in \varDelta $*and then record in which box the ball of*1*falls, if we only allow it to fall in boxes*$1,2,\dots $*.* - •
*Consider a simple Λ-coalescent. Projecting*$\mathcal{P}_{2}$*on its first component, so*$(T,(K_{i})_{i\in \mathbb{N}})\mapsto T$*, yields a homogeneous Poisson process with intensity*$\mu _{-2}-\mu _{-1}<\infty $*. To see this, proceed analogously as for*$\mathcal{P}_{1}$*. Then, Eq.*(12)*for*$\nu _{2}$*reads the same except for replacing*${P}^{(x)}(\mathbb{N})$*by*${P}^{(x)}(\{0\})=1-|x|$*.*

For a

*Λ*-coalescent (with $\varLambda (\{0\})=0$) the Poisson construction simplifies, since*Ξ*only has mass on $\{x\in \varDelta :x_{2}=x_{3}=\cdots =0\}$ and thus $\mathcal{P}$ can be seen as a Poisson process on $[0,\infty )\times {\{0,1\}}^{\infty }$ with intensity measure $dt\otimes \int _{[0,1]}\otimes _{n\in \mathbb{N}}{P}^{(x)}{x}^{-2}\varLambda (dx)$, where ${P}^{(x)}$ is the Bernoulli distribution with success probability $x\in (0,1]$.When constructing simple

*Λ*-coalescents, even the process $\mathcal{P}$ itself has almost surely finitely many points $(T_{j},({K_{i}^{(j)}})_{i\in \mathbb{N}})$ on any set $[0,t]\times {\{0,1\}}^{\infty }$ (which we can again order in the first coordinate). As described in [32, Example 19] and analogously to Lemma 1, we can construct each (potential) merger at point $(T_{j},({K_{i}^{(j)}})_{j\in \mathbb{N}})$ of a simple*Λ*-coalescent as follows (while between jumps, we wait independent $\mathrm{Exp}(\mu _{-2})$ times). First choose $P_{i}\in (0,1]$ from ${\mu _{-2}^{-1}}{x}^{-2}\varLambda (dx)$, we have $E(P_{i})={\mu _{-2}^{-1}}\int _{[0,1]}{x}^{-1}\varLambda (dx)=\alpha $. Then, throw independent coins $({K_{i}^{(j)}})_{i\in \mathbb{N}}$ with probability $P_{i}$ for ‘heads’ (=1) for each block present and merge all blocks whose coins came up ‘heads’. Again, $(P_{i})_{i\in \mathbb{N}}$ are i.i.d. and the ‘coins’ ${K_{i}^{(j)}}$ are $\text{exchangeable}(P_{i})$ indicators. Analogously to above, we thus have##### Lemma 2.

*Let Λ be a finite measure on*$[0,1]$

*fulfilling*(3)

*. For the Poisson process*$\mathcal{P}=(T_{j},({K_{i}^{(j)}})_{i\in \mathbb{N}})_{j\in \mathbb{N}}$

*,*$(({K_{i}^{(j)}})_{i\in \mathbb{N}})_{j\in \mathbb{N}}$

*is an i.i.d. sequence (in j) of sequences of*$\textit{exchangeable}(P_{j})$

*indicators, where*$(P_{j})_{j\in \mathbb{N}}$

*are i.i.d. with*$P_{1}\stackrel{d}{=}{\mu _{-2}^{-1}}{x}^{-2}\varLambda (dx)$

*. In particular,*$E(P_{i})=\alpha $

*.*

Since many proofs will build on the properties of different sets of exchangeable indicators, we collect some well-known properties in the following

##### Lemma 3.

*Let*$(K_{i})_{i\in \mathbb{N}}$

*be*$\textit{exchangeable}(X)$

*indicators.*

- a)
*We have*$\lim _{n\to \infty }\frac{1}{n}{\sum _{i=1}^{n}}K_{i}=X\textit{almost surely}$*. X is almost surely unique.* - b)
*If*$(K_{i})_{i\in \mathbb{N}}$*is independent of a σ-field*$\mathcal{F}$*, X is, too.* - c)
*Let*$(L_{i})_{i\in \mathbb{N}}$*be*$\textit{exchangeable}(Y)$*indicators, independent of*$(K_{i})_{i\in \mathbb{N}}$*. Then,*$(K_{i}L_{i})_{i\in \mathbb{N}}$*are*$\textit{exchangeable}(XY)$*indicators and X, Y are independent.*

##### Proof.

These properties essentially follow from the de Finetti representation of an infinite series of exchangeable variables as conditionally i.i.d. variables. The lemma is a collection of well-known properties as e.g. described in [3, Sections 2 and 3], arguments of which we use in the following.

An infinite exchangeable sequence is conditionally i.i.d. given an almost surely unique random measure

*α*. This measure is the weak limit of the empirical measures, in our case, ${n}^{-1}{\sum _{i=1}^{n}}\delta _{K_{i}}$, which has limit ${X^{\prime }}\delta _{1}+(1-{X^{\prime }})\delta _{0}$ for some random variable ${X^{\prime }}$ with values in $[0,1]$. Given*α*, the indicators are*α*-distributed. However, since*X*gives the success probability of each Bernoulli coin, we have $X={X^{\prime }}$ almost surely, so*X*is almost surely unique. The rest of a) is just the strong law of large numbers e.g. from [3, 2.24] ($E(K_{1})\le 1$), the limit is ${X^{\prime }}$. Part b) follows from measure theory since the limit is measurable in the*σ*-field spanned by the summed variables. For c), we again check Pitman’s condition [32, Eq. 27]. Let $M\subseteq [m]$. We have that*X*,*Y*are independent from b). With $P(K_{i}=L_{i}=1|X,Y)=XY$ almost surely, \[\begin{array}{r@{\hskip0pt}l}\displaystyle P\big(\big\{i\in [m]:K_{i}L_{i}=1\big\}=M\big)& \displaystyle =E\big(P\big(\big\{i\in [m]:K_{i}L_{i}=1\big\}=M|X,Y\big)\big)\\{} & \displaystyle =E\big({(XY)}^{|M|}{(1-XY)}^{m-|M|}\big),\end{array}\]

since given *X*,*Y*, both $(K_{i})_{i\in \mathbb{N}}$ and $(L_{i})_{i\in \mathbb{N}}$ are independent. This shows c). □## 3 When does a *Ξ*-coalescent with dust stay infinite?

A crucial assumption for our results is that the

*Ξ*-coalescent*Π*has almost surely infinitely many blocks that may merge in the mergers where 1 participates in. The property \[ P(\varPi _{t}\hspace{2.5pt}\text{has infinitely many blocks}\hspace{2.5pt}\forall \hspace{2.5pt}t>0)=1\]

is called staying infinite, while $P(\varPi _{t}\hspace{2.5pt}\text{has finitely many blocks}\hspace{2.5pt}\forall \hspace{2.5pt}t>0)=1$ is the property of coming down from infinity. These properties have been thoroughly discussed for *Ξ*-coalescents, see e.g. [33, 27] and [20].We recall the condition for

*Ξ*-coalescents with dust to stay infinite.##### Lemma 4.

*Let*$\varDelta _{f}:=\{x\in \varDelta :x_{1}+\cdots +x_{k}=1\hspace{2.5pt}\textit{for some}\hspace{2.5pt}k\in \mathbb{N}\}$

*and Ξ be a finite measure on Δ fulfilling Eq.*(2)

*. The Ξ-coalescent stays infinite if and only if*$\varXi (\varDelta _{f})=0$

*. If*$\varXi (\varDelta _{f})>0$

*, then the Ξ-coalescent has infinitely many blocks until the first jump of*$f_{1}$

*almost surely and the Ξ-coalescent neither comes down from infinity nor stays infinite.*

##### Proof.

Let ${\varDelta }^{\ast }:=\{x\in \varDelta :|x|=1\}$. All

*Ξ*-coalescents considered are constructed via the Poisson construction with Poisson point process $\mathcal{P}$.First, assume $\varXi ({\varDelta }^{\ast })=0$. We recall the (well-known) property that for a

*Ξ*-coalescent with dust $\varXi ({\varDelta }^{\ast })=0$ is equivalent to $P(S_{t}>0\hspace{2.5pt}\forall t)=1$, where $S_{t}$ is the asymptotic frequency of the dust component. We use the remark on [12, p.1091]: For*Ξ*-coalescents with dust, $(-\log S_{t})_{t\ge 0}$ is a subordinator. The subordinator jumps to*∞*(corresponds to $S_{t}=0$) if and only if for its Laplace exponent*Φ*, we have $\lim _{\eta \searrow 0}\varPhi (\eta )>0$. For a*Ξ*-coalescent with dust we have $\lim _{\eta \searrow 0}\varPhi (\eta )=\int _{{\varDelta }^{\ast }}\nu _{0}(dx)$. Hence, $\varXi ({\varDelta }^{\ast })=0$ almost surely guarantees infinitely many singleton blocks for all $t\ge 0$, so the corresponding*Ξ*coalescent stays infinite.Now assume $\varXi ({\varDelta }^{\ast })>0$. The subordinator $(-\log S_{t})_{t\ge 0}$ jumps from finite values ($S_{t}>0$) to

*∞*($S_{t}=0$) after an exponential time with rate $\nu _{0}({\varDelta }^{\ast })$. This shows that the*Ξ*-coalescent does not come down from infinity. Assume further that $\varXi (\varDelta _{f})=0$. Then, [33, Lemma 31] shows that the*Ξ*-coalescent either comes down from infinity or stays infinite, so it stays infinite.Finally, assume $\varXi (\varDelta _{f})>0$. Split $\mathcal{P}$ into independent Poisson processes ${\mathcal{P}^{\prime }_{1}}:=\{(T,(K_{i})_{i\in \mathbb{N}})\in \mathcal{P}:\kappa \in \varDelta _{f}\}$ and ${\mathcal{P}^{\prime }_{2}}:=\{(T,(K_{i})_{i\in \mathbb{N}})\in \mathcal{P}:\kappa \notin \varDelta _{f}\}$, where $\kappa :=(\lim _{n\to \infty }{n}^{-1}\sum _{i\in [n]}1_{\{K_{i}=j\}})_{j\in \mathbb{N}}$ (again restriction theorem [25, p.17], Lemma 3 shows

*κ*exists almost surely). Their intensity measures are defined as in Eq. (11), but using ${\nu ^{\prime }_{1}}(\cdot ):=\nu _{0}(\cdot \cap \varDelta _{f})$ and ${\nu ^{\prime }_{2}}:=\nu _{0}-{\nu ^{\prime }_{1}}$ instead of $\nu _{0}$. Since ${\nu ^{\prime }_{1}}(\varDelta _{f})\le \mu _{-1}<\infty $, for any $t>0$ there are almost surely finitely many $P\in {\mathcal{P}^{\prime }_{1}}$ with $T<t$. Consider such $P=(T,(K_{i})_{i\in \mathbb{N}})$ with*T*smallest. Observe that until*T*, we can construct the*Ξ*-coalescent using only the points of ${\mathcal{P}^{\prime }_{2}}$, which is the construction of a ${\varXi ^{\prime }}$-coalescent with ${\varXi ^{\prime }}(dx):=(x,x){\nu ^{\prime }_{2}}(dx)$. Since $\int _{\varDelta }|x|{\nu ^{\prime }_{2}}(dx)<\mu _{-1}<\infty $ and ${\varXi ^{\prime }}(\varDelta _{f})=0$, the proof steps above show that the*Ξ*-coalescent has infinitely many blocks until*T*. Now consider the merger at time*T*. The form of ${\nu ^{\prime }_{1}}$ ensures that $(K_{i})_{i\in \mathbb{N}}$ can only take finitely many values, and Lemma 3a) ensures that infinitely many $K_{i}$’s show each value. Thus, all blocks present before time*T*are merged at*T*into a finite number of blocks (given by which $K_{i}$’s show the same number). This shows that if $\varXi (\varDelta _{f})>0$, the*Ξ*-coalescent stays neither infinite nor comes down from infinity. Additionally, this shows that either the block of 1 already merged at least once before*T*or it merges at*T*, thus there are infinitely many blocks before the first merger of 1. □## 4 The block of 1 in *Ξ*-coalescents with dust – proofs and remarks

##### Proof of Theorem 1.

As in Lemma 1, split the Poisson point process $\mathcal{P}$ used to construct the

*Ξ*-coalescent in $\mathcal{P}_{1}$ and $\mathcal{P}_{2}$. We also use the notation from Lemma 1 and its proof. The block of 1 in the*Ξ*-*n*-coalescent for any $n\in \mathbb{N}$ can only merge at times*t*for which there exists a Poisson point $(T,(K_{i})_{i\in \mathbb{N}})\in \mathcal{P}_{1}$. Lemma 1 states that the set of times*T*forms a homogeneous Poisson process with rate $\mu _{-1}$. This shows that potential jump times are separated by countably many independent $\mathrm{Exp}(\mu _{-1})$ random variables. Kingman’s correspondence yields that $f_{1}$ exists almost surely at each potential jump time. To see this, observe that even though the partition of $\mathbb{N}$ induced by the Poisson construction is not exchangeable, the partition on $\mathbb{N}\setminus \{1\}$ is, and the asymptotic frequencies of the former and the latter coincide. Since $f_{1}$ is by definition constant between these jump points, $f_{1}$ has càdlàg paths almost surely. Since any blocks change by mergers, $f_{1}$ is increasing.The value of $f_{1}$ at 0 follows by definition. Since

*Π*stays infinite (see Lemma 4), at each $P\in \mathcal{P}_{1}$ infinitely many blocks can potentially merge. Lemma 1 shows that the indicators of whether blocks present immediately before*P*merge with the block of 1 are $\text{exchangeable}(X)$ indicators with $X>0$ almost surely. Then, Lemma 3 ensures that a positive fraction of them almost surely does, causing $f_{1}$ to jump (since a positive fraction of merging blocks has positive frequency). Thus, every Poisson point leads to a merger almost surely, which shows that $f_{1}$ jumps at all potential jump times described above. Since, for all*t*, either $S_{t}>0$ or non-dust blocks not including 1 exist (having asymptotical frequency $>0$), $f_{1}(t)<1$ for all $t\ge 0$.We consider the jump chain of $f_{1}$. Set $X_{1}:=f_{1}[1]$. Since $f_{1}[k]\in (0,1)$ for all $k\in \mathbb{N}$ and $f_{1}$ increases, $f_{1}[k+1]=f_{1}[k]+(1-f_{1}[k])X_{k+1}$ for $X_{k+1}\in (0,1)$. Iterating this yields $f_{1}[k]={\sum _{i=1}^{k}}X_{i}{\prod _{j=1}^{i-1}}(1-X_{j})$ for $k\ge 2$. The properties of $(X_{k})_{k\in \mathbb{N}}$ follow from the Poisson construction and Lemma 1. Consider the blocks present at time $T_{k}-$, where the The independence of $({K_{i}^{(k)}})_{i\in \mathbb{N}}$ from $(\varPi _{t})_{t<T_{k}}$ is also crucial for the next two equations. Compute, with $P({K_{1}^{(k)}}={K_{i}^{(k)}})=E({X^{\prime }})=\gamma $ for $i\in \mathbb{N}$,

*k*th Poisson point of $\mathcal{P}_{1}$ is $P_{k}=(T_{k},({K_{i}^{(k)}})_{i\in \mathbb{N}})$. The block with least element*i*merges with the block of 1 if ${K_{i}^{(k)}}={K_{1}^{(k)}}$. Consider the*k*th Poisson point at time $T_{k}$. $X_{k}$ gives the fraction of the asymptotic frequency of non-singleton blocks and singleton blocks at time $T_{k}-$, i.e. the fraction of $1-f_{1}(T_{k-})$, that is merged with the block of 1 at $T_{k}$. Denote ${L_{i}^{(k-)}}:=1_{\left\{\{i\}\hspace{2.5pt}\text{is a block at}\hspace{2.5pt}T_{k}-\right\}}$. Then, recording the asymptotic frequencies of merged non-singleton and singleton blocks, \[ X_{k}=\frac{1}{1-f_{1}(T_{k}-)}\Bigg(\sum \limits_{i\ge 2}1_{\{{K_{1}^{(k)}}={K_{i}^{(k)}}\}}f_{i}(T_{k}-)+\underset{n\to \infty }{\lim }\frac{1}{n}{\sum \limits_{i=2}^{n}}1_{\{{K_{1}^{(k)}}={K_{i}^{(k)}}\}}{L_{i}^{(k-)}}\Bigg).\]

Since by construction, $\varPi _{T_{k}-}\setminus \{1\}$ is an exchangeable partition of $\mathbb{N}\setminus \{1\}$, $({L_{i}^{(k-)}})_{i\in \mathbb{N}}$ are $\text{exchangeable}(S_{t-})$ indicators with $S_{t-}=1-{\sum _{i=1}^{\infty }}f_{i}(T_{k}-)$. Recall that Lemma 1 tells us that $(1_{\{{K_{1}^{(k)}}={K_{i}^{(k)}}\}})_{i\ge 2}$ are $\text{exchangeable}({X^{\prime }})$ indicators with ${X^{\prime }}\stackrel{d}{=}Q$. $({K_{i}^{(k)}})_{i\in \mathbb{N}}$ is independent from $(\varPi _{t})_{t<T_{k}}$, since the Poisson points of $\mathcal{P}_{1}$ are i.i.d., so Lemma 3 c) and a) show ##### (14)

\[ X_{k}\stackrel{a.s.}{=}\sum \limits_{i\ge 2}1_{\{{K_{1}^{(k)}}={K_{i}^{(k)}}\}}\frac{f_{i}(T_{k}-)}{1-f_{1}(T_{k}-)}+{X^{\prime }}\frac{1-{\textstyle\sum _{i=1}^{\infty }}f_{i}(T_{k}-)}{1-f_{1}(T_{k}-)}.\] \[\begin{array}{r@{\hskip0pt}l}\displaystyle E(X_{k})& \displaystyle =\sum \limits_{i\ge 2}P\big({K_{1}^{(k)}}={K_{i}^{(k)}}\big)E\bigg(\frac{f_{i}(T_{k}-)}{1-f_{1}(T_{k}-)}\bigg)+E\big({X^{\prime }}\big)E\bigg(\frac{1-{\textstyle\sum _{i=1}^{\infty }}f_{i}(T_{k}-)}{1-f_{1}(T_{k}-)}\bigg)\\{} & \displaystyle =\gamma E\bigg(\frac{1-f_{1}(T_{k}-)}{1-f_{1}(T_{k}-)}\bigg)=\gamma .\end{array}\]

Analogously, for $l<k$, $X_{l}$ only depends on Poisson points $P_{1},\dots ,P_{l}$, so \[\begin{array}{r@{\hskip0pt}l}\displaystyle E(X_{k}X_{l})& \displaystyle =E\bigg(\sum \limits_{i\ge 2}1_{\{{K_{1}^{(k)}}={K_{i}^{(k)}}\}}\frac{f_{i}(T_{k}-)}{1-f_{1}(T_{k}-)}X_{l}+{X^{\prime }}\frac{1-{\textstyle\sum _{i=1}^{\infty }}f_{i}(T_{k}-)}{1-f_{1}(T_{k}-)}X_{l}\bigg)\\{} & \displaystyle =\gamma E\bigg(\frac{1-f_{1}(T_{k}-)}{1-f_{1}(T_{k}-)}X_{l}\bigg)={\gamma }^{2},\end{array}\]

showing that $X_{k},X_{l}$ are uncorrelated. An analogous computation shows that $E(\prod _{i\in \{l_{1},\dots ,l_{m}\}}X_{l_{i}})=\prod _{i\in \{l_{1},\dots ,l_{m}\}}E(X_{l_{i}})$ for distinct $l_{1},\dots ,l_{m}\in \mathbb{N}$. With this, \[ E\big(f_{1}[k]\big)={\sum \limits_{i=1}^{k}}E(X_{i}){\prod \limits_{j=1}^{i-1}}\big(1-E(X_{j})\big)={\sum \limits_{i=1}^{k}}\gamma {(1-\gamma )}^{i-1}=1-{(1-\gamma )}^{k}.\]

To prove $\lim _{t\to \infty }f_{1}(t)=1$ almost surely, observe that $f_{1}$ is bounded and increasing, thus $\lim _{t\to \infty }f_{1}(t)$ exists. Monotone convergence and $\lim _{t\to \infty }E(f_{1}(t))=\lim _{k\to \infty }E(f_{1}[k])=1$ show the desired. Note that $(X_{k})_{k\in \mathbb{N}}$ is in general neither independent nor identically distributed, see Section 6. □##### Proof of Corollary 1.

By the Poisson construction the block of 1 for ${\varPi }^{(n)}$ can only merge at times given by Poisson points in $\mathcal{P}_{1}$. Consider $(T_{1},({K_{i}^{(1)}})_{i\in \mathbb{N}})\in \mathcal{P}_{1}$. While $T_{1}$ is the time of the first jump of $f_{1}$ (see the proof of Theorem 1), there does not necessarily need to be a merger of $\{1\}$ in the

*n*-coalescent ${\varPi }^{(n)}$, if we have ${K_{1}^{(1)}}\ne {K_{i}^{(1)}}$ for the least elements*i*of all other blocks of ${\varPi }^{(n)}$ immediately before $T_{1}$. However, Lemma 1 shows that $(1_{\{{K_{1}^{(1)}}={K_{i}^{(1)}}\}})_{i\ge 2}$ are exchangeable indicators. The mean ${n}^{-1}{\sum _{i=2}^{n}}1_{\{{K_{1}^{(1)}}={K_{i}^{(1)}}\}}$, as argued in the proof of Theorem 1, converges to an almost surely positive random variable for $n\to \infty $. As shown in Lemma 4, any*Ξ*-coalescent with dust has infinitely many blocks almost surely before $T_{1}$. Thus, there exists*N*, a random variable on $\mathbb{N}$, so that 1 is also merging at time $T_{1}$ in ${\varPi }^{(n)}$ for $n\ge N$ almost surely. This yields $\lim _{n\to \infty }{n}^{-1}M_{n}=\lim _{n\to \infty }{n}^{-1}|B_{1}(T_{1})\cap [n]|=f_{1}(T_{1})=f_{1}[1]$ almost surely. All further claims follow from Theorem 1. □##### Remark 5.

*Let*${Q}^{(n)}$

*be the number of blocks merged at the first collision of the block of 1 in a Λ-n-coalescent with dust. [*

*34*

*, 1.4]*

*shows that*${n}^{-1}{Q}^{(n)}$

*converges in distribution. We argue that this convergence also holds in*${L}^{p}$

*for all*$p>0$

*and, for simple Λ-n-coalescents, almost surely.*

*The proof of Corollary*

*1*

*shows that*$(T_{1},({K_{i}^{(1)}})_{i\in \mathbb{N}})\in \mathcal{P}_{1}$

*causes the first merger in the n-coalescent for n large enough (almost surely, but since*${n}^{-1}{Q}^{(n)}\in [0,1]$

*for all n, convergence in*${L}^{p}$

*is not affected by the null set excluded). Split*${Q}^{(n)}$

*into*${Q_{0}^{(n)}}$

*, the number of non-singleton blocks and*${Q_{1}^{(n)}}$

*, the number of singleton blocks merged at*$T_{1}$

*. For the limit, we can ignore the non-singleton blocks merged. To see this, recall*${Q_{0}^{(n)}}\le K_{n}$

*, where*$K_{n}$

*is the total number of mergers for the Λ-n-coalescent, since a non-singleton block has to be the result of a merger. [*

*12*

*, Lemma 4.1]*

*tells us that*${n}^{-1}K_{n}\to 0$

*in*${L}^{1}$

*for*$n\to \infty $

*for Ξ-coalescents with dust. This shows that the*${L}^{1}$

*-limit of*${n}^{-1}{Q}^{(n)}$

*is the same as of the one of*${n}^{-1}{Q_{1}^{(n)}}$

*.*${n}^{-1}{Q_{1}^{(n)}}$

*already appeared in the part of the proof of Theorem*

*1*

*leading to Eq.*(14)

*, its limit almost surely exists and equals*${X^{\prime }}\frac{1-{\sum _{i=1}^{\infty }}f_{i}(T_{1}-)}{1-f_{1}(T_{1}-)}$

*. Since*${n}^{-1}{Q_{1}^{(n)}}$

*is bounded in*$[0,1]$

*, it also converges in*${L}^{p}$

*,*$p>0$

*. So*${n}^{-1}{Q}^{(n)}$

*converges in*${L}^{1}$

*. Since it is bounded in*$[0,1]$

*it also converges in*${L}^{p}$

*,*$p>0$

*. For simple Ξ-n-coalescents, [*

*11*

*, Lemma 4.2]*

*shows*${n}^{-1}K_{n}\to 0$

*almost surely, so in this case the steps above ensure also almost sure convergence of*${n}^{-1}{Q}^{(n)}$

*.*

## 5 The block of 1 in simple *Λ*-coalescents – proofs and remarks

##### Proof of Proposition 3.

Let $\mathcal{P}:=(P_{i})_{i\in \mathbb{N}}$ be the coin probabilities coming from the Poisson process used to construct the simple

*Λ*-coalescent*Π*as described in Section 2. As shown in the proof of Theorem 1, the Poisson point belonging to $P_{C}$ where 1 first throws ‘heads’ in the Poisson construction is the Poisson point where $f_{1}$ jumps for the first time. We have $P(C=k|\mathcal{P})=P_{k}{\prod _{i=1}^{k-1}}(1-P_{i})$. Integrating the condition and using the independence of $(P_{i})_{i\in \mathbb{N}}$ as well as $E(P_{1})=\alpha $ (see Lemma 2), we see that*C*is geometrically distributed with parameter*α*.To describe $f_{1}[1]$ at the

*C*th merger (Poisson point), recall that the restriction $\varPi _{-1}$ of*Π*to $\mathbb{N}\setminus \{1\}$ has the same asymptotic frequencies as*Π*. Thus, we can see $f_{1}[1]$ as the asymptotic frequency of the newly formed block of $\varPi _{-1}$ at the time of the Poisson point $P_{C}$. This follows since $\varPi _{-1}$ has infinitely many blocks before (see Lemma 4) and then, as in the proof of Theorem 1, there will be a newly formed block of $\varPi _{-1}$ at the*C*th Poisson point (and the unrestricted block in*Π*includes 1).We consider $\varPi _{-1}$ at the

*k*th Poisson point with coin probability $P_{k}$. For $\{i\}\in \mathbb{N}\setminus \{1\}$ to remain a (singleton) block and not be merged for the first $k-1$ mergers and then to be merged at the*k*th, we need $\prod _{j\in [k-1]}(1-{K_{i}^{(j)}})=1$ and ${K_{i}^{(k)}}=1$. $(1_{\{\prod _{j\in [k-1]}(1-{K_{i}^{(j)}})=1,{K_{i}^{(k)}}=1\}})_{i\in \mathbb{N}}$ are $\text{exchangeable}(P_{k}\prod _{j\in [k-1]}(1-P_{j}))$ indicators. Let \[ \mathcal{S}_{k}=\bigg\{i\in \mathbb{N}\setminus \{1\}:\prod \limits_{j\in [k-1]}\big(1-{K_{i}^{(j)}}\big)=1,{K_{i}^{(k)}}=1\bigg\}\]

be the set of $i\in \mathbb{N}\setminus \{1\}$ whose first merger is the *k*th overall merger. We call $\mathcal{S}_{k}$ the*k*th singleton set (of $\varPi _{-1}$). From the strong law of large numbers for exchangeable indicators, see Lemma 3a), we directly have that $\mathcal{S}_{k}$ has asymptotic frequency $P_{k}\prod _{j\in [k-1]}(1-P_{j})$ almost surely.Now, consider the asymptotic frequency ${f}^{\ast }[k]$ of the newly formed block at the

*k*th merger of $\varPi _{-1}$. By construction, there is only one newly formed block at each merger. $\mathcal{S}_{k}$ is a part of the newly formed block. Any other present block with more than two elements (non-singleton block) is merged if and only if its indicator ${K_{i}^{(k)}}=1$ (we order by least elements). For $k=1$, the newly formed block is $\mathcal{S}_{1}$. For $k=2$, it is either $\mathcal{S}_{2}$ or $\mathcal{S}_{1}\cup \mathcal{S}_{2}$, if the coin of the the block $\mathcal{S}_{1}$ formed in the first merger comes up ‘heads’.Applied successively, this shows that the newly formed block at the where the ${B_{i}^{(k)}}$, $i\in [k]$, are non-independent Bernoulli variables which are 1 if the

*k*th merger consists of a union of a subset of the singleton sets $(\mathcal{S}_{{k^{\prime }}})_{{k^{\prime }}<k}$ and the set $\mathcal{S}_{k}$. For its asymptotic frequency, we have##### (15)

\[ {f}^{\ast }[k]={\sum \limits_{i=1}^{k}}{B_{i}^{(k)}}P_{i}\prod \limits_{j\in [i-1]}(1-P_{j})>0,\]*i*th singleton set $\mathcal{S}_{i}$ is a part of the newly formed block at the*k*th merger of $\varPi _{-1}$.If $\varLambda (\{1\})>0$, $P_{k}=1$ is possible. In this case, at the

*k*th Poisson point all remaining singletons form $\mathcal{S}_{k}$ and all blocks present at merger $k-1$ merge with $\mathcal{S}_{k}$. There are no mergers at Poisson points $P_{l}$, $l>k$, so we do not consider Eq. (15) for $l>k$.We have $f_{1}[1]={f}^{\ast }[C]$. Given $\mathcal{P}$, $({f}^{\ast }[k])_{k\in \mathbb{N}}$ is independent of

*C*. Thus, Eq. (9) is implied by Eq. (15).Assume $\varLambda (\{1\})=0$. For $({B_{i}^{(k)}})_{k\in \mathbb{N},i\in [k]}$, we have ${B_{k}^{(k)}}=1$ for all $k\in \mathbb{N}$ since the

*k*th singleton set is formed at the*k*th Poisson point and is a part of the newly formed block. The coins thrown at the*k*th Poisson point to decide whether other singleton sets $\mathcal{S}_{i}$, $\mathcal{S}_{j}$ with $i,j<k$ are also parts of the newly formed block are either independent given $\mathcal{P}$ when they are in different blocks, or identical when they are in the same block. The set $\mathcal{S}_{i}$ uses the coin of the block newly formed at the*i*th merger. Let $I(i)$ be the Poisson point at which this block merges again (and $\mathcal{S}_{i}$ with it). At the $I(i)$th Poisson point and for all further Poisson points indexed with $j\ge I(i)$, we have ${B_{i}^{(j)}}={B_{I(i)}^{(j)}}$, since the singleton sets $S_{i}$ and $S_{I(i)}$ are in the same block for mergers $j\ge I(i)$.The property (i) of $I(i)$ in the proposition follow directly from its definition as the minimum number of coin tosses until the first comes up ‘heads’. The property (ii) is just integrating (i) and using that $(P_{i})_{i\in \mathbb{N}}$ are i.i.d. with $E(P_{1})=\alpha $ (see Lemma 2), the conditional independence is the conditional independence of coin tosses of distinct blocks from the Poisson construction. To see Eq. (10), observe that $\mathcal{S}_{i}$ for $i<j$ is a part of the newly formed block at the

*j*th merger of the*Λ*-coalescent ($i\in J$) if and only if $I(i)\in J$. If $I(i)\in J$, either we have $I(i)=j$, so $\mathcal{S}_{i}$ is merged for the first time after it has been formed at the*j*th merger, or we have that $I(i)<j$ which means that it has already merged with at least one other singleton set and that, as parts of the same block, they both again merged at the*j*th merger. If $I(i)\notin J$, the singleton set $\mathcal{S}_{i}$ neither merges at the*j*th merger for the first time after being formed nor merges with any other singleton set before that is then merging at the*j*th merger, so $\mathcal{S}_{i}$ is not a part of the newly merged block at the*j*th merger.If $\varLambda (\{1\})=0$, the arguments hold true for all $j\in \mathbb{N}$. If $\varLambda (\{1\})>0$ this holds true for all $i,j\le K:=\min _{k\in \mathbb{N}}\{P_{k}=1\}(<\infty \hspace{2.5pt}\text{almost surely})$, where all singleton sets merge and ${f}^{\ast }[K]=1$. However, in this case $C\le K$, so we still can establish Eq. (9). □

##### Remarks 6.

- • $(I(i))_{i\in \mathbb{N}}$
*is useful to construct the asymptotic frequencies of the Λ-coalescent. Given*$\mathcal{P}$*, at the kth merger, there are the singleton sets*$(\mathcal{S}_{j})_{j\in [k]}$*with almost sure frequencies*$P_{j}\prod _{i\in [j-1]}(1-P_{i})$*which were already formed in the k collisions, and unmerged singleton blocks with frequency*$\prod _{i\in [k]}(1-P_{i})$*. Using*$(I(i))_{i\in [k]}$*, we can indicate which singleton sets form a block.*$\mathcal{S}_{i}$*is a single block if*$I(i)>k$*, if*$I(i)\le k$*it is a part of a block where*$\mathcal{S}_{I(i)}$*is also a part of. This can be seen as a discrete version of the construction of the Λ-coalescent from the process of singletons as described in [**15**, Section 6.1]* - •
*The variables*$(I(i))_{i\in \mathbb{N}}$*are useful to express other quantities of the Λ-coalescent. For instance, the number of non-singleton blocks in a simple Λ-coalescent at the kth merger is given by*$k-\sum _{i\in [k-1]}1_{\{I(i)\le k\}}$*.*

To prove Proposition 2, we need the following result.

##### Lemma 5.

*For*$p\in [\frac{1}{2},1)$

*, each*$x\in \mathcal{M}_{p}$

*from Eq.*(6)

*has a unique representation in*$\mathcal{M}_{p}$

*.*

##### Proof.

We adjust the proof of [4, Theorem 7.11]. Assume that $x\in \mathcal{M}_{p}$ has two representations $x=\sum _{i\in \mathbb{N}}b_{i}p{q}^{i-1}=\sum _{i\in \mathbb{N}}{b^{\prime }_{i}}p{q}^{i-1}$ with $b_{i}\ne {b^{\prime }_{i}}$ for at least one

*i*. Let $i_{0}$ be the smallest integer with $b_{i_{0}}\ne {b^{\prime }_{i_{0}}}$. Without restriction, assume $b_{i_{0}}-{b^{\prime }_{i_{0}}}=1$. Then, \[ 0=\sum \limits_{i\in \mathbb{N}}b_{i}p{q}^{i-1}-\sum \limits_{i\in \mathbb{N}}{b^{\prime }_{i}}p{q}^{i-1}=p{q}^{i_{0}-1}+\sum \limits_{i>i_{0}}\big(b_{i}-{b^{\prime }_{i}}\big)p{q}^{i-1}.\]

Thus, $p{q}^{i_{0}-1}=\sum _{i>i_{0}}({b^{\prime }_{i}}-b_{i})p{q}^{i-1}<\sum _{i>i_{0}}p{q}^{i-1}=p{q}^{i_{0}}$, simplifying to $p<q$, in contradiction to the assumption $p\ge \frac{1}{2}$. □##### Proof of Proposition 2.

From Eq. (15) we see that $f_{1}$ only takes values in $\mathcal{M}_{p}$, since $P_{k}=p$ for all $k\in \mathbb{N}$ and $C<\infty $ almost surely. Recall the definition of the singleton sets $\mathcal{S}_{i}$ and their properties from the proof of Proposition 3. The asymptotic frequency of $\mathcal{S}_{i}$ is $p{q}^{i-1}$ almost surely. Lemma 5 ensures that there is a unique representation $f_{1}[l]=x={\sum _{i=1}^{\infty }}b_{i}p{q}^{i-1}$ in $\mathcal{M}_{p}$, let $J:=\{i\in \mathbb{N}:b_{i}=1\}$ and $j:=\max J$. This means that $f_{1}[l]=x$ is equivalent to that the block of 1 at its

*l*th jump consists of the union of all $\mathcal{S}_{i}$ with $i\in J$ and 1. This also shows that the*l*th jump of $f_{1}$ is at the*j*th jump of the Dirac coalescent, since if $f_{1}$ jumps at the*k*th merger of the Dirac coalescent, the newly formed block includes $\mathcal{S}_{k}$.Since $P_{i}=p$ for all $i\in \mathbb{N}$, we have $\alpha =p$ and Eq. (9) simplifies to $f_{1}[1]={\sum _{i=1}^{C}}{B_{i}^{(C)}}p{q}^{i-1}$, where $C\stackrel{d}{=}Geo(p)$ is independent from $({B_{i}^{(k)}})_{k\in \mathbb{N},i\in [k]}$. The latter fulfil with $Y\stackrel{d}{=}Geo(p)$, since the joint distribution in Proposition 3 again simplifies, we can ignore the conditioning and $I(i)-i\stackrel{d}{=}Geo(p)$ for all $i\in \mathbb{N}$.

##### (16)

\[ P\big({B_{i}^{(j)}}=b_{i}\hspace{2.5pt}\forall \hspace{2.5pt}i\in [j-1]\big)=\prod \limits_{i\in J\setminus \{j\}}P(Y+i\in J)\prod \limits_{i\in [j]\setminus J}P(Y+i\notin J)\]Since $f_{1}[1]=x$ uniquely determines the values of which shows Eq. (7) when we insert the distributions expressed in terms of their geometric distributions.

*C*and $({B_{i}^{(C)}})_{i\in [C]}$, we have##### (17)

\[ P\big(f_{1}[1]=x\big)=P(C=j)P\big({B_{1}^{(j)}}=b_{1},\dots ,{B_{j-1}^{(j)}}=b_{j-1}\big),\]In order to verify that the jump chain $(f_{1}[i])_{i\in \mathbb{N}}$ is Markovian, we show that $f_{1}[1],\dots ,f_{1}[l]$ does not contain more information on $f_{1}[l+1]$ than $f_{1}[l]$ does. Without restriction, assume that the

*l*th jump $f_{1}[l]$ of $f_{1}$ takes place at the*k*th jump of the Dirac coalescent. Then, $f_{1}[l+1]$ is constructed from the blocks present after the*k*th merger. For each subsequent Poisson point $P_{k+1},\dots $, blocks present are merged if their respective coins come up ‘heads’ until (and including), at $P_{{k^{\prime }}}$, the coin of the block of 1 comes up ‘heads’ for the first time since $P_{k}$. Thus, only information about the block partition at merger*k*can change the law of the next jump. $f_{1}[l]=x$ gives the information which singleton sets $\mathcal{S}_{1},\dots ,\mathcal{S}_{k}$ are parts of the block of 1 at merger*k*of*Π*and which are not. $f_{1}[l]=x$ contains no information about how the other singleton sets, $\mathcal{S}_{i}$ with $b_{i}=0$, are merged into blocks at collisions before*k*apart from that it tells us that ${B_{i}^{(j)}}=0$ for $j\in J$ and $i\notin J$, which means that all $\mathcal{S}_{i}$ with $i\notin J$ did not merge at the*j*th collisions, $j\in J$. This is due to that any $\mathcal{S}_{i}$ with ${B_{i}^{(j)}}=1$ would merge with the newly formed block at merger*j*and thus would be in a block with $\mathcal{S}_{j}$ and also in the block of 1 at merger*k*. However, analogously we see that knowing $f_{1}[1],\dots ,f_{1}[l]$ does not give any additional information about the block structure at the*k*th merger, but only how the set of $\mathcal{S}_{i}$ which are in the block of 1 at merger*k*behaved at the earlier mergers*J*. Thus, $(f_{1}[l])_{l\in \mathbb{N}}$ is Markovian. However, $(f_{1}(t))_{t\ge 0}$ is not Markovian. In order to see this consider, for $0<t_{0}<t_{1}<t_{2}$, \[\begin{array}{r@{\hskip0pt}l}\displaystyle p(t_{2},t_{1},t_{0}):=& \displaystyle P\big(f_{1}(t_{2})=p+p{q}^{2}|f_{1}(t_{1})=p,f_{1}(t_{0})=0\big)\\{} \displaystyle =& \displaystyle \frac{P(f_{1}(t_{2})=p+p{q}^{2},f_{1}(t_{1})=p,f_{1}(t_{0})=0)}{P(f_{1}(t_{1})=p,f_{1}(t_{0})=0)}.\end{array}\]

We will show that $p(t_{2},t_{1},t_{0})$ depends on $t_{0}$, which shows that $f_{1}$ is not Markovian.We can express all events in terms of the independent waiting times for Poisson points, i.e. the successive differences between the first component

*T*of the Poisson points $(T,(K_{i})_{i\in \mathbb{N}})\in \mathcal{P}$. Here, we use the split of the Poisson points into the independent Poisson point processes $\mathcal{P}_{1}$ and $\mathcal{P}_{2}$ from Lemma 1. The waiting times between points in $\mathcal{P}_{1}$ are $\mathrm{Exp}(\mu _{-1})$-distributed, the waiting times between points in $\mathcal{P}_{2}$ are $\mathrm{Exp}(\mu _{-2}-\mu _{-1})$-distributed, see Lemma 1 and Remark 4. We will relabel $\tau =\mu _{-1}$ and $\rho =\mu _{-2}-\mu _{-1}$ for a clearer type face. Let $T_{1},T_{2},\dots $ be the waiting times between points in $\mathcal{P}_{1}$ and ${T^{\prime }_{1}},{T^{\prime }_{2}},\dots $ be the waiting times between points in $\mathcal{P}_{2}$. All waiting times are independent one from another. We recall that for $T\stackrel{d}{=}\mathrm{Exp}(\alpha )$, $P(T>a)={e}^{-\alpha a}$ and $P(T\in (a,a+b])={e}^{-\alpha a}(1-{e}^{-\alpha b})$ for $a,b\ge 0$.The event $\{f_{1}(t_{1})=p,f_{1}(t_{0})=0\}$ means that the first jump of $f_{1}$ adds the singleton set $\mathcal{S}_{1}$ at a time in $(t_{0},t_{1}]$. Thus, there has to be only a single point of $\mathcal{P}_{1}$ with first component $T_{1}\le t_{1}$ and the smallest time ${T^{\prime }_{1}}$ of points of $\mathcal{P}_{2}$ has to be greater than $T_{1}$. We compute, conditioning on $T_{1}$ for the third equation,

\[\begin{array}{r@{\hskip0pt}l}\displaystyle P\big(f_{1}(t_{1})=p,f_{1}(t_{0})=0\big)& \displaystyle =P\big(t_{0}<T_{1}\le t_{1}<T_{1}+T_{2},T_{1}<{T^{\prime }_{1}}\big)\\{} & \displaystyle ={\int _{t_{0}}^{t_{1}}}P(T_{2}>t_{1}-x)P\big({T^{\prime }_{1}}>x\big)\tau {e}^{-\tau x}dx\\{} & \displaystyle ={\int _{t_{0}}^{t_{1}}}{e}^{-\tau (t_{1}-x)}{e}^{-\rho x}\tau {e}^{-\tau x}dx\\{} & \displaystyle =\frac{\tau }{\rho }{e}^{-\tau t_{1}}{\int _{t_{0}}^{t_{1}}}\rho {e}^{-\rho x}dx=\frac{\tau }{\rho }{e}^{-\tau t_{1}}\big({e}^{-\rho t_{0}}-{e}^{-\rho t_{1}}\big).\end{array}\]

Analogously, we compute (by conditioning on $T_{1},T_{2}$ for the second equality) \[\begin{array}{r@{\hskip0pt}l}& \displaystyle P\big(f_{1}(t_{2})=p+p{q}^{2},f_{1}(t_{1})=p,f_{1}(t_{0})=0\big)\\{} & \displaystyle \hspace{1em}=P\big(t_{0}<T_{1}\le t_{1}<T_{1}+T_{2}\le t_{2}<T_{1}+T_{2}+T_{3},T_{1}<{T^{\prime }_{1}}\le T_{1}+T_{2}<{T^{\prime }_{2}}\big)\\{} & \displaystyle \hspace{1em}={\int _{t_{0}}^{t_{1}}}{\int _{t_{1}-x}^{t_{2}-x}}P\Bigg(T_{3}\hspace{0.1667em}>\hspace{0.1667em}t_{2}\hspace{0.1667em}-\hspace{0.1667em}x\hspace{0.1667em}-\hspace{0.1667em}y,{T^{\prime }_{1}}\hspace{0.1667em}\in \hspace{0.1667em}\big(x,x\hspace{0.1667em}+\hspace{0.1667em}y],{T^{\prime }_{2}}\hspace{0.1667em}>\hspace{0.1667em}x\hspace{0.1667em}+\hspace{0.1667em}y\big){\tau }^{2}{e}^{-\tau x}{e}^{-\tau y}dy\hspace{2.5pt}dx\\{} & \displaystyle \hspace{1em}={\tau }^{2}{\int _{t_{0}}^{t_{1}}}{\int _{t_{1}-x}^{t_{2}-x}}{e}^{-\tau (t_{2}-x-y)}{e}^{-\rho x}\big(1-{e}^{-\rho y}\big){e}^{-\rho (x+y)}{e}^{-\tau x}{e}^{-\tau y}dy\hspace{2.5pt}dx\\{} & \displaystyle \hspace{1em}=\frac{{\tau }^{2}}{\rho }{e}^{-\tau t_{2}}\bigg[{\rho }^{-1}\big({e}^{-\rho t_{1}}\hspace{0.1667em}-\hspace{0.1667em}{e}^{-\rho t_{2}}\big)\big({e}^{-\rho t_{0}}\hspace{0.1667em}-\hspace{0.1667em}{e}^{-\rho t_{1}}\big)\Bigg)\hspace{0.1667em}-\hspace{0.1667em}\frac{1}{2}\big({e}^{-2\rho t_{1}}\hspace{0.1667em}-\hspace{0.1667em}{e}^{-2\rho t_{2}}\big)(t_{1}-t_{0})\bigg].\end{array}\]

Taking the ratio shows that

\[ p(t_{2},t_{1},t_{0})=\frac{\tau }{\rho }{e}^{-\tau (t_{2}-t_{1})}\big({e}^{-\rho t_{1}}-{e}^{-\rho t_{2}}\big)-\underset{\ne 0}{\underbrace{\frac{\tau }{2}{e}^{-\tau (t_{2}-t_{1})}\frac{{e}^{-2\rho t_{1}}-{e}^{-2\rho t_{2}}}{{e}^{-\rho t_{0}}-{e}^{-\rho t_{1}}}}}(t_{1}-t_{0})\]

depends on $t_{0}$, so $f_{1}$ is not Markovian. □##### Remark 7.

*Our proof of Proposition*

*2*

*relies on the unique representation in*$\mathcal{M}_{p}$

*. This means that it also holds true for all*$p\in (0,{2}^{-1})$

*where each*$x\in \mathcal{M}_{p}$

*has a unique representation in*$\mathcal{M}_{p}$

*, e.g. for all transcendental p. If the representation is not unique, Eq.*(16)

*is still correct, but the right side of Eq.*(17)

*does not show*$P(f_{1}[1]=x)$

*. Instead, the latter shows the contribution to*$P(f_{1}[1]=x)$

*from the paths of*$f_{1}$

*which fulfil*$C=j,{B_{1}^{(j)}}=b_{1},\dots ,{B_{j-1}^{(j)}}=b_{j-1}$

*(recall that*$j,b_{1},\dots ,b_{j-1}$

*depend on the representation of x). Moreover,*$P(f_{1}[1]=x)$

*then is the sum over*$P(C=j)P({B_{1}^{(j)}}=b_{1},\dots ,{B_{j-1}^{(j)}}=b_{j-1})$

*for the tuples*$j,b_{1},\dots ,b_{j-1}$

*coming from the different representations of x (the sets of paths are disjoint if the parameter sets*$(j,b_{1},\dots ,b_{j-1})$

*differ). Since the proof of our results on the Markov property of both*$f_{1}$

*and its jump chain also rely on the unique representation of x (to read off which blocks merged when), the proof does not extend if p does not allow a unique representation of x.*

## 6 Example

We provide a concrete example showing that the random variables $(X_{k})_{k\in \mathbb{N}}$ from Theorem 1 are, in general, neither independent nor identically distributed.

Choose $\varLambda =\delta _{\frac{1}{2}}$ and consider $f_{1}$ in the corresponding

*Λ*-coalescent. Recall that $f_{1}[l]=x\in \mathcal{M}_{\frac{1}{2}}$ already fixes which singleton sets $\mathcal{S}_{k}$ are parts of the block of 1 at its*l*th merger and which are not. First, assume $f_{1}[1]=X_{1}=\frac{5}{8}=\frac{1}{2}+\frac{1}{{2}^{3}}\in \mathcal{M}_{\frac{1}{2}}$, which means that the coin of 1 comes up ‘heads’ for the first time at the third Poisson point and the block of 1 is $\mathcal{S}_{1}\cup \mathcal{S}_{3}$, while $\mathcal{S}_{2}$ is a block of its own (an event happening with probability $>0$). Assume further $f_{1}[2]=\frac{11}{16}=\frac{5}{8}+\frac{1}{16}$. This sets $X_{2}=(f_{1}[2]-f_{1}[1])/(1-X_{1})=\frac{1}{6}\notin \mathcal{M}_{\frac{1}{2}}$. We read off that the coin of the block of 1 also comes up ‘heads’ at the fourth collision, where the block of 1 merges with $\mathcal{S}_{4}$. We also see that the coin of the only other block $\mathcal{S}_{2}$ comes up ‘tails’. We thus have, since we throw fair coins, $P(X_{2}=\frac{1}{6}|X_{1}=\frac{5}{8})=P(f_{1}[2]=\frac{11}{16}|f_{1}[1]=\frac{5}{8})=\frac{1}{4}$. Since $X_{1}=f_{1}[1]\in \mathcal{M}_{\frac{1}{2}}$ for any realisation, $X_{1}$ and $X_{2}$ have different distributions. To see also non-independence, consider $f_{1}[1]=X_{1}=\frac{1}{2}$ (coin of 1 comes up ‘heads’ at first coin toss, block of 1 is $\mathcal{S}_{1}$, occurs with probability $\frac{1}{2}$). In this case $P(X_{2}=\frac{1}{6}|X_{1}=\frac{1}{2})=0$, since $f_{1}[2]=X_{1}+(1-X_{1})X_{2}=\frac{7}{12}\notin \mathcal{M}_{\frac{1}{2}}$.