We prove a quantitative functional central limit theorem for one-hidden-layer neural networks with generic activation function. Our rates of convergence depend heavily on the smoothness of the activation function, and they range from logarithmic for nondifferentiable nonlinearities such as the ReLu to $\sqrt{n}$ for highly regular activations. Our main tools are based on functional versions of the Stein–Malliavin method; in particular, we rely on a quantitative functional central limit theorem which has been recently established by Bourguin and Campese [Electron. J. Probab. 25 (2020), 150].
The so-called multi-mixed fractional Brownian motions (mmfBm) and multi-mixed fractional Ornstein–Uhlenbeck (mmfOU) processes are studied. These processes are constructed by mixing by superimposing or mixing (infinitely many) independent fractional Brownian motions (fBm) and fractional Ornstein–Uhlenbeck processes (fOU), respectively. Their existence as ${L^{2}}$ processes is proved, and their path properties, viz. long-range and short-range dependence, Hölder continuity, p-variation, and conditional full support, are studied.