We consider the problem of compression of stochastic processes via random Gaussian matrices, which can be seen as an asymptotic framework for the compressed sensing theory. Jalali and Poor showed that any compression rate strictly greater than the mean information dimension is sufficient for (universal) almost lossless recovery of psi*-mixing processes, in the sense of vanishing mean square error. We show that this result is optimal, i.e. any rate below the mean information dimension is insufficient for the almost lossless recovery. To this end, we introduce a new mean dimension type quantity, related to techniques from geometric measure theory: the correlation dimension rate, which is shown to be a lower bound for the compressed sensing rate of an arbitrary stationary stochastic processes.
The talk will be based on a joint work with Yonatan Gutman: https://arxiv.org/abs/2507.23175