This work considers the low-rank approximation of a matrix (Formula presented.) depending on a parameter (Formula presented.) in a compact set (Formula presented.). Application areas that give rise to such problems include computational statistics and dynamical systems. Randomized algorithms are an increasingly popular approach for performing low-rank approximation and they usually proceed by multiplying the matrix with random dimension reduction matrices (DRMs). Applying such algorithms directly to (Formula presented.) would involve different, independent DRMs for every (Formula presented.), which is not only expensive but also leads to inherently non-smooth approximations. In this work, we propose to use constant DRMs, that is, (Formula presented.) is multiplied with the same DRM for every (Formula presented.). The resulting parameter-dependent extensions of two popular randomized algorithms, the randomized singular value decomposition and the generalized Nyström method, are computationally attractive, especially when (Formula presented.) admits an affine linear decomposition with respect to (Formula presented.). We perform a probabilistic analysis for both algorithms, deriving bounds on the expected value as well as failure probabilities for the approximation error when using Gaussian random DRMs. Both, the theoretical results and numerical experiments, show that the use of constant DRMs does not impair their effectiveness; our methods reliably return quasi-best low-rank approximations.