Reconstruction of field quantities from sparse measurements is a problem arising in a broad spectrum of applications. This task is particularly challenging when the mapping between sparse measurements and field quantities is performed in an unsupervised manner. Further complexity is added for moving sensors and/or random on¿off status. Under such conditions, the most straightforward solution is to interpolate the scattered data onto a regular grid. However, the spatial resolution achieved with this approach is ultimately limited by the mean spacing between the sparse measurements. In this work, we propose a super-resolution generative adversarial network framework to estimate field quantities from random sparse sensors. The algorithm exploits random sampling to provide incomplete views of the high-resolution underlying distributions. It is hereby referred to as the randomly seeded super-resolution generative adversarial network (RaSeedGAN). The proposed technique is tested on synthetic databases of fluid flow simulations, ocean surface temperature distribution measurements and particle-image velocimetry data of a zero-pressure-gradient turbulent boundary layer. The results show excellent performance even in cases with high sparsity or noise level. This generative adversarial network algorithm provides full-field high-resolution estimation from randomly seeded fields with no need of full-field high-resolution representations for training.