Per pixel. This is not relevant at all as image level saturation is relevant.
Sensors do not exist in vacuum. They provide information which is processed for desired results. Having more pixels tends to increase read noise slightly (roughly proportionally to square root of ratio of pixel counts, assuming same per-pixel read noise, depending on ADC contribution), thus the amount of infomation is reduces. However sampling the image with more pixels captures more information, information that the lower pixel count doesn't have. The question is now which set of input information can be used to create a better quality output. It is not something which has a trivial result, especially since different set of information should be processed in different way for optimal results. The evidence seems to point out to direction that usually it would be better to have more pixels.
Also more pixels helps to reduce aliasing.
More pixels improves resolution even with poor lenses. Today's lenses and even the smallest of pixels (in interchangable lens cameras) result in massive aliasing. There is a lot of room there for finer sampling.
Pixel counts have gone up very slowly compared to sizes of storage media over the last 10-20 years. Same with computer processing power.
I doubt anyone has made such claim.