@raja . Wow... That made me research the hell out of the whole sensor working.
First of all... you guys are still tagging the wrong Raja
Will give it a shot sometime. I think the contrast thing is true since any internal reflection, light leakage, reflection off the glass will muddle the image. Imagine a set if pixels were supposed to get pure red light. So the filter would allow this to only the red sensors. This is the perfect condition. However the reflection causes some amount of blue and green to creep in giving these pixels a mixed color and screwing up what the camera interprets this color as. Giving you a not so true red in the final image.
a) If any green or blue light creeps into a photosite that is supposed to receive only red, the colour filter in front of the photosite is supposed to block the green and blue light and only allow the red light through. If it doesn't, that is a case of a defective colour matrix. You can imagine what havoc such a defective matrix would cause in the vast majority of cases where a mixture of R,G, and B light correctly reaches the array and it then filters them incorrectly.
b) Even if it does allow green and blue light through, the sensor still doesn't know that it has received green and blue light. It still only counts photons and the demosaicing algorithm behind it still thinks that what it has received is pure red. What happens is that an incorrect number of photons is counted, which I think registers as wrong Value, not wrong Hue i.e., the red will be brighter but still red. I admit that the issue gets very muddled here as changes to brightness, contrast, and saturation will cause colour shift in standard RGB colour space, which is why RawTherapee allows us to adjust these parameters in L*a*b and CIECAM02 colour space to avoid such shifts. Of course demosaicing algorithms do cause some smearing of detail (wich is why the Leica M is supposed to be so sharp) because they are creating individual pixels from multiple photosites, but I don't know if the same applies to colour. Even if it does that is still a demosaicing issue. And there could be a difference here between camera generated JPGs and raws that are separately processed in alternate colour spaces.
But I'm getting way over my head here. My argument assumes that colour filters are near perfect which may not be true in reality. On the other hand I don't know if demosaicing algorithms compensate for imperfect arrays using some tolerance level; I don't see why they shouldn't, especially as each camera model has its own colour profile, and the user can create colour profiles for each individual camera/lens/light combination if they wish to.
So TL;DR version is I don't know, but I'm still not convinced that the lens plays any significant role in Hue when the sensor does not see Hue. And I don't even want to touch the topic of screen and printer calibration which may play a much bigger role in colour rendering.
The same would be true of dark spots in the image as well. You would lose out on how much difference there is in the light and dark areas thus a washed out image.
Yeah, but that is not a Hue issue. Difference between light and dark is a Contrast issue. It is the amount of light, and not the wavelength, that makes the difference. That is why B&W photography is so dependant on the contrast of the images.
And I will repeat that I'm going way beyond my knowledge zone here and trying to figure this out purely on logic which may not be correct.