Signal to noise ratio vs. brightness
I have a question regarding linearity of SNR, perhaps it is a dumb question. Is SNR directly proportional to the brightness of a star? For example if I measure a mag. 9.0 star with a SNR=100 should I get a SNR=251 for an 8.0 mag. star, assuming I am working within the linearity range of the camera and under the same conditions (say for instance I want to measure a series of stars in the same frame)? Are any deviations in this regard indicative of a lack of linearity?