One of the reasons you may be having a problem with radon sensors is that most radon sensors will report results with a resolution in tenths of pCi/L (in the US) while the statistical variance in the range below 4.0 pCi/L (where most houses should be) is +/-25% or more for many common sensors. So you can have two readings which are e.g. 2.4 pCi/L and 1.9 pCi/L, and from the standpoint of a radon measurement specialist, these readings agree!
You might say, well, +/-25% sucks. You wouldn't accept a thermometer or pressure sensor which read +/-25%. And you are right, it does suck, but then try to design a radon sensor with a sensitivity of more than a few counts/hour per pCi/L but which is also affordable... you could spend $10k+ and get an Alphaguard which has a huge sensitivity and thus very little statistical uncertainty, or spend $25 on a charcoal kit which is +/-50% if you are lucky, or any of the options in between. If you buy a detector which does not explicitly list the sensitivity in the specs, you can be sure it absolutely sucks.
Because many detectors are measuring in single counts per hour, you have to measure over many hours (usually 48) just to get enough statistics to make a decent estimate of the radon level. Decent in this case being +/-10 to 25%. Then you add other sources of error like temperature, humidity, air speed, etc. If the sensitivity is absolutely abysmal, you may need to measure for weeks or months to get enough counts to know what your radon level is to +/-10%.
Often the best bang for your buck in radon measurements is to hire someone who has a really nice radon measurement system for a short term measurement. However, if your radon level changes over time then you get a good snapshot only of whats going on right then. For most homes this isn't a problem, but occasionally you run into a situation like one guy I remember whose well system had radon saturated water, so his radon was fine except when the shower was on.
It is an interesting thought to contemplate how one would disclose lightly radioactive water to a potential future purchaser of the home. I got curious and looked it up, apparently you can purchase an aeration system for $3k-6k to take the radon out of the water.
You might say, well, +/-25% sucks. You wouldn't accept a thermometer or pressure sensor which read +/-25%. And you are right, it does suck, but then try to design a radon sensor with a sensitivity of more than a few counts/hour per pCi/L but which is also affordable... you could spend $10k+ and get an Alphaguard which has a huge sensitivity and thus very little statistical uncertainty, or spend $25 on a charcoal kit which is +/-50% if you are lucky, or any of the options in between. If you buy a detector which does not explicitly list the sensitivity in the specs, you can be sure it absolutely sucks.
Because many detectors are measuring in single counts per hour, you have to measure over many hours (usually 48) just to get enough statistics to make a decent estimate of the radon level. Decent in this case being +/-10 to 25%. Then you add other sources of error like temperature, humidity, air speed, etc. If the sensitivity is absolutely abysmal, you may need to measure for weeks or months to get enough counts to know what your radon level is to +/-10%.
Often the best bang for your buck in radon measurements is to hire someone who has a really nice radon measurement system for a short term measurement. However, if your radon level changes over time then you get a good snapshot only of whats going on right then. For most homes this isn't a problem, but occasionally you run into a situation like one guy I remember whose well system had radon saturated water, so his radon was fine except when the shower was on.