Not sure what you mean with calibration here. Above is not so much about calibration but correlation. Basically comparing the monitors to the reference data.
Sorry, should've been "how". My understanding is that the optical sensors work by recalibrating every week or so, assuming that the lowest CO2 level they've seen is equivalent to the baseline "outdoor" ppm that was set in the factory (which can be adjusted over time to account for climate change).
If the sensors are in an environment where the CO2 is always elevated, how do you keep it properly calibrated (eg if the CO2 never goes under 700ppm, how do you stop it from recalibrating so that it returns "400" when it should actually return "700") I know you can just turn calibration off but won't the sensors' accuracy decrease over time?
If you’re using reference grade instruments, I would assume they’re calibrated in labs using well controlled environments.
I.e. put sensor in vacuum chamber, remove all gases, then introduce know amount of gases to produce a known target environment.
Given their reference instruments, I assume they’re also capable of maintaining their calibration for a long period of known time, before they need to be re-calibrated. They would never rely on something as inaccurate as “20 mins outdoors” to calibrate themselves.