The naming is indeed a bit confusing now that I look at it.
sysData[M][‘OD0’][‘raw’] – raw data that comes from the sensor.
sysData[M][‘OD0’][‘target’] – the OD blank value
sysData[M][‘OD’][‘current’] – the calibrated value of OD
The confusing is the OD0 entry in the dict holds all the “raw” values e.g. the blank value and also the measurement in raw form. The ‘OD’ entry in the dict then has values for the OD in OD units (i.e. no raw values).
Best method to understand might be to put a bunch of print() statements in the code at the things you want to look at and see how the values change, this should make it easier to work backward.
Currently the system doesn’t record the RAW OD values (sysData[M][‘OD0’][‘raw’] ) in the database over time so you need to back-calculate these from your readings, and then shift the sysData[M][‘OD0’][‘target’] value, then calculate your new adjusted OD values.