Home Forums Science/Applications OD Calibration – “Normalising” between experiments

Viewing 4 posts - 1 through 4 (of 4 total)
  • Author
    Posts
  • #1687
    Kate
    Participant

    Hi Guys,

    Our chibio sadly crashed part way through an experiment – we were able to get it up and running again quite quickly, however we had to ‘reset’ the UI and seem to have made an error entering the values for OD calibration. Particularly for M0, our OD value setpoint was 13,098 for the first experiment and 8312 for the second, and the recorded OD values do seem lower than we would expect for experiment 2.

    I read in your manuscript supplementary materials about the absorption equations for converting between a raw measured value (R) and an initial (logarithmic) ODvalue (ODi). The equation involves an OD0 and an R value. Which one of these (if either) corresponds to the ‘od_zero_setpoint’ recorded in the output files? What does the other correspond to?

    The reason I ask is because I’m trying to think of a way to ‘scale’ our second experiment’s OD by the first experiments zero_setpoint. Any thoughts you have on this would be greatly appreciated!

    #1688
    harrison
    Keymaster

    Hello Kate,
    It shoudl be OD0. You can look inside the code and see how this is done, what you need is lines 1572 to 1576: https://github.com/HarrisonSteel/ChiBio/blob/master/app.py
    And yes I think it would be fine to just re-scale the final reading (i.e. as if there was a different OD set point) to join up the growth curves. Ultimately it has done the same raw measurement, just scaled it in a different way.

    #1689
    Kate
    Participant

    Ok awesome, thanks. I’m still unclear about how, once the calibration has been done to provide an initial OD (which seems to be the lines 1572-76), which parameter in the code captures subsequent raw measurements, and how these are scaled using the calibration to create the final OD readings that are saved in the output file?

    #1690
    harrison
    Keymaster

    The naming is indeed a bit confusing now that I look at it.
    sysData[M][‘OD0’][‘raw’] – raw data that comes from the sensor.
    sysData[M][‘OD0’][‘target’] – the OD blank value
    sysData[M][‘OD’][‘current’] – the calibrated value of OD

    The confusing is the OD0 entry in the dict holds all the “raw” values e.g. the blank value and also the measurement in raw form. The ‘OD’ entry in the dict then has values for the OD in OD units (i.e. no raw values).

    Best method to understand might be to put a bunch of print() statements in the code at the things you want to look at and see how the values change, this should make it easier to work backward.
    Currently the system doesn’t record the RAW OD values (sysData[M][‘OD0’][‘raw’] ) in the database over time so you need to back-calculate these from your readings, and then shift the sysData[M][‘OD0’][‘target’] value, then calculate your new adjusted OD values.

Viewing 4 posts - 1 through 4 (of 4 total)
  • You must be logged in to reply to this topic.
Log in/Register
Scroll to top