I would like to be able to determine the set range of an I/O point programmatically so I can compare the reading and make sure we are not maxing out an I/O. So for example if I have a 4-20 mA input and it is scaled 0 - 100 psi, I want to be able to know if the I/O is getting close to the max scale (100) to set a flag that it is getting close to over range. For that matter I could also look at the raw mA input and just do the same comparison with 20 mA, however I don’t know if there is a way to get either of those values programmatically.
I know this could be done manually, however if we ever change the scaled range of the I/O then we would have to go manually change a value somewhere so on and so forth.
Is there some way to do this that I’m just not seeing at this point?