When you’re given a function, as observed, you will have discontinuity, and so the question becomes, is the discontinuity the result of observation, or the result of the underlying function itself? And in each case, how can I measure that, given my observed data, which is likely all you have to work with? It just dawned on me, my paper, “Sorting, Information, and Recursion“, seems to address exactly this topic. Specifically, Equation (2) will increase as the distance between the terms in a sequence increases. So as a result, what you can do is, first test the data using Equation (2) as is, without sorting the data. So for example, given , we would take the difference between adjacent range values as is, producing the vector
. Then calculate
using that vector. Then, you sort the range values, in this case producing
, and repeat this, and the degree to which
changes in the latter case, is a measure of how continuous your data is, because continuous data will have small gaps in the range values, and will be locally sorted in some order, as a consequence. Note that you should use the variant of Equation (2) I presented in Footnote 5, because for a continuous function, the distance between range values will likely be less than 1, and if you do that, then tighter and tighter observations will cause
to get closer and closer to
.