# Information, Length, and Volume

Information, Length, and Volume

I’ve written about this topic a few times, and having reviewed it, the articulation was a bit sloppy, so I thought I’d restate it a bit more formally. The basic idea is that there’s some capacity for storage in each unit of length, and this is physically true, in that you can take a string, for example, and subdivide its length into equal intervals with markings. Then, simply place an object upon one of the markings –

This is a unique state of the system, and placing the object upon each other such marking defines a different unique state of the system.

If there are $N$ such markings, then the system can be in $N$ states, and therefore, store $\log(N)$ bits of information. This system is equivalent to a binary string of length $N$, where only one bit can be flipped on at a time. We can generalize the connection between length and information by assuming that the length is divided into some number of $N$ segments, each of which can be in $K$ states. To continue with the physical intuition, this could be done by assigning $K$ objects to each of the $N$ segments along the length, where the number of objects placed upon a given segment determines its state. For example, if you have 2 pebbles upon a given marking along the string, that would be the second state of that segment. This generalization associates a given length with a $K$-ary string of length $N$, that can be in $K^N$ states, and store $N\log(K)$ bits.

We can set a variable $n$ to have units of $K$-ary switches per length, and so given a length $l$, we can then solve for $N = nl$. The number of bits that can be stored along the length is therefore given by $I = \log(K^N) = \log(K^{nl}) = nl\log(K)$. Note that we can treat $K$  and $n$ as constants as a function of $l$, and as a result, the information content associated with a given length $l$ is $O(\log(l))$. We can generalize this to volume, where $n$ would instead have units of $K$-ary switches per volume, from which it follows that the information content associated with a given volume $V$ is also $O(\log(V))$.

This in fact demonstrates that there is a proportional relationship between substance, and information. For an exhaustive treatment of this topic, my first real paper on physics implies an actual equation that relates energy and information (See Equation 10), and they are in that case again proportional. What the work above shows is that as a practical matter, at the macroscopic scale, the same proportional relationship holds, since my paper implies that the information content of a system is $O(E)$, where $E$ is the total energy of the system.