Does anyone know of any good resources on Kolmogorov complexity and fractals such as the Mandelbrot set? Or even on information theory and fractals?
For some reason reading this article is making me wonder about the difference between the information required to generate something like a mandelbrot, knowing the underlying rule, and the information required to represent it as it is, without following the rule. Or e.g., the difference between the information of the generating rule and the information implicitly represented through the time or number of operations needed to generate it.
It seems like there's some analogy between potential and kinetic energy, and kolmogorov complexity and something else, that I'm having trouble putting my finger on. Even if you have a simple generating algorithm that might be small in a kolmogorov complexity sense, if that algorithm entails a repeating something over a large number of operations, the resulting object would be complex, so there's an implied total complexity as well as an "generating" one.
Maybe this is some basic computational complexity concept but if so I'm not recalling this, or am being dense. E.g., I'm used to discussions of "compressibility" but not of the "generating representation information cost" versus "execution cost".
I think you’re forgetting that there’s no definitive way to represent something, compressed or raw. So while it’s interesting to acknowledge that we can compress data and sometimes rather efficiently, I’m not sure it maps to anything physical beyond the fact that decompressing data creates entropy.
>A great number of methods and of accounts of rationality consider at their foundations some form of Bayesian inference. Yet, Bayes’ rule, because it relies upon probability theory, requires specific axioms to hold (e.g. a measurable space of events). This short document hypothesizes that Bayes’ rule can be seen as a specific instance of a more general inferential template, that can be expressed also in terms of algorithmic complexities, namely through the measure of unexpectedness proposed by Simplicity Theory.
For some reason reading this article is making me wonder about the difference between the information required to generate something like a mandelbrot, knowing the underlying rule, and the information required to represent it as it is, without following the rule. Or e.g., the difference between the information of the generating rule and the information implicitly represented through the time or number of operations needed to generate it.
It seems like there's some analogy between potential and kinetic energy, and kolmogorov complexity and something else, that I'm having trouble putting my finger on. Even if you have a simple generating algorithm that might be small in a kolmogorov complexity sense, if that algorithm entails a repeating something over a large number of operations, the resulting object would be complex, so there's an implied total complexity as well as an "generating" one.
Maybe this is some basic computational complexity concept but if so I'm not recalling this, or am being dense. E.g., I'm used to discussions of "compressibility" but not of the "generating representation information cost" versus "execution cost".