This is very interesting. I've been chasing novel universal Turing machine substrates. Collecting them like Pokémon for genetic programming experiments. I've played around with CAs before - rule 30/110/etc. - but this is a much more compelling take. I never thought to model the kernel like a digital logic circuit.
The constraints of boolean logic, gates and circuits seem to create an interesting grain to build the fitness landscape with. The resulting parameters can be directly transformed to hardware implementations or passed through additional phases of optimization and then compiled into trivial programs. This seems better than dealing with magic floating points in the billion parameter black boxes.
Yeah this paper feels profoundly important to me. The ability to differentiate automata means you can do backward propagating optimization on Boolean circuit designs to learn complex discrete system behaviors. That’s phenomenal.
check out difflogic. differentiable neural net logic circuits that can be compiled to cuda or c code. their prototypical demo is an mnist classifier that can run at > 1M images/sec on cpu!
The constraints of boolean logic, gates and circuits seem to create an interesting grain to build the fitness landscape with. The resulting parameters can be directly transformed to hardware implementations or passed through additional phases of optimization and then compiled into trivial programs. This seems better than dealing with magic floating points in the billion parameter black boxes.