Perfect unusable code: or how to modelize code and distributivity

So let's speak of what and un/deterministic code really are.

I am gonna prove that you can achieve nearly chaotic series of states with deterministic code \o/

Definitions:

Deterministic: code is deterministic if the same input always yield the same output

Chaotic: a time serie of value is considered chaotic if knowing of sample of t-n samples cannot make you able to predict the t+1 term. 

Turing Machine: a computer that does not worth more than a cassette player.

Complex system: a set of simple deterministic object connected together that can result in non deterministic behavior.

lambda function: stateless functions without internal states.

FSM (finite state machine): a stuff necessary in electronic because time is relativistic (Einstein).

Mapping: a mathematical operation/computer stuff that describes a projection of input discrete dimension A to output discrete dimension B. 


Now let's play real life turing machine.

Imagine I give you an old K7 player with a band of 30 minutes and every minutes I tell the result of n x 3.
If you go at minutes 3 the K7 will tell 9.
If you go at minutes 5 you will hear 15. 

This is the most stupid computer you can have. 
My tape is a program. The index (minutes) is the input, and the output is the what is said. 

So let's do a python Basically we did a mapping from the index on the tape in minutes to integers that yields index(in minutes) x 3. 



So what do we learn with this?

That I can turn code into turing machines, that I can use as a code with a 1:1 relationship, I have a ... mapping \o/

What does compile does?
It evaluates for all the input possible that is an integer belongs to [0:255] all the output possible of boolean function. It is a projection of [2^8] input => 2 output
I projected a discrete space of input to a discrete space of output.

Let's see why it is great

My code is fully deterministic and is threadsafe because my code is stateless.

It is an index of all the 256 solutions for f(x) for every possible values.

if I encode a function that tells if a number can be divided by X another one by Y to have the function that tells if a number can be divided by (X * Y) I just have to apply then & (bitwise and operator) to the int representing the code.

An int is a very cool for a storage of function.
With div2 / div 3 I can by applying all the «common bitwise operator» create a lot of interesting functions:

div2xor3 : a code that indicates number that can be divided by 2 or 3 but not 6
not div2: every even/odd number
div2or3: multiple of 2, 3 and 6
div2and3: multiple of 6 only
....

I can combine the 16 bliter operations to directly obtain functions.

In functional programming you do partial function that you apply in a pipe of execution, here you can directly combine the code at «implementation level»


My evaluation is always taking the same number of cycles, I don't have to worry about worst case, and my code will never suffer from indetermination (neither in execution time nor results). My code is ultimately threadsafe as long as my code storage and my inputs are immutables. 


My function are commutative thus I can distribute them.

div2(div3(val)) == div3(div2(val)) (== div6(val))

=> combining function is a simple and of the code

Why we don't use that in real life

First there is a big problem of size.

To store all the results for all the possible inputs, I have to allocate the cross product of size of input * size of output.

A simple multiplication table by 3 for all the 32 bits integers would be 32 bit * 32 bits = an array of 4billions worlds of 32 bits. 16Gbytes!

Not very efficient.

But if we work on a torus of discrete value, it can work :)

Imagine my FPU is slow and I need cos(x) with an error margin sufficient to only work in 1/256 of radians? I can store my results as an array of precomputed cosinus value expressed in fraction of 256%256 :)

A cache with memoization is also using the same principle.
You replace computing code that is long by a lookup in a table.

It might be a little more evoluted than reading a bit in an integer, but it is globally the same principle.

So actually, that is one of the biggest use of the turing machine: efficient caching of pre computed values.

Another default, is that the mapping make you lose information on what the developer meant.

If you just have the integer representing your code, more than one function can yield the same code. The mapping from the space of the possible function to the space of the solutions is a surjection.

Thus if you have a bug in this code, you cannot revert back to the algorithm and fix it.

if I consider I have not a number of n bit as an input but n input of 1 bit constituting my state input vector,  and the output is my internal state, than I am modeling a node of a parallel computer. This «code» can be wired (few clocks costs) as a muxer that is deterministic in its execution time and dazzling fast.


What is the use of this anyway?

Well, it models deterministic code.

I can generate random code and see how they interact.

The Conway's Game of life is a setup of turing machine interconnected to each other in a massively parallel fashion.

So my next step is to prove I can generate pure random numbers with totally deterministic code.

And I can tell you I can prove the condition for my modified game of life to yield chaotic like results is that the level of similarity for every code on every automaton is low (entropy of patterns is high) AND 50% of the bits are 0/1 in code (maximizing the entropy of the code in ratio of bits).