We evaluated the following measures on their correlation with the rate of evolution
measured through experimental runs. From top to bottom: Multivariate mutual information
(slope 0.169 ± 0.011, intercept 0.302 ± 0.031), multivariate conditional mutual information
(slope 0.181 ± 0.012, intercept 0.12 ± 0.037), entropy, mutual information of timeshifted
environments, average pairwise Pearson correlation, average KullbackLeibler divergence.
Information theoretic metrics were calculated for probability distributions (using
64 bins) of possible values of two signals S_{1 }and S_{2 }and a nutrients' signal. Nutrient's signals were taken either as a time delayed function
of input signals N (see Figure 3), or as a shifted by 500 time steps nutrients' signal N_{shifted }(i.e. with eliminated time delay relative to the input signals). Information theoretic
metrics were calculated as: MI(S_{1}; S_{2}; N) = H(S_{1}) + H(S_{2}) + H(N)  H(S_{1};S_{2})  H(S_{1};N)  H(S_{2};N) + H(S_{1};S_{2};N) and MI(S_{1}; S_{2}N) = MI(S_{1};S_{2})  MI(S_{1}; S_{2}; N); joint entropy is defined as and mutual information is defined as , where (p(x_{1},...,x_{n}) and p_{i }(x) are joint and marginal probability distribution functions, respectively. KullbackLeibler
divergence is calculated as: KL Div. for probability distributions S and N.
