From Correlation to Causality
Often causality is confused with correlation. Human intuition has evolved such that it has learned to identify causality through correlation. However, the fact that two quantities are correlated does not necessarily mean that one is the cause or the effect of the other. Correlation, as a statistical measure does not capture the direction of the information flow. Therefore, going beyond statistics is a necessary step to predict and take reliable and resilient decisions.
From prediction to decision making
In order to generate relevant predictive signals and guide decision making, best AI systems need to integrate another dimension. They need to go beyond statistical measures and causality, by introducing the concept of time, evolution, and equilibrium.
Bringing all disciplines together; Causality, prediction, and decision making
Only a few AI systems today are able to bring together all the following dimensions:
· Using NLP to code languages and constitute a knowledge where words and themes are scored according to their relevance and interdependencies
· Creating causal diagrams, which are automatically extracted and where, the intensity of causation is quantified
· Performing stability analysis of dynamic systems, which are modelling human decision-making process
Read more about Causality
The British economist Clive Granger (Nobel Prize 2003) introduced a measure of “causal influence” in the late sixties (Granger 1969) that enabled a machine to pinpoint some directional influences automatically. Since then various definitions and methods of quantification of cause-and-effects relationships have been introduced and refined to enable learning and equilibrium analysis. More recently the computer scientist and philosopher, Judea Pearl has “revolutionised the understanding of causality in statistics, psychology, medicine and the social sciences”, according to the Association for Computing Machinery. Pearl uses causal diagrams (Pearl 1995) to represent and store directional interdependencies between variables (information flow):
Read more about Prediction
In 1873, the Scottish scientist James Maxwell delivered an address at Cambridge University concerning the debate between determinism and free will, in which he said (Atmanspacher and Bishop 2014, Ekeland 1990):
“It is a metaphysical doctrine that from the same antecedents follow the same consequences. No one can gainsay this. But, it is not of much use in a world like this, in which the same antecedents never again concur, and nothing ever happens twice … The physical axiom which has a somewhat similar aspect is ‘that from like antecedents follow like consequences. But here we have passed …from absolute accuracy to a more or less rough approximation. There are certain classes of phenomena … in which a small error in the data only introduces a small error in the result … There are other classes of phenomena which are more complicated, and which cases of instability may occur …”
Maxwell underlines the fact that a physical complex system can be fully deterministic. In other words, each of a system’s physical entities can react to its environment according to well-defined deterministic laws, while being absolutely unpredictable. This is due to the instabilities and nonlinearities of the interactions between the various components of a dynamic system. A slight perturbation (the butterfly’s wing beat), can bring a complex deterministic system in a completely unpredictable state.
Causality analysis and AI
The introduction of causality in AI has been a major breakthrough compared to simple statistical measures of correlations. But, is it enough to predict and guide decision makers?
The French mathematician Henri Poincaré (1854-1912) was one of the first to point out that many deterministic systems display a “sensitive dependence on initial conditions.” Poincaré described this concept in the following way: “It may happen that small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce an enormous error in the latter. Prediction becomes impossible.”
- Atmanspacher, Harald, and Robert Bishop. Between Chance and Choice: Interdisciplinary Perspectives on Determinism. Andrews UK Limited, 2014.
- Ekeland, Ivar. Mathematics and the Unexpected. University of Chicago Press, 1990.
- Granger, Clive WJ. “Investigating causal relations by econometric models and cross-spectral methods.” Econometrica: Journal of the Econometric Society (1969): 424-438.
- Pearl, Judea. “Causal diagrams for empirical research.” Biometrika 82.4 (1995): 669-688.