Coherence, Randomness, and the Hidden Architecture of Decision-Making
Randomness as a Word
When I think about what it means for something to be truly random in our world today, I associate this idea with an apparent chaos. An experience brought forth by a mechanism which is either unknown or disregarded - because it is deemed to be just that, random - causeless.
Randomness, though, is often less a property of reality than a measure of our resolution in perception. Our ability to perceive, model, and respond to life circumstances.
I have a fascination with the nature of luck, chance, randomness, and chaos - so I thought why not look into where all of this came from? Chance, fate, and fortune are very ancient - so there may be an element of semantics here as words can be ambiguous depending on the definitions we ascribe to them in a given culture or timeframe.
The nature and concept of randomness is something which has evolved over time. Taking a look at the etymology of the word, random, shows us something intriguing. Naturally, etymology doesn’t prove what randomness is, but it does show us how humans first named a certain kind of experience.
Random, comes from the Old French root randon, meaning “speed, force, or impetuosity”.
Randon comes from the verb randir, meaning “to run fast”. The irony of this given a poker player's propensity to chalk up wins to “running good” is very much not lost on me!
It’s worth noting that poker itself is a living laboratory for epistemic randomness where there’s a co-existence of deterministic processes, incomplete information, and feedback loops. Seemingly a model petri dish from which to sample data in order to study the interplay of such phenomena more deeply.
In the late 14th century, when a term was coined for this concept in Old French culture, it was used primarily to ascribe qualities of uncontrolled motion, reckless momentum, or lack of deliberate aim to a particular event or unfolding. Examples of this may have been along the lines of an arrow loosed in haste, a bolting horse, or a blow struck without plan.
My takeaway is that the idea of randomness is meant to ascribe a quality of movement without visible intention, as opposed to an absence of structure within a system. This distinction, while perhaps not immediately evident, is important because it draws a line between epistemic and ontological randomness.
An example of this, as previously noted, would be a shuffled deck of cards as epistemic randomness. The shuffle of the cards is deterministic based on the nuance of speed, grip, placement, etc of the cards - but functionally unpredictable to participants because we cannot (or at the very least, do not) consciously track these stimuli.
To me, it raises the question as to whether or not certain ‘big picture variables’ which account for the myriad of more granular variables, can be identified.
On the flip side, exploring ontology, some interpretations of quantum measurement treat certain outcomes as intrinsically indeterminate, irrespective of how many variables are known going into it.
Epistemic randomness is the felt experience of randomness whereas ontological randomness implies that the fundamental structure of a system exudes randomness. The former is what we all know from playing games of chance and the latter is a claim which has not yet been entirely proven. There’s broad agreement on both the math and predictions, and less congruence in agreement on what it means.
Randomness as a Concept
In the 4th century BCE Aristotle distinguished between necessity - described as lawful causes, chance - described as intersecting causal chains, and spontaneity. His claim was that chance events still have causes, they are simply ones which are often considered to be purposeless, accidental, or at the very least appear unknown.
In the more modern era, around the 17th century, probability theory was born. Key figures such as Pascal, Fermat, Huygens, and later, Laplace, studied dice, gambling, cards, insurance, and risk as a means to more deeply understand chance, odds, and outcomes.
In the early 1800’s, Laplace famously argued that: “If an intellect knew all forces and positions, the future would be perfectly predictable.”
This strongly deterministic ideal considers randomness as ignorance and probability as a bookkeeping of uncertainty.
Randomness as a Frontier
By the 19th century randomness continued to evolve in its societal place and perception within scientific communities. As statistics became a more commonplace method used to seek clarity in complex systems, randomness still meant practical unpredictability and an averaging of many unknown variables.
Then, in the 20th century, quantum mechanics began to explore reality by way of the many-worlds-interpretation and by the Copenhagen interpretation: respectively, offering the perspective that all possible outcomes co-exist, with our subjective experience corresponding to one branch among many - and the idea that outcomes are probabilistic, with limits on what can be simultaneously defined or predicted.
Einstein, who famously claimed that, “God does not play dice.” would have been met with criticism from the contemporary group exploring this topic whom proposed randomness as a fundamental quality of existence, not just an ignorance of structure as Laplace also stated.
Others held strong to the belief that randomness is a felt experience because of the irreducibility in complex systems which creates an epistemological reality as opposed to an ontological one, which if true would indicate that reality is probabilistic at its core. Even today, this remains philosophically unresolved.
Regardless of interpretation, quantum mechanics forced a sharper question: is randomness fundamental or contextual? Epistemological or ontological?
As computer science evolved in the 20th century, randomness experienced yet another twist in its perception and conceptual application. Deterministic algorithms can now create outputs indistinguishable from “true random”, but which have a structural, mathematical backbone.
Andrey Kolmogorov, a Soviet mathematician, provided the notion of “Kolmogorov incompressibility”. This is one formal lens on randomness in information theory, different from statistical randomness tests. Essentially, he proposed that if something can only be described in the shortest possible terms by way of a program which is almost as long as the string of the ‘thing itself’, it signals randomness or lack of pattern.
Randomness Today
So, in our world today, when we hear the word random - how much of this history do we associate with, given the evolution of this concept? It’s all fine and well for terms to evolve as the culture and understanding surrounding them does - but it doesn’t necessitate a change in understanding or meaning.
Randomness holds a connotation in our current world which is based on meaninglessness, lack of pattern, accidental cause, or absurdity. And while these may be true, it’s not how the term came into being.
It initially meant fast, unguided motion, later meant lack of visible intention, later yet meant lack of knowable cause, and now is often associated with lack of cause at all.
Khemset as an applied philosophy, focused on scientific inquiry into such topics, simply seeks to underscore the reality that even when randomness is fundamental at one level, our lived experience of randomness is still shaped by limits of our perception and model.
One need not deny quantum mechanics, assert mystical control, or overthrow probability in order to explore and seek a deeper understanding of reality. In essence, what appears random may simply conceal structure relative to state, scale, and observer.
In summary, randomness entered human thought not as a property of reality, but as a name for motion, speed, and outcomes whose governing structure may lay beyond intention or comprehension.
Khemset Cosmology: Coherence improves Interface
Where does coherence fit into the picture? If randomness is a human construct and commonly held description or perspective of a reality experience, what does coherence imply or indicate?
Coherence, defined as an organized physiological state where breath, HRV, and attention align, gives way to an improved signal-to-noise ratio. This allows for greater levels of information and order to be felt, grasped, interfaced with, and understood by the participant.
One mechanism for this such connection is commonly referred to as intuition, of which there are distinctions made between different types, as described by the HeartMath institute.
Local intuition is fast integration of subtle cues that you did perceive, whereas non-local intuition is hypothesized to be an integration of information not mediated by well known and accepted sensory channels.
The best way to outline my thoughts on this topic, and desire to explore non-local intuition, is to outline a cosmology for Khemset and share some axioms.
Khemset: A Working Cosmology
Embedded participation: Human beings are not observers outside reality; we are embedded participants within it. Our perceptions, decisions, and actions are part of the system being perceived. Just as a human being is not a part of nature, we are nature.
Structure precedes Outcomes: Reality exhibits deep structure such as constraints, symmetries, archetypes, and conservation laws. What may appear chaotic, can be orderly unfolding within these structures, from a different point of view.
Chance Is Intersection, Not Absence: Uncertainty arises when independent causal streams intersect. Chance does not imply a lack of cause, only a lack of intention or foreknowledge.
Randomness Is Resolution-Dependent: Events appear random when complexity, speed, or feedback exceed the observer’s resolution. Randomness may simply be a function of limits in perception, modeling, or state.
Coherence Improves Interface, Not Control: Coherence - physiological, psychological, and attentional alignment, does not override reality’s structural laws. It improves how clearly structure is perceived and how skillfully uncertainty is navigated.
A Speculative Horizon
As coherence increases, apparent randomness will likely not disappear, though it may become more navigable.
The universe lawfully responds to the quality of participation we bring to it in the only way we can truly understand it to - by virtue of our experience. When we meet life with coherence, we experience a reflection of just that back to us.
In that sense, randomness is comforting when not seen as an enemy to be conquered, but a signal of a higher resolution yet to be reached. An invitation to refine perception, deepen alignment, and engage the mystery with clarity to overcome fear, uncertainty, and distortions.
Coherence helps detect more signal, generate cleaner models, and act with less internal noise.
Whether you interpret that as psychology, complex-systems dynamics, or a many-worlds metaphor - the practical claim stays grounded: as coherence rises, the felt experience of randomness often falls.
Not because the universe becomes a slave to your desires, but because you’re perceiving it more clearly and navigating uncertainty more tactically. Using free will to engage in processes with potentially deterministic outputs, all of which are unknown, mysterious, and deliciously exciting.
I will say - I find the idea of free will, traditionally seen as a mechanism of agency - potentially being the catalyst by which one engages in deterministic processes, to be cosmically ironic, hilarious, and wonderful.
All realities co-existing in the moment is one way in which we can perceive a field of apparent randomness. One way in which we can contextualize the navigation of our individual and shared realities.
The less conscious and coherent I am, the more I experience an apparent chaos and disorder.
The more conscious and coherent I am of this process, the less I experience a feeling of randomness.
And perhaps, this cycle is one that goes on forever. Chaos prompting us to find order, and then that order giving way to an apparent chaos - leading us to ever greater levels of clarity and understanding.

