Neural Constraints
Dec 21, 202430 min readneuroscience

Neural Constraints

A new way of thinking about brainwaves

A seeming paradox

During my PhD, I was puzzled to find that the brain’s rhythmic activity often behaves in counterintuitive ways. When we close our eyes, the visual areas show a surge in alpha rhythm (8-12 Hz) (Berger, 1929). Similarly, during visual attention tasks, brain regions relevant to the task show decreased alpha activity, while irrelevant areas show an increase (Worden et al., 2000).

This phenomenon is not exclusive to the alpha rhythm. Consider beta oscillations (13-30 Hz), traditionally associated with motor function. When we deliver a magnetic pulse to the motor cortex, the resulting muscle response is significantly dampened during periods of high beta power (Zarkowski et al., 2006). This suggests that beta oscillations, like alpha, implement a form of phasic inhibition - but one operating at a different temporal scale and serving different functional roles.

Looking at the alpha rhythm, Pfurtscheller and Aranibar have dubbed this mechanism "focal ERD/surround ERS" (Pfurtscheller & Aranibar, 1977) - a pattern where task-relevant regions show decreased oscillatory power (event-related desynchronisation, ERD) while surrounding areas show increased power (event-related synchronisation, ERS). They hypothesised that this stems from interaction between thalamo-cortical nodes, where ERD in an active node removes inhibition on the inhibitory network of neighbouring nodes, resulting in surround ERS.

This arrangement serves a crucial function: it optimises the distribution of computational resources by simultaneously facilitating task-relevant processing while actively suppressing potential interference from task-irrelevant areas. It's a bit like a spotlight of neural processing, but one that works by darkening the surrounding area rather than brightening the centre.

Recent conversations with friends and colleagues have led me to a provocative insight: What if this observation is not just true for alpha or beta, but generalises across all "brainwaves" we measure? What if oscillations represent something far more fundamental than just neural activation patterns? What if they reveal how brains harness constraint itself as an organising principle?

What if we have been looking at neural oscillations all wrong, by asking ourselves what they enable instead of what they preclude? While neuroscience traditionally maps specific frequency bands to particular mental states - alpha waves with relaxation, beta waves with focus - this framework misses something crucial about how complex organisation emerges in biological systems. At its core, complex biological systems don't emerge from adding mechanisms; they arise through the progressive layering and preservation of constraints. As neuroanthropologist Terrence Deacon argues, living systems differ fundamentally from machines because their constraints aren't imposed externally but generated and maintained from within (Deacon, 2011). A cell doesn't maintain itself by constantly adding new components, but by carefully restricting what chemical processes can occur. An organism doesn't develop through the simple accumulation of parts, but through the progressive restriction of possible forms. These processes Deacon terms "morphodynamic" - situations where constraints compound and create new patterns. Think of how water molecules spontaneously organise into snowflakes, or how heated fluid forms into regular convection cells. The brain might use oscillations to achieve something similar but more sophisticated: patterns of constraint that can actively maintain and modify themselves - neural oscillations as creating temporary boundaries that channel neural resources where they're needed by constraining activity elsewhere.

This represents a fundamental shift in how we think about brain function. Instead of viewing neural activity purely in terms of what's present - firing neurons, active circuits, information flow - we need to pay equal attention to what's restricted or constrained. Just as a cup's function emerges from the empty space it contains, the brain's ability to generate coherent behaviour might depend crucially on its ability to maintain and modify patterns of constraint.

The implications go beyond just alpha or beta rhythms. Different frequency bands of oscillation might represent different scales of constraint that work together to shape neural activity in increasingly sophisticated ways. Higher-order cognitive functions might emerge not from adding new neural mechanisms, but from the progressive layering and preservation of these oscillatory constraints. The brain's ability to generate complex behaviour might depend not just on what it activates, but on what it excludes - the constraints it maintains on its own activity.

Before we explore these ideas in detail, we need to look more carefully at what the evidence tells us about neural oscillations and why simple mappings between frequency bands and functions don't tell the whole story. As we'll see, understanding the brain through the lens of constraint opens up new ways of thinking about everything from basic perception to consciousness itself.

The Limits of Simple Mappings

The desire to map a measured brain signal to a definite mental state or cognitive function is easy to understand. Why else would we engage in brain imaging, if not in the hope that whatever signal we record directly relates to a specific state or function? However, the assumption of one-to-one mappings between signal and function or state easily leads astray

Let's look at what actually happens when we try to establish these mappings. Consider alpha oscillations. When measuring alpha power levels, we find that prefrontal increases correlate with an unexpectedly diverse array of states: working memory engagement during mental arithmetic, states of intoxication or dissociation, mild to moderate depression, the onset of sleep, states of "calm focus." These states share no obvious functional similarities - in fact, some seem almost contradictory. How could the same oscillatory pattern be associated with both focused mental arithmetic and drowsy pre-sleep states?

The puzzle deepens when we look at pathological conditions. In depression, for example, we see excessive alpha power in moderate cases but deficient alpha power in severe cases. This kind of non-linear relationship defies any simple mapping between oscillation and function. More importantly, it suggests something crucial about how biological systems work: what matters isn't the presence or absence of a particular pattern, but how that pattern contributes to th system's overall organisation. The alpha rhythm isn't causing or representing these various mental states in any simple sense; it's participating in different configurations of constraint that allow these states to emerge and maintain themselves.

The evidence for this becomes particularly clear when we examine how oscillations interact across different frequency bands. We find that changes in alpha power aren't isolated events but are always accompanied by shifts in other frequency bands. More importantly, these interactions aren't random - they show specific patterns of nesting and coordination that suggest a hierarchical organisation of constraints. Beta oscillations nest within alpha rhythms, which in turn modulate slower theta waves, creating patterns that are stable despite the chaos of underlying neural activity.

This hierarchical organisation helps explain why simple interventions often fail. Attempts to influence mental states through direct manipulation of brain rhythms - whether through binaural beats, simple neurofeedback, or other techniques targeting specific frequencies - typically produce weak or inconsistent results. These approaches assume a linear relationship between oscillation and function that doesn't reflect how biological systems actually work. You can't just push the brain into a desired state by forcing its rhythms, because the rhythms themselves aren't the cause - they're part of how the brain maintains its own organisation.

Understanding this requires us to rethink what we mean by "information" in biological systems. The traditional view treats neural signals like computer bits - discrete units that carry specific information. But as Deacon argues, biological information isn't about pattern alone - it's about constraints that are preserved and reproduced because of their contribution to the system's self-maintenance. When we observe increased alpha power during both relaxation and mental arithmetic, we're not seeing the same "signal" being used for different purposes. We're seeing how similar patterns of constraint can participate in different organisational states.

This perspective helps resolve several paradoxes in neuroscience. Why do we see the same oscillatory patterns in seemingly unrelated mental states? Because what matters isn't the pattern itself but how it's integrated into larger configurations of constraint. Why do interventions targeting specific frequencies often fail? Because you can't manipulate one level of organisation without affecting the others. Why do pathological conditions show non-linear relationships with oscillatory power? Because what's breaking down isn't a simple mechanism but the brain's ability to maintain appropriate patterns of constraint.

Dynamic Inhibition

The perspective outlined here might seem puzzling at first, but it's actually really simple. Imagine trying to build a system that must remain perpetually responsive to unpredictable inputs while also maintaining precise control over its internal states. The solution nature has evolved is remarkably elegant: rather than building a system that activates selective pathways from a resting state, the brain maintains high baseline activity and achieves specificity through targeted inhibition.

This maintains stability while enabling flexibility. A brain with maximum constraints would be effectively dead - crystallised and unable to respond to environmental demands. So the rest-state of the brain is closer to a high-entropy state with maximal potential for different possible configurations - enabled by keeping oscillatory frequencies at the ratio of phi to minimise the chance of spurious phase synchronisation and cross-talk. It is a minimum constraint regime with maximal responsiveness to environmental demands. As these arise, selective inhibition shrinks this possibility space, channeling resources toward task-relevant pathways while actively constraining alternatives.

All of this is meant to highlight that the brain is not like a car that the conscious driver can kick into gear when needed. On the contrary, unlike a car that needs a driver to turn the ignition and press the gas pedal, there is no agent independent of the brain who could assess situational demands and increase its activity accordingly. If the brain isn't already in motion, it simply won't be able to respond to inputs - there is no lucid driver behind the wheel. To speak in plainer terms, steady throttle at rest is the only viable strategy for a system that must respond to an imperfectly predictable environment. All the driver can do is use the breaks and turn the wheel. This necessity for constant readiness is reflected in the brain's fundamental organisation: its energy consumption and blood supply remain remarkably stable across different mental states and activities. Rather than increasing total power output when engaging in conscious processing, the brain redistributes its existing resources within its fixed energy budget of about 20% of the body's total energy consumption. This is reflected in the power distribution across frequency bands - lower frequencies consistently maintain higher amplitudes and dominate the overall activity pattern, following a characteristic 1/f power law distribution. When conscious processing increases, we see a shift in the distribution of neural activity that includes increased high-frequency oscillations, particularly in the gamma range, though cognitive work emerges from complex interactions across multiple frequency bands rather than from high-frequency activity alone. As we move up the hierarchy of cognitive function, from basic sensory processing to conscious awareness, we generally see greater involvement of higher frequencies, though always in coordination with slower oscillations who are reduced in amplitude to keep net activity constant. Even during intense cognitive tasks or artificial neural stimulation, on a whole the brain preserves this fundamental balance, suggesting that cognitive work emerges not from "turning up the volume" but from dynamically redistributing resources across different frequency bands while maintaining stable overall energy consumption, shifting up activity on the frequency range through top-down constraints.

This principle becomes clearer through an analogy to steam power. Just as a steam engine doesn't create energy but rather harnesses pre-existing thermal energy through careful constraint, the brain doesn't generate conscious or cognitive work through the neural activity itself, but rather by how the neural activity channels its constant metabolic supply. In a steam engine, the raw expansive force of steam becomes useful work only through a complex architecture of pipes, valves, and chambers that constrain its flow. Without these constraints, steam would simply dissipate its energy randomly in all directions. Similarly, the brain's baseline metabolic activity - like steam under pressure - represents pure potential that must be carefully constrained to generate meaningful cognition. Neural oscillations serve as dynamic "pipeworks" that temporarily constrain this metabolic potential, channeling it into specific computational pathways while preventing its dissipation into task-irrelevant circuits. This suggests a profound parallel between mechanical and neural systems: in both cases, useful work emerges not from the raw power source itself, but from the progressive application of constraints that channel that power into increasingly sophisticated patterns of activity.

The Dialectics of Constraint

This framework of neural constraint naturally leads us to a deeper question: why constraints at all? The answer may lie in one of physics' most fundamental principles - the principle of least action. Just as physical systems tend toward paths that minimise energy expenditure, the brain appears to operate on a similar economy of effort. It achieves this efficiency through prediction - continuously generating models of the environment and updating them through error signals, much like modern machine learning systems. But what is prediction if not a constraint on future states? When the brain predicts, it isn't generating all possible futures but rather restricting the space of what it considers possible, creating a focused beam of attention that illuminates some possibilities while necessarily casting others into shadow.

This perspective offers new insight into how the brain maintains coherent behaviour despite the chaos of its underlying neural activity. Rather than trying to specify exact patterns of activation - a strategy that would be highly vulnerable to noise - the brain guides behaviour by constraining what cannot happen. This negative selection proves more robust than positive specification, particularly evident in motor control where smooth movement emerges not from precise muscular activation but from the selective inhibition of competing motor programs. The brain doesn't so much tell the body what to do as what not to do, creating a channel through which coherent action can flow.

This inhibitory framework also helps explain a puzzling aspect of brain development: why the brain initially produces an abundance of neural connections at birth and then systematically eliminates many of them - in fact this constitutes the majority of brain development during maturation. Rather than viewing this pruning as wasteful, we can understand it as the brain gradually sculpting specific patterns of constraint from a space of many possibilities. The pruning process isn't just removing unnecessary connections - it's establishing the inhibitory architecture that will enable sophisticated cognitive function.

Huxley's Doors of Perception

In his seminal work "The Doors of Perception" (1954), Aldous Huxley described his experiences with mescaline as revealing the overwhelming richness of unfiltered perception that normally lies behind our brain's "reducing valve" of consciousness. Modern neuroscience offers fascinating support for this metaphor.

Psychedelic substances appear to work by disrupting the brain's normal constraint mechanisms. Neuroimaging studies show reduced neural activity in specific frequency bands but increased connectivity across broader neural networks. This is akin to removing the pipes and valves from our steam engine analogy - the same energy is present, but instead of being channeled into specific pathways, it disperses chaotically.

Without proper neural constraints, prediction mechanisms break down. This explains both the overwhelming sensory experiences and the feeling that ordinary objects contain profound significance - the brain's normal filters for relevance and meaning are disrupted, allowing unusual connections and interpretations to emerge. The commonly reported feelings of insight or revelation can be understood as the temporary lifting of habitual cognitive constraints, enabling novel patterns of thought.

This mechanism also suggests why psychedelics might have therapeutic potential. Just as a stuck machine might need its parts loosened before being reassembled, individuals trapped in rigid patterns of thought or emotion might benefit from temporary disruption of these patterns. Under proper therapeutic guidance, this disruption could enable the establishment of healthier neural constraints.

This potential is driving renewed scientific interest in psychedelic-assisted therapy for conditions like treatment-resistant depression and PTSD, where breaking out of pathological patterns of neural activity might be therapeutic - though always in controlled clinical settings with proper medical oversight.

The same principles that govern neural development extend into the emergence of consciousness itself. When we encounter novel situations, consciousness floods our experience with heightened awareness - a state where prediction errors are high and existing constraints prove insufficient. Yet as we master these situations and our predictions improve, conscious oversight fades into automated responses. This pattern reveals consciousness as an energy-intensive process that emerges precisely where our predictive models fail, working to refine constraints until automation becomes possible.

This view suggests a profound paradox: consciousness works toward its own obsolescence. Following the principle of least action, the brain seeks to minimise both prediction errors and energy expenditure. Perfect prediction would eliminate the need for conscious oversight entirely - but the complexity and unpredictability of the real world make this impossible. Instead, consciousness serves as a temporary scaffold, continuously engaged in refining predictive models until they can handle situations automatically. Like the oscillatory constraints we examined earlier, consciousness itself acts as a dynamic constraint system, channeling cognitive resources toward regions of high prediction error while inhibiting processing of the predictable.

This perspective helps us understand why consciousness cannot be reduced to simply building up complex representations. Rather, it emerges through the progressive refinement of constraints in response to prediction errors - always working toward better automation, yet perpetually renewed by the inherent uncertainties of existence. The fade-out of conscious attention during mastery isn't a failure but a success - it reflects the achievement of efficient prediction that no longer requires energy-intensive conscious oversight. Yet this process is never complete, as new challenges continuously emerge that require consciousness to engage once again in its endless work of refining constraints.

Why Oscillatory Inhibition?

Consider first the temporal architecture that oscillatory inhibition creates. By generating regular rhythmic patterns, neural oscillations establish precise temporal windows where responses are more or less likely. This creates what we might call "temporal scaffolding"—predictable structure in time that allows distant brain regions to coordinate their activity without direct communication. When visual stimuli arrive during specific phases of ongoing alpha oscillations in the visual cortex, they're more likely to be perceived. This isn't just efficient resource management—it's the brain using time itself as a computational dimension.

But the true sophistication of oscillatory inhibition emerges when we consider it through the lens of physics. Static inhibition would face a fundamental limitation: it would make the system either too rigid to adapt or too loose to maintain stability. Oscillatory inhibition solves this through "protected states"—patterns that remain stable despite underlying fluctuations while still maintaining sensitivity to relevant perturbations. Think of a spinning top—its very motion creates stability, but it can still respond to significant forces without shattering. Similarly, even strongly inhibited neural circuits have brief windows of potential change due to the rhythmic nature of their constraint.

This also allows the brain to solve the gradient problem. Think of a gradient like a difference in height that makes water flow downhill, or a difference in temperature that makes heat move from hot to cold. In the brain, these gradients involve differences in electrical charge or chemical concentrations that allow neurons to fire and communicate. The challenge is that these processes naturally tend to eliminate the very differences that make them work—like how flowing water eventually levels out, or how heat spreads until everything is the same temperature. This self-destructive tendency is what we call the gradient problem: left unchecked, the brain's activity would quickly eliminate the conditions it needs to function. By cycling between periods of greater and lesser constraint—like a rhythm of tightening and loosening—it prevents any single process from running to completion and eliminating its gradient. This rhythmic variation maintains the essential differences that enable information processing, much like how a dam controls water flow to maintain a useful height difference instead of letting all the water rush down at once.

In the same way, the dynamic architecture of the brain also solves the problem of inertia in coupled systems. When two or more systems interact repeatedly, they tend to fall into the simplest possible pattern of interaction. It's like what happens when you hang two pendulum clocks on the same wall—over time, the stronger pendulum will force the weaker one to swing in exactly the same rhythm, eliminating any complex patterns. We see this same tendency toward simplification throughout nature, from synchronised fireflies to applauding audiences. The brain circumvents this by maintaining multiple distinct frequency bands of oscillation, creating complex but stable patterns of interaction between neural populations. No single frequency can dominate because the entire system preserves these multiple scales of oscillatory constraint.

This framework helps explain why the brain requires persistent low-level excitation to maintain coherent function. Just as Bénard cells only form in fluid systems with sufficient constant heat flow, neural oscillations require a baseline of activity to maintain their organisational patterns. This isn't just about keeping neurons "ready to fire"—it's about maintaining the conditions necessary for complex temporal constraints to emerge and persist.

The multi-frequency nature of neural oscillations implements what we might call a "constraint ratchet" mechanism. Like a mechanical ratchet that allows motion in one direction while preventing it in another, oscillatory inhibition can selectively preserve certain temporal patterns while allowing others to dissipate. This selective preservation is crucial for learning and memory, allowing the brain to maintain stable patterns while remaining flexible enough to incorporate new information.

This oscillatory architecture enables the brain to achieve something remarkable: it can match its rate of constraint dissipation to the rate of incoming perturbations. The result is a sophisticated form of dynamic equilibrium that goes far beyond simple homeostasis. When faced with novel situations, the system can temporarily relax specific constraints while maintaining overall stability. This creates a kind of "neural annealing" that allows exploration of new patterns without risking system-wide destabilisation—a crucial capability for complex cognitive tasks.

The brain's oscillatory organisation reveals itself in a hierarchical temporal structure. It maintains oscillations across multiple frequency bands, not just for different functions, but to create nested temporal hierarchies. The fastest oscillations occur in the gamma band (30–100 Hz), followed by beta (13–30 Hz), alpha (8–12 Hz), and theta (4–7 Hz) rhythms. Each slower rhythm contains and organises the faster ones, creating a fractal temporal structure where each level constrains but doesn't completely determine the levels below it.

This temporal organisation explains a crucial aspect of brain function: the necessity of specific timescales. Individual neurons operate in milliseconds, while the supporting metabolic changes occur over seconds. This separation isn't coincidental—it's essential for stable pattern formation. If the brain's energy supply systems (its metabolic support) operated at the same lightning-fast speed as neural firing, it would be like trying to drive a car where the fuel injection responded as quickly as the spark plugs—the whole system would become unstable. Instead, the brain's slower metabolic processes act as a stabilising influence. The slower metabolic processes serve as a buffer, allowing complex patterns to emerge despite rapid neuronal fluctuations.

A single global oscillation would be catastrophic, no matter how well-tuned. Any perturbation at the right frequency could destabilise the entire system. By maintaining multiple distinct but interacting frequencies, the brain creates "temporal shock absorbers" that can dissipate perturbations across different scales without allowing system-wide disruption.

This multi-scale architecture helps explain why certain patterns of activity resist disruption, even when they're no longer adaptive. During habit formation, we observe "chunking"—sequences of actions become bound together into unified patterns that resist separation. Through the lens of oscillatory constraint, we can understand this as stable temporal scaffolds that synchronise activity across multiple neural populations. These patterns show remarkable stability because they're maintained not by any single neural population but by the coordinated timing relationships between many populations.

The relationship between oscillatory inhibition and metabolic constraints adds another layer of sophistication to brain function. The brain's resting state maintains complex oscillatory patterns that require significant energy. Yet this baseline activity proves remarkably efficient. By maintaining these oscillatory constraints, the brain can rapidly switch between different functional states without building each from scratch. It's like keeping factory machinery running at low power between production runs—more expensive initially, but far more efficient than repeatedly starting and stopping the entire system.

This perspective offers new insights into persistent pathological states. In conditions like chronic pain or treatment-resistant depression, we observe altered patterns of neural oscillation that become self-sustaining through their temporal structure. These aren't simply strong activation patterns—they're altered temporal constraints that self-reinforce through interaction with the brain's intrinsic oscillatory architecture. Understanding this temporal aspect suggests new therapeutic approaches focused on restructuring temporal relationships between neural populations rather than directly changing activation patterns.

The implications of this temporal framework extend beyond individual brain function. When we observe synchronised behaviour between individuals—from conversation to coordinated movement—we're seeing the alignment of multiple individual temporal hierarchies. This suggests that social coordination isn't just about matching actions, but about creating compatible temporal constraints across multiple nervous systems. The ability to establish and maintain such cross-brain temporal alignment might be fundamental to social cognition and cultural learning. However, this is, perhaps, the topic for a future article.

Conclusion: The Architecture of Possibility

When we began this exploration, we confronted a seeming paradox: why would increased neural oscillations signal reduced processing? Our investigation has revealed that this apparent contradiction actually illuminates a fundamental principle of biological organisation. Rather than viewing the brain as a system that builds up activity to create function, we now understand it as one that achieves specificity through precisely orchestrated constraint.

This shift in perspective reveals the sophisticated nature of neural organisation. The brain's oscillatory patterns represent more than just efficient resource management—they demonstrate how complex systems achieve stability through dynamic constraint. By maintaining multiple scales of inhibitory rhythms, the brain creates an "architecture of possibility," remaining poised between order and chaos, capable of both stable function and rapid adaptation.

The implications extend beyond neuroscience into broader biological principles. Just as physical systems find stability through balanced forces, the brain achieves coherent function through preserved patterns of limitation. This framework illuminates several persistent questions in neuroscience: the purpose of high baseline activity, the prevalence of inhibitory control, and even the nature of consciousness itself. Each phenomenon reflects an aspect of a system organised around dynamic constraint.

Modern artificial intelligence systems have evolved to mirror these principles, though through different mechanisms. Attention mechanisms in transformer models act as dynamic filters, while deep reinforcement learning systems like AlphaZero achieve superhuman performance through intelligent pruning of possibilities. Consider how GPT models selectively constrain their output based on context—a direct parallel to the brain's use of temporal hierarchies to process information across multiple timescales.

This understanding suggests novel approaches to treating neurological conditions. Rather than focusing solely on neurotransmitter levels, treatments might target the restoration of healthy oscillatory patterns across multiple temporal scales. The success of transcranial magnetic stimulation in treating depression, for instance, might work by disrupting pathological constraint patterns, allowing more adaptive ones to emerge.

Looking forward, this framework opens new research directions in neurodevelopment, abstract thinking, and creativity. How do constraint patterns emerge during brain development? How do they enable abstract reasoning? How might they contribute to creative insights? These questions suggest promising avenues for investigation into the fundamental principles of biological organisation.

The resolution of our initial paradox reveals something profound: the brain's sophistication lies not in accumulated complexity but in refined constraint. This insight reshapes our understanding of biological systems and points toward new approaches in both therapeutic intervention and artificial intelligence design. Just as a sculptor reveals form by removing material, the brain achieves its remarkable capabilities through the precise maintenance of limitations—a principle that may guide us toward deeper insights into both biological and artificial intelligence.

Sources

Berger, H., 1929. Über das Elektroenkephalogramm des Menschen. Archiv für Psychiatrie und Nervenkrankheiten, 87(1), pp.527-570.

Worden, M.S., Foxe, J.J., Wang, N. and Simpson, G.V., 2000. Anticipatory biasing of visuospatial attention indexed by retinotopically specific alpha-band electroencephalography increases over occipital cortex. Journal of Neuroscience, 20(6), pp.RC63-RC63.

Zarkowski, P., Shin, C.J., Dang, T., Russo, J. and Avery, D., 2006. EEG and the variance of motor evoked potential amplitude. Clinical Neurophysiology, 117(6), pp.1309-1315.

Pfurtscheller, G. and Aranibar, A., 1977. Event-related cortical desynchronization detected by power measurements of scalp EEG. Electroencephalography and Clinical Neurophysiology, 42(6), pp.817-826.

Deacon, T.W., 2011. Incomplete Nature: How Mind Emerged from Matter. W.W. Norton & Company, New York.