Whole Brain Emulation Achieved: Scientists Run a Fruit Fly Brain in Simulation

29 min read 6,329 words

Scientists have copied an entire biological brain neuron by neuron and synapse by synapse and made it control a simulated body that can walk, groom, and behave. This achievement is known as whole brain emulation the process of recreating a biological brain inside a computer simulation so its neural circuits can produce real behavior.

Historyโ€™s First Whole-Brain Upload Just Took Its First Steps In 7th March 2025

There are moments in science that do not announce themselves loudly. The first successful polymerase chain reaction was run in a car on a California highway. The first bacterial transformation was confirmed on a quiet morning in a Columbia laboratory. And on March 7, 2026, a small San Francisco company called Eon Systems released a video online that most of the world scrolled past without a second glance.

In that video, a fruit fly walked.

The world’s first whole-brain emulation that actually moves.
A complete biological fruit fly brain all 127,400 neurons and 50 million synapses from the FlyWire connectome now controls a real physics-simulated body in real time.
No training. No scripts. Pure connectome-driven behavior.
Video credit: Eon Systems (March 2026) Announcement: Dr. Alex Wissner-Gross

Not a real fruit fly. A simulated one rendered in a physics engine, running on servers, processing inputs through mathematical equations. But the brain driving that body was not artificial. It was not trained by reinforcement learning. It was not scripted or animated or approximated. It was a precise digital copy of a real biological brain, taken from a real animal, reconstructed neuron by neuron and synapse by synapse from electron microscope images, and run exactly as nature built it.

Every one of the fly’s 127,400 neurons was present. All 50 million of its synaptic connections were intact, carrying the exact weights and chemical signs that the original brain carried. The neurotransmitters โ€” acetylcholine, GABA, glutamate were predicted by a machine-learning classifier trained on the ultrastructure of synaptic clefts. When a sensory signal entered the system, it traveled through biological circuits the same way it would travel through a living fly’s brain. When motor neurons fired, they drove joints in a physically accurate body simulated by MuJoCo, DeepMind’s open-source physics engine. And when the body moved, updated proprioceptive signals fed back into the brain, closing the loop.

For the first time in history, a complete biological brain inhabited a machine. Not a simplified model. Not a statistical approximation. A literal copy.

This is what whole-brain emulation looks like in its first proof of concept. And it began, as so many profound scientific achievements do, with years of painstaking, unglamorous work that preceded the moment of triumph by nearly a decade.

Why Scientists Chose the Fruit Fly Brain First

Stunning 3D reconstructions of the FlyWire adult Drosophila brain connectome (โˆผ140,000 neurons, 50 million synapses), the foundational dataset for all subsequent modeling. Color-coded by cell type and hemisphere; these visualizations reveal the dense, brain-wide wiring that enables precise computational emulation.

The choice of Drosophila melanogaster the common fruit fly as the target for the world’s first whole-brain emulation was not arbitrary, and it was not a consolation prize. It was the result of careful scientific reasoning about what was achievable, what was meaningful, and what would teach us the most.

A human brain contains approximately 86 billion neurons and an estimated 100 trillion synaptic connections. A mouse brain contains roughly 70 million neurons. Both are utterly beyond the reach of current connectome mapping technology at synapse resolution. But the adult Drosophila central brain โ€” containing between 127,000 and 140,000 neurons, depending on how you count sits at a remarkable sweet spot. It is complex enough to support genuine behavioral diversity: vision, olfaction, taste, mechanosensation, associative learning, memory, navigation, courtship, aggression, feeding, and grooming. It is organized into dozens of anatomically distinct neuropils, each processing specific information streams, with long-range interneurons integrating signals across regions. It is, in every meaningful sense, a real brain. And as of 2024, it became the first brain of this complexity to be mapped completely at synapse resolution.

The significance of that complexity cannot be overstated. The only previously complete connectome belonged to Caenorhabditis elegans, the soil nematode, whose 302-neuron nervous system was first fully described by White, Southgate, Thomson, and Brenner in their landmark 1986 paper in Philosophical Transactions of the Royal Society B. The OpenWorm project subsequently built computational models and even embodied simulations of the C. elegans connectome a genuine scientific achievement. But 302 neurons produce only the most rudimentary behavioral repertoire. The gap between a worm and a fly is not merely quantitative. It represents qualitatively different levels of neural organization: the fly brain has multi-layered sensory hierarchies, recurrent feedback loops, a central complex for navigation and spatial memory, mushroom bodies for associative learning, and a motor system sophisticated enough to coordinate six limbs simultaneously during walking and grooming. Copying the fly brain and making it work is a genuinely different class of problem than copying a worm.

The Decade-Long Mission to Map Every Synapse: The 10-Year FlyWire Project

From raw electron microscopy to proofread connectome: the FlyWire pipeline (Nature, 2024). Every neuron and synapse was traced and validated by a global community of 127 institutions. Credit: Dorkenwald, Matsliah et al., Nature.

Before a brain can be emulated, it must be mapped. And mapping the fly brain completely at the resolution of individual synapses required one of the most extraordinary collaborative scientific efforts of the early 21st century.

The FlyWire project was built on a foundational dataset acquired in 2018 by Zheng, Lauritzen, Perlman, and colleagues, published in Cell. They used serial-section electron microscopy to image the complete brain of a single adult female Drosophila melanogaster, producing a dataset of approximately 100 teravoxels 100 trillion individual volume elements, each capturing a cubic nanometer of brain tissue. The brain was cut into roughly 7,000 ultrathin sections, each imaged at nanometer resolution to capture the finest structural details of every neurite, every dendritic spine, every synaptic vesicle cluster. The raw data volume was so immense that simply storing it required infrastructure that had not previously existed for a project of this kind.

The challenge of converting that raw imaging data into a complete wiring diagram a connectome was, however, far greater than the challenge of acquiring it. Each neuron must be traced through hundreds or thousands of image slices, its branches identified and distinguished from those of neighboring neurons that may run in parallel for long distances before diverging. Synapses must be identified at the contact points between neurons. And errors in automated segmentation where the algorithm incorrectly merges two neurons or incorrectly splits a single neuron must be detected and corrected. A naive estimate suggested that the proofreading work alone, done entirely by humans, would require approximately 10,000 years of continuous effort. This was, obviously, infeasible.

The FlyWire team, co-led by H. Sebastian Seung’s group at Princeton University and Mala Murthy’s group at Princeton, along with collaborators at the MRC Laboratory of Molecular Biology in Cambridge and the University of Vermont, solved this through an innovative combination of deep-learning segmentation and coordinated global crowdsourcing. Their automated segmentation system used convolutional neural networks trained on manually annotated sections to trace neuron boundaries through the full volume. A web-based platform called FlyWire then allowed trained proofreaders and expert neuroscientists worldwide to identify and correct segmentation errors, flagging merge errors (where two neurons were incorrectly joined) and split errors (where a single neuron was incorrectly divided) through an accessible browser interface. Over the course of the project, the global community contributed the equivalent of approximately 33 years of continuous human proofreading work โ€” compressed into a manageable timeline through parallelization across 127 institutions worldwide.

The flagship result of this effort, published by Dorkenwald, Matsliah, Sterling, Schlegel, and colleagues as the lead paper of a coordinated suite of publications in Nature in October 2024, described the complete proofread connectome at materialization version 630: 139,255 neurons and over 50 million chemical synapses. Simultaneously, companion papers released the comprehensive cell-type annotations of all proofread neurons (Schlegel et al., Nature), a full catalog of the visual system cell types (Matsliah and Yu et al.), an analysis of whole-brain network topology and statistics (Lin et al., Nature), and the computational brain model by Shiu et al. that lies at the heart of the Eon demonstration. The entire collection represented the most comprehensive description of any brain ever assembled.

A critical parallel contribution came from Eckstein, Buhmann, Cook, and colleagues, who published in Cell a machine-learning classifier capable of predicting the neurotransmitter identity of every neuron in the FlyWire connectome directly from the ultrastructural morphology of its synaptic clefts in the EM images. This classifier distinguished among six neurotransmitter types acetylcholine, GABA, glutamate, dopamine, octopamine, and serotonin based solely on the shape, density, and arrangement of synaptic vesicles and postsynaptic densities visible in the electron micrographs. Its predictions achieved high accuracy when validated against ground-truth immunohistochemical labeling for known neurotransmitter types, and its outputs provided the chemical signs excitatory or inhibitory that the Shiu computational model needed to assign to each of the 50 million connections.

Building the Thinking Machine: Shiu et al.’s Whole-Brain Computational Model

Shiu et al. (Nature, 2024) LIF model validation: sensorimotor predictions matched real optogenetic and calcium-imaging experiments with 91โ€“95% accuracy. The biological wiring (not shuffled connections) is what encodes behavior.

The step from a complete connectome to a working computational brain model is not trivial. A connectome is a static graph: neurons are nodes, synapses are weighted directed edges, and neurotransmitter identities assign signs to those edges. It describes the architecture of the brain but says nothing, by itself, about how that architecture produces dynamics. Converting it into a system that generates time-varying neural activity in response to inputs required both the right mathematical framework and the right parameter choices โ€” and enough experimental validation to know whether the framework was actually capturing biological reality.

Philip K. Shiu, working as a postdoctoral researcher in Kristin Scott’s laboratory at the University of California, Berkeley, led this effort with Gabriella R. Sterne, Nico Spiller, Romain Franconville, and 21 additional co-authors. Their paper, A Drosophila computational brain model reveals sensorimotor processing, published in Nature (volume 634, pages 210โ€“219) on October 2, 2024, describes the model in complete detail. Shiu subsequently joined Eon Systems as senior scientist, bringing the model directly into the embodiment project.

Shiu’s model is built on the leaky integrate-and-fire (LIF) framework โ€” the simplest biophysically grounded neuron model that captures the core integrate-and-threshold behavior of real neurons. Each of the 127,400 neurons is governed by two coupled first-order ordinary differential equations.

The membrane potential \(v_i\) evolves as:

\[\frac{dv_i}{dt} = \frac{g_i – (v_i – V_\text{rest})}{\tau_\text{mbr}}\]

where \(g_i\) is the total synaptic input conductance and \(\tau_\text{mbr}\) is the membrane time constant. Simultaneously, the conductance decays as:

\[\frac{dg_i}{dt} = -\frac{g_i}{\tau}, \quad \tau = 5\text{ ms}\]

When \(v_i\) reaches the firing threshold \(V_\text{thresh} = -45\text{ mV}\), the neuron emits a spike, its potential resets to \(V_\text{reset} = V_\text{rest} = -52\text{ mV}\), and it enters a 2.2 ms refractory period during which it cannot fire again.

Synaptic transmission is modeled with a 1.8 ms biological delay. Each arriving presynaptic spike increments \(g_i\) by a synaptic weight:

\[w_{j,i} = \text{FlyWire\_weight} \times \text{sign} \times W_\text{syn}\]

where sign is \(+1\) for excitatory and \(-1\) for inhibitory neurons, and \(W_\text{syn} = 0.275\text{ mV}\) is a global scaling factor.

V_thresh โˆ’45 mV Firing threshold
V_reset = V_rest โˆ’52 mV Reset & resting potential
ฯ„_syn 5 ms Synaptic decay time constant
t_refract 2.2 ms Refractory period
t_delay 1.8 ms Synaptic transmission delay
W_syn 0.275 mV Global synaptic scaling factor

Note All parameters were grounded in the known electrophysiology of Drosophila neurons, with \(W_\text{syn}\) specifically tuned to reproduce realistic population-level firing rates. The full model was implemented in Brian2, an open-source Python-based spiking neural network simulator built for biological fidelity and large-scale simulations.

A neuron was classified as inhibitory if more than 50% of its presynaptic output sites were predicted by the Eckstein classifier to release GABA or glutamate. This threshold-based classification reflects an important biological fact about the fly brain: unlike in vertebrates, where glutamate is the primary excitatory neurotransmitter, glutamate in the Drosophila central brain acts predominantly as an inhibitory neurotransmitter, activating glutamate-gated chloride channels that hyperpolarize postsynaptic neurons. Applying this rule across all 127,400 neurons yielded a distribution of approximately 55% cholinergic (excitatory), 24% glutamatergic (inhibitory), 14% GABAergic (inhibitory), and roughly 7% monoaminergic โ€” dopaminergic, octopaminergic, or serotonergic neurons with predominantly modulatory functions.

The network topology of the full connectome, analyzed in the companion paper by Lin and colleagues in Nature, revealed a rich-club organization: approximately 30% of all neurons are highly interconnected hub neurons that preferentially connect to one another. Within this rich-club, the analysis identified 676 broadcaster neurons (high out-degree, relatively low in-degree โ€” neurons that widely distribute signals), 638 integrator neurons (high in-degree, relatively low out-degree โ€” neurons that collect and compress signals from many sources), and over 37,000 neurons with balanced reciprocal connectivity. This rich-club architecture has direct functional implications for the LIF model: hub neurons serve as long-range amplifiers and synchronizers of activity, enabling sensory signals to propagate reliably through multiple processing layers from sensory input to motor output. Because the Shiu model uses the exact biological connectome rather than a statistical approximation, it preserves this rich-club topology completely.

The model was validated primarily through two behavioral circuits: gustatory-driven feeding initiation and mechanosensory-driven antennal grooming. For the feeding circuit, Poisson-distributed spike trains at frequencies ranging from 10 to 260 Hz were injected into gustatory receptor neuron (GRN) populations โ€” either sugar-sensing GRNs or water-sensing GRNs โ€” replicating the natural range of GRN responses to tastant concentration in the living fly. The model’s predicted downstream neural activity was compared against calcium-imaging recordings and optogenetic perturbation experiments from the published literature and from the authors’ own new experiments. The results were striking. For sugar GRN stimulation at 100 Hz, the model identified 47 neurons collectively sufficient and 14 neurons individually required for reliable activation of MN9, the motor neuron driving proboscis extension (the fly’s feeding reflex). When these predicted neurons were targeted in the living fly using optogenetic stimulation, 10 out of 11 predicted activations correctly elicited rostrum extension โ€” an accuracy of 91%. Across a confusion-matrix validation of 106 SEZ (subesophageal zone) cell types encompassing 164 testable predictions, the model achieved 84 to 91% accuracy. When the model predicted that combined sugar and water stimulation would be synergistic due to 250 to 391 shared downstream neurons, this was confirmed experimentally โ€” a non-obvious prediction that the authors state they would not have thought to test without computational guidance. When bitter-sensing GRNs or Ir94e neurons were activated, the model correctly predicted multi-step inhibition of appetitive pathways.

One result deserves particular emphasis because it addresses the most fundamental question about whether the connectome itself โ€” rather than some more general statistical property โ€” encodes behavior. When the full biological connectome was used to simulate sugar GRN stimulation at 100 Hz, MN9 activated correctly in 100% of simulation runs. When synaptic weights were randomly shuffled across the network โ€” preserving all the same neurons and all the same total connectivity statistics, but destroying the specific pattern of who connects to whom โ€” MN9 activated correctly in only 1 out of 100 shuffled simulations. This result is unambiguous: it is not the existence of connections that encodes the feeding reflex, nor their total number, nor their statistical distribution. It is the specific, precise, evolutionarily shaped arrangement of 50 million connections โ€” exactly as the animal has them โ€” that encodes the behavior. The wiring is the computation.

For the mechanosensory validation, the model demonstrated that activating Johnston’s organ neuron subtypes JO-CE (sensitive to sound and vibration) versus JO-F (sensitive to static deflection and touch) produced distinct downstream activity patterns โ€” specifically, JO-CE activation selectively engaged the premotor interneuron aBN1, a key node in the antennal grooming circuit, while JO-F activation selectively engaged aDN1, a descending neuron projecting to foreleg motor circuits. Both predictions were validated by two-photon calcium imaging experiments. This demonstrated that the model generalizes correctly beyond taste circuits to mechanosensory processing โ€” the very circuits that Eon Systems subsequently connected to the physical fly body.

Building the Body: NeuroMechFly v2

NeuroMechFly v2 (Nature Methods, 2024): the anatomically accurate Drosophila body in MuJoCo with vision, olfaction, proprioception, and tarsal adhesion exactly the physics body now controlled by the Shiu connectome model. Credit: Wang-Chen, ร–zdil, Ramdya et al.

A brain model that correctly predicts neural activity patterns is scientifically valuable. But to know whether it can actually drive naturalistic behavior โ€” whether the computation it performs is genuinely sufficient to control a body in the physical world โ€” you need to give it a body to control.

That body was provided by NeuroMechFly v2, developed by Sibo Wang-Chen, Pembe Gizem ร–zdil, Pavan Ramdya, and colleagues at EPFL’s Neuroengineering Laboratory, published in Nature Methods in 2024. NeuroMechFly v2 embeds an anatomically detailed Drosophila body model within MuJoCo, DeepMind’s open-source physics engine that uses generalized coordinates and constraint-based contact dynamics to simulate rigid-body systems with high fidelity. The v2 model introduced several critical anatomical upgrades over the original NeuroMechFly: updated joint models for both antennae with biologically accurate ranges of motion for pitch, roll, and yaw; a revised neck model that correctly captures head-to-thorax kinematics for head stabilization reflexes; thorax segment angles recalibrated against micro-CT volumetric reconstructions of adult Drosophila anatomy; and a tarsal adhesion model that toggles between stance phase (where van der Waals-like adhesive forces at tarsal pads maintain contact) and swing phase (where adhesion is released), enabling locomotion on sloped, vertical, and inverted surfaces.

The sensory apparatus of the simulated body is equally detailed. The compound eye is modeled as a fisheye ommatidial array with approximately 270 degrees of field of view, implementing a 7:3 ratio of yellow-type to pale-type photoreceptors that matches the known Drosophila photoreceptor distribution โ€” the model thus produces spectrally differentiated visual input, not merely luminance signals. Bilateral antennal olfactory sensors with degrees of freedom produce concentration-gradient signals for odor-taxis. Joint angle encoders on every leg and body joint provide continuous proprioceptive feedback, and Boolean contact sensors on each tarsus provide ground-contact state information. These ascending signals โ€” proprioception and contact โ€” are the feedback channel that closes the sensorimotor loop, informing the brain model continuously about the current state of the body.

The framework was validated in the v2 paper across a suite of demanding behavioral tasks: closed-loop visual object tracking in which head movements and steering maintained a small visual target in the center of the visual field; odor-taxis navigation toward an odor source using bilateral antennal concentration differences in a realistically simulated chemical plume; path integration using only idiothetic (self-motion) cues to return to a starting location after complex outbound trajectories; and connectome-constrained visual following in which T1 through T5 lamina neurons and Tm transmedullary neurons from the FlyWire connectome drove LC9 and LC10 lobula complex neurons to generate corrective steering commands. This last demonstration was directly relevant to the Eon integration: it showed that real connectome-derived circuit motifs, when inserted into the NeuroMechFly control architecture, could drive coherent and stable behavior in the physical simulation.

The Coordination Problem: How the Fly Brain Synchronizes Its Body Parts

Centralized grooming coordination in NeuroMechFly: head, antenna, and foreleg move in precise synchrony โ€” even after amputation experiments proved the coordination is brain-driven, not peripheral. Credit: ร–zdil et al. / Ramdya Lab.

One of the less obvious challenges in building an embodied brain emulation is the problem of multi-limb coordination. A fly grooming its antennae does not simply move its forelegs independently toward its face. It simultaneously rotates its head toward the antenna being groomed, adjusts the angle of the antenna to meet the foreleg, and times the grooming stroke precisely with the collision between foreleg tibia and antenna. These three movements โ€” head rotation, antennal adjustment, and foreleg stroke โ€” must be synchronized with sub-millisecond precision. Understanding where in the brain this coordination is encoded was essential to making the embodied emulation work correctly.

This question was addressed directly by Pembe Gizem ร–zdil, Sibo Wang-Chen, Pavan Ramdya, and colleagues in a 2024 preprint deposited on bioRxiv. Using high-speed videography of real Drosophila antennal grooming, they extracted precise kinematics and discovered the tripartite synchronization pattern described above. When this kinematic pattern was replayed in NeuroMechFly v2 physics simulation, the biomechanically coordinated grooming produced significantly more effective cleaning โ€” the foreleg stroke contacted a larger surface area of the antenna โ€” compared to uncoordinated movement. This validated that the synchronization serves a genuine functional purpose rather than being a passive consequence of body mechanics.

To determine the neural mechanism, the team performed amputation and immobilization experiments in the living fly, removing or fixing individual body parts and recording the movements of the remaining intact parts during spontaneous grooming. The result was decisive: coordination of the remaining body parts persisted even after the removal of any single part from the coordinated movement. A fly with an immobilized antenna still showed appropriate head rotation timing. A fly with an immobilized foreleg still showed coordinated antennal movement. This rules out purely peripheral proprioceptive chaining as the mechanism โ€” the coordination cannot be driven by each movement triggering a sensory signal that initiates the next movement, because the next movement still occurs even when the preceding one cannot happen. The coordination must therefore be centrally organized, encoded in interneurons that command multiple body-part motor modules simultaneously.

Mining the FlyWire connectome for the antennal grooming subnetwork, the authors identified a set of centralized interneurons connecting neck, antennal, and foreleg motor modules, and through a simulated activation screen in NeuroMechFly โ€” systematically activating candidate interneurons and observing the resulting movements โ€” confirmed two coupled circuit motifs that are sufficient to explain the observed coordination. The first is a recurrent excitatory subnetwork that drives contralateral antennal pitch during grooming โ€” when the right antenna is being groomed, excitatory neurons activate the left-side antennal motor neurons to create the appropriate counter-movement. The second is a broadcast inhibitory neuron that suppresses ipsilateral antennal pitch during the grooming stroke, preventing the groomed antenna from moving in a direction that would deflect the foreleg. Together, these two motifs produce robust unilateral and bilateral grooming coordination without requiring peripheral sensory feedback โ€” consistent with the amputation results. These same motifs were directly incorporated into the Eon embodied emulation, providing the centralized coordination that allows the brain model’s motor outputs to produce coherent, naturalistic multi-limb grooming sequences.

March 7, 2026: The Loop Closes

The embodied fly in action: motor commands from the biological connectome drive joint torques in MuJoCo; proprioception and contact forces flow back. This is the first time a complete biological brain has closed the perceptionโ€“action loop in simulation.

Eon Systems PBC โ€” a San Francisco public benefit corporation co-founded by Alexander D. Wissner-Gross and with Philip K. Shiu as senior scientist โ€” announced its demonstration on March 7, 2026, with Wissner-Gross posting on X and writing a detailed Substack piece titled “The First Multi-Behavior Brain Upload.” The integration brought three independently validated components into a single running system: the Shiu LIF brain model with all 127,400 neurons, 50 million synaptic connections, and ML-predicted neurotransmitter identities running in Brian2; the NeuroMechFly v2 anatomically accurate fly body in MuJoCo with full sensory apparatus; and the ร–zdil grooming coordination motifs governing multi-limb synchronization.

The data flow of the integrated system operates as a continuous real-time loop. Sensory signals from the MuJoCo simulation โ€” structured visual input from the compound eye model, olfactory concentration signals from the antennal sensors, joint angles from every leg and body joint, and contact states from each tarsus โ€” are encoded as Poisson spike trains and injected into the appropriate sensory neuron populations in the LIF brain model. These spikes then propagate neuron by neuron through the full 127,400-neuron network, following the exact synaptic weights and excitatory or inhibitory signs of the biological connectome, through sensory processing layers, interneuron populations, premotor circuits, and finally to motor neuron populations โ€” without any external control signal, reward function, or behavioral policy intervening at any stage. The firing rates of motor neuron populations are decoded into joint torque commands and transmitted to the MuJoCo actuators controlling each leg joint, antennal joint, and neck joint. MuJoCo advances the physical simulation one timestep, computing contact forces, body dynamics, friction, and adhesion based on the commanded torques. Updated proprioceptive signals are read from the simulation and re-encoded as ascending feedback spike trains, completing the cycle.

The behavioral repertoire observed in the demonstration included coordinated hexapod locomotion with both tripod and metachronal walking gaits, spontaneous postural correction in response to perturbation, initiation and execution of full antennal grooming sequences with the tripartite synchronization described by ร–zdil et al., and natural transitions between walking and stationary states. Every behavior arose from the same running brain model โ€” there was no switching between different neural circuits or controllers. This is precisely what happens in a living fly: walking, grooming, and balance are different motor programs that coexist in the same brain and are selected and executed by the same biological circuits depending on the moment-to-moment state of the animal and its environment.

What Makes This Categorically Different From Everything That Came Before

Contrast: DeepMind/Janelia RL-controlled fly (Nature, 2025). Beautiful movement, but learned from video data โ€” not driven by the real biological connectome. Eonโ€™s fly is the actual ghost in the machine.

To appreciate the significance of the Eon demonstration, it is worth placing it precisely against the landscape of prior work.

In 2025, Vaxenburg, Siwanowicz, Merel, and colleagues at DeepMind and the Janelia Research Campus published in Nature a whole-body physics simulation of fruit fly locomotion and flight in MuJoCo that is visually spectacular โ€” the simulated fly walks, runs, stumbles, and recovers with kinematic statistics that closely match those of real flies. But the controller driving that simulation is a Soft Actor-Critic reinforcement learning agent trained on kinematic reference data extracted from high-speed video recordings of real fly behavior. The weights of the RL neural network have no relationship whatsoever to the anatomy of the fly nervous system. The fly body is biological; the controller is not. It is, in essence, a very sophisticated motion-capture replay system with learned interpolation. It produces biological-looking behavior by learning to imitate biological movement โ€” not by running the biological computation that originally produced that movement.

The OpenWorm project, which has been running since 2011, does use the actual C. elegans connectome โ€” all 302 neurons โ€” and has produced embodied simulations of nematode locomotion and chemotaxis that represent genuine scientific contributions. But as noted above, 302 neurons support only the most rudimentary behavioral diversity. The C. elegans nervous system lacks the multi-layered sensory hierarchies, the segregated memory and learning circuits, and the multi-limb coordination architecture present in the fly. The OpenWorm project proved the concept that a biological connectome can drive a simulated body. It could not demonstrate that this concept scales to a brain of genuine complexity.

Eon’s system transcends both limitations simultaneously. It uses a brain of genuine complexity 127,400 neurons, 50 million connections, real biological circuitry validated against optogenetics and calcium imaging โ€” and it produces not a learned imitation of biological behavior but the biological computation itself. As Wissner-Gross described it, the distinction is between a ghost and a recording of a ghost: DeepMind’s fly is a recording; Eon’s fly is the ghost.

What This Brain Simulation Still Canโ€™t Do

The roadmap to whole-brain emulation (2025โ€“future). Fly (140k neurons) โ†’ Mouse (70 million) โ†’ Human. Eon Systems is already building the mouse datasets. Credit: adapted from NotebookLM / whole-brain emulation pathway visuals.

Shiu and his colleagues have been candid about the simplifications built into the current system. The LIF model treats all neurons of the same neurotransmitter type as computationally equivalent, ignoring the enormous morphological diversity of real neurons โ€” the branching patterns of dendrites, the distribution of synaptic inputs across dendritic compartments, the specialized properties of axon initial segments. Gabriella Sterne, co-author of the Shiu paper, noted at the time of the connectome’s publication that “we’re not there yet because one thing this connectome lacks is information about how the motor neurons connect to physical features of the body like the muscles” โ€” a gap that the current Eon implementation bridges through a population-decoding step that maps motor neuron firing rates to joint torques, a biologically plausible but not biologically exact interface. The current model also does not capture volume transmission by monoaminergic modulators, the effects of neuropeptides, glial contributions to neural dynamics, short-term synaptic plasticity, or any form of learning-related plasticity.

These are not trivial omissions. But the model’s greater than 90% accuracy on sensorimotor prediction, and its ability to produce naturalistic behavior in the embodied simulation, validates that the biological connectome alone โ€” without morphological detail, without modulators, without plasticity โ€” encodes a remarkable fraction of the brain’s behavioral computation. Each additional layer of biological detail that can be incorporated will increase fidelity further. The current system is not the final word; it is the opening sentence of a much longer conversation.

The path forward is clear in its direction if daunting in its scale. A mouse brain, with approximately 70 million neurons, is roughly 560 times larger than the fly brain. Organizations including E11 Bio, a focused research organization dedicated to mouse connectome mapping, are developing next-generation approaches combining expansion microscopy โ€” which physically swells brain tissue by factors of 4 to 20, making features that would require electron microscopy visible under conventional light microscopes โ€” with multiplexed fluorescence labeling, high-throughput light-sheet imaging, and barcoded neuron identity systems. Eon Systems has publicly committed to assembling the foundational datasets for a mouse brain emulation, combining large-scale connectome mapping with tens of thousands of hours of calcium and voltage imaging data to build functional constraints on the connectivity model.

Shiu put his own expectation plainly: “This really suggests that getting a mouse connectome, and eventually a human connectome, will be incredibly valuable. We can imagine a world where we can simulate a mouse brain, or eventually a human brain, and really get fundamental insights into the causes of various mental health disorders and about how the brain works.”

Reason Behind Why This Changes Neuroscience Forever

The immediate scientific applications of a validated, embodied fly brain emulation are considerable. The model can serve as a platform for systematic virtual perturbation experiments silencing specific neuron populations, adding or removing specific connections, modifying neurotransmitter parameters that would generate testable predictions for subsequent experiments in living animals. This could dramatically accelerate the pace of circuit-level neuroscience by allowing researchers to narrow down the candidate mechanisms for a behavior from thousands of possibilities to a handful of high-confidence targets before touching a single animal.

Next target: mouse brain connectome using expansion microscopy (E11 Bio roadmap). The same pipeline that worked for the fly is already being scaled 560ร— larger, but the principle is proven.

In medicine, the implications extend to drug and intervention screening at the circuit level. Because the model correctly captures sensorimotor transformations, modifications to it that mimic the effects of a pharmacological compound altering the excitability of specific neuron types, changing synaptic transmission kinetics can be tested for their effects on behavior in the simulation before any animal or clinical experiment. This is particularly valuable for neurological and psychiatric conditions where the circuit-level mechanisms are poorly understood. A model of dopaminergic neuron degeneration could reveal which downstream circuits are most disrupted and which behavioral outputs are most sensitive generating specific, testable predictions for intervention strategies.

For artificial intelligence, the demonstration carries a different kind of implication. The fly brain model produces stable, multi-behavioral locomotion and grooming from zero training data, zero reward signal, and zero motion-capture reference. It does this because the biological connectome encodes, through millions of years of evolutionary optimization, extraordinarily efficient solutions to the problems of sensorimotor control, multi-limb coordination, and behavioral state management. Current deep reinforcement learning systems require millions of training steps and massive reward engineering to approximate the behavioral competence that the fly achieves from its first second of existence. Understanding the organizational principles responsible rich-club topology, centralized coordination motifs, the specific balance of recurrent excitation and broadcast inhibition may provide fundamentally new architectural principles for AI systems that are more sample-efficient, more robust, and more capable of behavioral generalization than current approaches.

And beyond all of these near-term applications lies the longer horizon. If the pipeline that worked for a fly scales to a mouse, and from a mouse to a primate, and from a primate to a human if the same logic of map every connection, build a computational model, validate against biology, connect to a physical substrate holds across six orders of magnitude of neural complexity, then the questions that follow are among the most profound that humanity has ever faced. Questions about the nature of consciousness, the continuity of personal identity, the moral status of digital entities, and the meaning of death in a world where the physical substrate of a mind can be preserved and run indefinitely. These questions deserve serious, open, and rigorous engagement from ethicists, philosophers, legal scholars, and the public now, while the technology is in its earliest stages and while there is still time to build the frameworks that will govern what comes next.

The Moment Brain Simulation Became Real

It started with a 100-teravoxel electron microscope dataset of a brain smaller than a grain of sand. It required a decade of international collaboration, 33 years of equivalent human proofreading work, machine-learning classifiers trained on synaptic ultrastructure, a 127,400-neuron mathematical model validated against optogenetics and calcium imaging, a physically accurate simulated body with compound eyes and adhesive feet and proprioceptive joints, and a closed-loop architecture that lets sensation drive computation and computation drive movement and movement drive sensation again, without end.

On March 7, 2026, it walked.

Not because it was programmed to walk. Not because it was trained to walk. Because the biological circuits that evolution built to make a fruit fly walk were copied faithfully enough, and connected to a body accurately enough, that walking emerged on its own as it has emerged, in living flies, for tens of millions of years.

The ghost has found its machine. And the machine is just getting started.


References

  1. Shiu, P. K. et al. A Drosophila computational brain model reveals sensorimotor processing. Nature634, 210โ€“219 (2024). (Primary source for the 127,400-neuron LIF model, 91โ€“95% motor prediction accuracy, full methods & validation).
  2. Wang-Chen, S. et al. NeuroMechFly v2: simulating embodied sensorimotor control in adult Drosophila. Nature Methods (2024). (Full biomechanical MuJoCo framework, vision/olfaction/ascending feedback, connectome-constrained visual subnetworks).
  3. ร–zdil, P. G. et al. Centralized brain networks underlie body part coordination during grooming. bioRxiv (2024). (Connectome-derived grooming motifs; efficiency validated in NeuroMechFly biomechanics).
  4. Dorkenwald, S. et al. (FlyWire consortium papers). Multiple companion Nature articles (2024) detailing the core connectome assembly.
  5. Vaxenburg, R. et al. Whole-body physics simulation of fruit fly locomotion. Nature (2025). (DeepMind/Janelia RL baseline for contrast).