Open Source Research Project

Embodied Drosophila

The first whole-brain connectome-driven simulation of a fruit fly in a physics-based biomechanical body. 138,639 spiking neurons. Compound-eye vision. Olfaction. Gustation. Flight. All from the real connectome.

138,639
LIF Neurons
5M+
Synapses
6
Sensory Systems
0.1ms
Timestep
View on GitHub Quick Start

What is Embodied Drosophila?

A complete virtual fruit fly where every behavior emerges from the connectome — no hardcoded rules, no if-else chains for actions.

🧠

Whole-Brain Simulation

138,639 Leaky Integrate-and-Fire neurons running on GPU (PyTorch) at 0.1ms resolution. Connectivity from FlyWire v783 — the most complete Drosophila connectome.

🐞

Biomechanical Body

NeuroMechFly v2 with 6 articulated legs, contact sensors, adhesion pads, and compound eyes. Physics simulated by MuJoCo at 10kHz.

👁

Compound-Eye Vision

750 ommatidia per eye with a full motion detection cascade (T1-T5 neurons). Looming detection via LC4 triggers escape through the Giant Fiber.

👃

Olfaction & Chemotaxis

~2,600 olfactory receptor neurons with bilateral gradient sensing. The fly navigates toward food and away from danger using real antenna geometry.

👅

Gustation & Feeding

Tarsal taste detection through leg contact. Sugar triggers proboscis extension; bitter compounds trigger avoidance. Proboscis extends via a dynamic hinge joint.

✈️

Virtual Flight

Giant Fiber activation triggers takeoff. The fly lifts off, hovers, steers via descending neurons, and lands with a controlled descent. 3D forces applied to the thorax.

System Architecture

Brain and body run at different timescales and communicate through descending neurons, just like the real fly.

BRAIN (GPU / PyTorch) BODY (MuJoCo / flygym) ┌─────────────────────────────────────┐ ┌───────────────────────────┐ │ 138,639 LIF neurons │ │ NeuroMechFly v2 │ │ 5M+ synapses (sparse GPU) │ │ 6 legs x 3 joints │ │ │ │ compound eyes (2 x 750) │ │ Visual: T1─T2─T3─T4/T5─LC4─GF │ │ contact sensors │ │ Olfactory: ORN─PN─KC─MBON │ DN │ adhesion pads │ │ Gustatory: GRN─SEZ─MN │───►│ proboscis (hinge joint) │ │ Somatosensory: mechano─IN─MN │ │ looming arena │ │ Flight: GF ─► xfrc_applied │ │ taste/odor zones │ └─────────────────────────────────────┘ └───────────────────────────┘

Emergent Behaviors

All behaviors arise from connectome-driven neural activity. The brain-body bridge translates descending neuron firing rates into locomotion modes.

Behavior Neural Pathway Trigger
Walking DN rates → CPG modulation Default locomotion (tripod/tetrapod gait)
Escape Flight LC4 → Giant Fiber → DNs → xfrc_applied Looming object (expanding retinal image)
Chemotaxis ORN → PN → KC → MBON → DN turn Odor gradient (bilateral antenna comparison)
Feeding GRN → SEZ → MN9 → proboscis extension Tarsal contact with sugar zone
Aversion Bitter GRN → SEZ → avoidance motor Tarsal contact with bitter zone
Grooming AMMC → antennal MN → leg sweep Antennal mechanosensory activation
Courtship Song P1 → pIP10 → vPR → wing MN P1 neuron activation (male → female signal)
Tactile Escape Mechanoreceptor → ascending IN → DN Sudden high contact force (>35 N)

Visual Processing Pipeline

The compound eye vision system implements the biological motion detection cascade found in the Drosophila optic lobe.

Retina

750 ommatidia
per eye

T1 (Lamina)

Luminance →
contrast

T2 (Medulla)

Temporal
derivative

T4/T5

Directional
motion energy

LC4 → GF

Looming →
escape

Quick Start

Get the simulation running in under 5 minutes.

# Clone the repository git clone https://github.com/erojasoficial-byte/fly-brain.git cd fly-brain # Create conda environment conda env create -f environment.yml conda activate brain-fly pip install flygym mujoco # Run the embodied simulation with visuals and brain monitor python fly_embodied.py --visual --monitor # Enable virtual flight python fly_embodied.py --visual --monitor --flight # Add olfactory and gustatory stimuli python fly_embodied.py --visual --monitor --flight --sugar 10,0 --odor attractive,15,5

Neural Simulation Benchmarks

The project includes benchmarks comparing four GPU/CPU frameworks for simulating the same 138K-neuron network.

Framework Backend Status
Brian2 C++ standalone (multi-core CPU) Ready
Brian2CUDA CUDA standalone (GPU) Ready
PyTorch Sparse CUDA (GPU) Ready — used in embodied simulation
NEST GPU Custom CUDA kernel (user_m1) Ready

Contribute to Embodied Drosophila

This is an open research project. Whether you're a neuroscientist, roboticist, ML engineer, or student — you're welcome to contribute.

New Sensory Modalities

Auditory (Johnston's organ), hygrosensation, thermosensation, gravity sensing

Circuit Analysis

Identify and validate specific neural circuits in the FlyWire connectome

Behavioral Validation

Compare simulated behaviors against real Drosophila experimental data

Performance & Scale

GPU optimization, multi-GPU support, neuromorphic hardware ports (Loihi, SpiNNaker)

Learning & Memory

Implement mushroom body plasticity, associative learning, habituation

Visualization

Better real-time monitors, VR integration, data dashboards, 3D neural activity maps

Open an Issue Submit a PR

Technical Papers

Detailed documentation of the complete system, from neural model to biomechanics.

English Paper

Full technical paper covering architecture, sensory systems, virtual flight, and emergent behaviors.


Read on GitHub →

Paper en Español

Artículo técnico completo que cubre la arquitectura, sistemas sensoriales, vuelo virtual y comportamientos emergentes.


Leer en GitHub →