Computational neuroscience sits at the intersection of two massive data gathering efforts. The BRAIN Initiative in the US and the Human Brain Project in Europe have poured billions of dollars into electrophysiology, connectomics, and imaging over the past decade. The Allen Institute publishes gene expression and single-cell models openly. Connectomics projects like MICrONS and FlyWire have released millions of neurons and tens of millions of synapses at nanometer resolution.
All of that data eventually needs to meet a simulator. You cannot understand what a circuit computes by looking at its wiring diagram alone: you have to put it into motion and watch what it does. For decades that has meant installing NEURON or Brian2, configuring MPI or OpenMP backends, and running jobs on whatever cluster you could access. SciRouter's neuroscience API removes that infrastructure step entirely.
You describe the circuit; the API runs it on the right backend (CPU for small jobs, GPU for large ones) and returns the spiking activity, voltage traces, or summary statistics you asked for.
NEURON: biophysical detail and compartmental models
NEURON has been the standard tool for detailed neuron modeling since the early 1990s. It lets you build compartmental models where each neuron is divided into hundreds or thousands of electrical compartments, each with its own mix of ion channels governed by Hodgkin-Huxley-style kinetics. You can reproduce experimental voltage traces with high fidelity if you have the right parameters.
The downside is that NEURON is not easy to learn. It has its own scripting language (hoc), a Python interface that wraps it, and a large collection of mechanisms specified in a domain-specific language called NMODL. Setting up a serious simulation takes weeks to months for a newcomer. The API hides all of that: you pick a pretrained model from the library (say, an Allen Institute layer 5 pyramidal cell) and run simulations against it with a single call.
Brian2: Python-first network simulation
Brian2 took a different philosophy. Instead of providing a fixed library of mechanisms, it lets you write model equations directly in Python using a symbolic syntax close to how you would write them on paper. Brian2 then compiles the equations to C++ for performance, or to CUDA for GPU execution.
The result is a simulator that feels like a scientific programming tool rather than a specialized simulation package. It is the preferred choice for population-level and network-level modeling where the emphasis is on dynamics rather than single-cell biophysics. The API wraps Brian2 with the same schema as NEURON so you can swap between them and compare.
The Allen Brain Atlas library
The Allen Institute for Brain Science publishes a rich library of pretrained biophysical neuron models, cortical network models, and optimized channel kinetics. These are high-quality starting points for many research questions. The API bundles the Allen Institute library and exposes each model by ID, so you do not need to download and install their Python SDK yourself.
- Single-cell biophysical models. Layer-specific pyramidal cells, interneurons, thalamic cells, and striatal cells, each fit against patch clamp data from Allen Cell Types Database.
- Cortical network models. Layered microcircuit models that can be driven with naturalistic inputs.
- Whole-brain-scale models. Mesoscale models based on the Allen Mouse Connectivity Atlas.
A simulation call
import httpx
API_KEY = "sk-sci-..."
BASE = "https://scirouter.ai/v1"
response = httpx.post(
f"{BASE}/neuro/simulate-circuit",
headers={"Authorization": f"Bearer {API_KEY}"},
json={
"simulator": "brian2",
"network": {
"template": "cortical-microcircuit-v2",
"n_neurons": 10000,
"params": {
"excitatory_fraction": 0.8,
"connection_probability": 0.1,
},
},
"stimulus": {
"type": "poisson",
"rate_hz": 5,
"duration_ms": 1000,
},
"record": ["spikes", "population_rate"],
"duration_ms": 2000,
"backend": "gpu",
},
timeout=600,
)
result = response.json()
print(f"Total spikes: {result['n_spikes']}")
print(f"Mean firing rate: {result['mean_rate_hz']:.2f} Hz")
print(f"Results NWB file: {result['nwb_url']}")The call returns spike times, voltage traces (if requested), and population statistics. Results are stored as NWB (Neurodata Without Borders) files for compatibility with standard neuroscience analysis tools.
GPU backends and why they matter
A 10,000-neuron network simulation can take hours on a CPU. GPU-accelerated backends (Brian2CUDA, GeNN, CoreNEURON) compile the same model to CUDA and often achieve 10x to 100x speedups. The API selects the backend automatically based on network size: small simulations run on CPU, large ones on GPU. You do not need to change your code to take advantage of either.
The distinction matters most for parameter sweeps and fitting workflows where you need to run hundreds or thousands of simulations. A GPU backend turns an overnight run into a 15-minute job, which changes how ambitious you can be about the questions you ask.
Connectomics-driven simulations
Connectomics datasets (MICrONS, FlyWire, Janelia hemibrain, OpenWorm) provide nanometer-resolution wiring diagrams that used to be the stuff of science fiction. The challenge is turning those static diagrams into dynamic simulations. The API supports this workflow:
- Load a connectome subgraph by brain region or cell-type criteria.
- Assign cell-type-specific parameters from a curated library (Allen Institute, Blue Brain Project, or user-provided).
- Run the resulting simulation with Brian2 or NEURON.
- Analyze the output in NWB format with standard tools.
This is one of the most exciting research directions in computational neuroscience. Being able to simulate an actual brain region, built from an actual wiring diagram, with realistic cell types, is a new kind of experiment that was not practical a few years ago.
Parameter fitting and model optimization
Fitting a biophysical neuron model to electrophysiology data is one of the classical hard problems in computational neuroscience. You have a target voltage trace, a model with dozens of parameters, and a nonconvex loss landscape. Tools like BluePyOpt and SNOOPY use evolutionary algorithms or Bayesian optimization to search the parameter space.
The API exposes these fitting workflows as a single call. You supply the target data, pick a model template, and the service runs the optimization in parallel. Because the optimizer itself is compute-heavy, the GPU backend is usually the right choice.
Use cases in current research
- Pharmacology and ion channel studies.Simulate the effect of a new channel modulator on single-cell excitability before running wet-lab experiments.
- Neural prosthetics. Test electrode placement and stimulation protocols in detailed cortical models before human trials.
- Disease modeling. Simulate the effect of ion channel mutations on network excitability in epilepsy research.
- Computational modeling of behavior. Link neural circuits to behavioral output through closed-loop simulations.
- Education. Provide undergraduate neuroscience courses with reproducible simulation environments without requiring local installs.
How this fits into the broader SciRouter stack
Neuroscience workflows often span multiple domains. You might start with a molecular target (protein folding, drug screening), move to a cellular model (channel kinetics fit), and end at a network simulation (circuit-level effects). Having all of those under one API key removes friction when building integrated studies.
For instance, a calcium channel pharmacology study might run in three phases: dock candidate molecules to the channel protein using the ligand API, fit a channel kinetic model to patch-clamp data using the neuroscience API, and simulate the network-level effect in a cortical microcircuit. Each phase is one API call.
Getting started
Start with Neuro Lab, the web interface for building, running, and visualizing neural simulations. It supports both single-cell and network-level models and exposes a visual circuit editor for quick prototyping.
For production research, the Python SDK gives you the full API surface plus helpers for common tasks like parameter sweeps and fitting loops. Neuroscience labs increasingly run simulations as part of automated pipelines, and an API-first tool fits that model naturally.