May 2020 
The Scientist

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview







PODCAST BY SCIENTISTS FOR SCIENTISTS The Scientist Speaks is a new podcast produced by The Scientist’s Creative Services Team. Once a month, we bring you the stories behind news-worthy molecular biology research.

1. Birds on the Brain: The Neuroscience Behind Songbird Communication and Human Speech

2. The Buzz About Genetically Modified Mosquitoes

3. Tackling Antibiotic Resistance: Viruses to the Rescue

4. Hidden Hitchhikers: Lessons Learned from The Human Microbiome Project

FOLLOW US ON INSTAGRAM the_scientist_magazine

It’s a matter of expression. For over 40 years, New England Biolabs has been developing and using recombinant protein technologies in E.coli for our own manufacturing processes. Protein expression can be a very complex, multi-factorial process. Each protein requires a specific environment to correctly and efficiently achieve its secondary and tertiary structures. Proteins may also require post-translational modifications or insertion into a cellular membrane for proper function. Other proteins, once expressed, may be toxic to the host. Thus, no single solution exists for the successful production of all recombinant proteins, and a broad range of expression tools is often required to ensure the successful expression of your target protein. Our NEBExpress™ portfolio of products include solutions for expression and purification of a wide range of proteins, and is supported by access to scientists with over 40 years of experience in developing and using recombinant protein technologies in E. coli. We use these solutions in our own Access our entire portfolio and request your sample at

One or more of these products are covered by patents, trademarks and/or copyrights owned or controlled by New England Biolabs, Inc. For more information, please email us at [email protected]. The use of these products may require you to obtain additional third party intellectual property rights for certain applications. © Copyright 2019, New England Biolabs, Inc.; all rights reserved.

research and manufacturing processes, and know that quality and performance are critical – all of our products are stringently tested so that you can be sure they will work optimally for your solution, just as we rely on them to work in ours. Featured products include: • Cell-free expression systems – express analytical amounts of protein in approximately two hours • E. coli expression and purification kits – generate and purify high yields of recombinant proteins • Competent cells – express a variety of proteins in E. coli, including difficult targets, proteins with multiple disulfide bonds and His-tagged proteins • Purification beads, columns and resins – available for CBD-, MBP- and His-tagged proteins

MAY 2020










Strategies to make lab animals forget, remember, or experience false recollections probe how memory works, and may inspire treatments for neurological diseases.

Rats and equations help researchers develop a theory of how the human brain keeps track of past experiences.

Manipulating Memory


Memories of Time



Putting New Neurons in Their Place Adult neurogenesis, already known to play a role in learning and memory, also figures into mental health and possibly even attention, new research suggests. BY ASHLEY YEAGER

05.2020 | T H E S C IE N T IST


Looking for a new career in the life sciences? Search for a job that will spark an innovation. Visit The Scientist’s careers portal to find the best postdoc positions, explore alternative career opportunities, or simply keep up to date on the postings in your area.

MAY 2020

Department Contents 18




Transcending Biology

Our memories, rooted in the very real cells and molecules that make up our brains, create a universe entirely separate from reality.

Daniel Colón-Ramos: C. elegans Psychologist BY CLAUDIA LOPEZ-LLOREDA


Humans generate terabytes of behavioral data while using their smart devices. Crunching those numbers could help identify the very beginnings of cognitive decline.


Memory in the Digital Age

Some people worry that the more we embrace external technologies, the more our memory faculties deteriorate. But the reality is more nuanced.




A new book explores how research through the ages has tried to map the intricacies of the human brain, including pinpointing the seat of memory.


A Memory of Trauma; Hazy Recollections; Brain Boost; Memory Munchers





Wrist-Mounted Air Pollution Detector

A novel sampler records data on a broad range of environmental contaminants. COURTESY OF HUMM; © KELLY FINAN; © CHRISTOPHER BEAUCHAMP




Bees remember after just one lesson; fly memory impaired under constant darkness; mice forget fear by learning something new

48 PROFILE Unravelling Memory’s Mysteries

Studying nonhuman primates, University of Washington neuroscientist Elizabeth Buffalo has identified important features of the neural underpinnings of learning and memory. BY DIANA KWON


From Whence Memories?




Digital Detection of Dementia




Savant in the Limelight, 1988–2009 BY SUKANYA CHARUCHANDRA IN EVERY ISSUE

9 11 59



The April article "Breaking Away" stated that in 2006, the protein L1 was known only for its role in brain development. At that time, it had previously been reported to be present in cancer cells. The Scientist regrets the error.





A G O T S E T S E O N T MB E R E D I A J E Z E B E L A E I P P E R L UC Y A E I I NGORD E R E N U Q I S C A C T U S E E T E 05.2020 | T H E S C IE N T IST


MAY 2020

Online Contents




Remember This Lecture

Brain Power

Probing Memory

Yale University researcher Daniel ColónRamos, this month’s Scientist to Watch, presents his research on C. elegans and memory.

Jared Cooney Horvath discusses research on how the human brain interacts with the world in this TED Talk.

Sheena Josselyn, a neuroscientist at the Hospital for Sick Children in Toronto, gives a brief introduction to her research into how the brain remembers.



• Left-handed DNA contributes to a dynamic genetic code. • Crucial protein synthesis enzymes have evolved additional jobs in angiogenesis, fat metabolism, and more. • What to do when your supervisor is accused of research misconduct AND MUCH MORE




• Many probiotics fail, but clues from newborn babies could reveal the recipe for success.

Read The Scientist on your iPad!

1000 N West Street Suite 1200 Wilmington, Delaware 19801 E-mail: [email protected]


Bob Grant [email protected] MANAGING EDITOR

Jef Akst [email protected] SENIOR EDITORS

Kerry Grens [email protected] Shawna Williams [email protected]





Greg Brewer [email protected] ART DIRECTOR

Erin Lemieux [email protected] VIDEO PRODUCTION COORDINATOR

Roger Blanchard [email protected]



Catherine Offord [email protected]


Ashley Yeager [email protected] COPY EDITOR


Abby Olena Ruth Williams INTERN



Bob Kafato [email protected]

Kristie Nybo [email protected] ASSOCIATE SCIENCE EDITORS

Kathryn Loydall [email protected] Nathan Ni [email protected] ASSISTANT SCIENCE EDITORS

Tiffany Garbutt [email protected] Niki Spahich [email protected] MULTIMEDIA COORDINATOR

Meaghan Brownley [email protected] MARKETING COORDINATOR

Katie Prud’homme-Aitken [email protected]

Key Accounts

Ashley Haire [email protected] SENIOR ACCOUNT EXECUTIVES Western US, Western Canada, ROW

Karen Evans [email protected] Northeast US, Eastern Canada, Europe

Dana Sizing [email protected] ACCOUNT EXECUTIVE Midwest and Southeast US

Anita Bell [email protected] DIRECTOR OF MARKETING

Alex Maranduik [email protected] AUDIENCE DEVELOPMENT SPECIALIST

Matthew Gale [email protected]

Joseph L. Graves, Jr. Joint School for Nanoscience and Nanoengineering Erich Jarvis Rockefeller University Ellen Jorgensen Biotech Without Borders Mary Claire King University of Washington Elaine Mardis Nationwide Children’s Hospital Joseph Takahashi University of Texas Southwestern Medical Center H. Steven Wiley Pacific Northwest National Laboratory


Amanda Purvis [email protected] CUSTOMER SERVICE

[email protected]

SUBSCRIPTION RATES & SERVICES In the United States & Canada individual subscriptions: $39.95. Rest of the world: air cargo add $25. For assistance with a new or existing subscription please contact us at: Phone: 847.513.6029 Fax: 847.291.4816 E-mail: [email protected] Mail: The Scientist, PO Box 2015, Skokie, Illinois 60076


Robert S. D’Angelo

For institutional subscription rates and services, visit or e-mail [email protected].

[email protected] EXECUTIVE VICE PRESIDENT POSTMASTER: Send address changes to The Scientist, PO Box 2015, Skokie, Illinois 60076.

Canada Publications Agreement #40641071 The Scientist is indexed in Current Contents, Science Citation Index, BasicBIOS IS, and other databases. Articles published in The Scientist reflect the views of their authors and are not the official views of the publication, its editorial staff, or its ownership. The Scientist is a registered trademark of LabX Media Group Inc. The Scientist® (ISSN 0890-3670) is published monthly.

Advertising Office: The Scientist, 1000 N West Street, Suite 1200, Wilmington, Delaware, 19801

Mail to our New York address will be forwarded to the new address for a period of time but, going


Jack Gilbert University of California, San Diego




Deborah Blum Knight Science Journalism Program at MIT

Cayley Thomas [email protected]

Mario Di Ubaldi [email protected]

Ken Piech [email protected]

James Allison University of Texas MD Anderson Cancer Center





LIST RENTALS Contact Statlistics, Jennifer Felling at 203-778-8700 or [email protected]. REPRINTS Contact Katie Prud’homme at [email protected]. PERMISSIONS For photocopy and reprint permissions, contact Copyright Clearance Center at

MAY 2020

Contributors In the 1970s, evolutionary neurobiologist Matthew Cobb decided to study psychology because he wanted to understand “how and why animals and people behave the way they do.” When he stumbled upon an article in New Scientist about the memory-impaired Drosophila mutant named “dunce,” he was struck by “the idea that you could use genes to get at fundamental behaviors like learning and memory.” Cobb completed his PhD in psychology and genetics at the University of Sheffield in the UK in 1984, and has dedicated the last 40 years to studying how genes and evolution shape nervous systems and behavior. His main focus has been olfaction in Drosophila larvae. Cobb’s research has shown that different odors activate the same neuron in different ways, eliciting a range of responses even in a genetically altered larva with only a single neuron in its nose. Cobb is also a science historian whose new book, The Idea of the Brain, explores how metaphors frame much of our understanding of science and especially the brain, which has been described as telegraph systems, telephone exchanges, and more recently, computers. Cobb says that neuroscientists are now admitting that the computer metaphor may no longer be very useful. Read about how the brain’s memory machinery is not computer-like, and how researchers are still seeking to understand where in the brain memory resides on page 58.


Jared Cooney Horvath found his way to science through the art of filmmaking. A decade of writing

scripts in Los Angeles taught him that every character he created represented some aspect of himself, and that’s when he started looking at neuroscience and psychology as vehicles for expanding selfknowledge. “If I learn about the brain,” he wondered, “will I become a better screenwriter?” This question pushed him into realms he never could have imagined: the more he studied the brain, he says, the more he wanted to understand the process of learning. First he became a teacher, then he pursued a master’s degree in education, and he went on to earn a PhD in cognitive neuroscience. Now, as a research fellow at the Melbourne Graduate School of Education and St. Vincent’s Hospital Melbourne, Horvath studies the science of learning and explores how to translate those insights in the classroom. He’s especially interested in our relationship with technology and what impact it’s having on memory retention given the digitalization of education. Horvath is out to challenge the common assumption that tech is ruining our memories. “Computers aren’t doing much of anything,” he says. “We’re turning our brains into mush because of the way we’re choosing to use computers.” On page 12, Horvath delves into the complexities of how memory works in the digital age. In her Denver, Colorado, high school, Lucy Conklin was drawn to both science—physics in particular—and art. So she studied both at Bucknell University in Pennsylvania. After she graduated in 2008, she followed some friends to New York City, where she took her first science illustration class, at the American Museum of Natural History. Then she took an animal drawing course that took place after hours in the museum. She used those experiences to build her portfolio and apply for the Science Illustration Certificate Program at California State University at Monterey Bay. After 10 months of coursework, she returned to Colorado in 2012 for an internship at the Denver Museum of Nature and Science, and then did an internship at the Exploratorium in San Francisco, where she drew illustrations of various critters, plankton collections, egg fertilization, and more. She also began freelancing for magazines and other clients and eventually moved back to Monterey to teach science illustration at California State University and the University of California, Santa Cruz. She did her first infographic for The Scientist back in 2015, and has been a regular contributor since. In this issue, working from Denver where she moved last year, Conklin depicts how researchers are using optogenetics to manipulate the thoughts of mice (see page 28). “One of my favorite parts of science illustration is I get to learn little bits and pieces about whatever science it is,” she says. “The one I’m working on now, implantations of memories in mice—that just seems really cool.” 05 . 2020 | T H E S C IE N T IST



Transcending Biology Our memories, rooted in the very real cells and molecules of our brains, create a universe separate from reality. BY BOB GRANT


ships that come along with social isolation and the global economic downturn— will become uninvited guests, intruding on the daily business of living. On the opposite end of the spectrum, with any luck, many young people living through this reality will recall this period of their lives with a hazy bemusement. “Remember when we were kids, and we got to stay home from school with Mom and Dad for months on end?” Again, the mountain of memories that will accrue in this complicated time will not faithfully record the events now unfurling. Rather, they will form smudged reproductions of the difficulties we are all grappling with. Those memories, and the behaviors they drive, will linger, perhaps for generations. The scientific enterprise is currently front and center. Millions around the world are counting on researchers and clinicians to pull us from the darkness of this pandemic, as dozens of drugs and vaccines make their way through development and organizations around the globe work to distribute accurate tests that can track the spread of the disease. And this is only the first battle. In the months and years to come, those of us who survive this episode will again call on healthcare providers and scientists to rescue us from the mental and physical aftereffects of the pandemic. For the foreseeable future, the world will need science and medicine more than ever before in our history. And we will need humanity in equal measure. No matter the complexion of our memories of this time, it is my sincere hope that we can treat one another and the researchers striving to corral and vanquish this viral foe with understanding, compassion, and respect. This issue of The Scientist is dedicated to the late Nicola F. Morabito (1923–2020), my grandfather and an inspiration to many. Memories of him, imperfect reflections of reality though they may be, will live on in me and in the others whose lives he touched. g

Editor-in-Chief [email protected]



he past couple of months have been heavy for us at The Scientist. Heavy for everyone. From our home offices, we’ve been tirelessly reporting on the global pandemic that continues to grip the world in its stranglehold. We are trying to stay atop a flood of information and stories that need telling as we also contend with challenges that most of us have never confronted, and none of us will likely soon forget. At the same time, we continue to search across the life sciences for other nuggets of research worth sharing. This month, our issue is focused on the science of memory. Our memories make us who we are, subconsciously driving our behaviors and dictating how we view the world. One of the most interesting things about memory is its imperfection. Rather than serving as a precise record of past events, our memories are more like concocted reflections, filtered and distilled from pure reality into a personal brew that is formulated by our own unique physiologies and emotional backgrounds. The wholly unique universe we each create—separate from but still tethered to the actual universe—is the product of electrical signals zapping through the lump of fatty flesh inside our skulls. Biology gives birth to something that exists outside the boundaries of biology. The neural machinery involved in the formation, storage, and retrieval of memories is coming to light in labs across the world, but science has not yet solved this particular puzzle. In this issue, you’ll read about talented researchers who use modern tools such as optogenetics and genome editing to probe the biology underlying memory. In lab animals, these scientists can force the recall of memories at the flick of a molecular switch, implant false memories, and erode a real memory to the point of vanishing. Through these studies and others, perhaps science will one day robustly characterize memory’s biological nuts and bolts. But will we ever truly understand, and perhaps directly manipulate, the personal reality created by each individual’s brain? What scares me most at this juncture in world history is how the COVID-19 pandemic will live in the memories of those affected by it. The patchiness of the current global predicament will dictate our individual familiarity with the ravages of SARSCoV-2. Some will remain largely unscathed by illness, many will feel the economic pinch of societal lockdowns, many will also lose friends or loved ones to the virus, others will succumb to it themselves. No one will emerge unchanged. Memories living within the survivors will mirror the array of individual experiences. For many, traumatic memories of the pandemic—whether that be illness from the virus or any of the hard-


Speaking of Science 1








Note: The answer grid will include every letter of the alphabet.


—Author Guy de Maupassant in his 1880 short story “Suicides”






Our memory is a more perfect world than the universe: it gives back life to those who no longer exist.






This is going to be imprinted on the personality of our nation for a very long time. —Anthony Fauci, director of the US National Institute of Allergy and Infectious Disease, speaking with CNN’s Sanjay Gupta about the effects of the COVID-19 pandemic on the psyches of his daughters and other Americans (April 1)





7. 8. 9. 10.

1. 2. 3. 4. 5. 6. 11. 12. 13. 15.


Attribute notable in elephants Sub-Saharan vector of trypanosomes Like the heart of a fish (hyph.) Member of a civilization that revered coca 11. Butterfly named for King Ahab’s wife 13. Aquatic mammal’s forelimb 14. Nickname of an australopithecine female 16. Dominance hierarchy (2 wds.) 19. Where to find the sacrum and coccyx 20. One sort of xerophyte

Physicist who wrote Opticks Volcano of Ecuador Unscientific belief Maker of an aerosol Good name for a 23andMe analyst? All about bones Type of anemone or beetle Gastrointestinal tube (2 wds.) Antenna or tentacle, at times Head of a valley shaped like an amphitheater 17. Ceremonial room for Pueblo people 18. A single time, without replication Answer key on page 5

05.2020 | T H E S C IE N T IST 1 1


Memory in the Digital Age Some people worry that the more we embrace external technologies, the more our memory faculties deteriorate. But the reality is more nuanced.


ere’s a question that will only make sense to readers of a certain age: What was your childhood telephone number? I’m guessing you had no problem rattling that off despite not having dialed or recited those digits in decades. If technology were truly killing our memory, then surely this useless bit of information would have faded away long ago. But I submit that modern human beings have the same memory capabilities we’ve always had; technology is merely redefining how we choose to employ them. To understand what’s going on, we must first become acquainted with the structure of memory. In its simplest form, memory can be understood as a three-step process: first we


encode information in the brain; then we store that information in the brain; and finally, we retrieve that information from the brain. From each of these steps, we can learn something interesting about memory in the modern world. With regard to memory encoding, more than a century ago psychologist Hermann Ebbinghaus demonstrated that the manner in which we expose ourselves to information has a big impact on how memories are formed. More specifically, Ebbinghaus recognized that when we endeavor to ingest massive amounts of information in a single sitting, we ultimately remember less than when we expose ourselves to that same information over a series of shorter



periods—ideally, interspersed with several bouts of sleep. If you’ve ever pulled an all-night cram session for an exam only to forget everything you studied a week later, you’ve experienced this principle in action. Amidst the current attention economy, many modern technologies have been designed to continuously pump out information so as to keep users engaged for longer periods of time. Netflix urges us to watch one more episode, hyperlinks compel us to open one more tab, intermittent rewards drive us to play one more game.

Although technology may be changing what information we encode, store, and retrieve, it does not appear to be altering our memory machinery. Unfortunately, when information exposure is constant and ceaseless, our ability to hold onto information naturally diminishes. In fact, as colleagues and I demonstrated in a recent study, individuals asked to binge-watch the entire season of a television series remembered significantly less about the plot and characters than individuals who watched the same series on a nightly or weekly schedule. Human beings have always had a limit to the amount of information they could meaningfully encode in any given day. Modern technologies have not changed this; they simply push us beyond this limit more frequently than media of the past. In a highly cited study from 2011, researchers found that individuals remember significantly fewer facts when they’re told that those facts will be externally stored and easily accessible in the future. Termed the “Google Effect,” this is the reason why we so often don’t remember phone numbers, email addresses, or meeting schedules—technology has allowed us to outsource memory storage. Here’s the problem: in order to meaningfully interact with offloaded information, we must remember where that information is located—which keystrokes are required to access it, how to sift through it, etc. These processes are all internally stored memories. Accordingly, rather than killing our ability to create memories, technology is simply changing what information we choose to remember. Human thinking and cognition depend largely on those memories we have internally stored. In fact, the higher-order skills most people clamor for, such as critical thinking and creativity, emerge from and can only meaningfully be applied to facts held within our long-term memory. As educational psychologist Paul Kirschner of the Open University of the Netherlands states in a 2006 review paper, “Everything we see, hear, and think about is critically dependent on and influenced by our long-term memory.”

Some researchers have hypothesized that the secret to forming deep, lasting memories resides in the primary encoding phase. More specifically, if an idea or event elicits strong emotions during encoding, then people will form a deeper memory. Although this is true, it can’t be the whole story. Otherwise, why do we all remember completely emotionless TV commercial jingles from our childhood? Other researchers have suggested that the secret to forming deep, lasting memories resides in the storage phase. That is, if an experience is repeatedly encountered, there will have been multiple storage opportunities, leading to a deep memory. Again, although this is true, it can’t be the whole story. If it were, more people would be able to draw an accurate Apple Macintosh logo from memory. (Try it yourself.) It turns out that the secret to forming deep, lasting memories resides in the final retrieval phase. Put simply, memory is constructive: the more you retrieve a memory, dredging it up from the depths using your own cognitive faculties, the easier it becomes to recall in the future. This is likely why we remember so many TV jingles—we retrieve these songs each time we sing them—and why we don’t remember so many ubiquitous logos—very few of us have ever retrieved these images. Modern technology by and large is geared toward information broadcasting. It specializes in organizing and presenting ideas to people in a highly engaging format. Unfortunately, outside of usernames and passwords, technology is very bad at forcing us to retrieve information. This is the final reason why it may seem technology is killing our memories: when we need never recall information, relevant memories become weak and fleeting. Rest assured, there is no reason to assume human beings are losing the capacity to form deep memories. We are simply using this faculty to access and create deep memories for things such as usernames, passwords, and URLs. Although technology may be changing what information we encode, store, and retrieve, it does not appear to be altering our memory machinery. The fact that you can remember the name of the folder that holds a specific document, even if you don’t remember the contents of that document, shows memory is still chugging along. We are merely employing it differently than previous generations. This leads to the truly important questions: Do we like how we are currently using our memory? Do we like how this may be altering our learning, our discourse, our evolution? If the answer is “no,” then we need to re-evaluate how we are employing modern technologies. That our tools may not be killing memory does not mean they are innocuous. g Jared Cooney Horvath is an educational neuroscientist at the University of Melbourne in Australia. He also serves as director of LME Global, a company dedicated to bringing the latest in brain and behavioral research to education and business alike. Follow him on Twitter @JCHorvath. 05 . 202 0 | T H E S C IE N T IST 1 3


Highly Multiplexed Proteomics from IsoPlexis® is Accelerating Insights from Infectious Diseases to Cancer Immunology


nderstanding immune interactions are critical when it comes to immunotherapies

• Human Adaptive Immune:

for all types of diseases. Immune interaction is driven by the way cells

IL-17A, TGF-β1, MIP-1α, IL-9, MIP-1β, IL-6, IL-7, IL-8, IFN-γ, IP-10, GM-CSF, IL-4, IL-5, IL-10,

communicate with each other. Immune cells use cytokines and chemokines

TNF-α, MCP-1, IL-13, IL-2, Perforin, sCD40L, sCD137, TNF-β, Granzyme B, IL-15

to communicate within the immune system. Either as part of normal cellular function, or when challenged, immune cells release cytokines and chemokines which influence cell proliferation, migration, identity, and more; however, “modeling an immunological

• Human Cytokine Storm: IL-1β, IL-2, IL-4, IL-6, IL-7, IL-10, IL-12, IL-13, IL-17, MCP-1, GM-CSF, MIP-1α, TNF-α, IFN-γ

process can be challenging because the process may…contain multiple cell types, and use multiple cytokines.”1 One critical step in understanding this cell communication, and

Finding What Makes Cancer Cells Move

thereby better understanding immune interaction is by identifying the cytokines secreted

Cytokines are playing a critical role from infectious disease to cancer immunology. The

by the immune cells. To help address this integral challenge, IsoPlexis’ technology offers

IsoPlexis platform has proven instrumental in helping researchers establish a link between

powerful, accessible, high-throughput functional proteomics to help identify the functional

cytokine activity and cancer cell behavior. Published in Nature Communications, Jayatilaka,

mechanisms and drivers of cancer and infectious diseases.

et al.2 at Johns Hopkins University School of Medicine describe how they used functional

In infectious disease, it’s crucial to identify protective T cell response for vaccine

cellular proteomics to explore cancer cell metastasis.

development, establish cellular immune monitoring for protective response early in patients,

To identify the functional drivers promoting migration, Jayatilaka, et al. cultured cancer

and cellular prediction and cytokine level monitoring for toxicities related to cytokine storm.

cells within 3D collagen matrices at high and low densities. They found that cells moved

IsoPlexis’ IsoLight® system provides a solution for the single-cell and accelerated population

much faster at high densities. Transferring culture medium conditioned by high-density

level functional proteomics required to overcome these challenges. By detecting the range

cells into low-density matrices recapitulated this phenomenon in the absence of increased

of cellular proteomics involved in the immune response, researchers can better understand

physical density.

functional mechanisms for the development of vaccines and novel therapies for cancer and infectious diseases.

The team then identified the responsible factors by characterizing the secretomic profiles of sarcoma and carcinoma cells grown at high and low densities and comparing them. To do this, they determined the concentrations of 24 soluble molecules using

Functional Cellular Proteomics Powered by the IsoPlexis Platform

IsoPlexis’ highly multiplexed CodePlex technology. They discovered that IL-6 and IL-8 were

The IsoLight system runs both the IsoCode and CodePlex chips. IsoCode chips capture

both secreted in high concentrations that increased in abundance as cell density increased.

individual cells using microfluidic microchambers. The cytokines secreted by a given

IL-6 and IL-8 were the only assayed secreted proteins with elevated concentrations at higher

cell are then captured by an antibody barcode, creating a single cell cytokine profile

cell densities; the researchers saw no change in growth factors such as FGF and VEGF,

that scientists can use to identify functional correlates of persistence, resistance, and

chemokines such as RANTES and MIP1α, or other pro-inflammatory cytokines such as IFNγ

suppression. Additionally, run on the same system, CodePlex chips allow for accelerated

and TNFα. Jayatilaka, et al. further discovered that the effects of IL-6 and IL-8 occurred only

insight, measuring the cytokines present within a sample using a rapid automated method

when both interleukins were active. The presence of either IL-6 or IL-8 alone failed to elicit

with small sample volumes and only five minutes of hands-on time. Both IsoCode and

any increase in cell velocity and blocking either IL-6 or IL-8 function using RNA interference

CodePlex chips target over 30 cytokines in a highly multiplexed manner in both single cells

or receptor antagonists eliminated pro-migratory effects.

and population serum samples, respectively. The IsoLight is a hub for comprehensive functional profiling of each cell type across a

New Understandings, New Therapeutic Potential

large assay menu of single-cell and population proteomic chips across many high impact

Functional cellular proteomics helps scientists better understand cancer cell behavior in

applications. It fills the gap missing from current technologies of being able to look at the

order to identify potential treatment approaches as well as understand immune interactions

extracellular phenotype of each cell, helping scientists directly identify the functional drivers

to better treat cancer and infectious diseases. Using IsoPlexis’ platform, Jayatilaka, et al.

of persistence, resistance, and suppression, as well as cytokine storm and toxicity.

discovered a novel synergistic paracrine signaling pathway centered around IL-6 and IL-8

From infectious disease to cancer immunology, CodePlex can identify immune

secretion that plays a critical role in cancer cell migration. This pathway may represent a

signatures in a completely automated highly multiplexed fashion with small sample volume

new therapeutic target for decreasing the metastatic capabilities of cancer cells and thereby

and only five minutes of hands-on time. The Human Innate Immune panel is coming soon.

improving patient outcomes.

The currently available CodePlex panels are:

References • Panel Menu: Granzyme B, IFN-γ, MIP-1α, Perforin, TNF-α, TNF-β, GM-CSF, IL-2, IL-5, IL-7, IL-8, IL-9, IL-12, IL-15, IL-21, CCL11, IP-10, MIP-1β, RANTES, IL-4, IL-10, IL-13, IL-22, TGF-β1, sCD137, sCD40L, IL-1β, IL-6, IL-17A, IL-17F, MCP-1, MCP-4, IL-18, TGF-α, BCA-1, IL-12-p40, MIF, EGF, PDGF-BB, VEGF

1. H. Daneshpour et al., “Modeling cell–cell communication for immune systems across space and time,” Curr Opin Syst Biol, 18: 44-52, 2019. 2. H. Jayatilaka et al., “Synergistic IL-6 and IL-8 paracrine signalling pathway infers a strategy to inhibit tumour cell migration,” Nat Commun, 8:15584, 2017.

01 . 201 8 | T H E S C IE N T IST 4 5



A Memory of Trauma



ovember 13, 2015, will be remembered in France and around the world as the day that Paris fell under attack by a group of terrorists. More than 130 people died. Many others survived to carry on lives in the wake of this categorically traumatic experience. “When you listen to the reports . . . of the survivors, it was shocking,” says Karen Ersche, a neuroscientist at the University of Cambridge. “It was gruesome. It was worse than a horror film.” The head of the French National Centre for Scientific Research (CNRS) at the time, Alain Fuchs, wrote a letter imploring

researchers to respond to the attacks with science. After reading the letter, cognitive neuroscientist Pierre Gagnepain felt moved to do something. Although he didn’t study post-traumatic stress disorder (PTSD), his work was “extremely connected to the question of how the brain [controls] unwanted memory experience,” says Gagnepain, a researcher in Francis Eustache’s lab in the Neuropsychology and Imaging of Human Memory department at the French National Institute of Health and Medical Research (INSERM). “I wanted to focus . . . on what could protect and increase resilience” in people who suffered a trauma. In January 2016, Eustache and CNRS research director Denis Peschanski met with Fuchs to discuss launching a national program to study the survivors

MAY 2020

IN MEMORY: People in Paris placed flowers to commemorate victims of the terror attacks in November 2015.

of the November 13 attacks. Gagnepain soon joined in the conversations, as did Yves Lévy, head of INSERM at the time, and the 13-Novembre Programme was born. Researchers involved in the project, which is ongoing, aim to film testimonies from 1,000 volunteers—including those who survived the attacks, people from targeted neighborhoods, and inhabitants of surrounding areas and other cities— about their experiences that night. They have also begun using fMRI to look at the brains of some of those participants. One question the researchers want to explore through the lens of the 05.2020 | T H E S C IE N T IST 1 5


Paris attacks is why some people who experience trauma develop PTSD, while others do not. Gagnepain, Eustache, Peschanski, and their colleagues worked with a subset of those individuals who were affected by the attacks and were part of the 13-Novembre Programme—including 55 who had developed PTSD following the incident and 47 who had not. They also recruited 73 people who were not in Paris that day to participate in the study. The researchers designed a behavioral experiment in which the volunteers memorized word-image pairs, to the point where, upon seeing one of the words on a screen, the participants would automatically think of the associated image. The researchers then had the participants lie down in an fMRI machine and try to avoid recalling some of the images they’d just been trained to think of: if a cue word appeared in red, they were to try to prevent the associated image from entering their awareness, or to push it out of their thoughts if it did so. These word-image pairs were designed to represent neutral memories—there was nothing emotionally charged about seeing the word “Favor” and thinking about a picture of an elephant. But the idea the researchers wanted to test was whether individuals with PTSD were somehow less able to suppress neutral memories, just as they struggled to suppress the traumatic ones that underlie their disorder. Once participants had completed such a “no think” trial, the researchers asked them whether they’d involuntarily thought of the image, what the researchers considered an intrusive memory. There was no difference between individuals with PTSD and those without the disorder, or between those who had been exposed to the Paris attacks and those who had not, in terms of how often the intrusions happened—everyone was so well trained on the word-image pairs that the “intrusive images pop automatically,” says Gagnepain. All participants got better at blocking out intrusive memories over the course of the trials. But the fMRI data suggested that the mechanism underlying that suppression isn’t the same for people with PTSD and controls (Science, 367:eaay8477, 2020). 16 T H E SC I EN TIST |

The researchers found that participants differed in the amount of connectivity between certain brain regions—specifically, between inhibitory control regions of the prefrontal cortex and structures involved in memory such as the hippocampus. People with PTSD showed much weaker connections between these regions than did people without the disorder. “It seems that vulnerable individuals have a problem with control,” says Ersche, who wrote a perspective article that accompanied the study but was not involved in the research. “What I really liked about this paper is it says it’s not the severity of the trauma that leads to PTSD; it’s the ability to control it. And that is key.”

When you listen to the reports of the survivors, it was shocking. —Karen Ersche, University of Cambridge

After being asked to suppress certain images, the participants completed a different sort of test. They were shown scrambled versions of those same images, which then slowly came into focus. Participants with PTSD were quicker to discern the familiar but suppressed images, suggesting that their suppression was somehow disrupted. It seems that not thinking about something is more difficult for individuals with PTSD than it is for other people. This was known to be true of explicit, emotional memories such as a traumatic event that may have triggered the PTSD, and the new study demonstrates a similar effect for nonemotional, implicit memories that aren’t consciously recalled but influence behavior or emotions. Gagnepain compares the brain’s ability to suppress memories to the brake in a car: “There is one brake, to be used should a dog run out into the road or simply because the light turned red. We believe it is the same in the brain.” Kerry Ressler, chief scientific officer at McLean Hospital and a professor of psychiatry at Harvard Medical School, says the findings fit well with a “longstanding expectation from the literature—

the psychological literature and the neuroimaging literature—that people with PTSD have deficits in top-down regulation of emotion.” Ersche notes that the findings are very much in line with her own research on drug addiction. She has found that those who struggle with addiction have less inhibitory control over the striatum, a brain region with links to impulsivity and drug-related compulsions. Other researchers have noted similar neural signatures in cases of attention deficit hyperactivity disorder and obsessive-compulsive disorder. Scanning the literature, Ersche also found evidence of weak inhibitory control over the amygdala, the brain’s emotional processing center, in cases of depression. “It seems there’s a general pattern here,” she says, though she emphasizes that the idea of a common mechanism for such diverse disorders is just speculation. The idea that PTSD involves loss of regulatory control over all of one’s memories, not just the traumatic ones, could signal a new avenue to explore for potential therapies, Ressler says. He explains that PTSD patients are often treated with methods that encourage the repeated recollection of the traumatic experiences in an attempt to desensitize the individual—“essentially retraining the brain to no longer have emotional over-responses to the triggers or the intrusive memories.” The new study suggests that patients could potentially be trained to control their memories using neutral stimuli, as opposed to having to relive their own traumatic experiences. “That that could be a different way of approaching PTSD that could be powerful,” Ressler says. As for the INSERM study, Gangepain and his colleagues recently saw their participants a second time, and are using the latest data they collected to explore how brain activity changes over time. “We are interested in evolution of these [patterns],” he says. “We are extremely grateful to all the participants who volunteered. . . . I think most of them were happy to participate and feel they were doing something right. It’s important to make positive things out of a traumatic event.” —Jef Akst


Hazy Recollections When Lilian Kloft stumbled across a 2015 study showing a connection between cannabis use and susceptibility to false memories, she found herself wondering about the legal implications of the results. The study had discovered that heavy users of cannabis were more likely than controls to form false memories—recollections of events that never occurred, for example, or warped memories of events that did—even when they were not at the moment “high.” This kind of false remembering can pose difficulties for people gathering reliable testimony in the event of a crime, says Kloft, a PhD student in psychopharmacology and forensic psychology at Maastricht University in the Netherlands. Consequently, the growing acceptance of cannabis worldwide raises questions not only about how the drug affects memory, but also about how law enforcement officials should conduct interviews with suspects, victims, and witnesses who may be under the influence or regular users of the drug. In order to further investigate the connection between cannabis and false memory formation, Kloft and collaborators recruited 64 volunteers for a series of experiments. Participants, who were occasional cannabis users, were given a vaporizer containing either cannabis or a hemp placebo and then told to inhale deeply and hold their breath for 10 seconds. After that, the researchers tested them in three different tasks designed to induce false memories. In the first task, the team asked the volunteers to memorize lists of words, and then to pick out those words from test lists that also included dummy words. As expected, both the sober and the intoxicated participants falsely remembered some of the dummy words. But while the sober participants mostly falsely remembered words that were strongly associated with words on the original lists, the intoxicated participants also selected lessrelated and completely unrelated terms. In the next two tasks, the researchers wanted to see if they could induce false

memories by providing misinformation to the participants. Hoping to imbue these tests with more real-world relevance than a list of words, Kloft and colleagues designed two immersive virtual reality scenarios involving common crimes. In the first, the “eyewitness scenario,” participants observed a fight on a train platform, after which a virtual cowitness recounted the incident but with several errors, including falsely recalling a police dog that wasn’t part of the altercation. In the “perpetrator scenario,” participants entered a crowded bar and were instructed to commit a crime themselves—to steal a purse.

The researchers observed a range of effects associated with cannabis as the intoxicated subjects interacted in these virtual environments. Some participants laughed and talked to the virtual characters in the scenarios, Kloft reports, while others became paranoid and required assistance in stealing the purse. “One person even ran away so quickly that they ripped out the whole VR setup and it fell to the ground,” she says. When researchers interviewed the participants afterward using a combination of leading and nonleading questions, those who were intoxi-

05.2020 | T H E S C IE N T IST 1 7


cated showed higher rates of false memory for both the eyewitness and perpetrator scenarios compared with controls. To look for longer-term effects of cannabis, the experimenters called the subjects back a week later and tested them again on the word lists, this time with a few different dummy words thrown in. They also re-interviewed the subjects about the VR scenarios using a combination of old and new questions. As before, they found lower memory accuracy in the wordassociation test in those who had been intoxicated compared with sober participants. There were no statistically significant differences between the groups for the virtual reality scenarios, a result that Kloft says may indicate memory decay over time in all participants (PNAS, 117:4585–89, 2020). Cognitive neuropsychopharmacologist Manoj Doss, a postdoc at Johns Hopkins University who was not involved in the study, has used word association and other tasks in his own research to demonstrate that tetrahydrocannabinol (THC), the main psychoactive ingredient in cannabis, increases false memories when participants attempt to retrieve information they’d previously learned. Doss says that the study by Kloft and collaborators is novel not only because it employs virtual reality, but because it shows that both the real-world scenarios and the word association task can induce false memories. For the tests administered after one week, however, Doss notes that it’s difficult to determine if the researchers were observing actual false memories, because participants might remember both the accurate and the dummy information they encountered in the original experiment. In the follow-up test, “people might say yes to the things they’re not supposed to just because they saw them in that first test,” says Doss. He suggests that increasing the number of items tested, as well as separately analyzing the new and previously used word tests and interview questions, could reveal a higher incidence of false memories in the delayed test for the participants who took cannabis. Giovanni Marsicano, a neuroscientist at the University of Bordeaux in France who did not participate in the research, says that the new results match up with find18 T H E SC I EN TIST |

ings he’s made in mice: animals that receive injections of THC are more likely than controls to associate unrelated stimuli—itself a sort of false memory. His work has also shown that a cannabinoid receptor known as CB1 that is highly abundant in the hippocampus and prefrontal cortex probably plays a key role in the formation of these incidental associations. One of this receptor’s main jobs is to decrease the release of neurotransmitters. Marsicano hypothesizes that when the CB1 receptor is activated, neural signaling is inhibited in such a way that the brain is less able to separate correct from incorrect information. Roger Pertwee, a pharmacologist at the University of Aberdeen in the UK who was not involved in the research, says that the Dutch team’s results aren’t surprising given what’s known about how cannabinoids affect memory. Unlike endogenous cannabinoids, which tend to selectively activate some CB1 receptors and not others, compounds in cannabis activate all CB1 receptors at once; this indiscriminate activation may also somehow contribute to the formation of false memories, explains Pertwee, who works with GW Pharmaceuticals, a company that makes prescription medicines derived from cannabis.

Some participants laughed and talked to the virtual characters. Others became paranoid. In the future, Kloft says she’s interested in looking at how people regard the memories they form when high in order to find out whether they “trust” those memories. “Are they confident in them, and is there any strategy they pursue to correct for their probably impaired memory?” Study coauthor Elizabeth Loftus, a cognitive psychologist and human memory expert at the University of California, Irvine, says that the team’s study should prompt people to think about best practices when it comes to intoxicated witnesses. “The law recognizes that there are vulnerable witnesses who need extra special care and attention when you’re interviewing them: young children, people with mental disabil-

ities, sometimes the elderly are included in that category,” Loftus says. “Might not [people who are high] be another example of . . . vulnerable witnesses where you’ve got to be extra careful?” —Amy Schleunes

Brain Boost What if you could boost your brain’s processing capabilities simply by sticking electrodes onto your head and flipping a switch? Berkeley, California–based neurotechnology company Humm has developed a device that it claims serves that purpose. Their “bioelectric memory patch” is designed to enhance working memory— the type of short-term memory required to temporarily hold and process information— by noninvasively stimulating the brain. In recent years, neurotechnology companies have unveiled direct-to-consumer (DTC) brain stimulation devices that promise a range of benefits, including enhancing athletic performance, increasing concentration, and reducing depression. Humm’s memory patch, which resembles a large, rectangular Band-Aid, is one such product. Users can stick the device to their forehead and toggle a switch to activate it. Electrodes within the patch generate transcranial alternating current stimulation (tACS), a method of noninvasively zapping the brain with oscillating waves of electricity. The company recommends 15 minutes of stimulation to give users up to “90 minutes of boosted learning” immediately after using the device. The product is set for public release in 2021. Over the last year or so, Humm has generated much excitement among investors, consumers, and some members of the scientific community. In addition to raising several million dollars in venture capital funding, the company has drawn interest both from academic research labs and from the United States military. According to Humm cofounder and CEO Iain McIntyre, the US Air Force has ordered approximately 1,000 patches to use in a study at their training academy that is set to start later this year. Despite the hype, however, some scientists say that the jury is still out on whether

noninvasively stimulating the brain with electricity can have a meaningful effect on cognition. Unlike the more powerful method of transcranial magnetic stimulation, electrical stimulation doesn’t generate a clear sign that the stimulation is influencing the brain, such as an immediately observable muscle twitch. Humm’s device builds on years of research suggesting that methods of transcranial electrical stimulation (tES)—including tACS and transcranial direct current stimulation (tDCS), which uses a constant rather than an alternating current—can boost memory, attention, and other cognitive processes by influencing natural rhythms in brain activity. In one of the most recent studies, for example, a pair of scientists at Boston University demonstrated that it was possible to reverse age-related working memory deficits in older adults by modulating brain rhythms with tACS.

A few years ago, the Humm team decided to test the efficacy of their product in a randomized, placebo-controlled trial. After recruiting 36 healthy adults between 17 and 56 years old, they split the participants into two groups. One group was given tACS at 6 hertz (previous studies suggested that brain rhythms important for cognitive tasks oscillate at this frequency) for 15 minutes and the other received a sham stimulation, where electricity was applied for a brief, 60-second period, then turned off for the rest of the session. All subjects engaged in the Corsi block-tapping test, a working-memory task that involves recalling the order in which a series of squares appeared on a screen, while they underwent tACS or sham stimulation, in addition to before and after. Participants were also asked to rate how likely the stimulation was to improve their performance, both prior to and following the experiment. This study found that tACS-treated participants performed approximately 20

percent better than their sham-treated counterparts—and that expectations regarding their performance did not influence results. “I was surprised that they got such a large effect size,” says Ted Zanto, a University of California, San Francisco, neuroscientist and a scientific advisor to Humm who helped design the study. “In a lot of research labs using different devices, they oftentimes don’t see quite such a large effect.” The findings from the study, which was completed in 2018, are available in a white paper on the company’s website. “It was a properly designed study . . . and what they found was a statistically significant difference between the two experimental groups,” says Anli Liu, a neurologist at NYU School of Medicine who isn’t involved with the company. “That being said, I think statistical significance doesn’t necessarily mean clinical significance.” Liu points out that the 20 percent difference in performance between the two groups was driven by mem-


ASSIST PLUS Automating Multichannel Pipettes Making hands-free serial dilutions, reagent additions and sample reformatting very affordable for every lab. Compatible with all Integra‘s electronic pipettes from 4 to 16 channels for consistent results and unbeatable ergonomics.

VIAFLO - Electronic Pipettes

VOYAGER - Adjustable Tip Spacing Pipettes


I don’t think you can completely explain away the huge literature showing the effects of neural stimulation. —Ted Zanto, University of California San Francisco

in one 2019 Nature Communications paper, a group of researchers reported that tACS stimulation could increase a tremor in individuals whose hands already shook and trigger it in healthy individuals asked to hold a tiny weight up with one finger. But this effect disappeared when they applied a topical anesthetic to the skin on participants’ heads before placing the electrodes, hinting that the original effect may have depended on stimulating peripheral nerves, rather than on electricity reaching the brain. Liu and her team have generated similar findings by measuring neural activity directly from the brains of epilepsy patients undergoing surgery, observing that tACS failed to influence memory-related brain waves. Still, Zanto and McIntyre are optimistic about the future of this technology. McIntyre points out that in another 2019

POWER STRIP: This electrode-containing patch

could boost working memory, according to Humm, the California-based company that makes it.

study, which was published on the preprint server bioRxiv, a different group of researchers demonstrated that, even after using a local anesthetic to block skin sensations during tACS in non-human primates, electrical stimulation was able to entrain the firing of neurons in the animals’ hippocampuses, a memory-associated region deep within the brain. While the mechanism of action of tACS remains an open question, “I don’t think you can completely explain away the huge literature showing the effects of neural stimulation,” says Zanto. “If it were a placebo effect, why do people show time and time again changes in cognitive function when paired to a group that gets a placebo control?” The Humm team plans to conduct further studies of the memory patch in order to answer questions such as whether a tACS session will improve performance on other tasks. “We have a laundry list of scientific questions that are of interest,” says Zanto. For now, the company is accepting online registrations for early access to its patch— currently a single-use device that costs


bers of the tACS group being able to recall six items, while the control group remembered only five. In addition, she adds that it remains to be seen if similar improvements can be seen with other working memory tasks. Other scientists share Liu’s skepticism. Because the company didn’t report whether or not their memory patch generated any changes in the brain, it is unclear whether it actually alters brain rhythms, says Joel Voss, a neuroscientist at Northwestern University who has no affiliation with the company. “It’s important if you want to validate a device to show that it’s actually engaging the hypothesized neural structure and changing its function in the way that you predicted.” Otherwise, Voss adds, the effects could be due to “extraneous, nonspecific influences,” such as placebo or indirect influences on the peripheral nervous system. For example, stimulating the nerves on your skin might make you alert, which may in turn make you more attentive to the task at hand. On top of that, it remains unclear how much electricity actually enters the brain with tES, according to Voss. He adds that results from several studies indicate that not enough electricity gets into the brain with this technique to have a meaningful impact on function. For example,

around $5 a pop. According to McIntyre, 1,000 people have already paid between $40 and $500 for subscriptions of up to several months’ worth of the product. Those customers will receive the products this year, as part of a beta test. Anyone else who is interested will need to wait until the company officially launches its device in 2021. —Diana Kwon

Memory Munchers Nearly seven years ago, Sheena Josselyn and her husband Paul Frankland were talking with their two-year-old daughter and started to wonder why she could easily remember what happened over the last day or two but couldn’t recall events that had happened a few months before. Josselyn and Frankland, both neuroscientists at the Hos-

pital for Sick Children Research Institute in Toronto, suspected that maybe neurogenesis, the creation of new neurons, could be involved in this sort of forgetfulness. In humans and other mammals, neurogenesis happens in the hippocampus, a region of the brain involved in learning and memory, tying the generation of new neurons to the process of making memories. Josselyn and Frankland knew that in infancy, the brain makes a lot of new neurons, but that neurogenesis slows with age. Yet youngsters have more trouble making long-term memories than adults do, a notion that doesn’t quite jibe with the idea that the principal function of neurogenesis is memory formation. To test the connection between neurogenesis and forgetting, the researchers put mice in a box and shocked their feet with an electric current, then returned the animals to their home cages and either let them stay sedentary or had them run on a wheel, an activity that boosts neurogenesis. Six weeks

later, the researchers put the mice back in the box where they had received the shocks. There, the sedentary mice froze in fear, anticipating a shock, but the mice that had run on a wheel didn’t show signs of anxiety. It was as if the wheel-running mice had forgotten they’d been shocked before (Science, 344:598–602, 2014). Frankland says that this sort of active erasure of memories makes sense, because remembering everything that happens can overload the brain; some memories, such as what exactly we did last week, need to be cleared out to make room for new information. While scientists don’t know yet exactly how the brain maintains memories, some suggest that neuronal connections play a role. Neurogenesis may help erase memories, then, if new neurons make their way into established brain circuits and tweak the existing network of synapses. (See “Putting New Neurons in Their Place” on page 38.)

Quantifying RNA or DNA? Get better results with AccuBlue®, AccuGreen®, and AccuClear® Fluorescent Quantitation Kits Ultrasensitive: Accurate and sensitive detection ideal for NGS and digital PCR Broad Range: Kit options for detecting 1 pg-2000 ng DNA or 5-1000 ng RNA Versatile: Options for fluorescent plate reader or Qubit® fluorometer Selective: Specialized kits for high selectivity of DNA or RNA Best Value: Economical kits with exceptional quality Learn more at

But neurogenesis might not be the only way memories are removed from the brain, notes Yan Gu, a neuroscientist at Zhejiang University School of Medicine in Hangzhou, China. Research that he and others have carried out since Josselyn and Frankland’s study shows that immune cells called microglia also aid in forgetting, eating away at the connections among nerve cells where memories may reside. Gu and colleagues made this finding in 2019 in a series of experiments in which they manipulated the numbers of microglia and new neurons in the brains of mice. From past studies, the team knew that microglia remove extra synapses early in life. Given synapses’ suspected role in memory storage, the team wondered whether there was a connection between microglia and memory. To test the idea, the team repeated the experiment Josselyn and Frankland had run, putting adult mice in an unfamiliar cage, then shocking their feet with electricity. The team then returned the mice to their home cages and used a drug to wipe out microglia in some of the animals’ brains, particularly

MEMORY POLICE: Brain cells called microglia (red) snip connections between nerve cells (blue) in the mouse hippocampus, in a process that may influence forgetting.

in the dentate gyrus, the specific region of the hippocampus where new neurons are made. When put back into the cage where they’d been shocked, the drug-treated mice froze while the untreated mice didn’t, suggesting that mice with reduced microglia had held onto their memory of the foot shocks better than mice with normal levels of the immune cells. “After we found depleting microglia prevented forgetting, we then started to investigate why and how microglia regulate forgetting in the brain,” Gu writes in an email to The Scientist. One of the first tests the team did was to determine how mice behaved if they had normal levels of microglia but the immune cells weren’t able to consume synapses as they normally do. Mice given a drug to block microglia’s eating behaviors remembered the foot shock better than mice not given the drug, the team found, confirming that their propensity for forgetting had to do with the immune cells’ ability to manipulate connections among nerve cells. The team then gave another set of mice a drug to boost hippocampal neurogenesis, fol-

lowed by the drug that blocked microglia from manipulating synapses. Again, the mice froze in the shock box more than untreated mice did, showing that even boosting neurogenesis—which Josselyn and Frankland had found promotes forgetting—couldn’t counteract the memory-protecting effects of knocking out the microglia. “This connection between microglia and forgetting is fascinating,” says Jorge Valero, a neuroscientist at the Achucarro Basque Center for Neuroscience in Leioa, Spain, who wasn’t involved in Gu’s work but also studies microglia’s role in this phenomenon. Valero and his colleagues recently reported in the Journal of Neuroscience that the immune cells gobble down newly made neurons tagged for cell death (40:1453– 82, 2020). When they ingest those new neurons, the microglia begin to secrete chemicals that reduce neurogenesis. Curious whether microglia’s memorydestroying activity is dependent on neurogenesis, or whether it can still occur even when neurogenesis is absent, Gu’s team tried their experiments again, this time wiping out microglia in a region of the hippocampus where there’s no neurogenesis. Again, the mice without microglia froze longer when put back into the box where they’d been shocked than mice who still had normal numbers of microglia, indicating that microglia aid in the forgetting of memories that are not tied to newly made neurons at all, the team reported in Science in February (367:688–94, 2020). “It looks like a very careful study,” says the Stony Brook University School of Medicine’s Stella Tsirka, who was not involved in the research. She’s studied microglia for several decades and has long suspected that the cells have a function not only in immune responses during disease but in the normal, healthy brain. Gu’s work provides solid evidence for microglia’s role in forgetting in the hippocampus, she says, though it’s not yet clear if the immune cells also munch on memories in other regions of the brain. —Ashley Yeager





Wrist-Mounted Air Pollution Detector A novel sampler records data on a broad range of environmental contaminants. BY CLAIRE JARVIS


inking air pollutant exposure to health outcomes—which can include impaired learning and memory as well as increased risk for respiratory, cardiovascular, and other diseases in people—poses a data-collection challenge for researchers. Sensors capable of detecting different classes of pollutants exist, but few devices can collect a broad spectrum of environmental contaminants. Combining sensors into a single device often requires that participants carry a backpack, one that may be too large and heavy for members of vulnerable populations, such as children and the elderly, to conveniently tote around. In search of a better solution, Yale University environmental health scientist Krystal Pollitt and her team designed a cheap, wearable sensor that could be used for large-scale studies of vulnerable populations. The Fresh Air wristband consists of a triethanolaminecoated foam that samples nitrogen dioxide and a polydimethylsiloxane (PDMS) sorbent bar that absorbs airborne organic pollutants. Pollitt’s design avoided the batteries and air pumps that made other sensors heavy. Instead, pollutants passively absorb into the PDMS bar over several days. When their concentrations are high enough, researchers use gas chromatography–mass spectrometry to analyze the samples. The lack of real-time pollutant tracking data is one drawback of this passive sampling approach. Without a battery or pump, “you don’t have anything that’s driving the air

Nitrogen dioxide O


PDMS sorbent bar Volatile organic compounds

Triethanolaminecoated foam

Cl Cl


Polycyclic aromatic hydrocarbons


Cl Cl

Cl H Cl

Cl Cl

FASHIONABLE DETECTOR: To make an air pollution detector that study subjects would want to wear, researchers

used components that can fit into a wristwatch-style device. Inside the wristband, air pollutants diffuse into two types of sorbent material over time: triethanolamine-coated foam and a polydimethylsiloxane (PDMS) sorbent bar. The sorbent material is later removed from the wristband and the amount of pollutant collected is quantified.

through the sample to concentrate [pollutants] faster,” explains Pollitt. However, real-time pollutant sensors aren’t able to detect all the pollutants researchers are interested in. For example, polycyclic aromatic hydrocarbons, a class of compounds emitted by burning fuel that’s been linked with liver damage, are present at levels too low to be detected by currently available real-time detectors. “Some esoteric, rarefied pollutants can’t be detected in miniaturized wearable sensor technology at the moment,” says Benjamin Barratt, an environmental researcher at King’s College London who was not involved in this study. “So for them we have to move to passive sampling.” Pollitt’s team tested the wearability of the wristband on adult volunteers over

24 hours, then recruited 33 children age 12–13 years to wear the wristband for 4–5 days. The researchers obtained sensitive and reproducible pollutant readings for nitrogen dioxide as well as airborne volatile organic compounds and polycyclic aromatic hydrocarbons. They are now trialing the wristband in India and China to track maternal pollutant exposure during pregnancy. The researchers have demonstrated “supreme wearability” with their wristband, says Ellison Carter, a civil and environmental engineer at Colorado State University. Carter, who did not participate in the research, notes that for a wearable pollutant detector to be successful, people must want to keep wearing it. (Environ Sci Technol Lett, doi:acs. estlett.9b00800) g






Active air pollutant sensor

Air is pulled through the detector by a battery-operated pump.

A software algorithm converts pollutant signals into concentrations.

Minute-by-minute data collection is possible.

Limited class of pollutants detectable. Device is large (1 kg). Algorithms can be altered by software developers, possibly reducing accuracy.

Passive air pollutant sensor

Pollutants in air that come into contact with the detector are absorbed and passively diffuse into the sampler.

Scientists in lab vaporize the sample and analyze the particles via chromatography and spectrometry.

Can detect miniscule quantities of rare organic compounds. Cheap to manufacture. Light and small enough to go on a wristband.

Requires 2–5 days to collect sufficient quantities of pollutants for accurate analysis. Data aggregated over time.

05 . 2020 | T H E S C IE N T IST 2 3

Manipulating Memory Strategies to make lab animals forget, remember, or experience false recollections probe how memory works and may inspire treatments for neurological diseases.


mouse finds itself in a box it’s never seen before. The walls are striped on one side, dotted on the other. The orange-like odor of acetophenone wafts from one end of the box, the spiced smell of carvone from the other. The mouse remembers that the orange smell is associated with something good. Although it may not recall the exact nature of the reward, the mouse heads toward the scent. Except this mouse has never smelled acetophenone in its life. Rather, the animal is responding to a false memory, implanted in its brain by neuroscientists at the Hospital for Sick Children in Toronto.1 Sheena Josselyn, a coauthor on a 2019 Nature Neuroscience study reporting the results of the project, says the goal was not to confuse the rodent, but for the scientists to confirm their under-


standing of mouse memory. “If we really understand memory, we should be able to trick the brain into remembering something that never happened at all,” she explains. By simultaneously activating the neurons that sense acetophenone and those associated with reward, the researchers created the “memory” that the orange-y scent heralded good things. Thanks to optogenetics, which uses a pulse of light to activate or deactivate neurons, Josselyn and other scientists are manipulating animal memories in all kinds of ways. Even before the Toronto team implanted false memories into mice, researchers were making rodents forget or recall an event with the flick of a molecular light switch. With every flash of light, they test their hypotheses about how these animals—and by extension, people—collect, store, and access




05 . 2020 | T H E S C IE N T IST 2 5

past experiences. Scientists are also examining how memory formation and retrieval change with age, how those processes are altered in animal models of Alzheimer’s disease, and how accessing memories can influence an animal’s emotional state. “Almost every neurological disease or psychiatric disease—everything from autism to stress to PTSD to Alzheimer’s to epilepsy—they all affect the memory system,” says Denise Cai, a neuroscientist at the Icahn School of Medicine at Mount Sinai. A little more than a decade ago, such memory manipulations might have seemed like science fiction—and in terms of applying them to people, they mostly still are. But in two watershed papers, published in 20092 and 2012,3 researchers blew open the doors to memory control in lab animals. In addition to optogenetic control over neuronal firing, transgenic techniques allowed scientists to prime or modify the specific cells that were activated when the animals first stored a new memory. That collection of neurons, called a memory trace, will fire again during recollection. In the first of these two seminal papers, Josselyn’s team showed they could control, ahead of time, which neurons would join a trace, and then kill those cells to eliminate the memory. Shortly thereafter, Susumu Tonegawa’s group at MIT presented techniques to identify and reactivate memory traces. Since then, “it’s reached a fever pitch,” says Boston University neuroscientist Steve Ramirez, a Tonegawa lab alum and coauthor on the 2012 paper.

Forgetting and remembering In Toronto, Josselyn is one-half of a memory-manipulating duo with her husband, Paul Frankland. In their 2009 Science study,2 the pair aimed to make mice forget a specific memory: that in a particular chamber, the sound of a tone preceded a foot shock. Fear conditioning is a common technique in the field because the memory forms quickly, within a few trials. Rodents that recall the experience freeze in fear upon hearing the tone again, while those that forget are more likely to carry on exploring their cage as normal. Memory traces incorporate neurons throughout the brain, touching parts that process sights, sounds, smells, and emotions. For simplicity, researchers usually focus their studies on one brain area of interest. In this case, the Toronto team zeroed in on the amygdala, which processes emotions such as fear. The amygdala also takes part in storing memories of events and emotions. Rather than wait to see which neurons would join the memory trace and identify those, the researchers primed certain neurons to join that trace. To do so, they took advantage of the fact that the production of the transcription factor CREB makes neurons excitable, and that excitability makes them more likely to take part in memory storage. Using a virusdelivered genetic construct, the researchers randomly overexpressed CREB in a subset of neurons in the amygdala. When the team trained the mice to link the tone with a shock, those high-CREB neurons were three times more likely to join the trace than unaltered cells. 26 T H E SC I EN TIST |

The genetic construct Josselyn, Frankland, and their colleagues used also made the engineered neurons—many of which joined the memory trace—vulnerable to a toxin produced by the bacterium that causes diphtheria. Once the team injected the toxin to kill those trace neurons, the mice didn’t freeze in response to the tone. “The memory was gone,” says Josselyn. That study showed that memory scientists could sit in the driver’s seat, bending traces and their associated memories to their will. Meanwhile, the MIT team set out to force mice to recollect a fearful experience whenever the scientists wanted them to. Doing so would prove that the cells in the trace they manipulated were indeed behind the memory.

Life, in the real world, is an accumulation of an almost infinite number of memories across a lifetime. —Denise Cai, Icahn School of Medicine at Mount Sinai

Their work depended on the optogenetic tool channelrhodopsin, a protein that spans cell membranes and responds to light. Blue light, delivered via an implanted optic fiber, causes the channel to open, letting in positive ions and making neurons fire. The researchers developed a system to express channelrhodopsin in mouse neurons involved in a particular memory trace in the dentate gyrus, a brain structure closely associated with the hippocampus. The hippocampus is involved in learning and memory as well as emotion and motivation; the dentate gyrus integrates sensory inputs during memory formation. Like Josselyn and Frankland, the MIT group used fear conditioning, training mice to associate a specific place and tone with a shock. The team designed their transgenic mice so that memory traces would not be labeled if the animals’ diets included the antibiotic doxycycline. The researchers removed doxycycline from the mice’s meals for a short time to create a memory with channelrhodopsin expressed in the relevant trace cells, then reinstated the doxycycline treatment to avoid labeling any other memory traces. When they later shone blue light into the brain to activate the target trace while the mice were in a different place, the animals would recall the foot-shock situation and freeze. The team concluded in their 2012 Nature study3 that those cells truly represented the memory in the brain. (See illustration on page 28.) Those early days of memory manipulation were “incredibly exciting,” recalls neuroscientist Tomás Ryan, who trained at MIT with Tonegawa during that time before starting his own lab at Trinity College Dublin. The new techniques “entirely changed how we can do things.” Since then, many researchers have adopted Tonegawa’s system, or created variations, to ask their own questions about memory. For example, rather than force a recollection, a par-

allel technique can make a mouse forget. Christine Ann Denny, a neuroscientist at Columbia University Irving Medical Center, and colleagues bred mice to produce archaerhodopsin, a protein that pumps protons out of the cell in response to yellow light, silencing neurons. The researchers manipulated a group of hippocampal neurons that linked a lemon-scented chamber to a foot shock. When they silenced those neurons with yellow light, the mice forgot their fear.4


False memories Perhaps the ultimate test for scientists’ understanding of memory is to make one, from scratch, as Frankland, Josselyn, and colleagues did last year. To succeed, they needed to do two things: First, fake some cue—the neural equivalent to the reallife sensation, such as a tone, that mice are normally exposed to in conditioning experiments. And second, falsify the mouse’s associated expectations—the good or bad outcome that the animal would anticipate when it sensed that cue. For the cue, the team chose smell because the neurons in the olfactory system are understood in detail. Olfactory neurons with a receptor called M72 are activated by orange-scented acetophenone. Using mice that produce channelrhodopsin in every M72 sensory neuron, the team could shine blue light in this part of the brain to trigger the sensation of a whiff of orange. To set the animals’ expectations, the researchers tapped into one of two known pathways into the midbrain’s ventral tegmental area, which is involved in behavior reinforcement. One of the pathways is linked to reward, the other to aversion. By pairing the optogenetic stimulation of one of those pathways with optogenetic stimulation of M72, the team could link the scent cue to a good or bad “memory.” Control mice didn’t particularly prefer one side or the other of the striped and dotted box, despite the different wallpaper and scents. But mice that had been optogenetically trained to associate M72 activation with a reward spent more time near the end smelling like oranges. If they were conditioned to link the orange smell with an unpleasant sensation, they avoided it. The mice showed no preference for or aversion to carvone, the control scent. It was exactly as the team predicted, demonstrating that the researchers understood the rudiments of the underlying memory systems. When the scientists examined the neurons activated in the animals’ brains, there was significant overlap between the memory traces of mice with the artificial aversion memory and those of mice that had actually experienced a foot shock while smelling acetophenone, further validating the results. “This shows that we are beginning to have a much deeper understanding of how memories are made,” says Josselyn, “so much so that we can mimic the process and create an artificial memory using only optogenetics.” Most neuroscientists in this field work in rodents and study episodic memories—memories of experiences that an animal has lived through. In contrast, University of Texas Southwestern Medi-

REMEMBER WHEN: In a slice of hippocampus from a mouse brain, green fluorescent protein is present in the neurons associated with a fear memory of mild foot shocks (top) or a positive memory of a male-female social interaction (bottom). (Blue and red are counterstains for all the cells in the area.)

cal Center neuroscientist Todd Roberts has successfully implanted procedural memories, which encode how to do something, in the brains of birds. Young male songbirds must learn their father’s song in order to woo mates when they grow up. For zebra finches, Roberts says, just a few seconds of dad’s tune—a repetition of three to six unique elements, about 100 milliseconds each—is enough to seed the young bird’s memory. Male chicks then spend months practicing until their songs match the melodies they remember. “They will develop a perfect copy,” says Roberts. In 2014, Roberts and then-graduate student Wenchan Zhao set out to do something much simpler than implant a memory: they wanted to disrupt the song-learning circuit in the bird’s nidopallium, a brain region that serves similar top-level functions to those of the mammalian cortex. They assumed that if they did so while the bird was listening to an adult’s song, the memory of the song would be scrambled. But a control experiment yielded unexpected results. Zhao used channelrhodopsin to stimulate the learning circuit of a young bird raised without a father figure, before the animal was transferred to the company of an adult male tutor. Zhao expected that when the baby interacted with the male tutor, it would learn that male’s song. It didn’t. “This bird, when it grew up, had a really weird song,” 05 . 202 0 | T H E S C IE N T IST 27

METHODS OF MEMORY MANIPULATION As a memory forms, certain neurons are incorporated into a memory trace, a neural network associated with a particular experience that is active when the memory is recalled. By permanently altering those neurons in mice, researchers can control their activity. Neurons are engineered to produce channelrhodopsin (Chr), a light-sensitive ion channel, once they’re recruited into a specific memory trace. From memory formation onward, blue light can activate them, triggering the animal to act as if it is recalling the previous experience.


 2 As the mouse experiences a foot shock, delivered in a specific enclosure and accompanied by a tone, the neurons recruited to that memory trace are altered and begin to make channelrhodopsin.

 1 Scientists engineer mice such that neurons will produce channelrhodopsin once recruited into a memory trace. The mouse’s diet determines when the neurons are vulnerable to this effect.


Blue light


 3 Later, scientists can use blue light to activate the trace neurons, causing the cells to fire and the mouse to freeze in fear, as it learned to do when presented with the tone that heralded a foot shock.


Over the past decade, researchers have developed and refined techniques to activate channelrhodopsin to make a mouse behave as if it’s recalling a specific experience (Nature, 484:381–85, 2012).


Last year, researchers used channelrhodopsin to implant a completely false memory in a mouse’s brain: the idea that it had experienced something negative associated with the orange-like smell of acetophenone (Nat Neurosci, 22:933–40, 2019).

 1 Scientists engineer M72 olfactory receptor neurons, which sense orange-scented acetophenone, to respond to blue light.

 2

They do the same with neurons that control aversion to unpleasant stimuli such as a foot shock.



 3 Stimulation of both areas simultaneously results in the formation of a false memory, linking the acetophenone odor to unpleasantness.

 4 A mouse that has never experienced the smell of acetophenone will avoid the orange-like odor.

05 . 2020 | T H E S C IE N T IST 2 9

Good memories, bad memories Many memories aren’t neutral, but are charged with emotion. Ramirez recalls enduring a breakup that took place over a large iced coffee at Crema Café in Harvard Square in 2012 when he was


in graduate school. “In the immediate aftermath, going past Crema was a painful reminder . . . an emotional kick to the gut,” he recalls. The cafe became linked to the unpleasant memory, he says. But with time, as Ramirez came to terms with the breakup and continued to frequent Harvard Square, that emotional tinge faded. He says he could visit the cafe for his favorite peanut butter–banana sandwich without distress by the time Crema closed last year, seven years later. His experience illustrates the theory behind exposure therapy for negative memories, which involves re-experiencing the real situation linked to trauma or anxiety, or an imagined version or virtual reality simulation. For example, therapists have used virtual reality scenes featuring jungles and helicopters to treat PTSD in Vietnam veterans. The hypothesis holds that repeated exposure to a memory can drain it of emotional power. Ramirez wondered if he could do the same thing, optogenetically, in mice—if repeatedly activating the memory of something scary could diminish the associated freezing behavior. His team focused their attention, and light beams, on the top of the dentate gyrus, where contextual information such as place and time is recorded. (See “Memories of Time” on page 32.) They trained mice to associate a particular chamber with a shock, and tagged the corresponding memory trace in the dentate gyrus with channelrhodopsin. Then they reactivated that trace with light for 10 minutes, twice daily for five days, forcing the mice to recall the experience while in a novel, shock-free zone. Mice that were returned to the original, shock-linked chamber were less likely to freeze than mice who had not been subject to memory reactivation. In the treated mice, fewer neurons from the original trace were active the second time in the chamber.6 “We think, in this case, it’s that particular fearful memory that we were able to turn the volume down on,” says Ramirez. The researchers discovered that the location of the stimulation mattered. When they reactivated neurons from the same trace, but in the bottom of the dentate gyrus—associated with responses to stress and anxiety—they got the opposite results. Mice activated more of the neurons associated with the original fear memory when returned to the original enclosure, and the animals were more likely to freeze, as if the volume of their fearful memory had been turned up. In other experiments, Ramirez has examined the power of positive memories to alleviate depression-like symptoms in mice. To create that positive memory, the researchers let the mice spend time with a mouse of the opposite sex, while labeling active trace neurons in the dentate gyrus with channelrhodopsin. Then, they stressed the mice by immobilizing them in a cone-shaped device to produce a depression-like state. When the animals were then lifted by their tails, these mice spent less time struggling than non-stressed mice, and they showed little preference for sugar water, normally a desirable treat. But when the team stimulated the recollection of the earlier romantic interlude, the mice immediately acted like they felt better, choosing sugar water over plain water and spending more time trying to escape when dangled.7


Roberts says. Zhao’s brief pulse of light had set the bird’s song memory, implanting an artificial sense of how the tune should sound. The bird then spent its youth striving to measure up to that fake memory. Intrigued, Zhao experimented by exposing young, untutored birds with channelrhodopsin-carrying neurons in their learning circuit to the blue light for different periods of time, then raising them without tutors. If Zhao flashed the light for 50 milliseconds, the chicks grew up to sing songs with shorter-than-normal elements, producing a melody more like the quick trills of a canary. If she lit up the brain for 300 milliseconds, the sound elements were too long.5 “It sounds like they’re just yelling one pulse,” says Roberts. “It’s really quite bizarre.”

“It seemed to be a very effective way of reversing these depression-related behaviors,” says Ramirez. In contrast, mice that experienced reactivation of a neutral or negative memory didn’t show improvement of symptoms.

These crazy things we can do in the lab are really important to back up our understanding of what the brain is doing. —Sheena Josselyn, Hospital for Sick Children

People with depression have difficulty recalling positive experiences, Ramirez notes, but if there were some way to promote those recollections, it might help. He wonders: “Could we almost view memory as a drug?”

Memories lost and found Everyone’s memories can naturally fade. Memory loss can also be pathological, as in the case of Alzheimer’s disease or amnesia. But when memories disappear, are they gone for good? Or has the brain merely lost access to the trace? At Columbia, Denny and colleagues tested whether they could give mice better access to lost memories, jump-starting the recollection with optogenetics. The researchers crossed mice modeling Alzheimer’s disease (AD) with ones that would allow them to label a memory trace in the dentate gyrus with channelrhodopsin. The team let the animals age until they started to show deficits in memory tests at six months (the equivalent of age around 30 in human years), then activated channelrhodopsin in the memory trace as the mice learned to anticipate a shock in a particular chamber. Five days later, when the animals were returned to that same chamber, the researchers stimulated the channelrhodopsin-labeled trace cells with light. With their memories reactivated, six-month-old AD mice froze as often as non-AD animals, indicating that the memory was still there.8 The effects wore off within a day of stimulation, though, suggesting more stimulation would be necessary to produce ongoing memory improvements. Ryan and Tonegawa saw similar results in tests of mice with amnesia.9 With stimulation of a trace, “the memory comes back,” Ryan says. “Even severe kinds of memory loss can be because the memory is locked in your brain, not destroyed.” That matches the tendency for most people with amnesia to recover. Could such faded memories be restored in humans? Denny thinks that somehow stimulating the dentate gyrus in people with Alzheimer’s might help with memory loss. Of course, that’s easier said than done. “We’re not going to be sticking optic fibers into the human brain anytime soon,” says Ramirez. Clinical applications will require different tools, such as medications or psychotherapy.

In some cases, it’s simpler to stimulate a memory in a person than a mouse. Psychotherapists can bring up a past experience in conversation, or show a patient a picture. Just recalling a memory makes it malleable, vulnerable to being overwritten with a different emotional load. In other words, “face your fears,” says Johannes Gräff, a neuroscientist at the École Polytechnique Fédérale de Lausanne in Switzerland. Researchers are experimenting in clinical trials with drugs such as ketamine and MDMA (dubbed “ecstasy” by recreational users) that may help people change the emotional charge of certain memories as they reflect upon those episodes. But a person who has experienced trauma or forgetfulness in a complex natural environment is hardly the same as a cloistered lab mouse worried about a foot shock. “Life, in the real world, is an accumulation of an almost infinite number of memories across a lifetime,” says Cai. And complete memory traces are not limited to the few thousand cells that scientists can access in a mouse brain using an optic fiber. As a result, researchers are moving toward more-realistic interrogations of memory. Denny and Ramirez are building whole-mouse-brain, 3-D memory maps. The pair and others are investigating multiple memories, their interactions, and how the system changes with age. Experiments of this variety will provide deeper insights into the neuroscience of memory, which might eventually support the clinical use of memory manipulation. While direct manipulations of human memory traces are a long way off, many neuroscientists remain in awe of what’s been achieved in animals after just a decade of using optogenetics to delete memories or implant false ones. “These crazy things we can do in the lab are really important to back up our understanding of what the brain is doing,” says Josselyn. Plus, she admits, “doing the science-fiction type things is really fun.” g Amber Dance is a freelance science journalist living in the Los Angeles area. Read her work or reach out at

References 1. G. Vetere et al., “Memory formation in the absence of experience,” Nat Neurosci, 22:933–40, 2019. 2. J.-H. Han et al., “Selective erasure of a fear memory,” Science, 323:1492–96, 2009. 3. X. Liu et al., “Optogenetic stimulation of a hippocampal engram activates fear memory recall,” Nature, 484:381–85, 2012. 4. C.A. Denny et al., “Hippocampal memory traces are differentially modulated by experience, time, and adult neurogenesis,” Neuron, 83:189–201, 2014. 5. W. Zhao et al., “Inception of memories that guide vocal learning in the songbird,” Science, 366:83–89, 2019. 6. B.K. Chen et al., “Artificially enhancing and suppressing hippocampusmediated memories,” Current Biology, 29:1885–94.e4, 2019. 7. S. Ramirez et al., “Activating positive memory engrams suppresses depressionlike behavior,” Nature, 522:335–39, 2015. 8. J.N. Perusini et al., “Optogenetic stimulation of dentate gyrus engrams restores memory in Alzheimer’s disease mice,” Hippocampus, 27:1110–22, 2017. 9. T.J. Ryan et al., “Engram cells retain memory under retrograde amnesia,” Science, 348:1007–13, 2015.

05 . 202 0 | T H E S C IE N T IST 3 1


Memories of Time Rats and equations help researchers develop a theory of how the human brain keeps track of past experiences. BY CATHERINE OFFORD



o matter how he looked at the data, Albert Tsao couldn’t see a pattern. Over several weeks in 2007 and again in 2008, the 19-year-old undergrad trained rats to explore a small trial arena, chucking them pieces of tasty chocolate cereal by way of encouragement. He then recorded the activity of individual neurons in the animals’ brains as they scampered, one at a time, about that same arena. He hoped that the experiment would offer clues as to how the rats’ brains were forming memories, but “the data that it gave us was confusing,” he says. There wasn’t any obvious pattern to the animals’ neural output at all. Then enrolled at Harvey Mudd College in California, Tsao was doing the project as part of a summer internship at the Kavli Institute for Systems Neuroscience in Norway, in a lab that focused on episodic memory—the type of long-term memory that allows humans and other mammals to recall personal experiences (or episodes), such as going on a first date or spending several minutes searching for chocolate. Neuroscientists suspected that the brain organizes these millions of episodes partly according to where they took place. The Kavli Institute’s Edvard Moser and May-Britt Moser had recently made a breakthrough with the discovery of “grid cells,” neurons that generate a virtual spatial map of an area, firing whenever the animal crosses the part of the map that that cell represents.1 These cells, the Mosers reported, were situated in a region of rats’ brains called the medial entorhinal cortex (MEC) that projects many of its neurons into the hippocampus, the center of episodic memory formation. Inspired by the findings, Tsao had opted to study a region right next to the MEC called the lateral entorhinal cortex (LEC), which also feeds into the hippocampus. If the MEC provided spatial information during memory formation, he and others had reasoned, maybe the LEC provided something else, such as information about the content of the experience itself. Tsao had been alternating the color of the arena’s walls between trials, from black to white and back again, to see if LEC neurons showed consistently different firing patterns in each case. But he was coming up empty-handed.

While Tsao struggled to make sense of his data, a researcher on the other side of the Atlantic Ocean was tackling a seemingly unrelated problem. Marc Howard, a theoretical and computational neuroscientist then at Syracuse University, had filled a chalkboard with equations describing how the brain might achieve the complex task of organizing memories, according not to where they were formed, but to when. His mathematical model showed that if the passing of time was represented in a certain way in neural circuits, then that time signal could be converted into a series of mental “time stamps” during memory formation to help the brain organize past experiences in chronological order. Without data to confirm his model, however, the idea remained just that: an idea. It would be several years before the two researchers became aware of each other’s work. By the time they did, neuroscientists had started thinking in new ways about how the brain keeps track of when experiences occurred. Today, the theoretical and experimental advances made by Howard, Tsao, and others in this field are helping to reshape researchers’ understanding of how episodic memories are formed, and how they might influence our perception of the past and future. Back in 2008, however, Tsao was focused on finishing college. When his second summer in Norway came to an end, he left the Kavli Institute and his confusing dataset behind, and returned to California.

Another dimension When the cognitive neuroscientist Endel Tulving coined the term “episodic memory” in a book chapter in 1972, he observed that recalling the content of memories was linked to a strong subjective sense of where and when an episode took place. The where component has been a focus of neuroscientific research for decades. In 1971, University of College London neuroscientist John O’Keefe discovered place cells, neurons in the hippocampus that fire in response to an animal being in specific locations.2 He shared the Nobel Prize in Physiology or Medicine with the Mosers in 2014 for their discovery of grid cells in the MEC, and several studies published since sug05 . 202 0 | T H E S C IE N T IST 3 3

specific timepoints during behavioral tasks: a rat trained to associate a stimulus with a subsequent reward would have one hippocampal neuron that peaked in activity a couple hundred milliseconds after the stimulus was presented, another that peaked in activity a few hundred milliseconds after that, and so on—as if the hippocampus were somehow marking the passage of time. The findings, which are beginning to be extended to humans thanks to work by Lee’s group and a separate team at the University of Texas Southwestern,4,5 among others, generated interest in the representation of time alongside space in episodic memories. Yet it was unclear what was telling these cells when to fire, or what role, if any, they played in the representation of time passing within and between individual episodic memories. For Marc Howard, long fascinated by questions about the physical nature of time and the brain’s perception of it, the puzzle was a captivating one. In the years leading up to Eichenbaum’s paper, Howard and his postdoc Karthik Shankar had been developing a mathematical model based on the idea that the brain could create a proxy for the passage of time using a population of “temporal context cells” that gradually changes its activity.6 According to this model, all neurons in this population become active following some input (a sen-

gest that grid cells help the hippocampus generate place cells during memory formation. How the brain encodes the when of memories has received far less attention, notes Andy Lee, a cognitive neuroscientist at the University of Toronto. “Space is something we see, it’s easy to manipulate. . . . It’s somewhat easier for us to grasp intuitively,” he says. “Time is much harder to study.” Despite the thorniness of the subject, researchers have established in the last decade or so “that the brain has multiple ways to tell time,” says Dean Buonomano, a behavioral neuroscientist at the University of California, Los Angeles, and author of the 2017 book Your Brain is a Time Machine. Time is integral to many biological phenomena, from circadian rhythms to speech perception to motor control or any other task involving prediction, Buonomano adds. One of the biggest breakthroughs in understanding time as it relates to episodic memory came a few years after Tsao completed his internship, when the late Boston University neuroscientist Howard Eichenbaum and colleagues published evidence of “time cells” in the hippocampus of rats. Hints of time-sensitive cells in the hippocampus had been trickling out of labs for a couple of years, but Eichenbaum’s study showed definitively that certain cells fire in sequence at


Temporal context cell activity


Time cell

 1

 2

low Hippocampus high LEC low


Temporal context cell


Time cell activity

It’s unclear how the brain keeps track of the timing of events within a memory. One theory posits that, as memories are formed, temporal information about the experience is represented by gradual changes in activity in a particular population of neurons situated in the brain’s lateral entorhinal cortex (LEC, yellow region). These neurons, called temporal context cells, become active at the beginning of an experience—as a rat explores an arena, for example—and then relax gradually, at different rates. Other brain cells (not shown) may also become more active throughout an experience, or change their activity on a slower time scale, spanning multiple experiences. This information is fed into the hippocampus (pink region), which generates time cells. These cells become active sequentially at specific moments during an experience to mark the passage of time.

sory stimulus, for example), and then relax, one by one, creating a gradually decaying signal that is unique from moment to moment. Then, during memory formation, the brain converts this signal into a series of sequentially firing “timing cells,” which log moments within a memory. The same framework could also work to tag entire episodes according to the order in which they took place. The specific mathematical details of the model—in particular, the use of an operation called a Laplace transform to describe how temporal context cells compute time, and the inversion of that transform to describe the behavior of the hypothesized timing cells—nicely recapitulated several known features of episodic memory, such as the fact that it’s easier to remember things that happened more recently than things that happened a long time ago. And after hippocampal time cells, with their sequential firing patterns, were described in 2011, Howard, by then at Boston University, was gratified to see that they seemed to possess many of the properties he and Shankar had predicted for their so-called timing cells. But the first piece of the puzzle was still missing. No one had identified the gradually evolving set of temporal context neurons needed to produce the time signal in the first place, Howard says.

“We waited a long time for somebody to do the experiment—really just moving the electrodes over to the LEC and looking for it.”

Finding a signal After graduating from Harvey Mudd in 2009, Tsao returned to the Kavli Institute for a PhD. Although he mostly worked on other projects, by the end of his program he’d convinced himself, and the Mosers, that the rat experiments from his summer internship were worth another look. Tsao was “an exceptional student,” May-Britt Moser says, and the Kavli team trusted that his data were correct, but “we didn’t know what we were seeing.” The neurons in the LEC seemed to be behaving so unpredictably. Digging back into his old work after he graduated from his PhD program, Tsao began thinking about better ways to analyze the dataset. “We had always looked at activity at the level of individual neurons,” he says. “At some point, we decided to look at it at the entire population level.” In doing so, Tsao revealed that LEC activity was, in fact, changing—gradually, within and between trials. Data from further experiments, carried out by Kavli researchers after Tsao moved to Stanford University for a postdoc in 2015, showed that a whole cluster of cells within the LEC became active at

Some researchers hypothesize that, because the signal provided by the LEC is unique at any one time point, activity in this brain area could help timestamp memories themselves to allow temporal organization of individual episodes, in addition to marking time within experiences. Together, these records of time may help create the brain’s sense of when and in what order events happened, and could potentially aid the recall of memories later on by reinstating past patterns of activity.


Neural activity

Neural activity



Time Stimulus




 4







 5


 3



05 . 2020 | T H E S C IE N T IST 3 5

the beginning of trials, and then that activity decayed as individual neurons relaxed at various rates. Other cells in the LEC, meanwhile, seemed to become gradually less (or sometimes more) active over the course of the entire experiment. Looking at the data this way, the team was able to distinguish individual trials not just according to wall color but, far more intriguingly, by the order in which the rat had done them, explains May-Britt Moser. “Together, [these cells] coded for time.” Publishing the findings in late 2018, the team cited Howard’s and Shankar’s work, highlighting how the sort of activity patterns Tsao had seen in the LEC neuronal population matched up with the pair’s theoretical predictions.7 The Norwegian group also noted that this evolving signal seemed able to track passing time over multiple timescales— changing fast enough to distinguish between individual moments on the scale of seconds within a single episode, as well as to distinguish whole episodes from one another over the scale of minutes or hours. On reading the team’s findings, “I was ecstatic,” Howard says. “It was really a big deal for me.” The paper was exciting for many in the neuroscience community, and its publication was followed by a burst of theoretical work from several groups, not just Howard’s. Edmund Rolls, a computational neuroscientist at the University of Warwick, incorporated the findings from the Kavli group’s 2018 paper into a model that explored how interacting networks in the brain might convert gradually changing LEC activity into a sequence of hippocampal time cells,8 based on a framework he’d developed more than a decade earlier to explain how grid cells might lead to the generation of place cells.

Together, these cells coded for time. —May-Britt Moser, Kavli Institute

Additional experimental data started flowing in, too. Howard and colleagues, for example, analyzed recordings from monkeys’ entorhinal cortex—an area containing the MEC and LEC—and found activity similar to that observed in Tsao’s rats, according to a preprint published last summer on bioRxiv. Specifically, a cluster of neurons in the entorhinal cortex spiked after a monkey was presented with an image, and then returned to baseline, with different neurons relaxing at different rates.9 Just a couple of months later, researchers in Germany reported that activity recorded from the human LEC could be used to reconstruct the timeline of events people experienced during a learning task.10 The gradual change in LEC activity wasn’t the only novel result from Tsao’s paper. Several groups picked up on a related finding that the rate of change in the LEC—and indeed in many areas of the brain—may depend on the sort of experience an animal is having. That phenomenon might help explain why the passage of time within episodic memories seems so subjective.

Personal time As a follow-up to his original experiments with the rat arena, Tsao had done a couple of additional trials during his Kavli intern36 T H E SC I EN TIST |

ship with a figure-eight maze. In each of those trials, instead of freely exploring an arena, the rat would run around the maze, following the track left, then right, then left, and so on. After discovering patterns in rats’ LEC neuronal firing during arena trials, Tsao hoped to see something similar in data from the figureeight mazes—something that would distinguish trials from one another according to when they took place. “But . . . it turned out we couldn’t tell them apart very well,” says Tsao. “For a while this was very disappointing—this was basically the opposite conclusion that we had reached from the [arena] experiment.” It wasn’t until Tsao dug into the literature on episodic memory that he came to realize what might be going on. “Maybe it’s not so much about physical time, as you measure in clocks, but more about subjective time, as you perceive it,” he says. Running in a twisted loop was a repetitive, boring task compared to exploring an arena, and the rat’s LEC seemed to reflect that by changing its activity less substantially during the figure-eight experiment than it had during the arena experiment. It seemed as though the rat’s brain wasn’t really experiencing individual figure-eight trials as distinct events, at least not to the extent it had for arena trials, Tsao says. This link between the type of experience and the way time is represented in neurons touches on a well-known quirk of episodic memory. It’s easier to pick out memories from a week of exciting and varied activities than from a week filled with normal, uninteresting tasks, and the former feels much longer than the latter when it’s recalled. (This is different from the sensation of time dragging when doing something boring—an effect of consciously counting time as it passes rather than representing it in a memory of the event, notes Buonomano.) Tsao’s study hinted that part of this subjective effect might arise because the LEC, which receives neural input from areas involved in processing sensory information, changes its activity to a greater degree during more complex experiences than during ones that require little processing. It implies, Tsao speculates, that time in memory might be entirely “drawn from the content of your experiences, as opposed to being coded as an explicit thing.” Although neural recordings are challenging to carry out in humans, functional MRI (fMRI) data from other research groups has helped flesh out the link between the rate of activity changes in the cortex and the representation of time in memory. Kareem Zaghloul, a neurosurgeon and neuroscientist at the National Institute of Neurological Disorders and Stroke and a “big fan of Marc Howard’s work and his model,” had been running an experiment on the effects of brain stimulation on human memory around the time Tsao’s paper came out. As part of their project, Zaghloul and his colleagues decided to use their dataset to look at how temporal context might influence memory formation. “We hypothesized that perhaps the extent to which these signals of time change, maybe that affects your ability to distinguish memories from one another,” Zaghloul says. Participants in his group’s study had been asked to learn pairs of words, such as “pencil” and “barn,” and then remember these pairs later while avoiding confusing them with other pairs they’d learned, such as “orange” and “horse.” Measuring activity using electroencephalography across broad regions of participants’ brains while

they learned the word pairs, the researchers found that the faster a person’s neural activity changed during the learning task, the better they performed on memory recall later on.11 Electrical stimulation of participants’ brains during the learning task didn’t have a consistent effect on the rate of activity change, Zaghloul adds, “but when it made it faster, people tended to do better at remembering the word pairs, and when it made it slower . . . people tended to do worse,” he says. The findings, published last year, suggest “that this representation of time does play a role in your ability to lump or distinguish memories,” he says. That the sense of time in episodic memory might be dependent on neural activity rather than on a traditional clock reinforces some researchers’ belief that the brain perceives time rather differently from how people imagine it to. Buonomano and New York University neuroscientist György Buzsáki have independently argued before and since Tsao’s work that neuroscientists should rely less on preconceived notions of time and instead think more about how time-related information might be used by the brain. “The sole function of memory is to allow animals to better prepare for the future,” says Buonomano. “Sometimes the field forgets that detail.”

Thinking ahead Tsao is still studying the brains of rats as a postdoc at Stanford University, although his focus has shifted to other topics in neuroscience. But for other researchers studying the representation of time in episodic memory, the work has only just begun. May-Britt Moser says her group is continuing the line of research Tsao started, exploring how the hippocampus in rats integrates temporal and spatial information from the LEC and MEC during memory formation.12 The idea’s been around for a while. Several years ago, Eichenbaum and colleagues reported that rats’ time cells seem sensitive to spatial as well as temporal information. More-recent research has complicated the story further, identifying time cells outside the hippocampus, and finding that some place cells seem to respond to time-related signals from the LEC, leading some neuroscientists to propose that the hippocampus possesses different time-tracking systems for different timescales. To Howard, one of several theoreticians who has modeled how the brain might combine signals encoding the when and where of episodic memories, the blurred boundary between space and time is intuitive. Having originally trained in physics, he says, “I was pretty sure that the brain’s representation of space and the brain’s representation of time ought to obey the same equations,” whatever the scale. He and many other neuroscientists are now working under the assumption that the brain uses a unified representation of space and time in remembered experiences. And at least for some aspects of memory, Howard says, “I think that’s the story that’s starting to unfold now.” While Tsao’s work focused on how time is encoded during memory formation, some groups are working on the other side of the coin: what happens during the process of memory retrieval. Researchers at the University of California, Irvine, recently reported that people who showed higher LEC activity during a memory retrieval task were better at recalling when specific

events in a sequence happened, supporting a role for the LEC in a sense of time during memory retrieval as well as formation.13 Zaghloul, Howard, and others, meanwhile, have independently published work showing that when people successfully recall memories, they seem to reinstate the activity patterns in the medial temporal lobe—a region that includes the hippocampus and the entorhinal cortex—that were present when that memory was formed. It’s an effect, notes Zaghloul, that’s thought to allow a sort of “jump back in time” on recalling a memory.14,15 Such an ability to reinstate past activity patterns could have applications to the brain’s representation of events that haven’t yet happened, too, Howard says. “It occurred to us quite a while ago that if the brain has equations of the past, you could construct an estimate of the future with the same types of properties.” Empirical data to test the idea are lacking for now. One of the first things to do will be to figure out how the brain could skip back or forward to different activity states, because “we don’t currently have algorithms that can do that,” Howard notes. “We’re actively working on . . . figuring out a set of equations to describe the how of jumping back in time. Actually, I’m looking at my chalkboard right now, and I’m pretty optimistic.” g

References 1. T. Hafting et al., “Microstructure of a spatial map in the entorhinal cortex,” Nature, 436:801–806, 2005. 2. J. O’Keefe, J. Dostrovsky, “The hippocampus as a spatial map. Preliminary evidence from unit activity in the freely-moving rat,” Brain Research, 34:171–75, 1971. 3. C.J. MacDonald et al., “Hippocampal ‘time cells’ bridge the gap in memory for discontiguous events,” Neuron, 71:737–49, 2011. 4. S. Thavabalasingam et al., “Evidence for the incorporation of temporal duration information in human hippocampal long-term memory sequence representations,” PNAS, 116:6407–14, 2019. 5. G. Umbach et al., “Time cells in the hippocampus and entorhinal cortex support episodic memory,” bioRxiv, doi:10.1101/2020.02.03.932749, 2020. 6. K.H. Shankar, M.W. Howard, “Timing using temporal context,” Brain Res, 1365:3–17, 2010. 7. A. Tsao et al., “Integrating time from experience in the lateral entorhinal cortex,” Nature, 561:57–62, 2018. 8. E.T. Rolls, P. Mills, “The generation of time in the hippocampal memory system,” Cell Rep, 28:1649–58.E6, 2019. 9. I.M. Bright et al., “A temporal record of the past with a spectrum of time constants in the monkey entorhinal cortex,” bioRxiv, doi:10.1101/688341, 2019. 10. J.L.S. Bellmund et al., “Mapping sequence structure in the human lateral entorhinal cortex,” eLife, 8:e45333, 2019. 11. M.M. El-Kalliny et al., “Changing temporal context in human temporal lobe promotes memory of distinct episodes,” Nat Commun, 10:203, 2019. 12. J. Sugar, M.-B. Moser, “Neuronal codes for what, where, and when,” Hippocampus, 29:1190–205, 2019. 13. M.E. Montchal et al., “Precise temporal memories are supported by the lateral entorhinal cortex in humans,” Nat Neurosci, 22:284–88, 2019. 14. R.B. Yaffe et al., “Reinstatement of distributed cortical oscillations occurs with precise spatiotemporal dynamics during successful memory retrieval,” PNAS, 111:18727–32, 2014. 15. S. Folkerts et al., “Human episodic memory retrieval is accompanied by a neural contiguity effect,” J Neurosci, 38:4200–11, 2018.


Putting New Neurons in Their Place Adult neurogenesis, already known to play a role in learning and memory, also figures into mental health and possibly even attention, new research suggests.




n the spring of 2019, neuroscientist Heather Cameron set up a simple experiment. She and her colleagues put an adult rat in the middle of a plastic box with a water bottle at one end. They waited until the rat started drinking and then made a startling noise to see how the animal would respond. The team did this repeatedly with regular rats and with animals that were genetically altered so that they couldn’t make new neurons in their hippocampuses, a brain region involved in learning and memory. When the animals heard the noise, those that could make new hippocampal neurons immediately stopped slurping water and looked around, but the animals lacking hippocampal neurogenesis kept drinking. When the team ran the experiment without the water bottle, both sets of rats looked around right away to figure out where the sound was coming from. Rats that couldn’t make new neurons seemed to have trouble shifting their attention from one task to another, the researchers concluded.1 “It’s a very surprising result,” says Cameron, who works at the National Institute of Mental Health (NIMH) in Bethesda, Maryland. Researchers studying neurogenesis in the adult hippocampus typically conduct experiments in which animals have had extensive training in a task, such as in a water maze, or have experienced repetitive foot shocks, she explains. In her experiments, the rats were just drinking water. “It seemed like there would be no reason that the hippocampus should have any role,” she says. Yet in animals engineered to lack hippocampal neurogenesis, “the effects are pretty big.” The study joins a growing body of work that challenges the decades-old notion that the primary role of new neurons within the adult hippocampus is in learning and memory. More recently, experiments have tied neurogenesis to forgetting, one possible way to ensure the brain doesn’t become overloaded with information it doesn’t need, and to anxiety, depression, stress, and, as Cameron’s work suggests, attention. Now, neuroscientists are rethinking the role that new neurons, and the hippocampus as a whole, play in the brain.

The memory link The first hint that adult animal brains may make new neurons appeared in the early 1960s, when MIT neurobiologist Joseph Altman used radioactive labeling to track the proliferation of nerve cells in adult rats brains.2 Other data published in the 1970s and 1980s supported the conclusion, and in the 1990s, Fred “Rusty” Gage and his colleagues at the Salk Institute in La Jolla, California, used an artificial nucleotide called bromodeoxyuridine (BrdU) to tag new neurons born in the brains of adult rats and humans.3 Around the same time, Elizabeth Gould of Princeton University and her collaborators showed that adult marmoset monkeys made new neurons in their hippocampuses, specifically in an area called the dentate gyrus.4 While some researchers questioned the strength of the evidence supporting the existence of adult neurogenesis, most of the field began to shift from studying whether adult animal brains make new neurons to what role those cells might play. 40 T H E SC I EN TIST |

In 2011, René Hen at Columbia University and colleagues created a line of transgenic mice in which neurons generated by neurogenesis survived longer than in wildtype mice. This boosted the overall numbers of new neurons in the animals’ brains. The team then tested the modified mice’s cognitive abilities. Boosting numbers of newly born neurons didn’t improve the mice’s performances in water mazes or avoidance tasks compared with control mice. But it did seem to help them distinguish between two events that were extremely similar. Mice with more new neurons didn’t freeze as long as normal mice when put into a box that was similar to but not exactly the same as one in which they’d experienced a foot shock in earlier training runs.5

Neurogenesis appears to play a role in both remembering and forgetting. These results dovetailed with others coming out at the time, particularly those showing that aging humans, in whom neurogenesis is thought to decline, often have trouble remembering details that distinguish similar experiences, what researchers call pattern separation.6,7 “The line of thinking is that the memories that are most likely to be impacted by neurogenesis are memories that are really similar to each other,” says Sarah Parylak, a staff scientist in Gage’s lab at the Salk Institute. As insights into pattern separation emerged, scientists were beginning to track the integration of new rodent neurons into existing neural networks. This research showed that new neurons born in the dentate gyrus had to compete with mature neurons for connections to neurons in the entorhinal cortex (EC), a region of the brain with widespread neural networks that play roles in memory, navigation, and the perception of time.8 (See “Memories of Time” on page 32.) Based on detailed anatomical images, new dentate gyrus neurons in rodents appeared to tap into preexisting synapses between dentate gyrus neurons and EC neurons before creating their own links to EC neurons. To continue exploring the relationship between old and new neurons, a group led by the Harvard Stem Cell Institute’s Amar Sahay, who had worked with Hen on the team’s 2011 study, wiped out synapses in the dentate gyruses of mice. The researchers overexpressed the cell death–inducing protein Krüppel-like factor 9 in young adult, middle-aged, and old mice to destroy neuronal dendritic spines, tiny protrusions that link up to protrusions of other neurons, in the brain region. Those lost connections led to increased integration of newly made neurons, especially in the two older groups, which outperformed age-matched, untreated mice in pattern-separation tasks.9 Adult-born dentate gyrus neurons decrease the likelihood of reactivation of those old neurons, Sahay and colleagues concluded, preventing the memories from being confused.

Parylak compares this situation to going to the same restaurant after it has changed ownership. In her neighborhood in San Diego, there’s one location where she’s dined a few times when the restaurant was serving different cuisine. It’s the same location, and the building retains many of the same features, “so the experiences would be easy to mix up,” she says, but she can tell them apart, possibly because of neurogenesis’s role in pattern separation. This might even hold true for going to the same restaurant on different occasions, even if it served the same food. That’s still speculative at this point. Researchers haven’t been able to watch neurogenesis in action in a living human brain, and it’s not at all clear if the same thing is going on there as in the mouse brains they have observed. While many scientists now agree that neurogenesis does occur in adult human brains,

there is little consensus about what it actually does. In addition to the work supporting a role for new neurons in pattern separation, researchers have accumulated evidence that it may be more important for forgetting than it is for remembering.

The importance of forgetting It seems counterintuitive for neurogenesis to play a role in both remembering and forgetting, but work by Paul Frankland of the Hospital for Sick Children Research Institute in Toronto suggests it is possible. In 2014, his team showed that when mice made more new neurons than normal, they were more forgetful. 10 He and his colleagues had mice run on wheels to boost levels of neurogenesis, then trained the animals on a learning task. As expected, they did better than control mice who hadn’t exercised. (See “How Exercise Reprograms





Most of the research into neurogenesis involves boosting or inhibiting animals’ generation of new neurons, then training animals on a complex memory task such as finding a treat in a maze, and later retesting the animals. Decreasing neurogenesis tends to hamper the animals’ ability to remember.1

Alzheimer’s disease, Parkinson’s disease

Training mice or rats on a memory task before manipulating neurogenesis has also been found to affect the strength of the trained memory. Boosting neurogenesis reduced the memory’s strength, perhaps an extreme form of forgetting that at normal levels avoids the remembering of unnecessary details.2

Alzheimer’s disease and other forms of dementia

Research has linked decreased neurogenesis with more anxious and depressive behaviors in mice. Stress can reduce neurogenesis, ultimately leading mice to be more anxious in future stressful situations.3

PTSD, anxiety, depression

Research has linked decreased neurogenesis with trouble switching focus.4





1. Neuron, 91:1356–73, 2016; 2. J Neurosci, 38: 3190–98, 2018; 3. Hippocampus, 29:848–61, 2019; 4. Behav Brain Res, 376:112152, 2019. 05 . 2020 | T H E S C IE N T IST 41

HOW ADULT-BORN NEURONS INTEGRATE INTO THE BRAIN In recent years, images and videos taken with state-of-the-art microscopy techniques have shown that new neurons in the dentate gyrus of the hippocampus go through a series of changes as they link up to existing networks in the brain.

Entorhinal cortex neuron


New mature neuron

Immature neuron Interneuron Hippocampal neuron

Dentate gyrus neuron



New neuron fires

Excitatory signal

Once connections are formed, mature neurons send signals into the new neuron, and the cell starts firing off more of its own signals. At around four weeks of age, the adult-born neuron gets hyperexcited, sending electrical signals much more often than its well-established neuronal neighbors do.


As the new neuron grows, it rotates from a horizontal to a vertical position and connects to an interneuron (yellow) in a space called the hilus that sits within the curve of the dentate gyrus. The young neuron also starts making connections with well-established dentate gyrus neurons (blue) as well as neurons in the hippocampus (red).

Inhibitory signal

As the new neuron connects with still more neurons, interneurons in the hilus start to send it signals to tamp down its activity.


A neural stem cell divides to generate a new neuron (green).

the Brain,” The Scientist, October 2018.) In other animals, the researchers boosted neurogenesis after the mice learned information thought to be stored, at least in the short term, in the hippocampus. “When we did that, what we found was quite surprising,” Frankland says. “We found a big reduction in memory strength.” His team was puzzled by the result. Adding to the confusion, the researchers had observed a larger effect in memory impairment with mice that learned, then exercised, than they had seen in memory improvement when the mice ran first and then learned. As he dug into the literature, Frankland realized the effect was what other neuroscientists had called forgetting. He found many theoretical papers based on computational modeling that argued that as new neurons integrate into a circuit, the patterns of connections in the circuit change, and if information is stored in those patterns of connections, that information may be lost. (See “Memory Munchers” on page 21.) The notion surprised other neuroscientists, mainly because up to that point they’d had two assumptions related to neurogenesis and forgetting. The first was that generating new neurons in a normal animal should be good for memory. The second was that forgetting was bad. The first assumption is still true, Frankland says, but the second is not. “Many people think of forgetting as some sort of failure in our memory systems,” he explains. Yet in healthy brains there’s tons of forgetting happening all of the time. “And, in fact, it’s important for memory function,” Frankland says. “It would actually be disadvantageous to remember everything we do.” Parylak says this idea of forgetting “certainly has provoked a lot of discussion.” It’s unclear, for example, whether the mice in Frankland’s experiments are forgetting, or if they are identifying a repeat event as something novel. This is the point, she explains, where doing neurogenesis research in humans would be beneficial. “You could ask a person if they’d actually forgotten or if they are making some kind of extreme discrimination.” Despite the questions regarding the results, Frankland and his colleagues continued their work, testing mice’s forgetfulness with all types of memories, and more recently they asked whether the forgetting effect jeopardized old and new memories alike. In experiments, his team gave mice a foot shock, then boosted hippocampal neurogenesis (with exercise or a genetic tweak to neural progenitor cells), and put the mice in the same container they’d been shocked in. With another group of mice, the researchers waited nearly a month after the foot shock before boosting neurogenesis and putting the mice back in the container. Boosting the number of new neurons, the team found, only weakened the newly made memory, but not one that had been around for a while.11 “This makes a lot of sense,” Frankland says. “As our memories of everyday events gradually get consolidated, they become less and less dependent on the hippocampus,” and more dependent on another brain region: the cortex. This suggests that remote memories are less sensitive to changes in hippocampal neurogenesis levels.

The hippocampus tracks what’s happened to you, Frankland says. “Much of that’s forgotten because much of it is inconsequential. But every now and then something interesting seems to happen,” and it’s these eventful memories that seem to get “backed up” in other areas of the brain.

Beyond memory At NIMH, one of Cameron’s first studies looking at the effects of neurogenesis tested the relationship between new neuronal growth and stress. She uncovered the connection studying mice that couldn’t make new neurons and recording how they behaved in an open environment with food at the center. Just like mice that could still make new neurons, the neurogenesis-deficient mice were hesitant to go get the food in the open space, but eventually they did. However, when the animals that couldn’t make new neurons were stressed before being put into the open space, they were extremely cautious and anxious, whereas normal mice didn’t behave any differently when stressed.

Experiments have tied neurogenesis to forgetting, anxiety, depression, stress, and attention. Cameron realized that the generation of new neurons also plays a role in the brain separate from the learning and memory functions for which there was growing evidence. In her experiments, “we were looking for memory effects and looked for quite a while without finding anything and then stumbled onto this stress effect,” she says. The cells in the hippocampus are densely packed with receptors for stress hormones. One type of hormone in particular, glucocorticoids, is thought to inhibit neurogenesis, and decreased neurogenesis has been associated with depression and anxiety behaviors in rodents. But there wasn’t a direct link between the experience of stress and the development of these behaviors. So Cameron and her colleagues set up an experiment to test the connection. When the team blocked neurogenesis in adult mice and then restrained the animals to moderately stress them, their elevated glucocorticoid levels were slow to recover compared with mice that had normal neurogenesis. The stressed mice that could not generate new neurons also acted oddly in behavioral tests: they avoided food when put in a new environment, became immobile and increasingly distressed when forced to swim, and drank less sugary water than normal mice when it was offered to them, suggesting they don’t work as hard as normal mice to experience pleasure.12 Impaired adult neurogenesis, the experiments showed, played a direct role in developing symptoms of depression, Cameron says. 05 . 2020 | T H E S C IE N T IST 4 3

HOW ADULT-BORN NEURONS FUNCTION IN A CIRCUIT Researchers think neurogenesis helps the brain distinguish between two very similar objects or events, a phenomenon called pattern separation. According to one hypothesis, new neurons’ excitability in response to novel objects diminishes the response of established neurons in the dentate gyrus to incoming stimuli, helping to create a separate circuit for the new, but similar, memory. FIRST REWARD

Excitatory signal


Interneuron New neuron

Entorhinal cortex neuron Hippocampal neurons

Inhibitory signal

The notion that neurogenesis and stress might be tied directly to our mental states led Cameron to look back into the literature, where she found many suggestions that the hippocampus plays a role in emotion, in addition to learning and memory. Even Altman, who unexpectedly identified neurogenesis in adult rodents in the 1960s, and colleagues suggested as much in the 1970s. Yet the argument has only appeared sporadically in the literature since then. “Stress is complicated,” Cameron says; it’s hard to know exactly how stressful experiences affect neurogenesis or how the generation of new neurons will influence an animal’s response to stress. Some types of stress can decrease neurogenesis while others, such as certain forms of intermittent stress, can increase new neuronal growth. Last year, Cameron and colleagues found that generating new neurons helps rats used to model post-traumatic stress disorder recover from acute and prolonged periods of stress.13 44 T H E SC I EN TIST |

Her work has also linked neurogenesis to other characteristics of rodent behavior, including attention and sociability. In 2016, with Gould at Princeton and a few other collaborators, she published work suggesting that new neurons are indeed tied to social behavior. The team created a hierarchy among rats, and then deconstructed those social ranks by removing the dominant male. When the researchers sacrificed the animals and counted new neurons in their brains, the rats from deconstructed hierarchies had fewer new neurons than those from control cages with stable ranks. Rats with uncertain hierarchies and fewer new neurons didn’t show any signs of anxiety or reduced cognition, but they weren’t as inclined as control animals to spend time with new rats put into their quarters, preferring to stick with the animals they knew. When given a drug—oxytocin—to boost neurogenesis, they once again began exploring and spending time with new rats that entered their cages.14


Dentate gyrus neuron

Aging humans, in whom neurogenesis is thought to decline, often have trouble remembering details that distinguish similar experiences.

It’s becoming clear, Cameron continues, that neurogenesis has many functions in the adult brain, some that are very distinct from learning and memory. In tasks requiring attention, though, there is a tie to memory, she notes. “If you’re not paying attention to things, you will not remember them.” g


The study from Cameron’s lab on rats’ ability to shift their attention grew out of the researchers’ work on stress, in which they observed that rodents sometimes couldn’t switch from one task to the next. Turning again to the literature, Cameron found a study from 1969 that seemed to suggest that neurogenesis might affect this task-switching behavior. Her team set up the water bottle experiments to see how well rats shifted attention. Inhibiting neurogenesis in the adult mice led to a 50 percent decrease in their ability to switch their focus from drinking to searching for the source of the sound. “This paper is very interesting,” says J. Tiago Gonçalves, a neuroscientist at Albert Einstein College of Medicine in New York who studies neurogenesis but was not involved in the study. It could explain the findings seen in some behavioral tasks and the incongruences between findings from different behavioral tasks, he writes in an email to The Scientist. Of course, follow-up work is needed, he adds. Cameron argues that shifting attention may be yet another behavior in which the hippocampus plays an essential role but that researchers have been overlooking. And there may be an unexplored link between making new neurons and autism or other attention disorders, she says. Children with autism often have trouble shifting their attention from one image to the next in behavioral tests unless the original image is removed.

1. C.S.S. Weeden et al., “A role for hippocampal adult neurogenesis in shifting attention toward novel stimuli,” Behav Brain Res, 376:112152, 2019. 2. J. Altman, “Are new neurons formed in the brains of adult mammals?” Science, 135:1127–28, 1962. 3. P.S. Eriksson et al., “Neurogenesis in the adult human hippocampus,” Nat Med, 4:1313–17, 1998. 4. E. Gould et al., “Proliferation of granule cell precursors in the dentate gyrus of adult monkeys is diminished by stress,” PNAS, 95:3168–71, 1998. 5. A. Sahay et al., “Increasing adult hippocampal neurogenesis is sufficient to improve pattern separation,” Nature, 472:466–70, 2011. 6. C.K. Toner et al., “Visual object pattern separation deficits in nondemented older adults,” Learn Mem, 16:338–42, 2009. 7. M.A. Yassa et al., “Pattern separation deficits associated with increased hippocampal CA3 and dentate gyrus activity in nondemented older adults,” Hippocampus, 21: 968–79, 2011. 8. N. Toni et al., “Synapse formation on neurons born in the adult hippocampus,” Nat Neurosci, 10:727–34, 2007. 9. K.M. McAvoy et al., “Modulating neuronal competition dynamics in the dentate gyrus to rejuvenate aging memory circuits,” Neuron, 91:1356–73, 2016. 10. K.G. Akers et al., “Hippocampal neurogenesis regulates forgetting during adulthood and infancy,” Science, 344:598–602, 2014. 11. A. Gao et al., “Elevation of hippocampal neurogenesis induces a temporally graded pattern of forgetting of contextual fear memories,” J Neurosci, 38:3190–98, 2018. 12. J.S. Snyder et al., “Adult hippocampal neurogenesis buffers stress responses and depressive behavior,” Nature, 476:458–61, 2011. 13. T.J. Schoenfeld et al., “New neurons restore structural and behavioral abnormalities in a rat model of PTSD,” Hippocampus, 29:848–61, 2019. 14. M. Opendak et al., “Lasting adaptations in social behavior produced by social disruption and inhibition of adult neurogenesis,” J Neurosci, 36:7027–38, 2016.


DO NEW NEURONS APPEAR ANYWHERE ELSE IN THE BRAIN? Many, though not all, neuroscientists agree that there’s ongoing neurogenesis in the hippocampus of most mammals, including humans. In rodents and many other animals, neurogenesis has also been observed in the olfactory bulbs. Whether newly generated neurons show up anywhere else in the brain is more controversial. There had been hints of new neurons showing up in the striatum of primates in the early 2000s. In 2005, Heather Cameron of the National Institute of Mental Health and colleagues corroborated those findings, showing evidence of newly made neurons in the rat neocortex, a region of the brain involved in spatial reasoning, language, movement, and cognition, and in the striatum, a region of the brain involved in planning movements and reacting to rewards, as well as self-control and flexible thinking (J Cell Biol, 168:415–27). Nearly a decade later, using nuclear-bomb-test-derived carbon-14 isotopes to identify when nerve cells were born, Jonas Frisén of the Karolinska Institute in Stockholm and colleagues examined the brains of postmortem adult humans and confirmed that new neurons existed in the striatum (Cell, 156:1072–83, 2014). “Those results are great,” Cameron says. They support her idea that there are different types of neurons being born in the brain throughout life. “The problem is they’re very small cells, they’re very scattered, and there’re very few of them. So they’re very tough to see and very tough to study.” 05 . 2020 | T H E S C IE N T IST 4 5



One and Done THE PAPER Odor

M.E. Villar et al., “Redefining single-trial memories in the honeybee,” Cell Rep, 30:2603–13.e3, 2020.




Control odor

QUICK LEARNERS: Forager honeybees are exposed once to an odor while simultaneously receiving sucrose via a cocktail stick. The insects extend their proboscises to drink the sugary treat  1 . At 1 hour, 4 hours, 24 hours, or 72 hours after this experience, the bees are exposed to the same odor or to a control odor. At times up to 24 hours, most bees correctly extend their proboscises in response to the paired odor  2 and not the control one  3 . Even at 72 hours, approximately one-third of the trained bees do the same.

The researchers observed that a single exposure to a reward-paired odor was enough for most forager bees to remember that specific odor the following day: they extended their proboscises when exposed to the odor but not when exposed to an unrelated scent. Many foragers could even remember the odor three days later. Giurfa’s team went on to examine the molecular requirements of short-, mid-, and long-term memories in the brains of the bees by inhibiting either gene transcription, protein synthesis, or both during the learning period. They showed that short-term memory (one hour after training) required neither, mid-term memory (four hours after training) required the ability to make new proteins but not complete transcription, and

long-term memory (over 24 hours after training) required both. It is possible that nurse and guard bees differ in their learning capacities and molecular makeups, and that this explains the differences to prior studies, says Giurfa, but this is untested. The results do not mean that all prior research was wrong, says André Fiala of the University of Göttingen who studies fruit fly memory and was not involved in the project. “People have done the experiments in a different way.” Still, the new results do show that “the commonly held belief that one needs multiple training trials . . . to achieve longterm memory is not always true,” he says, and this “really advances the field.” —Ruth Williams


With their tiny brains and renowned ability to memorize nectar locations, honeybees are a favorite model organism for studying learning and memory. Such research has indicated that to form long-term memories—ones that last a day or more—the insects need to repeat a training experience at least three times. By contrast, short- and mid-term memories that last seconds to minutes and minutes to hours, respectively, need only a single learning experience. Exceptions to this rule have been observed, however. For example, in some studies, bees formed long-lasting memories after a single learning event. Such results are often regarded as circumstantial anomalies, and the memories formed are not thought to require protein synthesis, a molecular feature of long-term memories encoded by repeated training, says Martin Giurfa of the University of Toulouse. But the anomalous findings, together with research showing that fruit flies and ants can form long-term memories after single experiences, piqued Giurfa’s curiosity. Was it possible that honeybees could reliably do the same, and if so, what molecular mechanisms were required? Giurfa reasoned that the ability to form robust memories might depend on the particular type of bee and the experience. Within a honeybee colony, there are nurses, who clean the hive and feed the young; guards, who patrol and protect the hive; and foragers, who search for nectar. Whereas previous studies have tested bees en masse, Giurfa and his colleagues focused on foragers, tasking them with remembering an experience relevant to their role: an odor associated with a sugary reward.


IN THE DARK: Male fruit flies kept in constant darkness forget that their

attempted courtship was rejected.

REVERSING FEAR: In the basolateral amygdala of a mouse brain, newly formed fear-extinction memory cells (orange) can override the animal’s past memory of a foot shock.



Lasting Memories

Fear Extinction



S. Inami et al., “Environmental light is required for maintenance of long-term memory in Drosophila,” J Neurosci, 40:1427–39, 2020.

X. Zhang et al., “Amygdala reward neurons form and store fear extinction memory,” Neuron, 105:1077–93, 2020.

As Earth rotates around its axis, the organisms that inhabit its surface are exposed to daily cycles of darkness and light. In animals, light has a powerful influence on sleep, hormone release, and metabolism. Work by Takaomi Sakai, a neuroscientist at Tokyo Metropolitan University, and his team suggests that light may also be crucial for forming and maintaining long-term memories. The puzzle of how memories persist in the brain has long been of interest to Sakai. Researchers had previously demonstrated, in both rodents and flies, that the production of new proteins is necessary for maintaining long-term memories, but Sakai wondered how this process persisted over several days given cells’ molecular turnover. Maybe, he thought, an environmental stimulus, such as the light-dark cycles, periodically triggered protein production to enable memory formation and storage. Sakai and his colleagues conducted a series of experiments to see how constant darkness would affect the ability of Drosophila melanogaster to form long-term memories. Male flies exposed to light after interacting with an unreceptive female showed reduced courtship behaviors toward new female mates several days later, indicating they had remembered the initial rejection. Flies kept in constant darkness, however, continued their attempts to copulate. The team then probed the molecular mechanisms of these behaviors and discovered a pathway by which light activates cAMP response element-binding protein (CREB)—a transcription factor previously identified as important for forming long-term memories—within certain neurons found in the mushroom bodies, the memory center in fly brains. “The fact that light is essential for long-term memory maintenance is fundamentally interesting,” says Seth Tomchik, a neuroscientist at the Scripps Research Institute in Florida who wasn’t involved in the study. However, he adds, “more work will be necessary” to fully characterize the molecular mechanisms underlying these effects. —Diana Kwon

Fear conditioning, which connects a neutral stimulus with a painful experience in an animal’s brain, can be undone. Put a mouse in a cage where it experienced foot shocks the day before, and its initial response of freezing in place will eventually dissipate once the shock stimulus ceases. While scientists have known about such fear extinction for a long time, they haven’t understood how it happens in the brain. One hypothesis, says Susumu Tonegawa of MIT, is that a new memory takes the place of the fearful one: the original memory remains intact, but it’s inhibited by the new one. Under certain circumstances, the conditioned fear response can come back, Tonegawa explains. “This suggests the fear is still there, but it is dormant.” To get to the cellular bottom of this phenomenon, Tonegawa’s team focused on neurons in the murine basolateral amygdala (BLA), a brain area important for fear conditioning. The researchers identified certain neurons that were active during fear extinction and appeared to encode a new memory that suppressed the old fear memory. Using optogenetics, the scientists selectively turned on these Ppp1r1b+ neurons, and the mice quickly extinguished the fear memory in the cage where they had previously received foot shocks. When those neurons were turned off, the animals froze more and exhibited more fear. The team also found that Ppp1r1b+ neurons suppressed other BLA neurons, called Rspo2+, that are responsible for the fear memory. Tonegawa says the findings support the idea that Ppp1r1b+ neurons are linked with pleasurable emotions, while Rspo2+ neurons encode negative ones. “This study provides a nice extension on our current models for how the amygdala serves to process positive and negative [emotions],” Kay Tye, a neuroscientist at the Salk Institute for Biological Studies who was not involved in the research, writes in an email to The Scientist. —Kerry Grens 05 . 2020 | T H E S C IE N T IST 47


Unravelling Memory’s Mysteries Studying nonhuman primates, University of Washington neuroscientist Elizabeth Buffalo has identified important features of the neural underpinnings of learning and memory. BY DIANA KWON

FROM THE BASEMENT TO THE LAB BENCH Buffalo conducted her first scientific experiment as a teenager in Little Rock, Arkansas. Encouraged by her high school science teacher, she embarked on an ambitious science fair project, which involved investigating the behavioral effects of a chemical called para-chlorophenylalanine (PCPA) on rats. Inspired by a TV news segment noting that extreme risk takers had increased aggressiveness associated with higher-than-average levels of PCPA, Buffalo decided to put the behavioral effects of the chemical to the test. To carry out the experiment, Buffalo set up a makeshift lab in the basement of her house. Her science teacher helped her gather 48 T H E SC I EN TIST |

the necessary materials and expertise by connecting her with a professor at the nearby University of Arkansas Medical School. That professor provided guidance for her project and got her in touch with the Arkansas-based National Center for Toxicological Research, which donated rats for her experiment. Buffalo housed the rodents in cages with water bottles provided by the university. She injected the animals with different concentrations of PCPA, then examined changes in aggression levels by administering a small electric foot shock and documenting how much of a small wooden rod the animals would chew away in response. As she’d hypothesized, higher doses of PCPA made the rodents more aggressive. “It ended up being a pretty involved project,” Buffalo recalls. “At one point, we had 20 rats in the basement.”

I had these moments through almost every class where I thought, gosh, the most exciting thing I could ever try to figure out is exactly what happens in the brain when we learn something. —Elizabeth Buffalo, University of Washington

The experience solidified Buffalo’s interest in science and jumpstarted a series of summer jobs doing research. Most of those were spent in a behavioral toxicology lab led by Merle Paule at the National Center for Toxicological Research. Buffalo spent several summers during high school and college in Paule’s lab, working on experiments involving monkeys—such as assessing the behavioral effects of caffeine and other drugs on the animals. She coauthored a handful of papers about their research. Although Buffalo maintained an interest in science, she majored in philosophy at Wellesley College, where she started her undergraduate studies in 1988. She traces this decision back to a book, Scientific Realism and the Plasticity of Mind, by Canadian philosopher Paul Churchland, which she’d read while doing research for her high school science fair project. “I probably didn’t understand half of it, but it just was super interesting,” Buffalo says. Although science wasn’t her major, the subject remained a dominant force in Buffalo’s life. Intrigued by the brain, she concentrated her studies on the philosophy of mind, choosing psychobiology, which dealt with the biological basis of psychological processes, as her minor. Buffalo’s first neuroscience



s rodents scuttle through a maze, scientists can observe the activity of their brains’ “inner GPS,” neurons that manage spatial orientation and navigation. This positioning system was revealed through two different discoveries, decades apart. In 1971, neuroscientist John O’Keefe found place cells, neurons that are consistently activated when rats are in a specific location, while observing the animals as they ran around an enclosure. More than thirty years later, neuroscientists MayBritt and Edvard Moser used a similar method to identify grid cells, neurons that fire at regular intervals as animals move, enabling them to keep track of navigational cues. It was the early 2010s when neuroscientist Elizabeth Buffalo and her team at Emory University’s Yerkes National Primate Research Center in Atlanta started investigating what the brain’s GPS looks like in primates. While conducting memory tests by tracking the eye movements of primates viewing either familiar or unfamiliar images, the researchers began to wonder: Was this system also active in stationary animals? “They were moving their eyes as they were forming a memory of these pictures,” Buffalo says. “So we thought that maybe this eye movement exploration was something that primates do in an analogous way to how rodents explore as they move around a physical environment.” One of Buffalo’s graduate students, Nathaniel Killian, put this hypothesis to the test. Working with monkeys, he placed electrodes into the entorhinal cortex—the brain region where grid cells are found in rodents—and recorded brain activity while the animals viewed images on a screen. One day, Killian came into a lab meeting with an announcement: he had found grid cells in the primate brain (Nature, 491:761–64, 2012). Although it took many more months to complete additional experiments to validate the results, Buffalo remembers thinking during that meeting, “Wow, we’re seeing something really new.”

professor, the late Howard Eichenbaum, sparked her interest in learning and memory. “He would get super excited and enthusiastic about what he was talking about,” Buffalo says. “I had these moments through almost every class where I thought, gosh, the most exciting thing I could ever try to figure out is exactly what happens in the brain when we learn something.”


CAREER TITLES AND AWARDS Professor, Physiology and Biophysics, University of Washington Chief, Division of Neuroscience, Washington National Primate Research Center (2015–present) Associate Professor of Neurology, Emory University School of Medicine (2012–2013) Assistant Professor of Neurology, Emory University School of Medicine (2005–2012) McKnight Endowment Fund for Neuroscience Memory and Cognitive Disorders Award, 2018 National Academy of Sciences Troland Research Award, 2011

Greatest Hits • Found grid cells were active in the primate brain while a stationary animal is visually exploring a space • Identified differences in the synchronization of electrical signals across the layers of cortex within the neural circuit involved in processing attention • Revealed that visual memory and visual perception are processed in different regions of the brain

After completing her undergraduate degree in 1992, Buffalo went to graduate school with the goal of becoming a philosophy professor. She enrolled in a doctoral program at the University of California, San Diego, to work with philosopher Patricia Churchland, wife of the very Paul Churchland who had authored the book that first piqued Buffalo’s interest in philosophy. “I was seeking out a philosopher who really cared about neuroscience,” Buffalo says. “She focused on philosophy of mind with the idea that, in order to really understand the mind, what we need to understand is the brain.” Churchland led Buffalo on an interdisciplinary route, encouraging her to take classes with the neuroscience PhD students and to join a lab. Buffalo ended up in the joint lab of neuroscientists Larry Squire and Stuart Zola, two of Churchland’s colleagues who studied memory. “It became clear that [Buffalo] really loved this idea of neuroscience, and that she was bringing to neuroscience a kind of philosophical framework,” says Zola, who is now at the Yerkes National Primate Research Center. “She asked a lot of questions, making us think more about where we were headed and why, and how to interpret findings in ways that we might not have thought.” With Squire and Zola, Buffalo worked on identifying the borders between brain structures involved in memory, particularly episodic memories of everyday events and those associated with visual perception. Buffalo recalls wondering: “Is it just a continuum or is there really kind of this packaging, where you could say that one area really is involved in memory and the other area was involved in visual perception?” By examining the behavior of human patients and monkeys with brain lesions, Buffalo and her colleagues found evidence for packaging. Through a series of investigations that involved going back to data from old experiments, they revealed that damage to the medial temporal lobe, a region that contains key memory-related structures such as the hippocampus, impaired memory, while injury to the adjacent anterior inferotemporal cortex led to deficits in visual perception but left memory intact (Behav Neurosci, 112:3–14, 1998). 05 . 202 0 | T H E S C IE N T IST 49

PROFILE “She realized, I think, before many people did, that if you were going to understand what the heck the hippocampus did and what it really had to do with episodic memories, you need to understand the cortical structures that feed into the hippocampus,” Churchland says. Buffalo originally planned to obtain two doctorates—one in philosophy and the other in neuroscience. After completing her first dissertation in neuroscience, however, she decided she was eager to continue doing research. So in 1998 she left her philosophy PhD behind and headed east to start a postdoc with Robert Desimone, a neuroscientist who was then at the National Institute of Mental Health. Buffalo was drawn to Desimone’s lab because his group was investigating brain function at the level of individual cells rather than brain regions, which had been the focus of Zola and Squire’s lab. “What I was interested in at the time was how the activity of neurons would contribute to behavior,” Buffalo says.

Elizabeth realized, I think, before many people did, that if you were going to understand what the heck the hippocampus did and what it really had to do with episodic memories, you need to understand the cortical structures that feed into the hippocampus. —Patricia Churchland, UCSD

In Desimone’s lab, Buffalo examined neuronal activity in primates by recording both individual neurons and local field potential, electrical signals produced by groups of neurons. Using this method, Buffalo helped unveil important features of the neural circuitry of attention, which was the focus of Desimone’s team. Among other things, her work revealed that attention to a stimulus enhanced the synchronization of high-frequency signals in the superficial layers of a brain region called the visual cortex, while it reduced the synchronization of low-frequency signals in deeper layers of the same brain area (PNAS, 108:11262–67, 2011). The result suggests the different patterns of synchronization across brain layers play an important role in how the signals are processed and sent onward in the brain. Eight years into her postdoc, as her projects were beginning to wind down, Buffalo started to think about her options. “I really wanted to stay in academia, but I wasn’t convinced that I was going to be able to find a job,” Buffalo says. “It was, as it is now, a really hard time—and there were very few good tenure track jobs for primate neurophysiologists.” But just as Buffalo was considering making the switch to science policy or science writing, she got a call from Zola, her former PhD advisor. He had just been appointed the director of the Yerkes National Primate Research Center at Emory University and wanted to recruit Buffalo for an open faculty position. “Sometimes you just get really lucky,” Buffalo says. “It was a great environment, so I was really excited to have the chance to set up my lab.” 5 0 T H E SC I EN TIST |

Before getting that serendipitous call, Buffalo had applied at other universities. But there were personal reasons that went into her decision to accept Zola’s offer. “I won’t name names, but there were a couple of job interviews that I went on, where . . . [I got] the feeling that as soon as I said I had a female partner, the tone of the conversation really kind of changed,” she explains. “I decided early on that I’m not going to hide anything because I don’t want to move my family somewhere where it would be an issue. But I do think that has limited our choices.”

MEMORIES AND MOVES As Buffalo was building her lab at Emory, her wife was also starting a new job as director of a nonprofit organization in Atlanta, and the couple had a four-month-old son. “Thinking back, it was a crazy time,” Buffalo recalls. Luckily, even before she had finished setting up her lab, Buffalo already had a graduate student on board. Michael Jutras had done research on learning and memory in rodents as an undergraduate at Brown University, and was passionate about continuing his work on this topic. “He and I really built the lab together,” Buffalo says. Once her lab was ready, Buffalo immediately knew what experiments to pursue. She wanted to shift away from studying attention back to memory. During grad school, Buffalo had focused on examining the neural structures associated with memory. Now, she was equipped with the techniques she’d learned from Desimone’s lab to investigate brain waves and other physiological signatures of memory formation and retrieval. It was using these electrical recording tools that Buffalo and her team demonstrated the existence of grid cells in the primate brain. “She really is an exemplary mentor for graduate students and postdocs,” says Killian, who led the work on primate grid cells. “She created an environment where people were able to really conduct great science.” Buffalo was also thoughtful— each Christmas, she’d give every lab member a book “tailored to everyone’s unique interests,” Killian recalls. As the years passed, Buffalo was recruited to join other universities, so she and her family began to consider where else they might want to live. They liked Atlanta, but after careful consideration, they decided Seattle might be an ideal choice: Buffalo’s wife had family on the West Coast, and there was a primate center at the University of Washington (UW). When Buffalo reached out to the university to ask about open faculty positions, she was in luck: the chair of the physiology and biophysics department informed her that a search for a new faculty member would start in a couple of months and encouraged her to apply. She submitted an application and landed the job. Buffalo moved her lab to UW in 2013 to continue her work on memory. Now, she and colleagues are using virtual reality to more closely investigate how place cells and grid cells are behaving in the primate brain. A burning question for her is how the brain’s representation of space aligns with its recollection of the time spent there, Buffalo says. “Why is it that we have these spatial representations right in the structure that we know is important for memory?” g


Daniel Colón-Ramos: C. elegans Psychologist Professor of cellular neuroscience, Yale University, Age: 44 BY CLAUDIA LOPEZ-LLOREDA



s a Harvard undergraduate, Daniel Colón-Ramos explored the forests of Panama and Honduras, listening closely as indigenous people described how they use medicinal plants to treat ill individuals. The interactions, he says, left him with many more questions than answers. “The questions that kept coming to my mind were molecular questions about what the bioactive agents were and how they worked,” he says. Sitting there in the forest, he realized he wanted to contribute knowledge to science, instead of just learning facts.

After earning his bachelor’s degree in biology in 1998, he moved to Duke University, where he began a post-baccalaureate program that gave him his first experiences at the lab bench. “That was transformative in my ability to imagine myself as a scientist,” he says. He then applied to and was accepted as a PhD student at Duke, where he joined the lab of Sally Kornbluth, who studies cell suicide, a process called apoptosis. Colón-Ramos identified viral peptides that inhibited translation of RNA into host cell proteins that would otherwise induce apoptosis, revealing a potential mechanism that viruses use to continue their cell-to-cell spread. Those experiments, together with attending talks and reading papers outside of his comfort zone, helped Colón-Ramos pinpoint what sparked his scientific curiosity: how cellular organization shapes the way an organism behaves. Colón-Ramos wanted to explore this connection in an animal model that could easily be modified genetically, which led him to the worm Caenorhabditis elegans. As a postdoctoral researcher, he also shifted his research focus from cell death to developmental neuroscience, joining Kang Shen’s lab at Stanford University. There, Colón-Ramos showed that non-neuronal cells called glia guide synapse formation and coordinate neural connectivity (Science, 318:103– 106, 2007). Less than a year after publishing the discovery, Colón-Ramos opened his lab at Yale University to further explore how synapses form, persist, and govern behavior. He’s using C. elegans to dissect cell biology, “but also he has stayed true to himself and expanded his interest in behavior,” Shen says. In 2018, Colón-Ramos’s postdoc Josh Hawk reported that a single cell in the worm’s nervous system serves as a logic system controlling how the animal senses temperature and

responds to it based on a memory it made before (Neuron, 97:356-367.e4). “We’re showing that these neurons can be molecular computers,” Colón-Ramos says. “They’re capable of pretty sophisticated integration.” For Colón-Ramos, though, studying a single neuron is not enough. He wants to understand all of the neuronal connections of the brain, what’s called the connectome. In a multi-institutional effort, he and colleagues traced C. elegans connectomes to see which cells interact during development. “From that work emerged all sorts of circuits, some of which we knew, but [also] others which we had overlooked,” Colón-Ramos says. Delving into the developmental connectomics of C. elegans is not an easy feat, says Hari Shroff, a biophysicist at National Institute of Biomedical Imaging and Bioengineering who is part of the connectome collaboration. But Colón-Ramos has this “willingness to be fearless and try new technology, while also being critical and honest about limitations,” Shroff says. That honesty has paid off in ColónRamos’s research as well as in his advocacy for representation of minorities in the sciences. Born and raised in Puerto Rico, Colón-Ramos became acutely aware of the lack of diversity in science as an undergraduate. “Not only did I not see myself represented in the sciences, but a lot of people had not met people from my background,” he says. Colón-Ramos founded CienciaPR, a nonprofit organization that brings Hispanic and Latinx communities together to promote scientific research and education. “He actually does want to make science a diverse place,” Shroff says. Shen adds that Colón-Ramos’s incredible passion for outreach is part of what sets him apart, along with his ability to establish collaborations and his outstanding science. “He seems to have this sort of energy to get it all done,” Shen says. “[His] success already speaks for itself.” g 05 . 202 0 | T H E S C IE N T IST 51


Digital Detection of Dementia Humans generate terabytes of behavioral data while using their smart devices. Crunching those numbers could help identify the very start of cognitive decline. BY RACHAEL MOELLER GORMAN



must be administered by a health professional, and provide just a snapshot of the patient’s experience. Thus, current diagnostic tools usually cannot identify people in the early stages of disease, nor determine whether someone will develop dementia years down the road. By the time patients receive a diagnosis, neurons have died and brain anatomy has changed. “The periods beforehand have been unknown territory,” says Arlene Astell, a dementia and technology researcher at

KEEPING TABS: Massachusetts-based digital

health company Linus Health has developed tools for smartphones and tablets that aim to measure cognitive function.

the University of Reading in the UK and the University of Toronto. “We haven’t been able to collect those sorts of data in the past.” Carol Routledge, director of research at the nonprofit Alzheimer’s Research UK, agrees: “We know nothing, or very, very little, about early-stage disease.”


hree years ago, Eli Lilly, Apple, and California-based health and measurement company Evidation Health came together to ask a new kind of research question: Can we identify cognitive impairment by analyzing the many types of digital data people inadvertently generate in their everyday lives? For 12 weeks in 2018, more than 100 people with varying states of cognitive decline—or none at all—used an iPhone, an Apple Watch, an iPad pro with a smart keyboard, and a Beddit sleep monitoring device. Each of these devices contains various sensors such as gyroscopes, pedometers, accelerometers, heart rate monitors, and sleep sensors. The iPad also administered language and motor control tests on a biweekly basis. Throughout the study period, participants talked, slept, worked, cleaned, and socialized as their digital biomarker data streams flowed to a cloud-based server viewed by researchers at the study headquarters at Evidation in San Mateo. The project, which ultimately aims to improve diagnosis of cognitive decline and the diseases it often accompanies, addresses a pressing need. More than 5.8 million Americans live with dementia, costing the healthcare system and patients’ families about $305 billion per year. One in 10 Americans over the age of 65 now suffer from Alzheimer’s disease, deaths from which have increased 146 percent since 2000—it’s now the sixth leading cause of mortality in the US. There are no drug treatments on the market to improve the cognitive function of people who already have dementia. But diagnosing dementia, especially early on, can be difficult. Typically, doctors assess patients in their offices with tests that only effectively diagnose people who have already noticeably started losing cognitive ability. These tests

Early diagnosis could not only aid intervention for people at high risk of developing dementia, but provide better opportunities to design new therapies, says Routledge. “If we were able to diagnose early, then that might start telling us what goes wrong in the very early stages,” she says, “which in turn might help us get insight into more appropriate targets that we could eventually develop treatments for.” Pharmaceutical companies might be able to work with populations of at-risk individuals to re-examine failed drugs that didn’t work in people with later stages of the disease. “We likely might even already have some of those treatments, it’s just that we are not stratifying [the patient population] at the precision level that allows them to work as effectively as they could,” says Rhoda Au, a neuropsychologist at Boston University School of Medicine. That’s why researchers at Evidation and its collaborators, who published the findings of their smart-device study last year, are so interested in harnessing the power of wearable or mobile technology. By passively gathering data from people not yet showing obvious clinical symptoms of cognitive decline, these devices could be used to create a digital phenotype that helps clinicians diagnose dementia early, before neuronal death. Evidation isn’t the only company interested in harnessing these 21st-century communications technologies to try to address neurodegeneration. Pharma companies are incorporating digital technologies in their own research, trying to develop in-house solutions, says Au, and many venture firms and other funders are now backing such efforts. Last April, the Alzheimer’s Drug Discovery Foundation put out a call for proposals on digital biomarkers of the disease and related dementias, and the US National Institute on Aging is also funding research in this area. Earlier this year, Alzheimer’s Research UK launched EDoN (Early Detection of Neurodegenerative diseases), a global initiative that will develop “digital fingerprints” of conditions such as Alzheimer’s to “revolutionise the early detection of neurodegenerative diseases,” according to a press release.

With all the investment in digital biomarkers of early cognitive decline, says Au, “I think . . . collectively, we are going to start to have these solutions emerge.”

Early days Fortunately for the research community, members of the public are keen to monitor their health, as evidenced by the more than 300,000 health-related apps and 340 wearable devices already available as of 2017, according to a report by health-focused data science company IQVIA. Many apps purport to detect cognitive decline using data on a user’s movement, cognition, and other factors that may begin to slide years before that person would fail a clinical test for dementia.

We know nothing, or very, very little, about earlystage disease. —Carol Routledge, Alzheimer’s Research UK

There’s science to support the idea that subtle changes can precede dementia. Studies have found, for example, that around 12 years before a clinical diagnosis of mild cognitive impairment, a person’s gait begins to slow dramatically. Other research has shown that, compared with healthy controls, patients suffering from mild cognitive impairment have a higher blink rate and lower heart rate variability. Circadian rhythm disruptions also seem to occur in the very early stages of cognitive decline. But by themselves, these small changes are unreliable markers of neurodegenerative disease. Few of the apps and devices on the market have been validated by rigorous research; none are FDA-approved. The study conducted by Evidation and its collaborators aimed to provide real predictive ability by aggregating data from the sensors in multiple devices, as well as basic device usage metrics—how often phones were locked and unlocked, and numbers of calls and texts—to evaluate cognitive status. Researchers looked at gross motor function using accelerometers, pedometers, and gyroscopes; heart rate using the heart rate monitor in

a smartwatch; circadian rhythms using Beddit sleep sensors; various behavioral, social, and cognitive characteristics measured by app usage, phone use behavior, and text message and phone call metadata; fine motor control using an iPad assessment app for typing and dragging tasks; and language skills using the iPad app. Through these devices, the researchers monitored 113 people between the ages of 60 and 75 years old—31 people with cognitive impairment (as determined by standard criteria) and 82 without. Once generated, participants’ data arrived encrypted at Evidation’s Study Platform, where they were time-stamped, stored, and analyzed. The team found some important differences between the groups of subjects. For example, participants with cognitive decline typed more slowly and had more pauses during typing, perhaps because of fine motor problems or language difficulties or both, the researchers reported last summer. Those with cognitive impairment also walked in a less regular pattern, and their first steps came later in the day. They sent fewer text messages, had a greater reliance on helper apps such as the Clock app, which tells the time and sets alarms, and were more likely to use Siri’s app suggestions. The researchers used machine learning on the dataset to develop a model to distinguish which people had cognitive impairment and which were healthy, based solely on the pattern of digital data received from the participants’ devices and their responses to iPad tasks. The resulting model was able to distinguish between healthy individuals in that dataset, those who had mild cognitive impairment, and those with mild Alzheimer’s disease, with an accuracy similar to that of computerized cognitive tests administered in clinical settings. “[Eli] Lilly [working with Evidation] has done a lot of good work there,” says Graham Jones, director of innovation at Novartis Technical Research and Development, who has researched digital biomarkers and wearable devices for Alzheimer’s disease but wasn’t involved in the study. 05 . 2020 | T H E S C IE N T IST 53

BIO BUSINESS Another group, venture capital–funded Linus Health, is working with Au and other neurocognitive experts to develop a brain health monitoring platform that is independent of any particular tech brand or company. The platform will analyze several aspects of a person’s behavior to glean insights into their brain, and merge it with medical health records, says David Bates, one of the company’s cofounders. Linus has already built its monitoring tool: a smartphone app that reminds users to do certain tasks and measures reaction time, voice and speech, gait, and other potential biomarkers. One task might be to assess changes in gait under a cognitive load: a user walks normally with a smartphone in her pocket; then, she is asked to count backwards by threes from 300 while still walking. Analysis of data generated by the device’s built-in sensors can assess heel strike, toe lift, balance, and walking speed, and differences seen during counting could reveal a declining mind.

5 4 T H E SC I EN TIST |

Researchers at Linus Health are also developing an in-clinic assessment on a tablet. For example, in one task a person might be asked to describe what is going on in a picture while their voice is recorded and speech transcribed, “so it can be analyzed for neuromuscular and cognitive health,” says Bates. Other tasks could involve subjects playing games while the device tracks accuracy, response times, and eye movements. “All these different things combined give a holistic picture of what’s going on in someone’s brain,” he says. Another initiative called the GameChanger project, led by digital phenotyping researcher Chris Hinds at the University of Oxford’s Big Data Institute in the UK, uses Mezurio, a phone app that administers game-like tasks to measure executive function, paired-association learning, and speech production, for 5 minutes per day for 30 days. Users also

self-report mood, sleep, and any word-finding difficulties or disorientation. In this first phase of the project, anyone who hasn’t been diagnosed with cognitive impairment can download the app and take part. After a year, they are invited to take part in GameChanger again. Users’ responses provide information for researchers about brain function in people without dementia, and how the brain changes over time, according to the website of the UK-based Alzheimer’s Society, which partially funds the project. Since 2018, more than 16,500 people across the UK have used the app and contributed data. Hinds and colleagues note in a 2019 preprint about the app that they plan to study people with diagnosed cognitive decline, too.

Hurdles ahead For the digital detection of cognitive decline to work, researchers need huge amounts of personal data to be transmit-

ted, stored, and analyzed, making the privacy of participating users an obvious concern. There are some regulations protecting consumer data, such as the General Data Protection Regulation in the European Union, which requires companies storing data over long periods to implement the “right of erasure,” allowing participants to delete their personal data. Several US states, including California, Massachusetts, and New York, have data privacy laws, and the US Health Insurance Portability and Accountability Act of 1996 (HIPAA) led to federal standards to protect health information in electronic form. For studies such as the one by Evidation and collaborators, this could mean setting up the personal devices in such a way as to limit the data sent to the computing facility. Another solution could be setting up local data storage instead of a centralized computing center. “The ethical implications of any further development run deep,” writes Nikki Marinsek, a data scientist at Evidation Health, in an email to The Scientist. “The data collected is very sensitive, and privacy must be the first consideration when dealing with this kind of data.”

Some people with cognitive impairment may also have difficulty understanding the consent they need to give to share their personal data with a company. Many studies enroll a partner or family member for each person with cognitive decline, to help them use the technology appropriately and ensure they’re well cared for. Another concern is how to provide access to the technologies needed to do the monitoring: personal devices such as iPhones and Apple Watches aren’t cheap and may be difficult to use for some people, even with assistance. “The question is, can you deploy them at scale, economically?” says Novartis’s Jones. He recommends researchers use something economical such as a smart speaker—some of which cost as little as $30—to collect speech and other data on participants, rather than a several-hundred-dollar iPhone or Apple Watch. Researchers at Dartmouth-Hitchcock Medical Center in New Hampshire and the University of Massachusetts, Boston, recently won a grant worth more than $1 million from the National Institutes of Health to study whether voice assistants such as Alexa (used by Amazon Echo) or Google Assis-

tant (Google Home) could be used to detect early cognitive impairment. (See “Listening for Your Health,” The Scientist, May 2019.) Overall, researchers are enthusiastic about the potential for digital technology to improve early detection of dementia. Au estimates that it’ll be less than five years before there’s a well-validated digital phenotype that will be able to identify people who are at a higher risk of developing dementia over the following decade or so. “We have technologies that allow us to now track behaviors in much more continuous, granular ways, so we can sort people out into various subgroups with much greater precision,” Au says. “On top of that, we have more advanced analytic capabilities that are allowing us to look at multidimensional sources of information. These are all advances that are happening simultaneously . . . we are getting closer, faster. I’m quite optimistic.” g Rachael Moeller Gorman is a Bostonbased science journalist. Find her at or on Instagram @ rachaelmoellergorman.

EARLY TREATMENTS? Early detection is an effective tool in slowing disease progression when treatments are available. But currently, there are no cures available for dementia, and pharmaceutical companies are increasingly reluctant to invest because so many trials have failed, says Arlene Astell, who researches neurodegenerative diseases at the University of Reading in the UK. There’s some hope in Biogen’s aducanumab amyloid-β clearance drug. The company’s clinical trial of the therapy in patients with early Alzheimer’s disease was halted last year when it seemed patients weren’t improving, but a later analysis of patients who had taken higher doses of the drug did show clearance of amyloid-β plaques and improvement of cognitive function. If aducanumab is eventually approved, it will be the first drug to both reduce clinical decline in Alzheimer’s disease and show that removing amyloid-β leads to a better outcome. But if the disease is caught early enough, lifestyle interventions may help. The FINGER (Finnish Geriatric Intervention Study to Prevent Cognitive Impairment and Disability) trial, for example, recruited more than 1,000 people deemed at risk of cognitive decline on the basis of educational attainment, physical activity, cardiovascular health, and other factors known to influence risk. The study found that people who followed a two-year regime of exercise classes, diet plans, computer work, puzzles and games, and social activity, plus monitoring of metabolic and cardiovascular risk factors, scored 25 percent higher on neuropsychiatric tests than control participants, 83 percent higher on executive functioning, and 150 percent higher on information processing speed. “You’re using your brain circuitry slightly differently, and quite aggressively,” says Graham Jones, director of innovation at Novartis Technical Research and Development, of the program’s participants. Studies in several other countries, including the United States, Singapore, and Australia, are assessing the effectiveness of the FINGER model in different populations as well. Dubbed World Wide FINGERS (WW-FINGERS), this collaboration hopes to harmonize research and share data. Smart devices that can detect early signs of dementia may motivate their owners to actively engage in these interventions. With the population aging in the US and many other countries, “a lot of people are gonna have Alzheimer’s,” says Jones. “So it’s a real, real issue that’s got to be dealt with, and I think you’ve got to start very early on.” 05 . 2020 | T H E S C IE N T IST 5 5


Advancing Cancer Vaccines

Each individual tumor is unique, carrying its own distinguishing antigens, which can complicate the cancer vaccine development process. Cancer vaccines target unique tumor antigens, including overexpressed healthy proteins or virus-derived proteins. To better characterize, identify, and understand the dynamic interactions between the immune system and cancer cell antigens, researchers can turn to advanced flow cytometry and live-cell imaging technologies. This webinar, brought to you by The Scientist and sponsored by Sartorius, will present new research and developments in the cancer vaccine field and delve into the intricacies of target discovery and vaccine development.

DAVID E. AVIGAN, MD Professor of Medicine Beth Israel Deaconess Medical Center

MARY L. (NORA) DISIS, MD Helen B. Slonaker Endowed Professor for Cancer Research American Cancer Society Clinical Research Professor Professor, Medicine, Adjunct Professor Pathology and Obstetrics and Gynecology, University of Washington Director, UW Medicine Cancer Vaccine Institute



Developing breast cancer vaccines with dendritic cell tumor fusions

Boosting immunity with cancer vaccines: A focus on breast and ovarian cancer immunology WEBINAR SPONSORED BY

Leveraging Advances in Predesigned Synthetic sgRNAs for Highly Functional and Specific CRISPR-Cas9 Gene Knockout

CRISPR-based genome editing has accelerated biological research and introduced great potential for studying and treating human diseases. The CRISPR-Cas9 system requires a Cas9 nuclease and a guide RNA, which may consist of either a CRISPR RNA (crRNA) coupled with a trans-activating crRNA (tracrRNA), or a single guide RNA (sgRNA) that combines the crRNA and tracrRNA in a single molecule. Both guide RNA formats can be chemically synthesized, offering advantages over expression systems. This webinar, sponsored by Horizon Discovery, will explain how synthetic guide RNAs are amenable to chemical modifications for increased stability, eliminate time-consuming steps of cloning and sequencing, and do not evoke the inherent immune response and cytotoxicity that accompanies in vitro transcribed guide RNAs. They can be readily delivered into cells for highthroughput arrayed screening applications to expand the types of phenotypic readouts to high-content and morphology-based assays.

THURSDAY, MAY 21, 2020 2:30 - 4:00 PM EST REGISTER NOW! KURT MARSHALL, PhD R&D Scientist Horizon Discovery

TOPICS TO BE COVERED • How to use Edit-R Synthetic sgRNA for robust CRISPR gene knockout • Functionality of sgRNA in different cell models, including primary T cells • Simplifying high-throughput loss-of-function screening with sgRNAs WEBINAR SPONSORED BY


Crucial Applications of Single-Cell Gene Expression and Immune Profiling for Infectious Disease Research

The ongoing coronavirus (COVID-19) outbreak has taken thousands of lives, and the number of infections is growing daily. In this webinar, sponsored by 10x Genomics, Brian Fritz will discuss the utility of single-cell technologies to advance infectious disease research, highlighting how the scientific community can respond to such events.


BRIAN FRITZ, PhD Associate Director, Strategic Market Development & Programs 10x Genomics

TOPICS COVERED • How to study adaptive immune response using 10x Genomics Single-Cell Immune Profiling • Combining single-cell multiomic readouts of immune cell biology • How to uncover hidden insights with single-cell immune profiling



Using 3-D Organoids to Answer Questions About Human Health

Studying layers of cells grown on flat surfaces leaves a lot to be desired as cellular responses and gene expression change when cells are not in their native, 3-D arrangements. Researchers develop organoids from primary cell lines or stem cells, and these structures are similar in architecture to primary tissue, which makes them relevant models of in vivo conditions. Scientists use organoids to study many areas of human biology, including toxicology, infection, and cancer. Join this webinar, brought to you by The Scientist and sponsored by Bio-Techne and 10x Genomics, to learn how researchers use human cerebral and tumor-derived organoids to better mimic the state of living tissue for drug development and infection studies. CATHRYN HAIGH, PhD Chief, Prion Cell Biology Unit Laboratory of Persistent Viral Diseases National Institute of Allergy and Infectious Diseases Division of Intramural Research Rocky Mountain Laboratories National Institutes of Health DANIELA S. GERHARD, PhD Director, the Human Cancer Model Initiative Director, Office of Cancer Genomics National Cancer Institute National Institutes of Health

ORIGINALLY AIRED THURSDAY, APRIL 30, 2020 WATCH NOW! TOPICS COVERED • Human cerebral organoid applications for prion disease • Insights from the Human Cancer Models Initiative



From Whence Memories? A new book explores how research through the ages has tried to map the intricacies of the human brain, including pinpointing the seat of memory. BY MATTHEW COBB


or centuries, scientists have been arguing about where memory resides in the brain. I explore the fascinating history of this quest to characterize the machinery of memory in my latest book, The Idea of the Brain. Our modern understanding of the nature of memory can be traced back to the 1940s. The Canadian psychologist Donald Hebb argued that “memory must be structural,” and based on networks of cells. These networks, Hebb claimed, became better connected with repeated experience—the presentation of food after the sound of a bell, for example. This idea is often summarized as “cells that fire together wire together.” At around the same time, McGill University neurosurgeon Wilder Penfield showed that it was possible to evoke very precise, eerie memories by stimulating a particular part of the human brain. Often Penfield’s patients heard sounds—a piano being played, someone singing a well-known song, or a telephone conversation between two family members. Memory, or at least access to it, was clearly highly localized. Another profound discovery was made by Penfield’s colleague Brenda Milner. In 1953, a young man called Henry Molaison underwent extensive psychosurgery to relieve his debilitating epilepsy. As a result of the operation, Molaison was unable to form any new memories. He lived in a perpetual present, with no knowledge of anything that had happened after that fateful surgery. Milner and her colleagues suggested that the decisive damage had been done to Henry’s hippocampuses, two structures, one on either side of the base of the brain. But the hippocampuses are not the site of memory storage. Rather, these brain regions are the encoders and the routes through which memory formation

5 8 T H E SC I EN TIST |

seems to pass. The memories that are processed by the hippocampuses seem to be distributed across distant regions of the brain. As for how those complex networks of cells change their activity during learning, this was shown in large part through the work of a brilliant young physician who was inspired in 1957 to study memory by reading Brenda Milner’s first paper on Molaison. This physician, Columbia University neuroscientist Eric Kandel, investigated the nature of memory in a simpler form— in the neurons of a large sea slug known as Aplysia. By studying changes in the electrical and chemical activity of nerve cells in this mollusc, Kandel was able to provide support for Hebb’s suggestion that the links between neurons, known as synapses, become stronger with learning. Once researchers tapped into the power of molecular genetics in the late 1980s, Kandel and his colleagues were able to reveal the genes and chemicals involved in learning, and to show that the same processes that underlie learning in the sea slug are also taking place, right now, in your head. Although the mechanisms underlying memory have given up a few of their most basic secrets, we are still far from understanding what is happening when we learn and remember. Despite Penfield’s unnerving findings, we do not appear to be perpetually recording our whole lives, and the link between normal memory retrieval and experimentally-triggered recall remains unclear. The search for memory’s seat in the human brain seems even more complex today. Researchers now know that memories may not be found in a single place, but precise cells and structures play a key role in memory formation and recall.

Basic Books, April 2020

Memories are often multimodal, involving place, time, smell, light, and so on, and they are distributed across the cortex through intricate neural networks. Our brains might be like computers in terms of how they sometimes process information, but the way we store and recall our memories is completely different. We are not machines, nor are we like any machine we can currently envisage. g Matthew Cobb is a professor in the School of Biological Sciences at the University of Manchester in the UK, where he studies olfaction, insect behavior, and the history of science. Read an excerpt of The Idea of the Brain: The Past and Future of Neuroscience at Follow him on Twitter @matthewcobb.

Monitoring Viral Cytopathic Effects

Native & Recombinant DNase I for Rapid & Complete DNA Digestion RNA Purification • Partially and highly purified grades to remove DNA • Available in versatile lyo and liquid forms • Convenient and stable, ready-to-use liquid forms • Native and recombinant animal-free grades • Reduces cell clumping for flow cytometry and analysis

The CP96 instrument follows viralinduced cytopathic effects using ECIS® impedance measurements. • Ideal for studying the effects of viral infections and their response to therapeutic compounds • Cell morphology changes and loss of viability reported in real-time graphical format • Measurements are continuous and label-free

The Guide

Removing DNA? Need Clean RNA?

Free evaluation includes 96 well microplates. Complete system under $20,000.

WORTHINGTON BIOCHEMICAL CORPORATION Phone: 800.445.9603, Fax: 800.368.3108


Customizable NGS Panels SureSeq™ myPanel™

Imaging Software for Microscopy cellSens

• Now include accurate detection capabilities for translocations and difficult-to-sequence partial tandem duplications • Beneficial to researchers investigating myeloid disorders like chronic myeloid leukemia, myeloproliferative neoplasms, and acute myeloid leukemia, now enabled with BCR-ABL fusion gene and KMT2A-PTD detection • Expanded content enables OGT SureSeq myPanel™ custom panels to be customized to include the BCR-ABL gene fusion

• Offers significantly improved segmentation analysis, such as label-free nucleus detection and cell counting, for more accurate data and efficient experiments • Can identify and segment nuclei from simple transmission images so that fluorescent labeling is not required • Deep-learning technology provides accurate analysis data from low signal-to-noise ratio images • Saves time by identifying and counting mitotic cells automatically



Cell Culture Products ProCulture®

One-Step RT-qPCR RT-qPCR Kits QuanTASE & QuanTASE PLUS

• Cover multiple steps of the cell culture process from isolation to harvesting • Products include an array of shaker flasks, spinner flasks with a unique impeller that increases aeration and eliminates dead spots, and an orbital shaker platform that converts an existing magnetic stir plate into an orbital shaker • Simplify researchers’ cell culture experiments

• Have been successfully tested in assays using EUA authorized qPCR detection kits using the established Centers for Disease Control testing protocol • Test amplifies and identifies genetic material from the SARS-CoV-2 virus that causes COVID-19 in one step • Operate on most existing laboratory testing equipment in the US, and include all required RT-qPCR reaction reagents needed to run a COVID-19 test



05.2020 | T H E S C IE N T IST 59


Savant in the Limelight, 1988–2009 BY SUKANYA CHARUCHANDRA

6 0 T H E SC I EN TIST |

MEMORY MASTER: Kim Peek in 2006. Among Peek’s many remarkable feats was memorizing the index of a set of encyclopedias, as well as a number of passages from other books, at age six. Over the years, he would come to recall from memory zip codes for certain areas, call letters for all the regional television stations, and telephone area codes as well as facts drawn from world history, geography, literature, popular culture, and more.

both the brain and body. Pamela Heaton, a professor of psychology at Goldsmiths, University of London, notes that exceptional abilities like Peek’s are seen more often in people with autism than in those with other conditions. An affinity for structure, an ability to recognize patterns in data, and elevated perception seem to play a role in the abilities of autistic savants, according to Laurent Mottron, a professor of psychiatry at the University of Montreal. Up until his death in 2009, Peek served as an advocate for people with disabilities, showcasing his incredible memorization skills to people he met at speaking engagements. Treffert, who proposed that savants fall into three categories based on the types of skills they possess, believes that Peek was a rare “prodigious savant,” meaning that his abilities stood out even compared to neurotypical individuals.

Research on, and interventions for, people with savant syndrome, autism, and other intellectual conditions have progressed considerably since Peek was born. “There was a very different view then than now,” due in large part to Rain Man, says Treffert. Mottron, though, suspects the neurodiversity movement—which advocates for respect, equality, civil rights, and inclusion for neurodivergent individuals—has done more to change public perception of such conditions. While Treffert’s categories are just one of the many ways researchers seek to understand savant syndrome, no framework has yet emerged that can account for all cases of the condition. He suggests that more research on these individuals could help disentangle the mechanisms not just of autism and savant syndrome, but of human memory more generally. g



ven for Darold Treffert, an expert in the study of savants who has met around 300 people with conditions such as autism who possess extraordinary mental abilities, Kim Peek stood out from the pack. Treffert first spoke with Peek on the phone in the 1980s. Peek asked Treffert for his date of birth and then proceeded to recount historical events that had taken place on that day and during that week, Treffert says. This display of recall left Treffert with no doubt that Peek was a savant. Peek’s abilities dazzled screenwriter Barry Morrow when the two men met in 1984 at a committee meeting of the Association for Retarded Citizens. Morrow went on to pen the script for the 1988 film Rain Man, basing Dustin Hoffman’s character on Peek. The concept of savant syndrome dates back to 1887, when physician J. Langdon Down coined the term “idiot savant” for persons who showed low IQ but superlative artistic, musical, mathematical, or other skills. (At the time, the word “idiot” denoted low IQ and was not considered insulting.) Nine months after Peek was born in 1951, a doctor told his family “that Kim was retarded, and they should put him in an institution and forget about him,” says Treffert. “Another doctor suggested a lobotomy, which fortunately they didn’t carry out.” Instead, his parents raised him at home in Utah where he raced through books, memorizing them. Despite his feats of memory and other abilities, such as performing impressive calculations in his head, Peek never learned to carry out many everyday tasks, such as dressing himself. MRIs would later reveal that Peek had abnormalities in the left hemisphere of his brain and was missing a corpus callosum, which controls communication between the two cerebral hemispheres. Peek was diagnosed at one point with autism and later thought to have a genetic condition called FG syndrome, which affects

2:34 PM MAY 01, 2021


WITH MORE OPTIMIZATION TO HIGH COMPLEXITY IMMUNO-ONCOLOGY RESEARCH WITH BD INSTRUMENTS, REAGENTS AND PANELS. BD is your partner in immuno-oncology from discovery research to clinical applications. With a range of high-performance solutions designed to give you high-quality, reliable research data for even the most complex experiments. More solutions. More answers. More data you can trust. Discover more with BD at your side. Discover the new BD.

Learn more at For Research Use Only. Not for use in diagnostic or therapeutic procedures. BD, San Jose, CA, 95131, U.S. BD and the BD Logo are trademarks of Becton, Dickinson and Company or its affiliates. © 2019 BD. All rights reserved. 0619-2569