277 58 15MB
English Pages [68] Year 2021
GPS to brain scans: the remarkable LEGACY OF THE PENDULUM
The puzzling persistence of IMMUNE CELL MEMORY
AMERICAN
Scientist March–April 2021
www.americanscientist.org
Portrait of a Pandemic Examining the virus through an artistic lens brings its details into focus
PLUS Why can't some people make images in their minds?
Bad to the Bone Full tang stainless steel blade with natural bone handle —now ONLY $79!
T
he very best hunting knives possess a perfect balance of form and function. They’re carefully constructed from fine materials, but also have that little something extra to connect the owner with nature. If you’re on the hunt for a knife that combines impeccable craftsmanship with a sense of wonder, the $79 Huntsman Blade is the trophy you’re looking for. The blade is full tang, meaning it doesn’t stop at the handle but extends to the length of the grip for the ultimate in strength. The blade is made from 420 surgical steel, famed for its sharpness and its resistance to corrosion. The handle is made from genuine natural bone, and features decorative wood spacers and a hand-carved motif of two overlapping feathers— a reminder for you to respect and connect with the natural world. This fusion of substance and style can garner a high price tag out in the marketplace. In fact, we found full tang, stainless steel blades with bone handles in excess of $2,000. Well, that won’t cut it around here. We have mastered the hunt for the best deal, and in turn pass the spoils on to our customers. But we don’t stop there. While supplies last, we’ll include a pair of $99 8x21 power compact binoculars and a genuine leather sheath FREE when you purchase the Huntsman Blade. Your satisfaction is 100% guaranteed. Feel the knife in your hands, wear it on your hip, inspect the impeccable craftsmanship. If you don’t feel like we cut you a fair deal, send it back within 30 days for a complete refund of the item price. Limited Reserves. A deal like this won’t last long. We have only 1120 Huntsman Blades for this ad only. Don’t let this beauty slip through your fingers. Call today!
EXCLUSIVE
FREE
Stauer® 8x21 Compact Binoculars -a $99 valuewith purchase of Huntsman Blade
Huntsman Blade $249*
Offer Code Price Only $79 + S&P Save $170
1-800-333-2045
BONUS! Call today and you’ll also receive this genuine leather sheath!
Your Insider Offer Code: HUK͕͛͘-01 You must use the insider offer code to get our special price.
Stauer
®
Rating of A+
14101 Southcross Drive W., Ste 155, Dept. HUK-01 Burnsville, Minnesota 55337 www.stauer.com
*Discount is only for customers who use the offer code versus the listed original Stauer.com price.
California residents please call 1-800-333-2045 regarding Proposition 65 regulations before purchasing this product. Not shown
• 12” overall length;͚ͽΤΜdz actual size. Ȉ Ƭ • Includes genuine leather sheath
Stauer… Afford the Extraordinary.®
What Stauer Clients Our Knives “The feel of this knife is unbelievable...this is an incredibly fine instrument.” — H., Arvada, CO
AMERICAN
Scientist Departments
Volume 109 • Number 2 • March–April 2021
Feature Articles
66 From the Editors 67 Letters to the Editors 70 Spotlight Unequal burden of urban heat • Cat chemistry • Bringing clarity to COVID-19 testing • Briefings 78 Sightings A walk to remember 80 Technologue Generating a greener future Lee S. Langston 84 Engineering Quick is beautiful, slow less so Henry Petroski 88 Arts Lab Painting a portrait of SARS-CoV-2 David S. Goodsell 94 Perspective Why do virtual meetings feel so weird? Elizabeth Keating
Scientists’ Nightstand 118 Book Reviews Plutonium legacies • How best to foster healthy behaviors • Abiding darkness
98 98 Remembrance of Germs Past How do our immune cells remember invaders for so long, and what could this ability mean for COVID-19 vaccines? Marc Hellerstein
110 Blind Mind’s Eye People with aphantasia cannot visualize imagery, a trait that highlights the complexities of imagination and mental representation. Adam Zeman
106 From a Swinging Chandelier to Global Positioning Systems Calculus has unraveled mysteries that puzzled scientists for centuries, and it has led to technologies they never would have imagined. Steven Strogatz
110
From Sigma Xi 125 Sigma Xi Today Sigma Xi election results • Support student researchers • Virtual STEM Art and Film Festival showcased art–science collaborations
106
The Cov er A respiratory droplet harbors SARS-CoV-2, the virus that causes COVID-19, in this painting by biologist and artist David S. Goodsell. Respiratory droplets are expelled when we breathe, which is why face masks that catch those droplets can slow the spread of the pandemic. The surface of SARS-CoV-2 is covered with spike proteins, giving it a distinctive, crownlike silhouette (magenta). The droplet consists primarily of water, but it also includes molecules that are normally found in the respiratory tract, including mucus molecules (green), pulmonary surfactant molecules from the surfaces of respiratory cells (blue), and protective molecules from the immune system (tan). Goodsell describes his process of researching and creating molecular art in “Painting a Portrait of SARS-CoV-2” (pages 88–93). (Cover image courtesy of RCSB Protein Data Bank, doi:10.2210/rcsb_pdb/goodsell-gallery-024)
From the Editors AMERICAN
Scientist
Switch Up Your Lenses
www.americanscientist.org
T
66
American Scientist, Volume 109
VOLUME 109, NUMBER 2 Editor-in-Chief Fenella Saunders Managing Editor Stacey Lutkoski Senior Consulting Editor Corey S. Powell Digital Features Editor Katie L. Burke Senior Contributing Editor Sarah Webb Contributing Editors Sandra J. Ackerman, Emily Buehler, Christa Evans, Jeremy Hawkins, Efraín E. Rivera-Serrano, Diana Robinson, Heather Saunders Editorial Associate Mia Evans Intern Reporter Madeleine Feola Art Director Barbara J. Aulicino SCIENTISTS’ NIGHTSTAND Book Review Editor Flora Taylor AMERICAN SCIENTIST ONLINE Digital Managing Editor Robert Frederick Acting Digital Media Specialist Kindra Thomas Acting Social Media Specialist Efraín E. Rivera-Serrano Publisher Jamie L. Vernon CIRCULATION AND MARKETING NPS Media Group • Beth Ulman, account director ADVERTISING SALES [email protected] • 800-282-0444
Courtesy of RCSB Protein Data Bank
he first step to finding an answer is to realize you need to ask the question, and that can be tricky. As we move along within our frameworks and routines, familiarity might prevent us from seeing outside our bubbles. It’s often the case that if you break out, take a step back, or otherwise find a different angle, inspiration will strike. We all see the world through our personal lenses, to borrow a phrase common to psychologists and ethicists, who use the idea to remind us that people interpret the world differently. Indeed, Destinée-Charisse Royal, a senior staff editor at the New York Times, frequently recommends “loaning your lenses” as an effective tool in conflict resolution. But learning to switch out your standard lenses, and maybe see the world through someone else’s, can be a fruitful skill to develop for many reasons. Scientists know that artwork has explanatory power when used to illustrate discoveries, but some of them may not realize that art is valuable as a tool for making breakthroughs. David S. Goodsell would like to make that clear, and he describes in this issue’s Arts Lab column (“Painting a Portrait of SARS-CoV-2,” pages 88–93) how he uses his artwork to get new perspectives on science that interests him. Goodsell is a biologist, and he’s also a renowned illustrator of the internal workings of cells and microorganisms. Goodsell emphasizes that when he’s making a new art piece—one featuring SARS-CoV-2, for instance—he needs to ask a lot of questions about the structure and the function of the cells he is attempting to depict in detail. His searches for all the “bits of information,” he says, often lead him to ask more questions and do more literature research across fields, to find the molecularlevel specificity he requires. And he hopes that his work will induce experts in those fields to ask more questions as well. In his SARS-CoV-2 Fusion painting (right), he notes that there’s still a lack of information about how the structure of a spike protein changes while it’s in the process of fusing with a cell membrane. He would be pleased if scientists in these fields are inspired to confirm or disprove his educated guesses. Marc Hellerstein, in “Remembrance of Germs Past” (pages 98–105), also discusses how a broad range of experience, and some serendipity, moved along his research regarding cell proliferation, which was a first step in tracking the life spans of T-cells. Hellerstein’s background working with glucose metabolism allowed him to see new ways of getting safe tracers into DNA at the time it is synthesized, and his connections with vaccine researchers let him work on the ways T-cells could mount a response to invaders not encountered for decades. Most of our analogies for gaining a new perspective are visual—a different point of view, a change of scenery. But as Adam Zeman points out in “Blind Mind’s Eye” (pages 110–117), not every brain can visualize. Some people simply can’t create an image in their mind’s eye based on a description. Zeman points out that people with this condition, called aphantasia, lead rich mental lives nonetheless, even though their sense of perception is perhaps more abstract than experiential. It’s a good reminder that there are lots of different ways to think about things. Has using a different mental lens (visual or not) ever resulted in a breakthrough for you? Let us know about it. You’re always welcome to reach out to us via our website form or on social media. —Fenella Saunders (@FenellaSaunders)
EDITORIAL AND SUBSCRIPTION CORRESPONDENCE American Scientist P.O. Box 13975 Research Triangle Park, NC 27709 919-549-0097 • 919-549-0090 fax [email protected] • [email protected] PUBLISHED BY SIGMA XI, THE SCIENTIFIC RESEARCH HONOR SOCIETY President Sonya T. Smith Treasurer David Baker President-Elect Robert T. Pennock Immediate Past President Geraldine L. Richmond Executive Director Jamie L. Vernon American Scientist gratefully acknowledges support for “Engineering” through the Leroy Record Fund. Sigma Xi, The Scientific Research Honor Society is a society of scientists and engineers, founded in 1886 to recognize scientific achievement. A diverse organization of members and chapters, the Society fosters interaction among science, technology, and society; encourages appreciation and support of original work in science and technology; and promotes ethics and excellence in scientific and engineering research. Printed in USA
Letters Wetland Carbon Storage To the Editors: I found the interview with Ariana Sutton-Grier, “Sinking Carbon in Coastal Wetlands” (First Person, November–December 2020), both interesting and timely, given the recent catastrophes of climate change and sea level rise the world is facing. As Dr. Sutton-Grier points out, coastal wetlands can be very effective at removing carbon dioxide from the air. But aren’t many of them already full? Most coastal wetlands have developed during the past few thousand years of the Holocene epoch when sea level has been fairly static. In each salt marsh, the vegetative layer has an upper and a lower boundary; the vegetation may continue to enrich the soil beneath it with carbon, but there is a limit to how high the vegetation can grow. So the vegetation doesn’t absorb above-ground carbon the way a forest does unless it actually converts to a forest—which may happen as inorganic sedimentation raises the soil level and the marsh is eventually lost. On a tectonically stable seacoast, a sea level increase of a few centimeters or even a few meters should have the effect of raising the top of a salt marsh
by that same amount, thereby increasing the thickness and volume of that organic layer. The marsh would thus become a small element of a natural self-regulating system for CO2, thereby mitigating global warming and sea level rise. As Dr. Sutton-Grier points out, salt marshes are effective carbon sinks only as long as we don’t mess with them. They provide a more permanent solution than reforestation for that purpose. In a world with ever-increasing demand for lumber and agricultural land, many forests unfortunately have to be regarded as repositories for temporary carbon storage rather than as permanent carbon sinks. David C. Bushnell Alamo, CA Dr. Sutton-Grier responds: This letter brings up an interesting topic: natural carbon sinks and whether they saturate or become “full.” We are still learning about whether even old-growth forest ecosystems are still sequestering carbon or whether they do reach a point where they might be considered full. However, where forests have been destroyed or degraded, we can restore
them. There are plenty of opportunities for natural climate mitigation in restored forests. Coastal wetland ecosystems have been experiencing very slow sea level rise for millennia, which has enabled them to continue to accumulate carbon over long periods of time. Coastal wetlands have been accumulating sediment and carbon to keep pace with sea level rise. Their location at the intersection of land and sea sets coastal wetlands apart from other natural climate mitigation opportunities because they continue to develop deep, organic-rich soils over time and are not a saturating sink. It is important not to become overly concerned with comparing and contrasting natural climate solutions to one another. We need rapid climate action in order to avoid very serious environmental impacts, which means we need to act on all possible climate solutions, including restoring and protecting all our carbon-absorbing ecosystems (including forests, grasslands, and wetlands) as well as decreasing energy use, increasing energy efficiency, and increasing renewable energy sources. We need immediate action on every solution to help us slow climate change.
American Scientist (ISSN 0003-0996) is published bimonthly by Sigma Xi, The Scientific Research Honor Society, P.O. Box 13975, Research Triangle Park, NC 27709 (919-549-0097). Newsstand single copy $5.95. Back issues $7.95 per copy for 1st class mailing. U.S. subscriptions: one year print or digital $30, print and digital $36. Canadian subscriptions: add $8 for shipping; other foreign subscriptions: add $16 for shipping. U.S. institutional rate: $75; Canadian $83; other foreign $91. Copyright © 2021 by Sigma Xi, The Scientific Research Honor Society, Inc. All rights reserved. No part of this publication may be reproduced by any mechanical, photographic, or electronic process, nor may it be stored in a retrieval system, transmitted, or otherwise copied, except for onetime noncommercial, personal use, without written permission of the publisher. Second-class postage paid at Durham, NC, and additional mailing offices. Postmaster: Send change of address form 3579 to Sigma Xi, P.O. Box 13975, Research Triangle Park, NC 27709. Canadian publications mail agreement no. 40040263. Return undeliverable Canadian addresses to P. O. Box 503, RPO West Beaver Creek, Richmond Hill, Ontario L4B 4R6.
Travel Adventures in Spring & Summer 2021 Alaska Cruise: Whales and Glaciers
For information about Covid and travel, please call our office. Arizona in Spring April 18 - 25, 2021 From Astronomy to Desert plants and wildlife!
On Nat Geo Sea Lion June 15 - 20, 2021 With optional 7-day Denali extension
Botanical Alaska
Grand Canyon, Zion, and Bryce Canyon
June 27 - July 7, 2021 With local experts
May 28 - June 5, 2021
Mystique of Morocco
Greenland Annular Eclipse and Geomagnetic North Pole
April 9 - 21, 2021 Rabat, the Atlas Mountains & Sahara
June 3 - 13, 2021 An extraordinary adventure
We invite you to travel the World with Sigma Xi! www.americanscientist.org
SIGMA XI Expeditions THE SCIENTIFIC RESEARCH HONOR SOCIETY
Phone: (800) 252-4910
Columbia River Cruise On Nat Geo Sea Lion October 3 - 10, 2021 Discover the fascinating historical and geological heritage
For information please contact: Betchart Expeditions Inc. 17050 Montebello Rd, Cupertino, CA 95014-5435 Email: [email protected]
2021
March–April
67
Online | @americanscientist.org Science and Hip Hop
Check out AmSci Blogs
A new podcast series from Sigma Xi, “STEM+Art and Inclusive Science Communication,” examines the role of music in creating more reflexive, equitable, and engaging science. https://bit.ly/3o2JdLE
http://www.amsci.org/blog/
Vaccinated People Still Need Masks
Follow us on Twitter
Two coronavirus vaccines have received stamps of approval from the U.S. government, but masks will continue to be necessary for a few more seasons. https://bit.ly/38WK3oT Vaccines and Antiscience Beliefs
Vaccines prevent 2 million to 3 million deaths per year, but they are facing a formidable enemy: rampant misinformation. https://bit.ly/3bUeePJ
Australian Mammals To the Editors: In the article “The Elusive Dingo” (September–October 2020) author Pat Shipman writes: “The only placental mammals on the continent were humans, bats that presumably had flown there, and perhaps rats that had fled explorers’ ships. Every indigenous Australian mammal was a marsupial and raised its young in a pouch.” In fact, Australia has a very rich and exotic collection of indigenous rodents that long predate European colonization, and the Australian mammalian fauna includes some much loved egglayers that have use for neither a placenta or a pouch. Richard Gogerly Melbourne, Australia Editors’ note: The online version of the article has been updated to more accurately describe the variety of indigenous Australian mammals.
Advice from Loyola To the Editors: The article “The Dangers of Divided Attention” by Stefan Van der Stigchel (January–February) confirms what Ignatius of Loyola and the Jesuit educators have taught in their schools and 68
American Scientist, Volume 109
Find American Scientist on Facebook facebook.com/AmericanScientist
twitter.com/AmSciMag
Join us on LinkedIn linkedin.com/company /american-scientist
Find us on Instagram instagram.com/american_scientist/
Read American Scientist using the iPad app Available through Apple’s App Store (digital subscription required)
universities since the 16th century: Age quod agis (translated from Latin as “Do what you are doing”). Ed Janosko Wilmington, NC
How Are Our Readers? Editors’ note: In the From the Editors column, American Scientist editor-in-chief Fenella Saunders has asked readers to check in about how they are doing during the pandemic and period of social distancing. We received letters from many of you and are including a selection of those responses here. To the Editors: I’m hopeful, thanks to Sigma Xi Communities. The online forum has enabled me to act on the deep sense of responsibility I feel for being among the luckiest people on Earth. As a fulltime resident of Maui, I don’t know anyone who has died of—or even contracted— COVID-19. I’ve been spending more time in my garden, where I look to nature as my teacher. I also struggle with a kind of survivor’s guilt: How can I enjoy this paradise when my fellow humans are suffering? In my anguish, I turned to Sigma Xi Communities, and the roots of my sense of responsibility were watered by the wisdom that circulates in our conversations. Because many of the participants
are retired, I wonder: Can their wisdom be put to use beyond this forum? Kupuna science is my name for what I see sprouting in Sigma Xi Communities. Kupuna is the Hawaiian word for an elder who has acquired enough life experience to assume the role of community leadership. The term also refers to the starting point of the process of growth. In other words, wisdom is not just a crowning glory; it’s also the start of new opportunities. Sigma Xi Communities is helping me to envision the key question of Kupuna Science 101: How can we take the knowledge we gained while pursuing our careers and repurpose it for our (planet’s) health? Harriet Witt Ha’iku, Maui, HI To the Editors: I am a retired marine biologist and environmental scientist who likes to keep busy and to champion good environmental stewardship. Prior to the pandemic, I was busy with the local public schools leading career day classes on marine biology and environmental science and making presentations about backyard science. Since the lockdown and social distancing, I have been trying to be a good citizen by following all of the medical advice and am staying at home most of the time. The schools have not reopened, so my classroom time is in abeyance for now. I have used this interim to catch up on my backyard science and nature studies. I have been observing cicadas for many years in my yard and neighborhood, and am anxious about the return of the 17-year periodicals during the spring of 2021. I have pulled together all of my backyard data, and have even added a bit of new data to my records in preparation for 2021. Clarence Hickey Rockville, MD How to Write to American Scientist
Brief letters commenting on articles appearing in the magazine are welcomed. The editors reserve the right to edit submissions. Please include an email address if possible. Address: Letters to the Editors, P.O. Box 13975, Research Triangle Park, NC 27709 or [email protected].
Simplicity. Savings. Stauer®SMART
Best value for a Smartwatch...only $99! 3Xs the Battery Life of the top-selling Smartwatch
Smarten up S
ome smartwatches out there require a PhD to operate. Why complicate things? Do you really need your watch to pay for your coffee? We say keep your money in your pocket, not on your wrist. Stauer®SMART gives you everything you need and cuts out the stuff you don’t, including a zero in the price.
Keep an eye on your health with heart rate, blood pressure** and sleep monitoring capabilities. Track your steps and calories burned. Set reminders for medicine and appointments. StauerSMART uses Bluetooth® technology to connect to your phone. When a notification or alert arrives, a gentle buzz lets you know right away. When it comes to battery life, StauerSMART has one of the most efficient batteries available--giving you up to 72 hours of power. Most Smartwatches need to be charged every 24 hours. StauerSMART can get you through a three-day weekend without needing a charge. This is the smarter Smartwatch. And, at only $99, the price is pretty smart too. Satisfaction guaranteed or your money back. Try StauerSMART risk-free for 30 days. If you aren’t perfectly happy, send it back for a full refund of the item price.
Stauer®SMART • • • • • • •
Track steps and calorie burn Monitor heart rate, blood pressure & sleep Set reminders for medicine & appointments Get notified of emails & text messages Personalize the dial with your favorite pic Up to 72 hours of battery life per charge Supports Android 4.4+, ¡OS8.2 & Bluetooth 4.0+
Stauer®SMART gives you everything you want for only $99... and nothing you don’t.
Stauer ® SMART $299† Offer Code Price
$99 + S&P Save $200
You must use the offer code to get our special price.
1-800-333-2045
Your Offer Code: STW͖͗͜-01
Rating of A+
Please use this code when you order to receive your discount. • Supports Android 4.4+, iOS8.2 & Bluetooth 4.0+ • Silicone strap • Touchscreen with digital timekeeping • Stopwatch timer • Heart rate, blood pressure & sleep monitor • Fitness tracker • Notifications: text, email, social media, & calendar alerts • Alarm clock • Water resistant to 3 ATM • USB charger included
Stauer
Emails and texts alerts
Find my phone
Monitor heart rate
Track steps and calories
® 14101 Southcross Drive W., Ste 155, Dept. STW͖͗͜Ǧ01, Burnsville, Minnesota 55337 www.stauer.com
* Please consult your doctor before starting a new sport activity. A Smartwatch can monitor real-time dynamic heart rates, but it can’t be used for any medical purpose. † Special price only for customers using the offer code versus the price on Stauer.com without your offer code.
Stauer… Afford the Extraordinary.®
www.americanscientist.org
2021
March–April
69
Spotlight | Health effects of redlining
Unequal Burden of Urban Heat Historically redlined areas are disproportionately affected by rising temperatures—a disparity that has significant health implications. Jeremy Hoffman was examining temperature maps of the city of Richmond, Virginia, when he noticed a disturbing pattern. Hoffman, a climate scientist at the Science Museum of Virginia, had been working with local nonprofits to measure the intensity of the summertime heat island effect, in which urban areas are hotter than the surrounding areas. The effect was pronounced in Richmond, but the city’s excess heat was not at all evenly distributed. “The hottest spots in the city tended to be areas with lower resources and minoritized communities,” Hoffman says. The heat island effect is caused in large part by brick, asphalt, and other urban building materials that absorb solar energy during the day and release it as heat in the afternoon and evening. Hoffman compares the heat island effect to walking across a parking lot on a hot day. You can feel the sweltering heat radiating up from the asphalt, and the rush of warm air as you pass by a running car. Tall structures block wind that could cool the parking lot, leaving you sweating. On the scale of a city,
these factors—the hard surfaces, the waste heat from vehicles and air conditioners, the absence of tree canopies and breezes—combine to make U.S. cities up to 4 degrees Celsius hotter than surrounding areas during the day. “Extreme heat is the leading cause of weather-related fatalities in this country,” Hoffman says. “It’s this silent killer.” Heat increases physical stress and exertion; exacerbates preexisting respiratory, kidney, and liver problems; and has been shown to cause premature birth. Heat also increases economic burdens: A few degrees may be the difference between leaving your air conditioning off and turning it on, Hoffman notes. Within a city, trees and parkland can reduce the heat island effect, but such amenities are most often found in wealthier, predominantly white neighborhoods. Lack of green space could account for the hot spots Hoffman found in lower-income communities of color. The problem of urban heat and the disparities it exacerbates is only growing, as climate change increases the fre-
David Grossman/Alamy
Children in Bedford–Stuyvesant, a formerly redlined Brooklyn neighborhood, cool down in the water from a fire hydrant. Urban heat waves are becoming more intense because of climate change, leading to disproportionate health implications for those in hotter, historically redlined areas. 70
American Scientist, Volume 109
quency and intensity of summer heat waves and urbanization makes cities even warmer. When “there are particular places in our cities that then turn up the heat even further,” Hoffman adds, “that has a disproportionate impact on people’s health.” Hoffman’s data tell a story of climate injustice: systemic, institutionalized racism that places the burden of climate change—including heat waves—disproportionately on poor Black and Brown communities. To better understand this phenomenon, Hoffman and his colleagues Vivek Shandas and Nicholas Pendleton decided to investigate whether historically redlined neighborhoods are hotter than those that weren’t redlined. Redlining began in the 1930s with the federally funded Home Owners’ Loan Corporation (HOLC), which offered low-interest mortgage loans to help people keep their homes. However, these loans were not offered to everybody. To determine which areas were desirable for lending, the corporation drew up maps of about 240 U.S. cities, classifying neighborhoods in four categories from A (“best” investment) to D (“hazardous” investment). On these maps, “hazardous” areas were outlined in red, hence the term redlining; they consisted primarily of working-class Black and immigrant neighborhoods. Redlining was made illegal with the passage of the Fair Housing Act in 1968, but the damage was done. Historically redlined areas have suffered an ongoing legacy of disinvestment, resulting in lower home ownership rates, lower property values, and reduced credit access today. People living in historically redlined areas are exposed to more pollution than those in neighborhoods that were greenlined, and are at greater risk for a number of health conditions including cardiovascular disease and diabetes. However, no previous studies had explored the relationship between historically redlined areas and present-day temperature. Hoffman and his team set out to do just that by examining summer temperature data for 108 urban areas. Within each city, they calculated the surface temperature anomaly, or how the summertime surface temperatures of each HOLC tier from A to D compared with the city’s summertime surface temperature overall. The researchers also
www.americanscientist.org
Science Museum of Virginia
HOLC Residential Security Map, 1937 Richmond, VA Grade A - best B - still desirable C - definitely declining D - hazardous
From J. S. Hoffman, et al., 2020, Climate doi:10.3390/cli8010012
calculated the average percentage of impervious land surface cover (heat-absorbing materials, such as asphalt) and the percentage of tree cover for each HOLC category, because those factors have been shown to influence local temperature. The results were clear: In 94 percent of the cities studied, summer temperature anomalies followed a stepwise function, with each successive HOLC category from A to D significantly warmer than the category above. Historically D-rated areas were an average of 2.6 degrees warmer than A-rated areas, but the difference was as high as 7 degrees in some cities. Cities with large temperature anomalies demonstrated corresponding disparities in land use: The warmer neighborhoods had more impervious land surface cover and less tree cover than their cooler counterparts. Hoffman’s results line up with previous studies, which show that historically redlined areas have higher rates of health complications, including asthma, kidney disease, and preterm birth, all of which are associated with extreme heat. HOLC maps were typically accompanied by notes that influenced which neighborhoods were flagged in A-rated green and which ones in D-rated red. Greenlined neighborhoods were described as shady and leafy, whereas redlined areas were considered built-up and smelly—and were already recognized as hotter. “Think of the trajectory of those two different categories, where wealth and power is concentrated in one and completely deprived in another,” Hoffman says. Accessing the amenities that keep a neighborhood cool, such as parks and tree-lined streets, and pushing back against freeways, strip malls, and factories that produce heat, takes economic and political power. According to Hoffman’s hypothesis, the practice of redlining codified this longstanding power imbalance—already visible in the temperature differences between neighborhoods—and paved the way for decades of urban renewal and infrastructure projects that further reinforced it. Although overt forms of redlining are now illegal, Hoffman’s research highlights the degree to which redlining continues to affect people’s health and lives. The most important corrective step, he says, is boosting public investment in historically redlined communities. Multiple cities, such as Boston and Baltimore, are already
Urban Heat Islands in Richmond, VA temperature difference from city average (degrees Celsius) –3.5 to –2.2 –2.2 to –0.9 –0.9 to 0.5 0.5 to 1.8 1.8 to 3.1
This digitized version of the 1937 Home Owners’ Loan Corporation Residential Security Map of Richmond, Virginia, (top map) indicates areas ranging from “best” (green) to “hazardous” (red). Decades of housing and development policies restricted communities of color to those red areas, which had fewer parks and other amenities. Today, the average temperatures of Richmond’s neighborhoods correlate with the historic redlining (bottom map).
using redlining maps to identify areas for tree planting; in the future, prioritizing historically redlined neighborhoods for urban greening and park building projects could help mitigate the disproportionate effects of heat waves. According to Hoffman, the inequalities we see today are the direct result of communities of color being left out of decision-making processes. If historically redlined communities are empowered to develop and implement their own responses to extreme heat
waves, Hoffman expects to see better, more just results in the future. There are also steps that anyone can take to help, such as pushing local governments to build more bus shelters so that people who rely on public transportation have shade from the heat. “Climate change can feel very far away in time, in space, and in power,” Hoffman says, “but there are things that we can do today in our own backyards that will keep people safe and healthy.”—Madeleine Feola 2021
March–April
71
Infographic | Andy Brunning
CAT CHEMISTRY: ALLERGIES, CATNIP AND URINE WHY DO CATS GO CRAZY FOR CATNIP? H
O H O NEPETALACTONE
70% OF CATS ARE AFFECTED
The compound that causes catnip's effect on cats is nepetalactone. It binds to protein receptors in the cat's nasal passages. This triggers responses in the cat's brain that make the cat exhibit behavior similar to that triggered by cat pheromones, including rolling, rubbing and salivating. The effects last for around 10 minutes before wearing off.
WHAT CAUSES ALLERGIES TO CATS?
WHY DOES CAT URINE SMELL SO BAD?
10–30% OF PEOPLE ARE ALLERGIC TO CATS
Fresh cat urine doesn't have a strong odor. However, over time, it develops a strong smell. Felinine, an amino acid in cat urine, gets broken down by enzymes into 3-methyl-3sulfanylbutan-1-ol (MMB). This, along with the ammonia produced by the breakdown, gives cat urine its pungent smell.
Eight different cat allergens are currently recognized by the World Health Organi[ation. They're designated as Fel d 1–8. Of these, Fel d 1 is the primary cat allergen.
O HO
S
OH
HO
SH
NH2 MMB
FELININE
FEL D 1 MAIN ALLERGEN
SALIVA
FUR & SKIN
FELININE EXCRETION
5IJT protein JTGound in PWFS 99QFSDFOU of64 homes.*UT CJPMPHJDBMGVODUJPOJTVOLOPXO
FEL D 1 LEVELS
>
>
FELININE EXCRETION
(note: neutered males excrete felinine at levels similar to thPTF of females)
O
FEL D 1 LEVELS
CAT KETONE
SH
(note: neutered males produce Fel d 1 BUlevels similar to those of females)
Inhalation of Fel d 1 causes an immune response in people who are allergic. As a defenTe mechanism, the body produces antibodies. TheTF antibodies trigger the release of histamine from mast cells. Histamine causes the symptoms of allergies.
Cat ketone (4-methyl-4-sulfanylpentan-2-one) is another key cat urine odorant. It also occurs naturally in Sauvignon grapes, and is a key contributor to the odor of blackcurrants.
CAT LITTER COMPOSITION .PTUMJUUFSTVTFDMBZT TVDIBT calcium bentonite; TPNFVTFbiodegradable NBUFSJBMTPS silica gel.
N HN NH2 MAST CELLS
Ci 72
HISTAMINE
SYMPTOMS
Cat litter absorbs urine and odors. In cases where cats urinate elsewhere around your house, enzymatic cleaners should be used. These break down odorous compounds.
© Andy Brunning/Compound Interest 2020 - www.compoundchem.com | @compoundchem Shared under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 license.
American Scientist, Volume 109
BY
NC
ND
First Person | Michael Mina
Bringing Clarity to COVID-19 Testing Michael Mina is an epidemiologist at the Harvard T. H. Chan School of Public Health and the director of molecular diagnostics at Brigham and Women’s Hospital at Harvard Medical School. He started the Broad Institute’s COVID-19 testing program, which spearheaded automated PCR tests—the most commonly used tests for diagnosing the viral infection so far during the pandemic. PCR tests (PCR is short for polymerase chain reaction) for SARS-CoV-2 use a process that copies viral genetic material to make small amounts more detectable. Mina has been a leading voice advocating for the implementation of new testing strategies focused on screening, meaning identifying infectious asymptomatic individuals before they spread the virus, and of more widespread use of rapid antigen tests, which detect proteins that make up the virus, such as receptor proteins or components of the viral envelope. Mina spoke to digital features editor Katie L. Burke. How have past outbreaks affected the approach to the COVID-19 pandemic?
This virus has forced a reimagination of how testing can be used. As we’ve been developing vaccines, our past experiences led to an immense effort to use PCR-based testing and contact tracing with surveillance testing as our primary approach toward controlling this virus. Back in January and February [2020], every public health person pretty much was on board with saying that contact tracing is good when there’s not a lot of cases. Now, contact tracing is just sapping resources. Staying tethered to the lessons of the past has hindered us from dealing with this pandemic. It is the first time in the modern era we’ve seen a virus like this one: something truly aerosolizing, spreading very quickly before symptoms. What testing approach do you think is necessary instead?
What I’ve been pushing for is to entirely rethink what the purpose of testing is and what the downstream actions of testing must be. In this case, making the testing approach not so much a topdown public health surveillance, but rather a bottom-up personalized empowerment screening tool—meaning that I know my status, much like HIV— changes the whole game. We can have many people using simple rapid tests in www.americanscientist.org
or 40 percent or even worse. The latter papers are all comparing rapid tests to PCR positivity. Only about 20 percent of somebody’s time when they’re PCR positive is spent being infectious. That’s why the U.S. Centers for Disease Control and Prevention (CDC) says not to test again after you get out of an isolation, because you’ll remain PCR positive. These antigen tests only tell you you’re positive when you’re infectious. If you’re just doing random samples of people who are asymptomatic, a person might not have ever known they were infected. You’re more likely to catch somebody during the 70 percent to 80 percent of the time that they’re PCR positive, but no longer transmissible. Sarah Storrer
their home, twice a week. Without that frequent testing, people never know they’re positive until it’s too late and they’ve already transmitted. You don’t have to contact trace anyone, because you have more people who would be testing themselves regularly than you’d ever get with a contact tracing program. This screening approach requires a new way to think and a different type of test. The responsibility and the power lies in the individual. In a pandemic like this one, which is so out of control, we can’t expect a small department of health that’s underfunded in a state somewhere to tackle this. If we don’t get the public’s help dealing with this public health problem, we will not succeed. We would need our best marketing agents to help educate the public about the role of testing. There are so many tools at our disposal to deal with the pandemic. We just haven’t been deploying them. Some studies of rapid tests have reported low sensitivity, the measure of how well a test detects positive cases, which conflicts with much higher sensitivities reported in other studies. How do you explain this disparity?
PCR’s sensitivity is why we keep getting these metrics of 40 percent sensitivity among the rapid tests. You get this bimodal distribution [a graph with two peaks] in rapid test sensitivity, either 95 percent sensitivity on some papers
Many people have heard that rapid tests may be less accurate. Why do you say they are essential to a better strategy?
The major misconception about these rapid tests is that they’re not as sensitive. That needs to stop. They are extremely sensitive to catch infectious people. We’ve run over 3,000 Innova tests [a paper strip rapid antigen test with a readout that looks much like a pregnancy test’s, one line indicating a negative result and two lines a positive one] at one school and have had zero false positives so far. What we’re finding is the Innova test is detecting the false positive PCRs. We’re running PCRs along with the antigen tests to try to understand the antigen tests. Antigen tests such as BinaxNOW are getting around a 1 in 1,500 false positive rate. Another point about sensitivity and rapid testing is if you’re not getting the test, or if a result is not available in the appropriate amount of time, then the most sensitive test in the world is useless. Right now, we have 100 people infect 140 new people on average. That means over a three- to fourweek period, 100 people become 500. But what if we can just have those 100 infect not zero, but 90? That sounds like a bad testing program, but it’s a great one, because at a national scale after three or four weeks, those 100 people become 20 instead of 500. It just keeps going down from there. That’s the power of thinking about testing 2021
March–April
73
through an epidemiological lens instead of a medical lens. When did you start seeing rapid antigen tests as an important part of an effective approach to the pandemic?
The first time I held a paper strip test was back in March or April [2020]. One of the companies told me they were not quite at the sensitivity we needed, but they could scale a lot. I started doing epidemiological modeling with my laboratory to think of new ways to approach a pandemic. At the Broad Institute, we run around 150,000 tests per day now. It’s probably the highest efficiency lab in the world. But even with that efficiency, I recognized it wasn’t going to be enough tests for what this country needs. I started recognizing, as we built more models, that it wasn’t even close. Frequency of testing and the speed to get results trump everything else. You could have sensitivity that’s a thousandfold lower [than that of PCR tests], but because the virus in everyone grows from 10 viral particles to a trillion, missing the first thousand is not a big problem. Within a 24-hour period, the SARSCoV-2 virus will grow from just detectable on PCR to a billion viral copies. PCR is likely to miss the peak of people’s transmission. If you do a rapid test every two or three days, you’re likely to catch someone early in their course of infection before they go on to transmit to others. That’s what we learned from all the modeling [published on medRxiv .org in July 2020]. I started talking about this widely back in May. What could such a deployment of rapid tests look like?
There are different schools of thought about where they’re best used—what I call public health screening versus entrance screening. Both approaches use rapid tests, but one has an epidemiological goal in mind, and it’s the more powerful one, but one is maybe more politically palatable. Entrance screening is what you might expect—detecting infectious people before they access facilities. The first place that President Joe Biden will probably use these rapid tests is in schools by testing people on a frequent basis. I think we could use that for businesses, too, to keep the economy running, much more safely than we’re doing now. On the other hand, what I’d like to see, instead of entrance screening, is 74
American Scientist, Volume 109
public health screening. It’s mass distribution of rapid tests to people’s homes, a tool from the government, no strings attached, that you can use twice a week to test yourself with a simple paper strip. You swab your nose, put the swab on the paper, drop some contact solution on it, and get a result in five minutes. People get immediate feedback that they are safe to go about their day. But if I’m positive, I can see the [second] line [on the test result]. It’s not some weird phone call I’m getting that I may not trust. That empowers me to know I shouldn’t go visit my mom tonight, or go to church, because I don’t want to get my church sick. And you don’t have to tell anyone about it. There’s stigma that comes with getting COVID now. A lot of people don’t get tested because they don’t want their friends who they hung out with last night to be told they can’t go to work for 14 days. People are actively not getting tested because they don’t want to get contact traced. For those who want to report their testing results, you could have one-click reporting. On the other hand, if someone wants their privacy, they can have that too. Maybe they need to go to work. But if they know they’re positive, then they can go to work knowing that, and maybe they don’t eat lunch in the break room, or maybe they really wear their mask that day more than they normally would. Everyone at the aggregate level can take small mitigating steps and combined that can quickly get the R value [the average number of people infected by one infected person] of this virus below one. Public health testing is designed so that in weeks, truly, we could get R below one [an R value below one means the viral spread is declining]. Your and several economists’ modeling, published in October 2020 on the preprint site medRxiv.org, shows the economic benefit of national screening testing. What did you find?
A $5 billion or $20 billion program is less than 0.1 percent of what this virus will cost Americans. If we could use $20 billion to get these tests out to most households in America for five months to use twice a week, that gets R down below one. The return on investment is in the hundreds of billions of dollars. Right now, every single day, this virus costs America $16 billion. If we can get the economy open one week earlier, that much more than pays for the whole
program. But this public health screening approach could get the economy going months earlier. This program would ameliorate the pandemic, not just put a Band-Aid over its symptoms. Stopping the viral transmission has gains we can’t even appreciate. For example, if we can stop transmission as much as possible before we roll out vaccines, then we cut down on the potential for mutations that escape vaccination. With vaccine rollout already underway, why would screening testing still be worthwhile?
Screening testing can get the virus under control much quicker and should be seen as a support to the vaccine program. If we can get R below one without even getting the vaccine out, then we’re not in a crisis mode as we roll out the vaccine. It buys us a lot of time. If we can scale up the Innova test’s production to 20 million to 30 million per day, which is possible given their current production, we could get R below one by mid-March. That’s months earlier. We probably wouldn’t hit herd immunity through vaccines alone until the end of the summer or so. The other reason is, as we’re seeing in South Africa, Brazil, and many other places, mutant strains may well arise. We’ve created four vaccines for most of the Western world: Pfizer, Moderna, AstraZeneca, and Johnson & Johnson. All four target one spike protein [a receptor on the virus used to gain entry into host cells]. We have quadrillions of viral particles floating around today. It just takes one of them to find some new way to latch onto the cell, just by chance. The more we can stop spread before the vaccine, the lower the likelihood [that the virus will adapt to escape the vaccine]. But this is a big world. If the virus does escape immunity, these tests also can be our support system at that point. We’ll have another control mechanism that works. What do you do if you suspect your test result is a false positive?
We’ve found that when there is a false positive on one antigen test and you run the same test again, the false result doesn’t repeat. That’s important. One of the most powerful aspects of these tests is that you’re not waiting days to get a result. You can repeat every single positive you get immediately. The other way to check and reduce false positives is to use two different
antigen tests. They have different molecules on them, so if there’s discordance, you know the positive one was probably a false positive. In the case of two discordant results, people get confused about which one to believe. But that doesn’t take into account the correct likelihoods. If one is positive, it’s likely that another test that’s 95 percent sensitive or more will also turn positive. It’s very unlikely that it would happen to be the 0.2 percent that results in another false positive. There are easy ways to use the benefits of a rapid test. For various reasons, people say you need to confirm these antigen tests with a PCR test. That’s just wrong. When you have ready access to multiple antigen tests, you could take another one. The reason we’re using these antigen tests is because we don’t have enough PCR tests. We can’t confirm every negative if we use antigen tests frequently. We know that PCR tests can stay positive for weeks or months after the virus infected someone. It just takes a few days to kill the virus so that you’re no longer able to transmit it to others. Accessible over-the-counter tests are needed for effective screening of
asymptomatic cases. Why hasn’t the U.S. Food and Drug Administration (FDA) authorized more tests for this use?
The FDA only has a charge to evaluate medical diagnostic products. An overthe-counter product is already stretching what their interest is. That means it’s not a medical device. It’s a consumer device. This is a bad catch-22 that the government is in right now. We don’t have a regulatory landscape or an office that is charged with evaluating a test besides one that a physician would prescribe. The FDA is only in a position to evaluate a test the way that the companies want. Abbott can make only 1.2 million tests a day and doesn’t necessarily want to get an over-the-counter claim, because the test will have to be cheap. The FDA is not in a position to tell Abbott, “We’re giving you an over-the-counter claim for this test.” Our regulatory landscape has been designed to work with forprofit companies and commercialization. The Innova test turns out to be one of the best tests we have. I’ve been evaluating them all. We’re exporting them to the rest of the world. We need the U.S. Department of Health and Human Services (HHS) to authorize these differently. When
30,000+ students supported
17 eligible disciplines
BinaxNOW was first purchased by the government [in September 2020], Abbott only had a prescription claim on it. HHS said you can use these tests off-label in congregate settings without a prescription as part of your public health surveillance within nursing homes or schools. People criticized Brett Giroir, the coordinator of U.S. testing at HHS, for going around the FDA. Our country is just not set up to deal with public health crises. How have the new variants of SARSCoV-2 affected your approach?
People are realizing that we need to act quickly, or there could be a whole new cycle of this epidemic. It’s causing people to act how I think we should have been acting since February [2020]. I feel even more of a crisis—to make sure that these rapid tests are built in numbers that can serve not just the United States, but also central Africa. Some other countries don’t have any PCR testing. They have no way to monitor the virus and stop transmission. These rapid tests are accessible tools and can be made for 50 cents each. Q An extended interview is available online.
Grants in Aid of Research Spring Application Deadline: March 15, 2021 The Sigma Xi Grants in Aid of Research (GIAR) program has awarded funding to undergraduate and graduate students across disciplines since 1922. Grants of up to $1,000 are available in most disciplines. Designated funds from the National Academy of Sciences provide grants of up to $5,000 for astronomy research and $2,500 for vision related research.
Learn more: www.sigmaxi.org/giar [email protected] 800-243-6534
Open to qualified science and engineering students worldwide
www.americanscientist.org
Support GIAR: www.sigmaxi.org/give #myGIAR
2021
March–April
75
Briefings
The Grammar of Viruses Natural language processing may help predict patterns of viral evolution. Computer scientist Brian Hie at the Massachusetts Institute of Technology hypothesized that the gene sequences of viruses have a structure analogous to that of languages, and that learning their structures could help predict viral escape—mutations that enable viruses to slip past the body’s defenses. He and his colleagues turned to natural language processing, a machine-learning technology that can teach computers the rules of a language so that, for example, a word processor can flag grammatical mistakes. They fed the genetic data of three viruses—influenza A, HIV, and SARS-CoV-2—into natural language processing algorithms, including thousands of variations for each viral sequence. The algorithms analyzed the sequences for structures similar to the syntax (grammar) and semantics (meaning) of languages, and learned which variations were associated with viral fitness. Hie and his colleagues found that for a mutation to have the potential for viral escape, it must maintain the syntax but change the semantics of the virus’s genetic sequence. Their innovation may help with the identification of dangerous viruses that could lead to future pandemics as well as the development of new vaccines.
from yellow to green, but the baseline colors of 33 percent of the rivers displayed an appreciable change in hue. The color variations are related to the levels of algae and sediment in the water, and they differed by location—the western rivers tended to become greener, whereas the eastern rivers trended more yellow. The changes were most pronounced near dams and urban areas, indicating that human development is a driver of the shift. Studying river color can help pinpoint areas that have experienced environmental stresses. The team’s novel method of examining waterways at a continental scale can help ecologists identify macrolevel trends.
American Scientist, Volume 109
Seals Vocalize Ultrasonically Weddell seals chatter at frequencies beyond human hearing, possibly indicating a complex form of communication or even the use of echolocation. Their diverse vocal calls have been well documented for decades, but only in the sonic range. These seals, which live all around the Antarctic coast, can dive at least 600 meters deep for more than 80 minutes at a time. University of Oregon biologist
Brighter-than-Expected Universe As anyone who has bought paint knows, there are many shades of black. Astronomers studying the expanses between stars in deep space are finding that the dark swaths of sky are a few shades brighter than expected. A team led by astronomers Tod R. Lauer of the National Science Foundation’s National OpticalInfrared Astronomy Research Laboratory and Marc Postman of the Space Telescope Science Institute examined images from the New Horizons spacecraft, which flew past Pluto in 2015 and is now more than 7.5 billion kilometers from Earth. At that distance from the Sun and other reflecting objects—especially the space dust that clouds the inner Solar System—the sky is much darker than any sky ever before seen; however, there is
NASA/APL/SwRI
U.S. Rivers Are Changing Color
76
Lauer, T. R., et al. New Horizons observations of the cosmic optical background. Astrophysical Journal doi: 10.3847/1538-4357/abc881 (January 11).
Gardner, J. R., X. Yang, S. N. Topp, M. R. V. Ross, E. H. Altenau, and T. M. Pavelsky. The color of rivers. Geophysical Research Letters doi: 10.1029/2020GL088946 (January 7).
Hie, B., E. D. Zhong, B. Berger, and B. Bryson. Learning the language of viral evolution and escape. Science doi: 10.1126/science.abd7331 (January 15).
A study of satellite images found that one-third of U.S. rivers have undergone significant color changes over the past few decades. Geologist John R. Gardner of the University of North Carolina at Chapel Hill and his colleagues evaluated 234,727 satellite images spanning the years 1984 to 2018 and covering more than 100,000 kilometers of rivers in the United States. The waterways had seasonal patterns of color shift, ranging
there are many more galaxies than previous theories suggested. Ultra-deep field observations from the James Webb Space Telescope, set to launch this year, may help solve this mystery.
© Samuel Blanc/CC BY-SA
I
n this roundup, managing editor Stacey Lutkoski summarizes notable recent developments in scientific research, selected from reports compiled in the free electronic newsletter Sigma Xi SmartBrief. www.smartbrief.com/sigmaxi/index.jsp
more light than the astronomers had anticipated. The team processed the images from New Horizons to eliminate all known sources of light, but they still found two times more light than predicted. Possible explanations for the additional glow include dwarf galaxies just beyond current detection capabilities or free-floating stars in the intergalactic void. Another explanation could be that
Paul A. Cziko and his colleagues analyzed recordings from the McMurdo Oceanographic Observatory, a remote Antarctic station 21 meters underwater that operated from 2017 through 2019. They found nine distinct types of ultrasonic vocalizations—including chirps, whistles, and trills—ranging as high as 50 kilohertz. Some of these vocalizations dip in and out of the ultrasonic spectrum, which suggests that previously identified sonic calls might be part of more complex mixedelement calls. The function of ultrasonic calls is not known. One area for further exploration is whether the calls are used for echolocation. Previous studies have asserted that seals do not echolocate, but these new vocalizations suggest they might have that ability. The team’s findings also indicate that the calls of other seal species should be revisited to see whether ultrasonic vocalizations are more common than previously indicated. Cziko, P. A., L. M. Munger, N. R. Santos, and J. M. Terhune. Weddell seals produce ultrasonic vocalizations. Journal of the Acoustical Society of America doi: 10.1121/10.0002867 (December 18, 2020).
Flashback, 1886 During the second half of the nineteenth century, Americans engaged in commerce and the professions emulated their European counterparts by forming affinity groups to promote their disciplines. What seemed to be missing from the mix at the time was an honor society that promoted more than just a narrowly defined scientific discipline, but rather the study and encouragement of all scientific realms. It took a mechanical engineer to come up with the idea. In 1886, Frank Van Vleck and his student, William A. Day, suggested the need for an honor society geared toward the various scientific disciplines. Together with a few historian friends, they choose the Greek letters (sigma and xi) and developed a motto for the Society based on the two letters, “Spoudon Xynones,” which translates into “Companions in Zealous Research.”
“Companions in Zealous Research”
N ominate a new member today Email: [email protected]
www.sigmaxi.org Robert R. Morris. 2011. Sigma Xi, The Scientific Research Society 1886–2011. Marengo, Illinois: Walsworth Publishing Company.
www.americanscientist.org
2021
March–April
77
Sightings
A Walk to Remember
A
unique set of footprints crosses what is now White Sands National Park in New Mexico. The tracks of one person headed in a straight line are closely paralleled by prints pointing the opposite way, indicating a return journey. The tracks suggest a quick pace, and slide around on the surface, as if it had been raining. And these tracks are at least 13,000 years old. Establishing an age for footprints is notoriously difficult. Those that are discovered are usually at the surface, where there is little material to use in radiocarbon dating. But these prints present a vignette that may exist nowhere else in the world: The first set of prints are crossed by those of a mammoth and a giant ground sloth—and then the same human’s prints cross over the animal prints. Those creatures died out some 13,000 years ago, so the prints are at least that old. “At that point, we were justifiably excited, because that co-association let us build up a picture of humans living at a really great antiquity in the landscape along with extinct fauna,” explained paleontologist Sally Reynolds of Bournemouth University in the United Kingdom. ”That sort of insight into how ancient humans felt on their landscape in relation to the other fauna is something I don’t think you get from any other site in the world.” The track of more than 400 footprints was discovered in 2016 by Reynolds’s coauthor, David Bustos of White Sands National Park. The research team has since documented the prints using a method called structure from motion photogrammetry. Overlapping photographs taken around the item at oblique angles are used to create a three-dimensional model that can also determine depth and curvature of each footprint measured. “It’s very important that we have a permanent record of these tracks before they disappear forever, because once you excavate them, 78
American Scientist, Volume 109
it’s a race against time, and they are essentially eroding in situ,” says Reynolds. “The photogrammetry is so much better than the more traditional ways of preserving the footprints, such as casting. The big, bulky casts sit in the museum gathering dust. We can share the 3D models digitally.” The models show an immense amount of variation among the prints, because of the slippery surface, but there’s little doubt that the prints all belong to one person, and the data show that they are the same coming and going. “I can’t say for sure that it wasn’t the twin of the person coming back,” quips Reynolds. “But the attributes of the person, the size of the foot and the height we can infer from that, and the walking style, are the same.” The measurements indicate the person was of the stature of a woman or an adolescent male. Along the path, a few child prints appear out of nowhere, so the person was carrying a toddler. But only in one direction—the variation in the load bearing on the footprints confirms that. “The little, tiny footprints are absent, and also there is slightly less marked asymmetry on the return journey than there is on the outward journey,” says Reynolds. The site is also a rare opportunity to quantify how much prints from a single trackmaker can vary, but Reynolds laments that there are not sufficient standards for comparison. “Going forward, I do hope that we’ll get one or two more students who are interested in going out and mucking around on the beach or in the mud with wet feet,” she says, “in order to generate some interesting prints for us to look at.”—Fenella Saunders
All images courtesy NPS/Bournemouth University
Preserved traces of a purposeful trip some 13,000 years ago say much about human and megafauna interactions.
A track consisting of more than 400 human footprints in White Sands National Park (left) is at least 13,000 years old, a date established by the crisscrossing tracks of a giant ground sloth (above, lower left image) and a mammoth (above, right, red arrows). Three-dimensional models of the prints (right and above) were created using multiple overlapping photographs, and the model is color-coded to show the depth of each print (blue is deeper). The track provides a rare opportunity to quantify the variation in prints produced by a single person on the same real-world, messy surface. At points along the track, small child prints appear (inset, left, and above, upper image), indicating the person was carrying a toddler. But the child footprints only appear in one direction, and the weightbearing asymmetry in the prints shows the person was not carrying the child on the return journey. The animal prints also show that the mammoth paid no attention to the human prints, but the sloth reared up, perhaps indicating its wariness of possible human predators.
www.americanscientist.org
2021
March–April
79
Technologue
Generating a Greener Future Combined cycle gas turbines are advancing electrical energy production. Lee S. Langston
P
owering the world while preserving the planet is a mounting concern. In an effort to decarbonize the production of electrical power, nations and large regions have recently added substantial amounts of wind, solar, geothermal, and biomass power. But these forms of renewable power take time to scale and likely cannot do so quickly enough for rapid decarbonization. Fortunately, the U.S. electrical system has still been significantly reducing carbon dioxide emissions over the past decade, thanks to the gas turbine. Gas turbine power plants have been generating electrical power in the United States for decades, providing reliable, consistent energy production. When paired with steam turbines in combined cycle power plants and powered by cleaner-burning fuels, they become dynamos in efficiency and CO2 reduction. Powering the United States In 2019, the United States was the second-largest producer of electrical energy, responsible for 16 percent of the world’s 27,005 terawatt-hours. That year also marked a significant shift in energy production: Natural gas, fueling mostly gas turbine power plants, took the lead as the means by which electrical power was generated, amounting to 39 percent of the 4,401 national terawatt total. Coal fell into second place at 24 percent, followed by nuclear at 19 percent, renewables
(mostly wind and solar) at 11 percent, hydro (water turbines) at 6 percent, and other (mostly oil) at 1 percent. Ten years earlier, coal led as the dominant fuel, accounting for about 44 percent of annual U.S. electrical power, which was generated in steam turbine (Rankine cycle) power plants. In 2014, these coal-fired plants accounted for 76 percent of CO2 emissions for the U.S. electric power sector. Since 2011, more than 100 U.S. coal-fired power plants have been replaced with gas turbine (Brayton cycle) plants or converted to the use of natural gas as a fuel, reducing carbon emissions and ending coal’s century-old dominance as the nation’s leader in electricity production. Indeed, annual U.S. CO2 emissions from energy consumption in the electric power sector fell 32 percent (673 million metric tons) from the 2005 level. What is even more striking is that this reduction occurred in a period when the total annual electrical net generation was close to level—3,902 billion kilowatt-hours in 2005 and 3,878 billion kilowatt-hours in 2017. This annual CO 2 reduction is significant—almost equal to the total CO2 emissions of Germany, the world’s fourth largest economy, in 2017 (763.8 million metric tons). So how did this happen? Between 2005 and 2017, coal use was replaced by a 19-percent increase in the use of renewables (mostly wind) and a 31-percent increase in the use of natural gas as a fuel in gas turbine power plants.
The Adaptable Gas Turbine The gas turbine can be used to provide thrust power as a jet engine or shaft power to drive energy conversion devices such as mechanical drives, marine propulsion systems, and, primarily, electrical power generators. Nonaviation, land-based gas turbines are now a dominant means of electrical power production in the United States. Built of heavier construction than their aviation counterparts, gas turbines are closely integrated into the conversion devices they power. Because of their design, gas turbines have a number of advantages over steam power: They are compact, require less start-up time to reach peak output operation, have a lower initial cost, and produce large amounts of power considering their weight and size. In 1939, the public power station in Neuchâtel, Switzerland, was the world’s first industrial commercial plant using the simple cycle gas turbine to generate electricity. Rated at 4-megawatt output, that power plant reportedly had an overall thermal efficiency of 17.38 percent. Impressively, today’s simple cycle gas turbines can reach thermal efficiencies above 40 percent in units that generate up to 500 megawatts. When Two Are Better Than One In the 1990s, gas turbine combustor and hot turbine technology advanced so that in electric power gas turbines, exhaust gas exit temperatures reached
QUICK TAKE A fuel’s carbon dioxide emissions are determined by the ratio of the mass of CO2 produced to its heat content. Lower emissions can help slow global warming.
80
American Scientist, Volume 109
Coal-fired plants accounted for 76 percent of the U.S. electric power sector’s CO2 emissions in 2014. Natural gas produces significantly lower emissions.
Combined cycle gas turbines driven by natural gas can reduce carbon emissions today, and they could be adapted to burn emissionfree hydrogen in the future.
538 degrees Celsius (1,000 degrees Fahrenheit). These exhaust gases were high enough in temperature to pass through a heat recovery steam generator and drive a steam turbine, generating more electrical power. By combining a gas turbine (a Brayton cycle heat engine) and a steam turbine (a Rankine cycle heat engine) in a single power plant, these two prime movers generate electrical power using one unit of fuel, with one thermodynamic cycle feeding the other (see the sidebar on page 83 for cycle and turbine details). These combined cycle gas turbine plants offer many of the same
source—is used in an existing steam power plant, combined cycle gas turbine plants still produce less CO2. Compared to gas-fired Rankine cycle power plants, combined cycle plants yield a 45-percent reduction in CO2 production. For example, the Lake Charles Power Station in Westlake, Louisiana, which began commercial operation in March 2020, is expected to emit 40 percent less CO2 than the previous, older gas-fired steam power plant. But one of the most significant advantages is that combining these two cycles into one power-generating operation yields greater thermal efficiency.
Today, these combined cycle gas turbine power plants are approaching thermal efficiencies as high as 65 percent—double that of most existing single cycle coal-fired plants—with power outputs greater than 1,000 megawatts, making them the most efficient heat engines yet designed by humankind.
Fueled by Natural Gas Supplied by a network of nearly 500,000 kilometers of interstate and intra state pipelines—the world’s most extensive natural gas pipeline system—gas turbine power plants in the United States use natural gas, composed mostly of methane, as hydrocarbon fuel. Methane is the most environmentally benign fossil fuel, with impurities such as sulfur (in the form of hydrogen sulfide) being removed before the fuel enters pipelines. But its biggest advantage as a fuel choice is that it has the highest heating value of any hydrocarbon fuel (a category that includes butane, diesel fuel, gasoline, and coal), meaning that relative to other fuels it has higher energy content and produces less CO2. A fuel’s CO2 emissions are determined by the ratio of the mass of CO2 produced to its heat content. The U.S. Energy Information Administration provides data on CO2 emissions for common fuels: SubbitumiSiemens Energy AG nous coal, the class of coal priA technician inspects a gas turbine rotor in a Siemens Energy turbine factory in Berlin. This marily used as fuel in steam-electric H-class gas turbine acts as the primary power generator when used in combined cycle power power generation, produces 92.13 kiloplants and drives an overall cycle efficiency of 61 percent. Gas turbines are now being devel- grams of CO per gigajoule of energy. 2 oped to operate using hydrogen as fuel. The Mitsubishi Hitachi Power Systems M501JAC, Natural gas, however, produces only a J-series gas turbine, is currently capable of operating on a mix of natural gas and up to 30 50.3 kilograms per gigajoule. Thus, on percent hydrogen, with the ultimate goal of being fueled by 100 percent renewable hydrogen. an energy input basis, subbituminous advantages as simple cycle gas turbine Using conservation of energy and the coal produces more CO2 mass than plants. They have low capital costs, definition of thermodynamic thermal natural gas by a factor of 1.832 (roundranging between $700 and $1,000 per efficiency, the combined cycle thermal ed off to a factor of 2.0 for the followkilowatt, compared to $3,000 and efficiency can be derived fairly simply ing discussion). But the use of natural gas is not $6,000 per kilowatt for coal and nu- as the sum of the two cycles’ efficienwithout downsides. Collecting, storclear, respectively. And because com- cies minus their product. Operating alone, the thermal effi- ing, and transporting natural gas has bined cycle gas turbine plants can rapidly start up and shut down as needed, ciencies of the Brayton and Rankine revealed other issues. Methane is a they can provide reliable backup pow- cycles can be estimated to be about 40 more potent greenhouse gas than er for emergencies and intermittent re- percent (a good value for modern gas CO2, so methane leakage—from tanks, turbines) and 30 percent (a reasonable pipelines, fittings, compressors, and newable power facilities. Because of recent shale gas develop- value at typical conditions), respec- valves—is one of the primary concerns ments in the United States, many coal- tively. Together in a combined cycle with its widespread use as a fuel. Another concern is the use of fired Rankine cycle plants are switch- gas turbine plant, they achieve an esing to cheaper natural gas. But even timated average 58 percent thermal fracking, which has been tied to increased water pollution and waste and when natural gas—a more efficient fuel efficiency, a remarkable increase. www.americanscientist.org
2021
March–April
81
Evolution of the Turbine
gas as the fuel reduces greenhouse gas production by lowering CO2 emissions during energy input. Then, using both Brayton and Rankine cycle turbines in the power plant lowers emissions during energy output by converting what was previously a waste product to generate even more electricity. Thus, replacing coal-powered Rankine cycle power plants with modern combined cycle gas turbine power plants and fueling the combined cycle plants with natural gas results in a substantial 75 percent reduction in CO2 production per unit of electricity, and it nearly doubles power plant thermal efficiency. Not only can combined cycle gas turbines reduce greenhouse gas emissions, they can also be adapted to burn emission-free hydrogen, either mixed with natural gas or pure, to push these reductions even further.
T
he first patent for a basic turbine was filed in 1791. Nearly 150 years later, in 1939, the first commercial use of a gas turbine for power generation was achieved when the simple cycle gas turbine was tested under full power in Switzerland. Since then, gas turbines have been rapidly improving: Worldwide engineering research and development have led to improvements in base materials, component design, and thermal efficiency, bringing about more efficient combustion, higher turbine inlet temperatures, and higher compression ratios. Today, gas turbines are being refined to deliver “greener” operation: to integrate renewable, emission-free hydrogen as a fuel (by better controlling air mixing) and to meet increasingly strict emissions regulations (by emitting less nitrogen oxide).
starting motor
generator
compressor
potentially even minor earthquakes. (See “Hydraulic Fracturing and Water Quality,” September–October 2015.) Natural gas supply in the United States has been supplemented recently by the development of fracking—short for hydraulic fracturing—and the shale gas industry. Fracking uses high-pressure water to create fissures in rock to reach the oil or gas inside, which then escapes through these fissures and is collected and stored. Even considering these ecological concerns, natural gas achieves significant environmental benefits over coal and other fossil fuels. Emissions and Thermal Efficiencies Greenhouse gases released into the Earth’s atmosphere would be significantly reduced if a substantial portion of coal-fired Rankine cycle power plants were replaced with natural gas–fired combined cycle gas turbine plants because they produce less CO2 and have higher thermal efficiency. As discussed, the carbon content of hydrocarbon fuel affects CO2 production. Using methane instead of coal reduces CO2 emissions by half, given natural gas’s higher energy content per unit of carbon. Mathematically, this is represented by a CO2 emission coefficient ratio of two (that is, coal 82
American Scientist, Volume 109
turbine
Courtesy ASME
combustor
produces more CO2 than natural gas by a factor of two). Then consider the differences in power plant efficiencies. More efficient power plants burn less fuel per unit of energy produced. The average thermal efficiency of coal-fired Rankine cycle steam plants is 30 percent, whereas the
Natural gas–fueled combined cycle gas turbine power plants can provide a 75 percent reduction in carbon dioxide production per unit of electricity. latest combined cycle gas turbine power plants are double that, 60 percent or higher, represented mathematically as a thermal efficiency ratio of two. When paired together, the rate of CO2 production for each fuel and the increase in plant thermal efficiency show just how substantial of a reduction in greenhouse gas production can be achieved—CO2 production is reduced by a factor of 4. Using natural
Integrating Green Hydrogen A colorless, tasteless gas, hydrogen is the most abundant element in the universe. As a fuel, it is low weight and energetic, and it has the highest specific heat value of any fuel—three times that of jet fuel. Hydrogen can be blended with natural gas, use existing natural gas pipeline infrastructure, and be stored for later use in natural or constructed salt caverns. And one of its most promising qualities is that it is nonpolluting: When combusted in gas turbine power plants, it would emit only water vapor. But unlike fossil fuels, free hydrogen isn’t readily available. As a highly reactive element, it must be separated from its reactants. Currently, most of the hydrogen used commercially comes from reforming coal or natural gas—hardly a “green” process. However, hydrogen can also be obtained from the electrolysis of water—using electricity to split water into oxygen and hydrogen. If the electrical power used for electrolysis is obtained from renewables, such as wind or solar, we could call this hydrogen “green.” As many nations pursue wind, solar, geothermal, biomass, and other renewables, it becomes more likely that green, emission-free hydrogen— created from a surplus of renewable energy—will be available to integrate into energy production. Companies and countries are currently researching hydrogen injection into gas pipelines and networks already in use by
power plants, and a number of pilot programs are in progress. Southern California Gas Company is field-testing injecting green hydrogen blends (from 1 percent to 20 percent) into its gas pipelines, and utilities in the United Kingdom are blending hydrogen (up to 20 percent in one use) to fuel power plants. And more efficient, environmentally friendly “hydrogen-ready” plants are already in development: The Long Ridge Energy Terminal, a 485-megawatt plant being built along the Ohio River and scheduled to begin production in fall 2021, will use a blend of natural gas and 5 percent hydrogen, with the goal of using 100 percent hydrogen by 2030. A Sustainable Future Although renewable energy is growing in popularity, in 2019 coal still drove 36 percent of the world’s electrical power (and a very high 65 percent of China’s), and roughly 40 percent of the world’s electricity is still generated in Rankine cycle coal-fired power plants. By contrast, renewable sources accounted for 35 percent of electric power generation in Germany and more than 25 percent in California. Renewable fuel sources have the potential to replace fossil fuels, perhaps eventually supplying up to 90 percent of energy production. But at the current rate they’re being deployed, it might take more than a century to decarbonize fully, rather than by 2030 or 2040 as many have hoped. And renewable fuel sources are intermittent—cloud cover, nighttime, windless days, and other factors mean they are not always available or predictable. When the Sun doesn’t shine or the wind doesn’t blow (or blows too hard and exceeds wind turbine operational limits), reliable, ondemand electrical power is still needed at a reasonable cost. Nuclear power plants are a possible alternative, because they can provide consistent, reliable power without direct CO2 emissions. However, the economics of replacing the coal power generating capacity with newly built nuclear power plants in the time needed is daunting—especially at $6,000 per kilowatt. And the questions of how to expand the nuclear supply chain and train the necessary nuclear engineers in such short order remain unanswered. At one-sixth the cost of nuclear, combined cycle gas turbine plants www.americanscientist.org
Combining Turbines and Thermodynamic Cycles
T
urbines are component parts of heat engines that use a working fluid to convert heat into mechanical energy: Gas turbines use air and steam engines use water. Within a gas turbine, air and fuel are pressurized, heated, expanded, and expended. This thermodynamic cycle of using high-temperature gases from combustion to create energy is called the Brayton cycle. American mechanical engineer George Brayton patented the cycle in 1867 and was the first to successfully implement it in 1872. However, because it was originally theorized by James Prescott Joule in 1851, it is also known as the Joule cycle. With steam engines, fuel combusted in a boiler heats and converts water to steam, which can then expand in a
steam turbine to produce mechanical power. The steam is then cooled at constant pressure, condensed into a liquid, and pumped back into the boiler. This process—the cyclical changes of temperature and pressure of the working fluid to produce energy—is named the Rankine cycle after Scottish engineer William Rankine, who developed it in 1859. Each turbine uses shaft power to generate electricity. The combined cycle power plant uses one unit of fuel to drive both turbines and thermodynamic cycles. Using the exhaust gases that would normally be a waste product from the Brayton cycle of the gas turbine to power the Rankine cycle of the steam engine makes combined cycle plants the most efficient energy producers yet created. exhaust gas heat recovery steam generator (HRSG)
natural gas (or oil) combustion chamber
electricity
air
gas turbine electric generator
gas turbine condenser
electricity
cooling water or air
steam turbine electric generator
steam turbine
Barbara Aulicino
now have the highest thermal efficiencies ever recorded and have already proven capable of serving as reliable sources of dispatchable power generation, filling on-demand and surge energy output needs and gaps left by other electrical production sources. Looking to the future of electrical power, gas turbine power plants— thanks to their robustness and adaptiveness—have a long, bright future as one of society’s major energy converters. And when fueled by natu-
ral gas, or even potentially emissionfree hydrogen in the future, combined cycle gas turbine power plants can meet these needs sustainably and maximize other renewable energy efforts. (References are available online.) Lee S. Langston is a professor emeritus of mechanical engineering at the University of Connecticut. He regularly writes articles on gas turbine technology aided by an early engineering career in the gas turbine industry. Email: [email protected] 2021
March–April
83
Engineering
How can science better anticipate innovation and invest more in success?
Henry Petroski
R
ecently I came across my copy of Freeman Dyson’s 1981 lecture, “Quick Is Beautiful,” in which the mathematical physicist spoke about “ways in which the scientific community may help solve some of the urgent practical problems” facing humankind. Among the problems he described in his talk was the development of nuclear reactors, and how, in the mid-1950s, the fledgling General Atomics division of General Dynamics assembled a group of consultants (including Dyson) to come up with ideas for viable concepts that the company might pursue to commercialization. The team, under the leadership of physicist Edward Teller (a lead developer of the first thermonuclear bomb), proposed three novel systems. One was “built, tested, licensed, and sold within less than three years,” and called the TRIGA reactor, an acronym for Training, Research, Isotopes, General Atomics. According to the company, the TRIGA reactor became “the most widely used non-power nuclear reactor in the world.” To Dyson, it exemplified what he meant by “Quick Is Beautiful,” a title that may have been inspired by economist E. F. Schumacher’s Henry Petroski is the Distinguished Professor Emeritus of Civil Engineering at Duke University. Address: Box 90287, Durham, NC 27708. 84
American Scientist, Volume 109
Józef Tykocin´er demonstrated his system for combining sound with moving pictures in 1922 (above), and the development was widely reported in newspapers (left). But patent battles delayed the technology’s development for long enough that another technology superseded it.
1973 book, Small Is Beautiful, which was read as an argument for the use of appropriate technologies. Dyson’s talk was the fifth Tykocin´er Memorial Lecture held at the University of Illinois at Urbana-Champaign. The eponymous series was established in 1972, having been made possible when engineer Józef T. Tykocin´er left his entire estate to the institution with which he had been associated for decades. In keeping with Tykocin´er’s own broad interests, the lectures treated a wide range of topics selected by scholars in the humanities, physical sciences, and biology. The first memorial lecture, on “Talking Pictures,” was delivered by Dennis Gabor, the Hungarian-British electrical engineer who invented holography, for which he received the Nobel Prize in Physics. Gabor and his topic were ideally suited for the inaugural lecture because 1972 marked the 50th anniversary of Tykocin´er’s public demonstration at the University of Illinois of the first motion picture with synchronized sound. Other early Tykocin´er lecturers included Sir Isaiah Berlin speaking on the divorce between the sciences and the humanities, and Sol Spiegelman, the molecular biologist whose technique of nucleic acid hybridization was preliminary to advances in recombinant DNA technology. Dyson was at the Institute
for Advanced Study in Princeton when he delivered his lecture, which was shortly after his popular book Disturbing the Universe was published. Tykocin´er Lecturers following Dyson included the economist, computer scientist, and cognitive psychologist Herbert Simon, who spoke on progress in the science of research; and the chemist, novelist, and contributor to the development of an oral contraceptive, Carl Djerassi, who spoke on science in fiction. The list of Tykocin´er lecturers was as wideranging as their namesake’s career. Tykocin´er’s Curiosity Józef Tykocin´ski-Tykocin´er was born in Vłocławek, Poland, in 1877. His father was a grain broker and wanted Józef to follow in his footsteps, but the young man was drawn to more technical pursuits, although not to formal education. Instead of doing homework and laboratory assignments, he preferred to conduct his own experiments, especially in the developing field of shortwave radio technology. When he had the opportunity to ask fellow Pole Marie Curie for suggestions of what contemporary scientific questions needed answering, she mentioned the ability to send electronic messages around the world. On his first trip to America, in 1896, Tykocin´er went to New York City, where
Courtesy of the University of Illinois at Urbana-Champaign Archives
Quick Is Beautiful, Slow Less So
he may have met Nikola Tesla, who had worked for the Edison Machine Works before striking out on his own. With financial support from partners, Tesla had set up laboratories in which he developed alternating-current induction and polyphase motors, whose patents were profitably licensed to the Westinghouse Electric Company. He was also pursuing ideas for wireless means of lighting and power distribution employing high-voltage, high-frequency electrical sources. He held out the hope of developing wireless communication systems by such means, but ran out of money for his independent experiments before realizing his goal. Wireless telegraphy was something the Italian inventor and engineer (and Tykocin´er’s close contemporary) Guglielmo Marconi was working on at the time. By the mid-1890s, Marconi had developed a communication system based on radio waves. Not finding much support in Italy for his work, he went to England, where the chief electrical engineer of the post office expressed interest in the apparatus, and it soon received a British patent, said to be the first in the field. Tykocin´er was working for the Marconi Wireless Telegraph Company in London in 1901, when Marconi demonstrated the transatlantic transmission of radio waves. When Tykocin´er returned to continental Europe with a wealth of experience and ideas, he worked for the German wireless telegraph company Telefunken. In 1904, he was assigned the task of developing a system of radio communication for the Russian navy, which took him to Russia. In recognition of his success in completing this task, Czar Nicholas II bestowed upon him honors variously described as a medallion or a jeweled gold watch. Following the Russian Revolution of 1917, Tykocin´er worked on radio problems for the government of Poland, which after more than a century of foreign rule had regained its independence. The Dream of Talkies Concurrently to his other work, Tykocin´ er had begun exploring the idea of adding sound to silent movies, a problem that really captured his imagination. At the time, motion pictures relied upon full-screen captions where context was needed, and on live piano or organ music to set the appropriate mood for the silent images being projected. It became Tykocin´er’s quest to www.americanscientist.org
incorporate a synchronous soundtrack directly on the picture film. In this way, a “movie” could become a “talkie.” In 1920, he returned to America and worked in Pittsburgh as a research engineer for Westinghouse, but he was unsuccessful in convincing the research laboratory to support his work in sound on film. Five years earlier, Westinghouse had hired away from the University of Illinois an electrical engineering professor, who had taken his patents with him. Perhaps in recognition of the company’s poaching, the company recommended that the electrical engineering department hire Tykocin´er. Thus he became an assistant research professor and worked on the design of antennas, a precursor to the development of radar, something Marconi had also been working on at the beginning of the 20th century.
Tykocin´er was able to record images of sound waves that could be converted back into sound as the projector’s light passed through the film. Tykocin´er still wished to pursue his ideas for sound on film, but he had “no proper laboratory, no money, and no help” to do so, as he noted in a 1967 interview. However, he did have drive and was able to find working space in the physics department, which was (and remains) part of the college of engineering at Illinois. The agriculture department was his source for a projector. There was nowhere at the university where he could secure a camera, so he used his own money for that. His early attempts employed the varying brightness of a sonically disturbed gas flame to capture sound in a visual form. One of his colleagues in Urbana was Jakob Kunz, a theoretical physicist who pioneered the development of photoelectric cells used in astronomical photometry. Kunz’s photoelectric cell, which converted light into electricity, used potassium and silver and was superior in sensitivity to anything Tykocin´er had been using. By employing a Kunz cell, Tykocin´er was able to record optically onto the film images of sound waves that, with a similar cell incorporated into
the projection equipment, could be converted back into sound as the projector’s light passed through the film being run. In order to amplify the sound sufficiently, he borrowed a device from the campus radio station, but was required to return it to the station every day in time for its evening broadcasts. The year following his arrival at the University of Illinois, Tykocin´ er was ready to produce films with sound. The head of electrical engineering must have been flattered when asked to be filmed reading the Gettysburg Address. Accounts vary as to what kind of audience witnessed the first demonstration of a talking motion picture. According to one story, it was the university trustees. Another version says Tykocin´ er wished to give it before a joint meeting of the American Institute of Electrical Engineers and the Electrical Engineering Society. A more colorful account says the screening took place in an auditorium large enough to hold all engineering faculty members. During a dry run, a colleague noted that the noise made by the projection equipment was drowning out the sound of the movie. To obviate this problem, Tykocin´er positioned the equipment just outside the auditorium door, into which he bored a hole, thus creating the first projection booth. The dean of engineering reportedly did not appreciate the hole left in the wooden door and had it plugged with a piece of bronze. For the demonstration, Tykocin´er produced a short film featuring his wife, Helena, holding a bell in her hand and saying, “I will ring the bell.” Then she did so, and afterwards asked, “Did you hear the bell ringing?” The successful demonstration took place on June 9, 1922 (the same year Tykocin´er was inducted into Sigma Xi), and was covered by the influential New York World (owned by the Pulitzer family), the New York Times, and other newspapers from around the world. The Need for Speed Not everyone was ecstatic about talkies superseding silent films. George Eastman, founder of Kodak and an expert on technical aspects of filmmaking, said, “I wouldn’t give a dime for all the possibilities of that invention. The public will never accept it.” The silent movie industry was also unenthusiastic about the new technology, considering it a toy. Changing from silent to talking films would mean investing in new 2021
March–April
85
William E. Sauro/The New York Times/Redux Pictures
Image courtesy of USGS
equipment, new writers, new actors, new editors, and who knew what else. However, the University of Illinois must have seen the new technology as a potential golden goose. According to a New York Times story published a month after the public demonstration, the president of the board of trustees declared that the invention belonged to the university and that “patents had been applied for by the school and the institution would develop the scheme, and if it were successful it would be turned over to the public at a nominal profit.” Three weeks later, the newspaper reported on a press release about the invention “sent out through the official news channels of the university” and quoted it at length. The news release described how Tykocin´er had worked for 20 years on the problem that he finally solved by optically recording action and sound simultaneously on a single strip of film. The visual component was by then old hat, but photographing sound was something new. Tykocin´er accomplished this feat by using sound waves to alter, via a photoelectric cell, the intensity of light from a mercury arc lamp focused on a narrow band of film beside the corresponding action. The varying light intensity created a film strip of varying transparency, which could then be decoded as a soundtrack when the film was projected for viewing. At the same time, the university also announced having “secured the second of its patents on apparatus fundamental in the art of talking motion pictures.” The first patent appears to have been on Jacob Kunz’s durable selenium cell. But Tykocin´er wished to retain patent rights on his invention, which the university was not about to give up without a fight. 86
American Scientist, Volume 109
Freeman Dyson (left) gave a Tykocin´er Memorial Lecture titled “Quick Is Beautiful,” in which he discussed the fast and successful development of the TRIGA nuclear reactor (above). However, Dyson noted that a safer reactor’s development was neglected, leaving the field behind the curve once the Three Mile Island nuclear reactor accident caused a scramble for safer technologies.
It threatened to withhold funding for his continuing experiments, and the physics department demanded its space back. Tykocin´er reached out to Westinghouse, General Electric, and Western Electric for support, but to no avail. He was finally able to patent part of his invention, but by 1926 it was superseded by a process invented by Lee de Forest. Not being quick can be a showstopper. In the meantime, as most of Hollywood remained stubbornly opposed to advancing, Warner Brothers produced the first feature-length motion picture with sound synchronized not only to the gross action but also to the lips of the actors. The 1927 movie was the musicaldrama The Jazz Singer, starring Al Jolson, and it proved that the public not only accepted talkies, but would demand them. Nonstop Innovation Like a true inventor, Tykocin´er did not sulk over being left behind by the motion picture industry. He was already onto a new project involving microwaves, which required considerable experimental space. Fortunately, the University of Illinois is a land grant school, and abutting its campus are the university farms. When my wife, Catherine, and I were graduate students at Urbana in the 1960s, we often drove among the fields, pens, and pastures, enjoying the animals. Tykocin´er set up his microwave experiments among the livestock, but for him the cows proved to be more annoyance than entertaining. Not only did
the curious animals roam around and disturb his equipment, they also interfered with the transmission of the microwaves. These disturbances proved serendipitous, however, because they led to the discovery that objects in the path of microwaves reflect them. But the powers that be at the University of Illinois felt that Tykocin´er and his crew disturbed the cows more than the cows disturbed the experiments, and so the field laboratory had to be abandoned. The study of microwaves, which may have led to developments in radar, was thus curtailed. The inventor next wanted to study the phenomenon of piezoelectricity, in which mechanical stress applied to an appropriate substance induces an electric current in it. This field was later proved by others to be a fruitful area of research and development, and resulted in such practical inventions as guitar and gramophone pickups. However, the university chose not to support it. Upon retiring from his position as research professor in 1948, Tykocin´er said, “I should now like to concentrate my work on the problem I started to investigate in 1927—namely, what are the conditions helpful in research activities.” He worked at developing a system that synthesized all knowledge, and he called it zetetics. One hagiographical account of Tykocin´ er’s life and career characterized the “new research science” as meaning “to detect the most important lacks in human knowledge by providing a structured
overview of all creative advancement: engineering, art, anthropology, and so forth.” Furthermore, zetetics was “designed to help researchers avoid the situation where their valid work was left underdeveloped due to a lack of insight.” On March 3, 1959, Tykocin´er wrote in his diary: “The most important date in my entire scientific activity. More important than sound motion pictures, antenna models, etc. The manuscript for the Outline of Zetetics is complete for publication.” In 1966, it was published by the College of Engineering, and for general sale by the Philadelphia publisher Dorrance. In the early 1960s, an 85-year-old Tykocin´ er came out of retirement to teach a course based on his new science. Hank Slotnick, professor emeritus of neuroscience at the University of North Dakota, was a graduate student at the University of Illinois in the late 1960s. His advisor, who knew of his interest in science and how it worked, suggested that Slotnick take the course taught by his neighbor. Catherine and I were also graduate students in Urbana in the 1960s, she in English and I in theoretical and applied mechanics. It was through her that I learned Tykocin´er was to give a lecture on zetetics to an English department group. We attended the lecture and, along with the rest of those in attendance, were impressed with the man’s energy and enthusiasm. I do not remember much about the talk itself, other than that it was illustrated with busy diagrams and was not especially easy to follow. However, as recently as the mid-1990s, some librarians did find the zetetic method to be of “continuing relevance to library science, a discipline for which Tykocin´er had a particular attachment.” Catching a Wave After leaving Urbana, I did not think or hear much about zetetics until Freeman Dyson sent me a copy of his lecture. Dyson had visited Duke in 1983, and I met him then at a reception sponsored by the university’s science, technology, and human values program. We talked about, among other things, nuclear reactors, and afterward I sent him an essay I had recently published on the opposing philosophies embodied in two competing reactor designs, which prompted him to send me “Quick Is Beautiful.” We also talked about book publishing, and he allowed that a neighbor www.americanscientist.org
of his had a collection of correspondence between her late husband, the editor Saxe Commins, and the playwright Eugene O’Neill that she wished to see published. Did I know a press that might be interested? A couple of years later, the letters were published by Duke University Press. Diverse individuals are connected in many and often unanticipated ways. Chance and circumstance provide opportunities to exploit such connections. This situation was true for Tykocin´er throughout his career, and he suspected it was also true about the human endeavors we call fields of study, re-
Dyson said the ability and will of institutions to react quickly to changing conditions was essential to maintaining technological superiority. search, and creativity. The phenomenon seems to have inspired his zetetics. In the “Quick Is Beautiful” lecture where Dyson described the success of the TRIGA reactor, he also related a follow-up project at General Atomics: the development of a high-temperature, gas-cooled power reactor known by the acronym HTGR. Theoretically, for thermodynamic reasons, it was much more efficient than water-cooled reactors, and it was inherently safer. Unfortunately, only two HTGR demonstration plants were built, one of which proved uneconomic to operate, and the other exhibited some curious core behavior. On the occasion of the 25th anniversary of its initial assembly, the team of consultants who had conceived TRIGA reassembled at General Atomics and learned of the company’s contrasting experience with the HTGR. As Dyson understood it, even though the HTGR had been determined to be 1,000 times safer than a water-cooled reactor, General Atomics had taken no action to go ahead with full-scale production. In the meantime, the 1979 accident at Three Mile Island rocked the nuclear industry, and safety became the central issue. The HTGR should have been the no-contest choice, but because Gen-
eral Atomics had declined to continue to invest in its development and fullscale demonstration, an example could not be ready for a dozen years. Dyson saw this lack of foresight as paradigmatic of private industry and government-sponsored research, and of development generally. The ability and will of institutions to react quickly to changing conditions is essential to maintaining technological superiority. He saw the experiences of nuclear physicists with fission reactors as having lessons for genetic engineering. He advised against embracing research and development programs projected to take decades rather than those with more focused objectives that could be achieved in just years. He firmly believed that quick is beautiful. It seems that he and Tykocin´er would have had much to discuss. Bibliography Anderson, B. 2013. Joseph Tykocin´er and the “talking film.” https://archives.library .illinois.edu/blog/joseph-tykociner-and -the-talking-film1/ Davis, M. A., and H. O. Davis. 1996. Current relevance of zetetics to library research and library instruction. Illinois Libraries 78: 230–233. https://www.lib.niu.edu/1996 /il9604230.html Doering, P. F. 2003. The Tykocin´er Memorial Lectures. https://web.archive.org /web/20070107042854/http://doer.com /JTT/Lectures/index.html Dyson, F. J. 1981. Quick is beautiful. Fifth Tykocin´er Memorial Lecture, April 7, University of Illinois at Urbana-Champaign. Ke˛pa, M. 2017. The scientist behind the first talkie: A secret Polish history. https://culture .pl/en/article/the-scientist-behind-the-first -talkie-a-secret-polish-history Lewis, J. R. 1981. J. T. Tykocin´er: A forgotten figure in the development of sound. Journal of the University Film Association 33:33–40. New York Times editors. 1922. Invents talking movie. New York Times (July 9). New York Times editors. 1922. Talking film device reported in West. New York Times (July 31). New York Times editors. 1969. Joseph T. Tykocin´er is dead; invented sound track system. New York Times (June 12). Schumacher, E. F. 1973. Small Is Beautiful: A Study of Economics as if People Mattered. New York: Harper & Row. Slotnick, H. 2020. The first projection booth. Bemidji Pioneer (June 24). https://www. bemidjipioneer.com/opinion/columns /6544926-COMMENTARY-The-first-pro jection-booth Tykocin´er, J. T. 1959. Research as a Science: Zetetics. Urbana, IL: Electrical Engineering Research Laboratory. Tykocin´er, J. T. 1966. Outline of Zetetics: A Study of Research and Artistic Activity. Urbana, IL: College of Engineering. 2021
March–April
87
Artist and biologist David S. Goodsell’s painting SARS-CoV-2 and Antibodies depicts a cross section of the novel coronavirus that causes COVID-19. The coronavirus’s characteristic spike proteins (magenta) bind to cells, infecting them. The RNA genome inside the virus is safely packaged by many copies of the nucleocapsid protein (blue). The virus is surrounded by blood plasma molecules, including antibodies (bright yellow). A few of the antibodies are shown binding to spikes, neutralizing the virus and preventing infection.
88
American Scientist, Volume 109
Arts Lab
Painting a Portrait of SARS-CoV-2 Art can be a tool for understanding the inner workings of cells. David S. Goodsell
All paintings courtesy of RCSB Protein Data Bank, https://pdb101.rcsb.org/sci-art/goodsell-gallery/
A
main purpose of science is to seek out things that are not well understood, and then to find ways to strengthen our understanding of them. This process is part of what makes the work so much fun: We get to explore and build up, discovery by discovery, increasingly detailed knowledge of the natural world. Throughout my career as a biologist, I’ve explored many topics, including the structure of DNA, methods of designing new drugs, and the amazing molecular mechanisms used by viruses and bacteria. In every case, I have used art to strengthen and deepen these explorations. For the past 30 years, I’ve been creating illustrations that capture the current state of knowledge of the mesoscale realm of biology— the scale between individual protein and DNA molecules—and show how these molecules are arranged in living cells. In these illustrations, I try to imagine what we would see if we could enlarge portions of cells and viruses a million times, so that the individual molecules were visible. Each new artwork begins with a treasure hunt in which I comb the literature for the bits of information that I need to support the illustration. What is the structure of each biological molecule? How many copies of each molecule do I need to include, and how do they fit into the higher structures of the cell? Who interacts with what? For each question that I answer, there are always a dozen more to chase down in my effort to bring these inner landscapes to life. As you can imagine, a healthy dose of artistic license is essential in this process. Although molecules and cells have been studied for decades, many scientific details are not known, at least not to the level of certainty that I need to draw them. The paintings shown here exemplify some of the creativity that I need to employ.
www.americanscientist.org
Early in 2020, just as the COVID-19 pandemic was gripping the world, I set the goal of creating a portrait of the novel coronavirus, to help put a “face” on it and reduce the fear of the unknown. At the time, there was little information available about the virus, but the sequence of the viral genome and early micrographs showed that it was quite similar to the 2003 SARS-CoV virus particle (or virion). So I took a huge leap of faith and created an illustration of SARS-CoV, in the hope that it would turn out to be similar enough to stand in for the new virus (see image on page 90). I titled the painting ambiguously as Coronavirus and shared it as part of pandemic-related outreach work at the Research Collaboratory for Structural Bioinformatics Protein Data Bank. The painting was presented for free use on their website and made its way into news articles and popular publications. I also presented a downloadable coloring-book version of the painting on the site and spent an enjoyable month viewing on social media many creative interpretations of the virus made by children and adults. There were limited data available for the molecules that I needed to include in my painting of the virus now known as SARSCoV-2. For example, the detailed atomic structures of the “spike” protein on the virus’s surface had been determined and were freely available in the Protein Data Bank archive. Electron microscopists had extensively studied the virus, so I was able to use their beautiful micrographs to define its size and the characteristic arrangement of spikes on its surface. However, the RNA genome inside the virion was (and still is) the subject of ongoing study. I had to combine my own molecular modeling expertise with some personal intuition to craft a conception that was more or less consistent with what was known at the time. 2021
March–April
89
Goodsell’s first attempt at painting SARS-CoV-2, simply titled Coronavirus, reflects the early research on the virus’s structure. The light pink proteins between the coronavirus’s magenta spikes are membrane proteins, which are thought to help package the genome when the virus emerges from cells. Inside the virus, the RNA genome is bundled by many copies of the nucleocapsid protein (purple). The virus is surrounded by molecules from the respiratory tract, including antibodies (yellow), mucus-forming mucin (green), and other molecules of the immune system (shades of brown).
Once I had these outlines in place, I used a traditional watercolor approach to create the painting. I have worked out a process over the years that allows me flexibility in designing paintings, and that also can be completed in a 90
American Scientist, Volume 109
reasonable amount of time (see images on facing page). I start with a complete pencil sketch of all the molecules in the foreground of an image, using a light box and liberal application of an eraser as I sort out the details of how things are
Courtesy of David S. Goodsell
Goodsell began the process of creating his Coronavirus painting (shown completed on the facing page) with a detailed pencil sketch of the virus (left), which he based on the most current research at the time. He transferred the sketch to watercolor paper and applied flat color washes to the spike proteins (center). Then he added the surrounding molecules, using darker colors to create depth. Ink outlines cleaned up and defined the entire image (right).
arranged and how they interact. I then transfer the sketch to heavy watercolor paper using carbon paper and start adding color, one molecule at a time. My watercolor approach is simple, with flat color washes for each molecule. I like the clean look that this approach produces, and it also makes it possible for me to complete a painting in about 12 hours. Once the entire foreground is filled with color, I add background molecules on-the-fly, using darker colors to give a feeling of depth. Finally, I draw outlines around molecules with a technical pen to clean everything up and give it a finished look. A few months after completing Coronavirus, I was invited to provide a cover for a special issue of Nature devoted to SARS-CoV-2 research. I used it as a welcome opportunity to update my conception of the virus. New cryoelectron micrographs created by leaders in the field, available at the time only through the preprint server bioRxiv, had revealed detailed images of SARS-CoV-2 spikes and the first higher-resolution views of the proteins that package the RNA genome inside the virion. Unlike my earlier SARS-CoV portrait, my updated SARS-CoV-2 has fewer spikes and they’re quite bendy; the genome also now looks like a string of chunky beads rather than a thick rope. Given my increased confidence in this painting, I titled it more specifically SARSCoV-2 and Antibodies (see image on page 88). This illustration was intended as a hopeful image, with bright yellow human antibodies binding to the spikes and neutralizing the virus. One of the overarching motivations for my artwork is a personal goal: I started creating these illustrations during my postdoctoral work as a way to reconnect with my love of Biology (with a capital “B”). These paintings allow me www.americanscientist.org
to take a break from my specific research topic and celebrate the larger picture of what makes life tick. I hope that the illustrations help other people experience the same “Wow!” moments that I have as I try to capture the complexity and diversity of this scale of life. I had a particularly intense moment like that while drafting the Respiratory Droplet painting, shown on the cover of this issue. To create it, I had to chase down information on the tiny droplets that we expel with every breath. This search led me to much reading about the many (often bizarrely shaped) molecules that line and protect our respiratory tract, and then to try to imagine what these molecules would do inside a droplet. Another impetus driving my artwork is the hope that I will motivate other scientists, as well as the next generation of scientists, to examine their own systems with this same lens of integrative synthesis. Whenever I start a new painting, I need to research answers to a hundred questions, which then lead to even more questions. For example, in the SARS-CoV-2 Fusion painting, I wanted to capture how the spike protein changes shape as the virus fuses with cell membranes and infects a cell (see image on page 92). Detailed structural information was available for the state of the spike in the free virus, and also for what it looks like after the fusion has occurred. But for all the interesting steps in between, the field was, and remains, still largely at the stage of “and then a miracle happens.” I hope that scientists look at my painting and say, “No, I think it happens this way,” and then find some way to test their ideas. The joy and misery of these illustrations is that they become outdated almost as soon as I finish them. But because I am a scientist, this feeling of chasing a moving target 2021
March–April
91
SARS-CoV-2 Fusion depicts the virus in action as it fuses with an endosomal membrane (part of the cellular transport network, green), releasing its genome (purple) into the cytoplasm (blue) of the cell. The large molecules in turquoise are ribosomes, which are beginning to build new viral proteins following instructions from the viral RNA.
92
American Scientist, Volume 109
In this idealized conception, titled SARS-CoV-2 mRNA Vaccine, Goodsell includes a long RNA strand (magenta) that encodes the viral spike protein. The RNA is surrounded by an engineered envelope (blue) coated with molecules (green) that mask the vaccine particle from the immune system. When injected, the vaccine’s RNA will enter cells and cause them to build inactive pieces of SARS-CoV-2, which then stimulates the immune system to fight the virus.
only motivates me to keep painting, to refine what I’m depicting based on what is known. While I was writing this article, I finished a painting of the new mRNA vaccines as they were just starting to be approved for use (see image above). As you read this, those vaccines are becoming widely available. I’m sure I’ll do paintings of the other vaccine approaches as they come to fruition. That’s the amazing thing about science and “sciart,” the thing that www.americanscientist.org
keeps me picking up the brushes: There’s always something new to explore. David S. Goodsell is a professor of computational biology at the Scripps Research Institute and a research professor at Rutgers University. His paintings build on three decades of research into the molecular structures of cells and viruses. He creates outreach materials for the Research Collaboratory for Structural Bioinformatics Protein Data Bank, including the popular “Molecule of the Month” series. Website: ccsb.scripps.edu/goodsell 2021
March–April
93
Perspective
Why Do Virtual Meetings Feel So Weird? Even as online meetings become more common, they can’t always capture the nuances of nonverbal communication and in-person interactions. Elizabeth Keating
I
t’s morning in Houston, Texas, for Jeremy and his team of engineers, and nearly evening in a small town north of Bucharest, Romania, for Costa and his team when they all sit down for a phone meeting in 2011. (All names of interviewees have been changed to protect their privacy.) They’ve gotten to the fourth item on their meeting agenda when Jeremy realizes the team in Houston has been operating with the wrong assumption about the number of pumps in the petroleum plant design they are working on. “Who else knows about this?” Jeremy asks Steve, who sits beside him. “Good question,” Steve says. As an anthropologist observing them, I realize it’s a big, amorphous question. There’s way too much ambiguity about who knows what and what information is located where. The gaps have led to at least one serious misunderstanding. Before 2008, everyone on this team shared the same office. Drawings were kept in the squad room, and employees came and went to check the same version of the design. Knowing the number of pumps would have been a no-brainer. Back then, information flowed easily around the office, and when new engineers joined the
team, they learned a lot simply from watching things in the normal course of a day. Now things are different. The COVID-19 pandemic has meant that many more people are, like these engineers, working on complex tasks through computer screens. In 2015, nearly a quarter of respondents to a
The ability to see people and to observe them observing us during work builds a framework for a lot of important parts of teamwork. survey by the U.S. Bureau of Labor Statistics reported working from home at least some of the time. As of May 2020, that number nearly doubled, with 42 percent of U.S. employees working from home full time, according to a study led by economist Nicholas Bloom of the Stanford Institute for Economic Policy Research. In Europe, where strict lockdowns have closed
many offices and businesses, the European Commission Joint Research Centre found that half of those now working from home had no previous experience with teleworking. How-to guides for remote video communication tend to focus on technical issues, appearance (such as a person’s dress and hair), lighting, the backdrop, and how to manage interruptions and peculiarities within “home” space—the dog barking or a child knocking at the office door. But there is much more to say about this kind of communication. As a linguistic anthropologist, I’ve been interested in technologically mediated interaction ever since the webcam first became widely available in the late 1990s; I wrote a book about it in 2016. I can see why people today are remarking, with some irony, that in spite of the enormous promises of technology to keep people connected while staying safely out of harm’s way, they are experiencing the unexpected disappointments of isolation, exhaustion, feeling out of the loop, and a lack of community. Adapting to Remote Collaboration Jeremy and Costa were part of my most recent study on remote communication. I conducted research on their
QUICK TAKE Remote work is challenging because the nuances of nonverbal communication that are present in face-to-face conversations are not conveyed through virtual meetings.
94
American Scientist, Volume 109
Informal knowledge that new employees learn quickly in an office setting through watercooler and hallway conversations is difficult to transmit in a virtual environment.
New technologies to virtually recreate casual interactions and the physical cues of nonverbal communication are in development and may make remote work feel more natural.
Switched Design/Shutterstock
Many businesses have transitioned their employees to remote work during the COVID-19 pandemic and have shifted meetings to online platforms such as Zoom, but important aspects of communication are not conveyed through a computer screen.
team while they designed state-of-theart petroleum processing plants from remote locations between 2008 and 2011, and again in 2016 and 2017. Although some members of this project were in a shared office, the expertise on the team was distributed across 30 to 50 people on at least two continents. Like many people today, Jeremy and Costa depended on email to send documents and to critique and assign tasks. They had weekly conference calls—both audio calls to discuss action items and video calls to review and make changes on the model they were creating. Before I met the engineers and got to know their work lives, the managers of these companies told me that the financial gains they had anticipated from their online international collaboration had not come to fruition, in part because of the increased managerial costs of a far-flung team. But what else was at play? I had earlier heard many engineers describe culture as a factor that could make or break virtual engineering teams— vocabulary, along with unstated assumptions and procedures, can vary from place to place. This problem is all too often underestimated. www.americanscientist.org
From things the engineers said in our introductory meetings, I assumed that these cultural differences, along with difficult logistics such as time zones, would be what made their work so challenging, and would be key to answering Jeremy’s somewhat exasperated question, “Why don’t they just understand what to do?” But I was off the mark. It turned out that a big contributor to frustration and having to do things over again was the loss of two forms of knowledge critical to any team and to any interaction: nonverbal communication and peripheral participation. Most people hardly give a thought to these kinds of communication—they have only recently become hot topics of investigation in online work worlds. Nonverbal Communication Those of us working from home now are experiencing quite drastic changes in what the sociologist Erving Goffman described so aptly in his books and articles of the 1960s as “focused interaction.” What Goffman meant by that was not only the exchange of words in an interaction but also the crucial role of
sight. Small glances were not insignificant details to him. As he put it, each of us notices how we are being experienced by other people. Further, each person can be observed noticing that they are being experienced in a certain way by others, and each person “can see that he has been seen seeing this.” With this tongue twister, Goffman is getting at the dynamic, reciprocal way humans keep each other in view. We use what we see to decide how to react and what to do next, and even to predict someone’s future behavior, and this view can include quite a few people. The way we focus on and acknowledge one another has a certain elegant economy to it; it happens while we are paying attention to other things. The ability to see people and to observe them observing us during work builds a framework for a lot of important parts of teamwork. It helps us establish trust, gain commitment, confirm understanding and consensus, and understand emotional states. Goffman had an anthropologist’s eye for the details of human interaction. Most people realize that in the transition from an office to an online remote setting, we are losing critical information about the very people we depend on to get the job done. This reduced form of interaction contributes to the perception of remote work 2021
March–April
95
10.2
never
rarely
11.2
1 day per week
11.0 15.1
2 days per week
14.4
3 days per week
4 days per week
7.5 30.5
5 days per week
0
5
10
15
20
percent of respondents
25
30
Nicholas Bloom, Stanford University
In a survey of people who have worked from home during the COVID-19 pandemic, nearly half of respondents said that when it is safe to return, they would prefer to be in the office more often than not. This reluctance to work from home full time might be related to the difficulty of virtual communication. The data is from 22,500 survey responses collected monthly from May 2020 through December 2020.
as less satisfying and more exhausting. Even though we often have good visual information about at least some people’s faces with today’s technological sophistication, we don’t have as much information as we are used to from in-person interactions. In Goffman’s terms, we lose the ability to observe others observing us, because we often don’t know where their gaze is directed. (They’re looking at their screens, not us.) The knowledge that is transmitted by our bodies—often referred to as nonverbal communication—has rarely been taken seriously. There’s the name of it, for one thing: How serious can we be about understanding something when we label it by what it is not? There are no departments in universities dedicated to nonverbal or embodied communication, and few books on communication consider it beyond a cursory discussion. We need a rich vocabulary for describing nonverbal communication akin to what linguists use when discussing verbal output. Yet anthropologist and scholar of body language Ray Birdwhistell, who in the 1940s and 1950s was one of the first to seriously study nonverbal behavior, estimated that “facial expression, gestures, posture and gait, and visible arm and other body 96
American Scientist, Volume 109
movements” make up about two-thirds of the social meaning of a conversation. Bodies talking and listening in conversation are highly expressive. We know this. They give off signals about emotional state, attitude, stance on a topic, whether people are confused or following along, whether they are excited or getting bored, how they are responding to our idea, and many other things that make it possible to work together. But, ironically, many online guides advise us to subdue our gestures or even recommend keeping the upper body motionless. In many cases, people turn their cameras off or use audio-only technologies. One day I was observing an audio conference call between the engineers in Houston and those in Romania. The topic was a change in design of a support for platforms. They had all come to an apparent agreement. “YES!” Jeremy had clearly voiced. “Maybe—” Costa started with an idea. “Yes, but—” Jeremy stepped in. Costa kept talking while Jeremy kept trying to finish. “He’s difficult to interrupt!” Bob in Houston said, laughing. He then intervened on Jeremy’s behalf. “Hold on just a second. Jeremy was trying to say something.”
Many online workers are now experiencing how people start to talk at the same time, then apologize, then start to talk at the same time again. People hesitate to speak because they don’t want to get into this awkward dance. Most people have never thought about taking turns or how they use eye gaze and head nodding, their own or other people’s, to facilitate easy conversations. Those who study the splitsecond timing and rhythmic coordination of turn taking, as conversation analysts do, describe how these splitsecond signals work—the ones Goffman was talking about, from gaze to the inclination of the body. People even notice if someone takes a certain kind of in-breath when they want to bid for a turn without interrupting someone else. These are all signals that, despite our high-resolution screens, are not yet easy to pick up online. The engineers, being engineers, told me they were so frustrated by their constant failure to coordinate the “simple” act of turn taking in their conversations, a failure they’d never experienced before, that they wanted to design a special turn-taking button—a kind of technological version of a baton. Recreating the Watercooler A second critical loss the engineers experienced was what linguistic anthropologists call peripheral participation. This term refers to the process by which newcomers become acclimated to existing communities of practice, an idea that was first developed by those studying theories of learning. The term peripheral indicates diminished status (like the “non” in nonverbal), but it’s actually one of the most significant aspects of a professional environment. Peripheral participation is especially important in mentoring and in fostering inclusivity, as well as in ensuring members of a team are on the same page. In one international institute where I was a researcher, the room where we sought out coffee, tea, and the odd slice of birthday cake was renamed the Tiny Conference Room (and a sign was put on the door announcing this change). The new name was an acknowledgment of the many important exchanges that went on there, including the one-on-one intense mentoring that characterizes professional learning, although people were ostensibly “just” getting coffee.
As newcomers to a profession move from being peripheral participants to becoming full participants, they develop “who knows what” directories of knowledge. But after moving to work environments dependent on online communication, the engineers had to “chase knowledge,” as they put it. Engineers who worked remotely with the engineers in the United States said, “We miss the hallway stuff.” Andrei told me that when he spent time in Houston, he was able to build a mental map of “who knows what.” When he got back to Romania, the map quickly became outdated. In the current pandemic, efforts at fostering worker interactions have to be reinvented. Many pre-COVID-19 offices were full of space for accidental meetings and rich not only in linguistic but also in sensory and contextual cues. Workers have long taken these for granted. It’s time to recognize them, not just by their absence but by acknowledging their importance. I’m an anthropologist who studies human interactions, not an IT developer, but I can see that there are academics and companies out there chasing technological solutions: Communica-
tion tools named after the Watercooler and Hallway are intended to facilitate peripheral participation, and enhanced video or virtual reality projections are designed to make people feel they are really “there” with one another. Some developers have proposed a pressure-sensitive chair to record how we move our bodies, linking these movements to signs of interest or disinterest. Other projects have shown that avatars that use gestures improve engagement and communication in work collaborations—even when those gestures are only accidentally triggered by the user. Many of these developments acknowledge that technologies won’t achieve what we want them to achieve unless we keep the human lessons in mind. Communication, after all, is about people—not just our words but our gestures, feelings, and mental maps. But there’s still a long way to go to incorporate all these developments into more fulfilling and effective virtual work.
January 23, 2021. https://siepr.stanford .edu/research/publications/how-working -home-works-out Bureau of Labor Statistics, U.S. Department of Labor. 2016. 24 percent of employed people did some or all of their work at home in 2015. TED: The Economics Daily. Accessed January 23, 2021. https://www.bls.gov /opub/ted/2016/24-percent-of-employed -people-did-some-or-all-of-their-work-at -home-in-2015.htm European Commission Joint Research Centre. 2020. Telework in the EU before and after COVID-19: Where we were, where we head to. Science for Policy Briefs. Accessed January 23, 2021. https://ec.europa.eu/jrc /sites/jrcsh/files/jrc120945_policy_brief _-_covid_and_telework_final.pdf Goffman, E. 1961. Encounters; Two Studies in the Sociology of Interaction. Indianapolis: Bobbs-Merrill. Goffman, E. 1963. Behavior in Public Places; Notes on the Social Organization of Gatherings. New York: Free Press of Glencoe. Keating, E., and S. L. Jarvenpaa. 2016. Words Matter: Communicating Effectively in the New Global Office. Oakland, CA: University of California Press. Lave, J., and E. Wenger. 1991. Situated Learning: Legitimate Peripheral Participation. Cambridge: Cambridge University Press.
Bibliography
Elizabeth Keating is a linguistic anthropologist at the University of Texas at Austin. This article was adapted from a version previously published in Sapiens, sapiens.org.
Bloom, N. 2020. How working from home works out. Stanford Institute for Economic Policy Research Policy Brief. Accessed
Call for Judges April 26–May 10, 2021 Virtual Event Non-student members are invited to judge Sigma Xi’s Student Research Showcase. Sign up to be a judge: www.sigmaxi.org/judge For more information, email: [email protected] #SigmaXiSRS | www.sigmaxi.org/srs
www.americanscientist.org
2021
March–April
97
Remembrance of Germs Past How do our immune cells remember invaders for so long, and what could this ability mean for COVID-19 vaccines? Marc Hellerstein
B
oth our brains and our immune cells have the capacity for recollection of previous experiences. Memory, whether cognitive or of germs, involves two encounters: an original event that is perceived and coded, and then its future recollection. But the body’s ability to recall germs requires one more thing that is taken for granted by cognitive memory: The individual must survive to recall the event. Before the development of vaccination, immune memory came at a cost, like a battle scar from a past conflict. The brilliance of vaccination is that it safely induces protective memory against microorganisms that are potentially fatal. Like cherished childhood memories that never fade, vaccines allow us to recall alien invaders—but without ever having faced their risk in a natural infection. There has been a long-standing mystery about immune memory, however. All our cells divide and die, so how can immune cells effectively remember germ encounters for 50 or more years? Maintenance of cognitive memory is understandable in view of the long lifespan of brain cells, but how does it work for immune memory? I will tell two stories here, one societal and one biologic. The great plagues of the past two centuries—mostly viral— up through today’s COVID-19 health and socioeconomic crisis, have been solved by vaccination, which is arguably the greatest discovery in medical history. Vaccines have been around for more than two centuries but new vaccine approaches and technologies are currently being tried out for COVID-19. Their rationale intersects in interesting ways
with what has been learned about the biology of immune protection against infections in general and coronavirus infections in particular. B-cells and T-cells The mammalian immune system relies on a collection of specialized cells produced from stem cells. The adaptive immune system—a germ-specific system that responds and adapts to disease exposures—can be divided into two arms (see graphic on page 104). The humoral immunity arm includes B-cells, which produce proteins known as antibodies that bind to molecules referred to as antigens. Antibodies have highly specific functions that include neutralizing a microorganism or facilitating their engulfment by other cells. Antibodies can be imagined as targeted missiles against microorganisms shot out from a distance by B-cell missile launchers. The other arm, commonly known as cell-mediated immunity, is composed of another type of blood cell: T-cells. These cells also recognize specific antigens but carry out more complex and interactive functions to defend us. Vaccines, just like natural infections, can induce both arms of the immune system. The importance of B-cells versus T-cells differs for each vaccine and for each infectious agent. Antibodies produced and secreted by B-cells attack foreign germs while they are circulating through the blood, but Tcells often play key roles in the control of infections that take place inside our cells. Antibodies cannot gain entry into cells to battle viruses or bacteria that cause trouble there. By contrast, certain T-cells have the remarkable ability to identify
cells in our body that contain foreign molecules and destroy these infected cells. This process includes recognition of viral proteins produced inside our cells, an ability that will be important when considering emerging vaccine approaches against COVID-19. T-cells are complicated, and there are different subtypes with different functions. The so-called effector T-cells divide rapidly and attack an invader in several ways, but then they die. There are also longer-lived T-cells, called memory T-cells, that stay around and “remember” the invader for later encounters. Unveiling T-cell memory is what got me involved in this field, in the midst of a viral plague that had killed millions. HIV/AIDS and T-cells In the 1990s, the biggest health crisis in the world was HIV/AIDS. David Ho, a physician and virologist now at the Aaron Diamond AIDS Research Center in New York City, had proposed, based on indirect evidence from numbers of cells and viruses in the bloodstream, that the virus known as HIV (human immunodeficiency virus) causes rapid death of infected T-cells and, over time, this infection depletes immune reserves and leads to immune deficiency. The life and death of T-cells was in the headlines! Based on this idea, the focus of treatment was to suppress replication of the virus to prevent depletion of T-cell reserves. But there were no direct data about death rates or longevity of T-cells in humans, whether infected with HIV or otherwise healthy. Cell division and death rates could not be measured well at the time, because a safe and accurate ap-
QUICK TAKE Devastating diseases from previous centuries that ravaged children and families are currently of little concern, largely because of vaccination practices.
98
American Scientist, Volume 109
Tracking the fate of immune cells after an infection, either a mild form from vaccines or from natural encounters, can unveil the secrets of lifelong protection.
Different immune cells contribute to immunity, and understanding their various roles is key when developing vaccines, which can target different systems.
Wellcome Collection/ CC BY 4.0
Vaccination had become popular in England in the early 1800s, but even then it was mistrusted. This 1802 caricature shows a satirical vaccination scene at the Smallpox and Inoculation Hospital at St. Pancras in London. Edward Jenner, who pioneered the smallpox vaccine using a
proach to study this activity had not yet been developed. This lack of data bothered me, and it is why I got involved. I had been working on the basic problem of how to measure division rates of cells in people. But I was also working on glucose metabolism in the human liver—specifically looking at how much of the glucose that the liver releases is produced in the body from noncarbohydrate sources. One day it struck me that the trick to tracking cell division and death could be through glucose. To learn when a cell was born and how long it has lived, it is necessary to tag a molecule that is synthesized when every new cell is made and that stays around throughout its lifetime. The only molecule that fits this description is the DNA in each cell’s nucleus. Proteins, lipids, carbohydrates, or other molecules in our cells come and go, but DNA is only synthesized immediately prior to the cell division that creates new cells. In other words, new DNA means a new www.americanscientist.org
related bovine virus, is shown with a knife injecting a frightened woman. Those who have been vaccinated are shown with deformations and cow parts emerging from their bodies. In 1803, Jenner coined the word “vaccine,” which derives from the Latin word vacca, meaning cow.
cell. The opposite occurs when cells die—the old DNA is no more. DNA is a double-stranded chain of subunits that contain deoxyribose, nucleic acid bases, and phosphate elements. The nucleic acid bases released from old DNA strands in dying cells are sometimes reused by cells to form new DNA, but not always. When scientists had tried previously to tag nucleic acids to follow cell birth and death rates, the results were a mess to interpret. But the deoxyribose portion reliably comes from glucose. Indeed, glucose is avidly converted into deoxyribose when a cell divides, which is the main reason why rapidly dividing cancer cells love to take up glucose and why certain cancer diagnostic tests called PET scans rely on glucose tagging. Put differently, the deoxyribose part of DNA is reliably newly made from glucose and not recycled, unlike the nucleic acid base portion. In addition, DNA is our most precious and protected molecule, and introducing
toxic or radioactive tags into its structure is not acceptable. For these reasons, we had the idea to use glucose tagged with a nonradioactive, stable tracer called deuterium to measure how much new deoxyribose there is in DNA in a population of cells—and thereby measure how many of the cells have newly divided and how long these new cells have lived. Measuring the birth and death rates of T-cells was now possible in humans. Tracking the Fate of Cells Specific proteins on the surface membrane of T-cells allow us to identify them, but up to that point, tests could not separate the subpopulations of short-lived effector cells from long-lived memory cells. When my colleagues and I tagged DNA in healthy T-cells with glucose tracers, we were able to see two shapes in the test curves in cells identified by surface proteins as “memory– effector” T-cells—with short-lived (effector) cells clearly behaving differently 2021
March–April
99
Tracking Immune Memory DNA is only made prior to cell division and persists throughout the lifetime of the cell. Deuterated glucose ([2H] glucose) and heavy water ( 2H2O) in precursors of DNA components can be used to quantify dividing cells as a proxy of their replicating DNA. Only cell death O results in loss of labeling; O P O–
later cell division can also be monitored as dilution but not loss of label. The genetic contribution from the original labeled population to the surviving cell population is recorded to calculate the fraction of cells that persist over time.
C2H2OH O
2H
2H
2H
OH
OH [2H] glucose
2H
2H
H
H
O P O– 2H O
H 2H
O
N
O H
O P O– O N
2H
N 2H
N O
C2H2
H
H
2H
2H
2H
2H
N
N
O 2H
O P O
2H
O H
O
O
N
2H
C2H2
2H
N H
N
N N
2H
O
OH
H
O 2H
O P O– O
HO
O
H
2H 2H
O
CH3
O C2H2
H
H N H N
2H
N N
2H
O 2H
2H
O
H
O–
2H
C2H2 O O P O– O
2H
heavy water
de novo and salvaged purines and pyrimidines glucose-6phosphate
from long-lived (memory) cells. This was an exciting confirmation of the power of our approach. When we next tried testing the T-cells in people with HIV/AIDS, we found that they were less able to make long-lived cells. With Mike McCune of the University of California, San Francisco and David Ho, we showed that antiviral therapy slowed down short-lived cell death and allowed more long-lived cells to be made. This approach helped address a sticky clinical problem in AIDS treatment. At the individual level, many patients on antiviral therapy showed a mixed response: incomplete blockage of viral replication but improvement of T-cell counts. Half the doctors at scientific meetings would say, “You need to keep them on therapy, the T-cells are better!” whereas the other half would say, “These drugs are failing to control the virus, and cause side effects; you need to stop therapy!” Working with Steven Deeks and Robert Grant of the Universi100
American Scientist, Volume 109
ribonucleoside diphosphate deoxyribonucleoside diphosphate deoxyribonucleoside triphospate
Barbara Aulicino
ribose-5phosphate
ty of California, San Francisco, we were able to tag these cells and show that Tcells had almost normal birth and death rates in patients who failed to fully suppress the virus. These data meant that although HIV was still present, it was not affecting the birth and death rates of Tcells. The implication was that antiviral treatment had selected less destructive HIV strains: Resistant viruses make a trade and lose much of their destructive punch. This conclusion was confirmed by additional approaches and, indeed, continuing therapy in these “drug failures” has since been proven to be the correct clinical approach. Twenty-five years later, the world is engulfed in the flames of another virus, and once again T-cells are front and center—this time, in the context of vaccines. To get a better grasp on the current situation, it is instructive to review how different vaccines work and remind ourselves of the remarkable effect that vaccines have had on human life.
Life Before Vaccination Two kinds of old cemeteries can be appreciated in the United States: those with small headstones for children and those without. It is hard not to be moved by the little headstones, like ducklings at their parents’ feet, in cemeteries from the 1700s and 1800s. In the 19th century, between 40 and 50 percent of children died before finishing puberty. The leading causes of death among American children were infectious diseases, well into the 20th century. None of the common fatal childhood diseases of the 19th century are on our list of concerns today; their reduction by vaccines is almost too good to believe. There were zero deaths in the United States between 2004 and 2006 from diphtheria, measles, polio, rubella, mumps, and smallpox combined. After correcting for the increased United States population, vaccinations are currently saving 100,000 or more children’s lives every year, a number that is five times today’s total number of annual childhood deaths, most of which are related to injuries. Complete, 100-percent reductions are almost unheard of in medicine—like a reverse science fiction movie. Some dramatic historical examples can help illustrate the immense societal impact of vaccination. Some of these vaccines work mostly through antibodies; some more through T-cells. Diphtheria Modern American physicians do not see cases of diphtheria. Since 2004, there have been only two reported cases of diphtheria in the United States. Before vaccination, there were 206,000 cases of diphtheria in 1921 alone, the year that child deaths peaked from this disease. Unlike the current pandemic caused by a virus, diphtheria is caused by a bacterium, Corynebacterium diphtheria. Diphtheria affected mostly children, especially those under 5 years old, and had a mortality rate of about 20 percent in this population. This bacterium infects the nose and throat, is easily spread by coughing or direct contact, and produces a toxin that blocks the synthesis of proteins in the body, eventually killing the cells that line the throat. A thick layer of dead cells develops that can block the airway, resulting in a horrible death from suffocation—hence a “strangling death” by the “black membrane.” Horror stories of diphtheria dot the historical American landscape. In 1903,
Smallpox and a Gift from Cows With a mortality rate of 30 to 50 percent, smallpox was once considered to be the world’s “deadliest infectious disease”— an airborne transmitted viral disease, spread by coughing, sneezing, body fluids, clothing, or bedding. Brought to the New World by colonizers, smallpox killed about 30 percent of Native Americans—sometimes whole villages or tribes—including 80 to 90 percent of the Incan population and 50 to 80 percent of the Aztecs. Almost every colonial town had a “pox cemetery.” Whole cities
preantiviral therapy postantiviral therapy
15 10 5 0 0
10
20 30 time (days)
40
50
www.americanscientist.org
preantiviral therapy postantiviral therapy
15 10 5 0 0
10
20 30 time (days)
40
50
The tracking of immune cell populations is used to follow the birth and death of helper and cytotoxic T-cells in HIV/AIDS patients before and after antiviral therapy. The fast-rising curves represent newly divided cells, whereas the long-term die-away curves show their survival and persistence. Treatment of patients with antiviral therapy slows the birth and death rates of shortlived cells—as seen by the rapid rise and fall of the curves—and allows long-lived cells to survive. As with vaccines, antiviral treatment against HIV ameliorates the fast turnover of short-lived cells (in the first week) and allows for long-lived cells to survive and establish immunological memory.
in Berkeley, England, overheard a milkmaid saying that she was glad she had cowpox because now she would not get those nasty marks all over her face from smallpox. In 1796, Jenner had the inspiration to take pus from a cowpox lesion and inject it in otherwise healthy humans who, when exposed to smallpox material, successfully did not get ill. The work was scientifically crude and astonishingly unethical by today’s standards but showed that smallpox infection could be prevented by exposure to cowpox material. In 1801, Jenner published a paper “On the Origin of the Vaccine Inoculation.” The word “vaccine” came from vacca, Latin for cow. He predicted that “the annihilation of the smallpox, the most dreadful scourge of the human species, must be the final result of this practice.”
None of the common 19th-century fatal childhood diseases are on our list of concerns today; their reduction by vaccines is almost too good to believe. would be quarantined; the xenophobia towards China in the current COVID-19 pandemic is mild in comparison. Cows get a milder pox disease, and the cowpox virus can be transferred to humans as, for example, coronaviruses do from other species to humans. Milkmaids would get pustules on their hands and arms from the cowpox virus. Edward Jenner, a country doctor
cytotoxic T-cells 20
Jenner became famous for this breakthrough, but the politics of vaccination were harsh from the start. Ministers warned against interfering with “the Lord’s Grand Design.” The breach of the species barrier was criticized. George Bernard Shaw called vaccination “a filthy piece of witchcraft.” But it sure saved lives! It is estimated that smallpox was responsible for 300 million deaths
in the 20th century but, because of widespread vaccination, there has not been a case of smallpox in the world since 1979. Vaccination Durability Why has the seemingly crude practice of vaccination had such an effect? Are the same immune cells present decades later that were stimulated by the vaccination, or are memory cells constantly “reminded” as they are replaced by new cells? To understand how the immunologic memory of germs can last for decades, we needed to incorporate two additional technical advances. One required the development of technology that would capture the exquisite specificity of immune cells and isolate the T-cells that reacted to specific foreign molecules such as viral antigens. A laboratory reagent called tetramers made this possible. T-cells look for antigens that originate from inside other cells and are now being exposed on the cell surface as beacons to immune cells. In this remarkable surveillance system, proteins inside of cells are partially digested into smaller fragments by the cell machinery and are then brought to the cell surface and “displayed” by a group of proteins known as the major histocompatibility complex (MHC). If the displayed fragments are foreign—such as viral proteins—and a circulating T-cell scout recognizes these fragments, it will become activated and initiate an immune response. A tetramer is constructed in the laboratory of four MHC proteins and a selected protein fragment. This technology presents the fragments in a test tube to T-cells isolated from a blood sample, identifying T-cells that recognize it so they can be collected. 2021
March–April
101
Efrain Rivera-Serrano
helper T-cells 20
molar percent excess (MPE)
molar percent excess (MPE)
the O’Marra family attended a funeral in Hartford, Kansas. A cousin may have been a silent carrier of diphtheria. The O’Marras had nine children, but within eight days, six had died. A local doctor gave diphtheria antitoxin—an antibody cocktail that is still used to this day—to the remaining children and the mother, but two other children later died. The family of 11 was reduced to two parents and the middle child. The diphtheria vaccine (the “D” in the DPT combination vaccine that children receive, which also protects against pertussis and tetanus) does not target the bacterium, but its toxin. The vaccine stimulates antibodies that specifically bind to the toxin and neutralize it. In 1901, the first Nobel Prize in Physiology or Medicine was awarded to Emil von Behring for the discovery of serum therapy, based on antibodies harvested from horses in response to injected diphtheria toxin. This so-called antiserum or antitoxin therapy was effective, but the disease had to be diagnosed first.
20,000
15,520
1921
10,000
7,518 5,845 1934
5,000 Diana Staresinic-Deane
We also needed a way to tag and follow long-lived cells, including memory T-cells, in human samples over long periods of time. Tagging with glucose was impractical and very expensive for more than a couple of days. We needed a new approach. Looking back, I see that the luck of interdisciplinary work again played a big role in a solution. I still recall vividly my first few weeks in graduate school after I had finished
1902
0
Smallpox 1902
my medical training. I had been entranced by the work of research scientists in the 1950s and 1960s such as Joe Katz, Bernie Landau, and Harlan Woods, who showed that the labeling pattern in a molecule could reveal its biochemical journey. These scientists had shown that hydrogen atoms exchange between glucose and body water at each step of glucose metabolism. I found it magical that an interrogation of a molecule’s intimate
1613
1803
Known in Spain as “El Año de los Garrotillos” (The Year of the Strangulations) for its epidemic of diphtheria.
Jenner coins the term vaccination.
3RD
Jenner tests hypothesis that injection with cowpox could protect a person from smallpox
17TH
18TH
century
tracer pattern can tell us where it came from and when it was made in the complexity of the body’s metabolism. It was therefore natural for me to appreciate that deoxyribose in DNA, which comes from glucose, could be tagged by hydrogens in body water. This idea held the answer to the limitations in our T-cell tagging system. So-called heavy water, containing the hydrogen isotope deuterium instead of the usual hydrogen atom, could be used to tag different types of T-cells and monitor their fate and their life span. Deuterium differs only in that it adds a neutron to
1894 First polio outbreak in the United States.
2020
1883
1796
century BCE
Polio 1949/1952
St. Mary’s Cemetery in Hartford, Kansas (left). A number of common 18th- and 19th-century fatal childhood diseases (with years of peak child deaths, shown at right) are of little concern today because of vaccination.
With increased understanding of the interplay between microorganisms and long-term immune memory, the control of infectious diseases is requiring significantly less time from the moment a microorganism is linked to a disease to a vaccine being developed and approved for use.
Smallpox-like rash found on mummies dating to the Egyptian Empire.
Diphtheria Pertussis 1934 1921 virus/peak year
19TH
century
A new coronavirus is associated with a cluster of cases of pneumonia. Disease is named COVID-19. Vaccines approved for emergency use.
The bacterium that causes diphtheria is identified.
20TH
century
21ST
century
century
1906
1939
1952
1959
1988
The bacterium that causes pertussis is identified.
A pertussis vaccine is shown to be effective.
Salk tests inactivated polio vaccine.
Plans to eradicate smallpox worldwide begin.
A global polio eradication initiative is launched.
1907
1908
Diphtheria antitoxin is Poliovirus is shown to confer first isolated. protection in humans. 102 American Scientist, Volume 109
1948
1980
1994
The first combined vaccine (diphtheria, tetanus, and pertussis) becomes available in the United States.
Smallpox is officially declared to be eradicated from the world.
Polio declared to be eliminated from the Americas.
Barbara Aulicino; background image Fernando Zhiminaicela/pixabay.com
In 1903 the O’Marra household lost eight of their nine children in a span of weeks after what seems to have been a single exposure event to “black diphtheria” from a visiting relative. The O’Marra graves remain at the
1949– 1952
2,510
Barbara Aulicino
number of deaths
15,000
www.americanscientist.org
kinetics of yellow fever–specific cytotoxic T-cells 120 100 80 60 Efrain Rivera-Serrano
(percent of labeling at day 28)
enrichment in tetramer+cytotoxic T-cells
Testing Yellow Fever Vaccine The only Nobel Prize for a viral vaccine was given to Max Theiler in 1951 for creating the yellow fever vaccine, which is still used today. The death toll from yellow fever was once about 200,000 people a year worldwide, with death rates between 20 and 50 percent, but the vaccine is known to create T-cells and protective immunity that are long-lasting. On a trip to Atlanta for a symposium on diabetes in 2014, I stopped by the laboratory of Rafi Ahmed of Emory University, who has made many fundamental contributions to the understanding of immunologic memory. We decided to combine forces and use the vaccine against the yellow fever virus as a test case to explore how memory from long-lived Tcells is maintained throughout life. We gave vials containing a few tablespoons of heavy water to drink three times a day to people who had just been vaccinated. We then followed virusreactive T-cells in the blood after vaccination. Proliferation of effector T-cells that were specific to the yellow fever virus was rapid at first but then stopped cold four weeks after vaccination, resulting in labeled cells that did not die away but displayed a very long life span. The virus-specific T-cells that divided during the first four weeks after vaccination had a very long half-life, about 460 days on average. A small subset of the cells that were produced in the first two weeks after vaccination will thus remain 7 to 10 years later. Other than brain cells that often last a lifetime, a life span this long is very rare for human cells. A slow rate of new yellow fever–specific Tcells are also produced after the original exposure and remain in circulation for an extended period, allowing protective immunity for 50 years or more. Modifications to the DNA—called the epigenetic landscape—in these T-cells contain marks of their previous encounters as effector cells. This fingerprint is maintained for years, or even decades, and keeps genes associated with a rapid recall response accessible for activation
The Birth and Maintenance of Memory T-Cells
2H
the atom, making it heavier but not affecting charge or behavior. Heavy water can be safely taken by mouth as a daily drink for weeks or months at home and does not need an intravenous line. In combination, these laboratory advances allowed us to finally ask how vaccines create such long-lasting memories. And the results were even more interesting than we had hoped.
40 20 0 0
50 100 Days post–yellow fever virus vaccination
150
Human donors were immunized with a yellow fever virus vaccine, which induces long-term immunity, and drank small amounts of deuterated water (2H2O) for the next 14 days. The blue highlighted area represents the labeling of newly produced yellow fever–specific cytotoxic T-cells obtained from blood samples during the first 28 days after vaccination. Plotted lines represent different patients and the survival of the labeled cells over time. The average half-life of label enrichment die-away was calculated to be approximately 460 days, and their rate of cell division to be less than once every year. These results indicate that the long-lived population of T-cells originated from the same T-cells that underwent extensive proliferation during the first two weeks after vaccination. when they are needed. All the while, the cells slowly evolve into a unique cell type. These cells display specific molecules that are usually present in neverdivided naive cells, but show a rapid proliferation in response to viral antigens and have unique gene expression patterns. Indeed, these cells were different from any cells that had been previously recognized in immunology. Like wizened old soldiers resting in the field under camouflage and accruing new skills but keeping the imprint of their early life history, these “stemlike” memory T-cells are poised to spring into action at the first sign of virus because they retain a personal history of having been in a battle. This brilliant combination allows rapid re-creation of attacking functions in a subset of long-lived T-cells. Moreover, it was now possible to study whether long life span and durability will exist after vaccination by tagging the cells and following their life span for a few months—a technique that adds a powerful tool to the field of human vaccinology. COVID-19 Vaccines Current vaccine efforts for SARS-CoV-2 have generally focused on antibody
response and on the coronavirus spike protein as the immunizing antigen. There are important considerations to acknowledge with this approach, however, for long-term protective immunity in coronavirus infections given what we know about other members of this class of viruses. For one thing, compared to T-cells, antibody response has not been a sensitive or long-lasting indicator of natural coronavirus infections. T-cell response has proven to be a better marker than antibody response after SARS and MERS— both coronavirus diseases. In survivors of the 2003 SARS outbreak, only about 50 percent showed detectable antibodies after three years and none at six years. For MERS survivors, their antibody response is low or absent in cases of mild disease. Antibodies found in COVID-19 patients also appear to be short-lived. In sharp contrast, virus-specific T-cells are essentially universally induced in coronavirus infections and remained present after 17 years in SARS survivors. Another consideration is that a strong antibody response is often correlated with more severe coronavirus disease, whereas T-cell responses are correlated with less severe disease. MERS survivors 2021
March–April
103
The Arms of Our Immune System humoral immunity
cell-mediated immunity helper T-cells
cytotoxic T-cells
activated by an antigen-presenting cell.
B-cell receptor
infected cell
antigen MHC-antigen complex naive helper T-cell
naive B-cell
cytotoxic T-cell
plasma cell
cytokines antigen
T-cell receptor
memory cytotoxic T-cell
memory helper T-cell
memory B-cell
Memory B-cells are activated by repeat exposure to the antigen.
Memory B- and T-cells confer future immunity to this virus.
effector T-cell
e pi g e n eti c s
T-cells recognize specific antigens displayed by infected cells and destroy them.
Memory cytotoxic T-cells are activated by repeat exposure to the antigen.
infected cell
Two different immune responses can occur, depending on the type of invading microorganism. The humoral response involves B-cells that recognize antigens or free-roaming microorganisms circulating in the body. The activation and differentiation of naive B-cells into plasma cells and memory B-cells is stimulated by helper T-cells and their cytokines. The plasma cells produce protective antibodies, whereas memory B-cells provide future immunity to encounters with the microorganism. In cellmediated immunity, cytotoxic T-cells become activated when they recognize foreign antigens produced inside cells—such as viral fragments from infected cells—that are presented to their T-cell receptors through a major histocompatibility complex (MHC; inset). As with naive B-cells, helper T-cells costimulate the differentiation of cytotoxic T-cells into effector T-cells that can specifically recognize infected cells and promote their destruction. These cells can remain in circulation for years as memory T-cells that can be activated once again by exposure to the antigen. with higher antibody levels had required longer critical care stays with more ventilator support than subjects with no detectable antibodies. Similar findings have been reported in COVID-19, suggesting that effective T-cells clear virus rapidly, which reduces disease severity and exposure to virus. Robust antibodies in natural coronavirus infection may reflect a failure of T-cell control. 104
American Scientist, Volume 109
MHC molecules
T-cell receptor
In addition, antibodies can worsen disease in coronavirus infections. This antibody-dependent enhancement of disease may occur when antibodies are not strong enough to neutralize the virus or perhaps are present in low amounts. Under these conditions, antibodies may actually help a virus get into cells through alternative routes. For example, feline infectious peritonitis, a corona-
antigen
T-cell
virus infection in cats, is worsened either by prior administration of antibodies or by vaccination before a challenge infection with the virus. If antibodies have a short half-life, the possibility of enhancement when antibody levels fall to nonneutralizing levels is a concern (in the absence of T-cell protection). There has not yet been any evidence for this after COVID-19 vaccines, however.
Barbara Aulicino
Plasma cells secrete antibodies into blood and extracellular fluid. Further exposure to antigen is not needed.
Immune protection conferred against COVID-19 that has so far been evaluated is from three-month clinical trials after vaccination. This reflects the acute immune response, not the memory response. Ideally we want to induce long-lasting protection, which requires immune memory. Current COVID-19 vaccine candidates are interesting in this regard. The first two of these approved vaccines are based on mRNA technology and do not involve the injection of a virus or viral proteins, but instead give the necessary recipe to generate an antigenic viral protein inside of our own cells. The rationale for such an RNA vaccine comes back to the basic biology of T-cells, which recognize foreign proteins present inside our cells. Other COVID-19 vaccines being tested include so-called adenovirus-based vaccines, which involve the use of a harmless cold virus that replicates in our cells. These vaccines also deliver the coronavirus antigen into our cells to allow our bodies to mount an immune reaction against it. It will be important to ask whether mRNA vaccines and adenovirus-based vaccines induce high-quality virusspecific T-cells against SARS-CoV-2 and how long these will last. Dan Baruch at Harvard University recently showed that depleting cytotoxic T-cells in nonhuman primates reduces vaccine protection against SARS-CoV-2, highlighting the importance of these cells in controlling coronavirus infection. Data from other viral vaccines such as those tested against HIV suggest weaker cytotoxic Tcell responses elicited by RNA vaccines compared to adenovirus-based candidates, but data on the durability of these responses in humans after COVID-19 vaccinations have not been reported. The T-cell tagging technology may help here. Rachel Rutishauser at the University of California, San Francisco recently analyzed blood from COVID-19– recovered patients using tetramers and detected coronavirus-specific T-cells at frequencies of more than 2,000 cells in 10 milliliters of blood, the equivalent of about 2 teaspoons. These numbers are promising because the cell tagging method requires only about 5,000 cells, or less than 50 milliliters of blood, to measure the life span of cells. An Iconic Example of Eradication To close on a hopeful note, we can look back at history for heartening cases where vaccines were completely sucwww.americanscientist.org
Courtesy Municipal Archives, City of New York; RBM Vintage Images/Alamy Stock Photo
The polio vaccine was a turning point in history. In the 1950s, the U.S. government licensed a vaccine developed by Jonas Salk (right) the same day that the results were announced of its 80 to 90 percent effectiveness against paralytic polio results. American singer Elvis Presley set an example to ease the public’s hesitance toward the vaccine (left).
cessful. The polio vaccine was a watershed moment in American history. Children who contracted polio would wake up with symptoms of a common cold but soon could not move a leg because of localized paralysis. Some children would progress to bulbar polio, which affected the nerves for breathing and swallowing, typically ending with death. Even in survivors, partial recovery meant atrophied limbs for life. The United States reported about 58,000 cases and 3,000 deaths from polio in 1952 alone. A year later, Jonas Salk, a young doctor in Pittsburgh, inactivated the virus chemically and used it as an immunization tool. He vaccinated himself and his family as part of the publicity effort, along with some 2,000,000 schoolage children in a placebo-controlled trial. When the dramatic reduction in polio cases was reported, the American public celebrated the news like a war victory. In opinion polls of the era, Salk was ranked between Gandhi and Churchill as a figure of admiration in modern history. There have been no reported cases of poliomyelitis in the United States since 1979, and in the Western Hemisphere since 1991. It is hard to imagine a greater medical miracle than what occurred with paralytic polio. Effective Vaccines The existence of free-flowing travel and commerce would collapse if pandemics and plagues—mostly viral and without effective treatment—ravaged the world every few years. Where would we have been for the past year without the hope of an effective COVID-19 vaccine that provides long-term immunity?
Discussions of the technological advances that led to industrial and postindustrial society usually invoke examples such as electricity, the internal combustion engine, telecommunications, sewage systems, airplane flight, the assembly line, computers, and the internet. Vaccination should be included prominently on this list, with a tip of the cap to T-cell memory. This advance, so widespread today, can be easily forgotten until we, once again, experience the devastation in real time of a lethal, untreatable infectious plague. Although we may as a people forget, the fact that our T-cells remember has been one of the great protectors of humanity. Bibliography Ahmed, R., and R. S. Akondy. 2011. Insights into human CD8+ T-cell memory using the yellow fever and smallpox vaccines. Immunology and Cell Biology 89:340–345. Akondy, R. S., et al. 2017. Origin and differentiation of human memory CD8 T cells after vaccination. Nature 552:362–367. Busch, R., et al. 2007. Measurement of cell proliferation by heavy water labeling. Nature Protocols 2:3045–3057. McMahan, K., et al. 2021. Correlates of protection against SARS-CoV-2 in rhesus macaques. Nature. In press. Mohri, H., et al. 2001. Increased turnover of T lymphocytes in HIV-1 infection and its reduction by antiretroviral therapy. Journal of Experimental Medicine 194:1277–1287.
Marc Hellerstein is a professor and the Dr. Robert C. and Veronica Atkins Chair in Human Nutrition at the University of California at Berkeley, and professor of endocrinology, metabolism, and nutrition in the department of medicine at the San Francisco General Hospital, University of California, San Francisco. Email: [email protected] 2021
March–April
105
From a Swinging Chandelier to Global Positioning Systems Calculus has unraveled mysteries that puzzled scientists for centuries, and it has led to technologies they never would have imagined. Steven Strogatz
L
egend has it that Galileo Galilei (1564–1642) made his first scientific discovery when he was a teenage medical student. One day, while attending Mass at Pisa Cathedral, he noticed a chandelier swaying overhead, moving to and fro like a pendulum. Air currents kept jostling it, and Galileo observed that it always took the same time to complete its swing whether it traversed a wide arc or a small one. That surprised him. How could a big swing and a little swing take the same amount of time? But the more he thought about it, the more it made sense. When the chandelier made a big swing, it traveled farther but it also moved faster. Maybe the two effects balanced out. To test this idea, Galileo timed the swinging chandelier with his pulse. Sure enough, every swing lasted the same number of heartbeats. This legend is wonderful, and I want to believe it, but many historians doubt it happened. It comes down to us from Galileo’s first and most devoted biographer, Vincenzo Viviani (1622– 1703). As a young man, he had been Galileo’s assistant and disciple near the end of the older man’s life, when Galileo was completely blind and under house arrest. In his understandable reverence for his old master, Viviani was known to have embellished a tale or two when he wrote Galileo’s biography years after his death.
But even if the story is apocryphal (and it may not be!), we do know for sure that Galileo performed careful experiments with pendulums as early as 1602 and that he wrote about them
World History Archive/Alamy Stock Photo
This 19th-century engraving depicts the legendary story of Galileo Galilei at the moment of inspiration as he watches the swaying chandelier at Pisa Cathedral. Physicists and mathematicians have built upon Galileo’s early observations of pendulums and, with the application of calculus, have developed many of the technologies that shape the modern world.
in 1638 in Two New Sciences. In that book, which is structured as a Socratic dialogue, one of the characters sounds like he was right there in the cathedral with the dreamy young student: “Thousands of times I have observed vibrations especially in churches where lamps, suspended by long cords, had been inadvertently set into motion.” The rest of the dialogue expounds on the claim that a pendulum takes the same amount of time to traverse an arc of any size. So we know that Galileo was thoroughly familiar with the phenomenon described in Viviani’s story; whether he actually discovered it as a teenager is anybody’s guess. In any case, Galileo’s assertion that a pendulum’s swing always takes the same amount of time is not exactly true; bigger swings take a little longer. But if the arc is small enough— less than 20 degrees, say—it’s very nearly true. This invariance of tempo for small swings is known today as the pendulum’s isochronism, from the Greek words for “equal time.” It forms the theoretical basis for metronomes and pendulum clocks, from ordinary grandfather clocks to the towering clock used in London’s Big Ben. Galileo himself designed the world’s first pendulum clock in the last year of his life, but he died before it could be built. The first working pendulum clock appeared 15 years later, invented by the Dutch mathematician and physicist Christiaan Huygens.
QUICK TAKE Galileo Galilei studied pendulums in the 16th century, but their full potential would not be realized until the discovery of calculus a century later.
106
American Scientist, Volume 109
The sway of a pendulum regulates the timekeeping of metronomes and grandfather clocks, and the concept behind the movement extends to any vibrating object.
Oscillating ions operate on the same principle as pendulums. Their regularity provides atomic clocks with the precision required to operate the Global Positioning System.
Javier Larrea/agefotostock/Alamy Stock Photo
Foucault pendulums, such as this one at Eureka! Zientzia Museoa in San Sebastián, Spain, demonstrate the Earth’s rotation. The pendulum swings along a consistent plane of oscillation while the planet spins below it, moving the room around the pendulum. Lights that turn on as the bob passes by show the pendulum’s path. The consistency of a pendulum’s path also applies to other oscillating objects, from generators to electrons.
Galileo was particularly intrigued— and frustrated—by a curious fact he discovered about pendulums: the elegant relationship between its length and its period (the time it takes the pendulum to swing once back and forth). As he explained, “If one wishes to make the vibration-time of one pendulum twice that of another, he must make its suspension four times as long.” Using the language of proportions, he stated the general rule. “For bodies suspended by threads of different lengths,” he wrote, “the lengths are to each other as the squares of the times.” Unfortunately, Galileo never managed to derive this rule mathematically. It was an empirical pattern crying out for a theoretical explanation. He worked at it for years but failed to solve it. In retrospect, he couldn’t have. Its explanation required a new kind of mathematics beyond any that he or his contemporaries knew. The derivation would have to wait until the late-17th century, for Isaac Newton and his discovery of the language of differential equations: calculus. www.americanscientist.org
Paradigms of Oscillation Galileo conceded that the study of pendulums “may appear to many exceedingly arid,” although it was anything but that, as later work showed. In mathematics, pendulums stimulated the development of calculus through the riddles they posed. In physics and engineering, pendulums became paradigms of oscillation. Like the line in William Blake’s poem, “Auguries of Innocence,” about seeing a world in a grain of sand, physicists and engineers learned to see the world in a pendulum’s swing. The same mathematics applied wherever oscillations occurred. The worrisome movements of a footbridge, the bouncing of a car with mushy shock absorbers, the thumping of a washing machine with an unbalanced load, the fluttering of venetian blinds in a gentle breeze, the rumbling of the Earth in the aftershock of an earthquake, the 60-cycle hum of fluorescent lights—every field of science and technology today has its own version of to-and-fro motion, of rhythmic return. The pendulum is the granddaddy of them all. Its patterns
are universal. “Arid” is not the right word for them. In some cases, the connections between pendulums and other phenomena are so exact that the same equations can be recycled without change. Only the symbols need to be reinterpreted; the syntax stays the same. It’s as if nature keeps returning to the same motif again and again, a pendular repetition of a pendular theme. For example, the equations for the swinging of a pendulum carry over without change to those for the spinning of generators that produce alternating current and send it to our homes and offices. In honor of that pedigree, electrical engineers refer to their generator equations as swing equations. The same equations pop up yet again, Zelig-like, in the quantum oscillations of a high-tech device that’s billions of times faster and millions of times smaller than any generator or grandfather clock. In 1962 Brian Josephson, then a 22-year-old graduate student at the University of Cambridge, predicted that at temperatures close to absolute zero, pairs of superconducting electrons could tunnel back and forth through an impenetrable insulating barrier, a nonsensical statement according to classical physics. Yet calculus and quantum mechanics summoned these pendulum-like 2021
March–April
107
oscillations into existence—or, to put it less mystically, they revealed the possibility of their occurrence. Two years after Josephson predicted these ghostly oscillations, the conditions needed to conjure them were set up in the laboratory and, indeed, there they were. The resulting device is now called a Josephson junction. Its practical uses are legion. It can detect ultrafaint magnetic fields a hundred billion times weaker than that of the Earth, which helps geophysicists hunt for oil deep underground. Neurosurgeons use arrays of hundreds of Josephson
oceans to wage war or conduct trade, but they often lost their way or ran aground because of confusion about where they were. The governments of Portugal, Spain, England, and Holland offered vast rewards to anyone who could solve the longitude problem. It was a challenge of the gravest concern. When Galileo was trying to devise a pendulum clock in his last year of life, he had the longitude problem firmly in mind. He knew, as scientists had known since the 1500s, that the longitude problem could be solved if one had a very accurate clock. A navigator
Calculus operates quietly behind the scenes of our daily lives. In the case of GPS, almost every aspect of its functioning depends on calculus. junctions to pinpoint the sites of brain tumors and locate the seizure-causing lesions in patients with epilepsy. The procedures are entirely noninvasive, unlike exploratory surgery. They work by mapping the subtle variations in magnetic field produced by abnormal electrical pathways in the brain. Josephson junctions could also provide the basis for extremely fast chips in the next generation of computers and might even play a role in quantum computation, which will revolutionize computer science if it ever comes to pass. Keeping Time Pendulums also gave humanity the first way to keep time accurately. Until pendulum clocks came along, the best clocks were pitiful. They would lose or gain 15 minutes a day, even under ideal conditions. Pendulum clocks could be made a hundred times more accurate than that. They offered the first real hope of solving the greatest technological challenge of Galileo’s era: finding a way to determine longitude at sea. Unlike latitude, which can be ascertained by looking at the Sun or the stars, longitude has no counterpart in the physical environment. It is an artificial, arbitrary construct. But the problem of measuring it was real. In the age of exploration, sailors took to the 108
American Scientist, Volume 109
could set the clock at his port of departure and carry his home time out to sea. To determine the ship’s longitude as it traveled east or west, the navigator could consult the clock at the exact moment of local noon, when the Sun was highest in the sky. Because the Earth spins through 360 degrees of longitude in a 24-hour day, each hour of discrepancy between local time and home time corresponds to 15 degrees of longitude. In terms of distance, 15 degrees translates to a whopping 1,600 kilometers at the equator. So for this scheme to have any hope of guiding a ship to its desired destination, give or take a few kilometers of tolerable error, a clock had to run true to within a few seconds a day. And it had to maintain this unwavering accuracy in the face of heaving seas and violent fluctuations in air pressure, temperature, salinity, and humidity, factors that could rust a clock’s gears, stretch its springs, or thicken its lubricants, causing it to speed up, slow down, or stop. Galileo died before he could build his clock and use it to tackle the longitude problem. Huygens presented his pendulum clocks to the Royal Society of London as a possible solution, but they were judged unsatisfactory because they were too sensitive to disturbances in their environment. Huygens later invented a marine chronometer whose
ticktock oscillations were regulated by a balance wheel and a spiral spring instead of a pendulum, an innovative design that paved the way for pocket watches and modern wristwatches. In the end, however, the longitude problem was solved by a new kind of clock that used a spring-based mechanism, which was developed in the mid-1700s by John Harrison, an Englishman with no formal education. When tested at sea in the 1760s, his H4 chronometer tracked longitude to an accuracy of 16 kilometers, sufficient to win the British Parliament’s prize of £20,000 (equivalent to a few million dollars today). Navigating via Pendulum In our own era, the challenge of navigating on Earth still relies on the precise measurement of time. Consider the Global Positioning System (GPS). Just as mechanical clocks were the key to the longitude problem, atomic clocks are the key to pinpointing the location of anything on Earth to within a few meters. An atomic clock is a modern-day version of Galileo’s pendulum clock. Like its forebear, it keeps time by counting oscillations, but instead of tracking the movements of a pendulum bob swinging back and forth, an atomic clock counts the vibrations of cesium atoms as they switch back and forth between two of their energy states, something they do when driven by microwaves that have a frequency of 9,192,631,770 cycles per second. Though the mecha-
NASA/JPL
NASA’s Jet Propulsion Laboratory has developed the Deep Space Atomic Clock, which will allow spacecraft in deep space to navigate independently. The clock will keep time by counting the oscillations of mercury ions, applying the same principle used to measure the pendulum swing of a chandelier. These oscillations are so reliable that the clock only loses one second every 10 million years.
Never Get Lost Again
I
NASA/JPL
t can be difficult to remember a time before smartphones, when people relied on unwieldy paper maps to find their way through unfamiliar cities. Today, your phone uses the Global Positioning System (GPS) to pinpoint your location and determine the best route from here to there, faster than you can open your glove compartment. A constellation of more than 30 GPS satellites orbits Earth, each carrying atomic clocks that count the pendulum-like oscillations of cesium or—in more advanced clocks—rubidium ions. A terrestrial network of antennae and monitor stations track the precise locations of the satellites, which are spread over six orbital planes—this configuration ensures that ground-based GPS devices can view at least four satellites from almost any location on the planet. Your phone calculates its distance from each satellite based on how long it takes signals to reach them. GPS then uses trilateration to identify your location with meter-level accuracy nearly anywhere on Earth.
nism is different, the principle is the same. Repetitive motion, back and forth, can be used to keep time. And time, in turn, can determine your location. When you use the maps application in your phone or car, your device receives wireless signals from at least four GPS satellites that are orbiting about 20,000 kilometers overhead. Each satellite carries four atomic clocks that are synchronized to within a billionth of a second of one another. The various satellites visible to your receiver send it a continuous stream of signals, each of which is time-stamped to the nanosecond. That’s where the atomic clocks come in. Their tremendous temporal precision gets converted into the tremendous spatial precision we’ve come to expect from GPS. The calculation relies on trilateration, which works like this: When the signals from the four satellites arrive at the receiver, your GPS gadget compares the time they were received to the time they were transmitted. Those four times are all slightly different, because the satellites are at four different distances away from you. Your GPS device multiplies those four tiny time differences by the speed of light to calculate how far away you are from the four satellites overhead. Because the positions of the satellites are known and controlled extremely accurately, your GPS receiver can then calculate those four distances to determine where it is on the surface of the Earth. It can also figure out its elevation and speed. In essence, GPS converts very www.americanscientist.org
precise measurements of time into very precise measurements of distance and thereby into very precise measurements of location and motion. GPS was developed by the U.S. military during the Cold War. The original intent was to keep track of U.S. submarines carrying nuclear missiles and give them precise estimates of their current locations so that if they needed to launch a nuclear strike, they could target their intercontinental ballistic missiles very accurately. Peacetime applications of GPS nowadays include precision farming, blind landings of airplanes in heavy fog, and enhanced 911 systems that automatically calculate the fastest routes for ambulances and fire trucks. But GPS is more than a location and guidance system. It allows time synchronization to within 100 nanoseconds, which is useful for coordinating bank transfers and other financial transactions. It also keeps wireless phone and data networks in sync, allowing them to share the frequencies in the electromagnetic spectrum more efficiently. I’ve gone into all this detail because GPS is a prime example of the hidden usefulness of calculus. As is so often the case, calculus operates quietly behind the scenes of our daily lives. In the case of GPS, almost every aspect of the functioning of the system depends on calculus. Think about the wireless communication between satellites and receivers; calculus predicted the electromagnetic waves that make the technology possible. Without calculus, there’d
be no wireless and no GPS. Likewise, the atomic clocks on the GPS satellites use the quantum mechanical vibrations of cesium atoms; calculus underpins the equations of quantum mechanics and the methods for solving them. I could go on—calculus underlies the mathematical methods for calculating the trajectories of the satellites and controlling their locations, and for incorporating Albert Einstein’s relativistic corrections to the time measured by atomic clocks as they move at high speeds and in weak gravitational fields—but I hope the main point is clear. Calculus enabled the creation of much of what made GPS possible. Calculus didn’t do it on its own, of course. It was a supporting player, but an important one. Along with electrical engineering, quantum physics, aerospace engineering, and all the rest, calculus was an indispensable part of the team. So let’s return to young Galileo sitting in the Pisa Cathedral pondering that chandelier swinging back and forth. We can see now that his idle thoughts about pendulums and the equal times of their swings had an outsize impact on the course of civilization, not just in his own era but in our own. Steven Strogatz is the Jacob Gould Schurman Professor of Applied Mathematics at Cornell University. This article is adapted from his book Infinite Powers: How Calculus Reveals the Secrets of the Universe. Copyright © 2019 by Steven Strogatz. Reprinted by permission of Houghton Mifflin Harcourt Publishing Company. All rights reserved. Website: www.stevenstrogatz.com 2021
March–April
109
Blind Mind’s Eye People with aphantasia cannot visualize imagery, a trait that highlights the complexities of imagination and mental representation. Adam Zeman
W
hich is darker: the green of grass or the green of a pine tree? Does a squirrel have a short or a long tail? Is a walnut larger than a hazelnut? Do Labradors have rounded ears? To answer questions such as these, you probably summoned up images of the mentioned items to inspect them in your “mind’s eye.” When you enjoy a novel, you likely come away with a visual impression of the characters and scenes described—which can lead to that familiar disappointment if the book is turned into a movie: “He looked nothing like I’d imagined him!” Most of us can conjure images to order: Visualize the Sun rising above the horizon into a misty sky—or your kitchen table as you left it this morning. But it turns out that 1 to 3 percent of the population entirely lack the ability to visualize—a condition called aphantasia—whereas others have hyperphantasia and experience imagery as vivid as actual sight. These imagery vividness extremes are prime examples of invisible differences that are easily overlooked but are salient features of the inner lives of those concerned. Understanding how such differences arise can help us learn about the many ways the mind can implement imagination and mental representation. The Science of Imagery Imagery involves the sensory experience of items in their absence: When we visualize a pine tree or the rising Sun, most of us have an experience that is a bit like seeing. But we can form imagery in other sense modalities too: We
“hear” the sound of distant thunder, “feel” the touch of velvet, or imagine running for a bus by engaging auditory, tactile, and motor imagery, respectively. Olfactory imagery is more elusive, but many of us can relish the scent of a rose or shrink from the smell of sewage. To some degree, we can evoke absent emotions, imagining a breath of sadness or a sudden jolt of surprise. Although this article focuses on visual imagery, the broad principles seem to apply to imagery of all types. Experiences of imagery are ubiquitous. They contribute to our recollection of the past (think of your last holiday) and our anticipation of the future (how will you spend next weekend?). They figure in our daydreams and our night dreams. They have been implicated in creative work in both the sciences and the arts. Albert Einstein wrote: “I very rarely think in words at all,” relying instead on “more or less clear images which can be voluntarily ‘reproduced’ and combined. . . .” The novelist Joseph Conrad emphasized the importance of imagery to his craft: “My task . . . is, by the power of the written word to make you hear, to make you feel—it is, before all, to make you see.” Research over the past century has taught us much about the psychology of imagery generally and its basis in the brain. An impressive series of experiments by Stanford University psychologist Roger Shepard, Harvard University neuroscientist Stephen Kosslyn, and others showed that imagery is indeed, as intuition might suggest, an echo of perception. If we are asked to shift our mental gaze between two objects on
a map that we have memorized, we answer more swiftly if they lie close together rather than far apart, as if we were scanning the map with our eyes before we respond; in deciding whether one object is a rotated version of the other, the timing of the decision depends on the extent of the rotation. A beautifully simple observation epitomizes work along these lines: If visualizing is really like seeing, visualizing something bright should cause a constriction of the pupil, as would occur when looking at something bright. Bruno Laeng at the University of Oslo has shown that, indeed, if we switch our mental image from a bright sky to a night sky or a cloudy one, the pupil duly dilates (see figure at the top of page 112). But there is more to imagery than it being simply “weak perception.” Let’s say that I ask you to imagine a tulip, and you succeed—what color was it, by the way?—you engage a whole team of more basic cognitive abilities: You must be awake and attentive, you require your command of the English language to decode the instruction, you need your memory to retrieve your knowledge of tulips and their appearance, you need to use your executive function to orchestrate the whole process, and you use your perceptual system to generate the sense of “looking at” a tulip. This description reminds us that, like any cognitive act, forming an image is a process rather than an instantaneous event. A measurable amount of time passes between receiving the instruction to “visualize a tulip” and becoming able to inspect and manipulate its image in the mind’s eye. On the basis
QUICK TAKE A condition called aphantasia affects 1 to 3 percent of the population. Aphantasics lack the ability to visualize imagery—a term that includes all the senses, not just sight.
110
American Scientist, Volume 109
A survey about imagery vividness from 1880 was the first to document the condition, but it remained a little-studied phenomenon until the past few decades.
Aphantasia does not imply a lack of imagination, which indicates that the brain has a wide range of methods for cognitive representation, some more abstract than experiential.
Artefact/Alamy Stock Photo
Wonderland by Adelaide Claxton (1841–1927) depicts the mental imagery (here, a smoky imaginary figure) that our brains regularly conjure up while reading or while doing any other task where we are asked to visualize. People with aphantasia cannot create these mental images.
of a series of behavioral experiments like those described above that he and his team undertook in the 1980s, Kosslyn described four key processing steps in our engagement with images. First, images must be generated: This step involves mobilizing information about how things look and using it to create a representation of the visualized item in what he called the “visual buffer,” a broad description for relevant, visually oriented regions of the brain. These www.americanscientist.org
images tend to fade rapidly, probably because the visual brain is designed to deal with rapidly changing scenes. Keeping an image in mind requires maintenance, Kosslyn’s second processing step. If we want to use an image to answer a specific question—does your tulip have a long stem?—we need to inspect it, which is the third step; if we want to manipulate the image, such as twirling our tulip, some transformation is called for, the final step.
It is now almost half a century since one other fascinating line of evidence began to illuminate the science of imagery. Functional brain imaging relies on the simple principle that the brain is like muscle: When it becomes active in a task, the blood flow to activated regions ramps up. We can observe this change in several ways, most commonly using magnetic resonance imaging that is sensitive to local changes in oxygen concentrations. Two years ago my colleague Crawford Winlove identified 40 studies that had examined brain activation during imagery tasks. The regions he and others have identified (see figure at the bottom of page 112) are in keeping with the cognitive processes required to call a tulip to the mind’s eye—areas in the frontal and parietal lobes linked to cognitive control, attention, and eye movements; areas linked to language processing; areas involved with memory; and visual cortices in the occipital and temporal lobes. The leading edge of such research is now focused on “mind reading,” which is the effort to decode the contents of the mind’s eye using brain-imaging data. Studies examining the time course of acts of visualization in the brain highlight another, intuitively obvious, difference between imagery and perception. When we see, information streams in from the eyes to the brain, driving activity that spreads through the visual system and deep into the brain, allowing us, among other things, to recognize what we see. Visualization is “vision in reverse”: The brain begins with a decision or instruction—”imagine a tulip”—and uses its stored knowledge of appearances to drive activity within the visual system that leads to the experience of imagery. Imagery, in brief, allows us to simulate sensory experience “offline,” enabling at least a partial reenactment of our past encounters with the world. The usual explanation for why we have imagery is that it ultimately enhances our ability to predict the future and act effectively within it. This purpose may be true, but recent findings somewhat complicate this story. Rediscovering Aphantasia Sir Francis Galton was a Victorian scientist with a passion for measurement, which was misapplied in his role in the development of eugenics. But his “breakfast table questionnaire,” published in 1880, was probably the first 2021
March–April
111
.8
.6
pupillary change (pixels)
.4
.2
0
–.2
–.4
–.6
–.8
sunny day
night sky
cloudy sky
face in sunlight
type of scene
face in shade
dark room
Adapted from B. Laeng et al., 2014.
Imagery studies show that imagination can cause physical responses, demonstrating that visualization is connected to vision. In this case, data show that people’s pupil dilation will change as they visualize brighter or darker imagery.
systematic attempt to measure the vividness of imagery. The questionnaire invited participants to “think of some definite object—suppose it is your breakfast table as you sat down
to it this morning—and consider carefully the picture that rises before your mind’s eye.” They were asked to comment on its degree of illumination, definition, and coloring. Galton initially
Adapted from C. Winlove et al., 2018.
Combined results from hundreds of individuals show the brain areas consistently activated while visualizing. Those in the frontal and parietal lobes are linked to cognitive control, attention, eye movements, language processing, and memory, whereas areas in the occipital and temporal lobes are visual. The mesh at lower left allows standardized mapping of brain regions. The arrow at top left indicates the insula, an area involved in sensation that normally would be obscured by other brain regions. 112
American Scientist, Volume 109
circulated his questionnaire to 100 colleagues, mostly scientists, classifying their responses into those where “the faculty is very high,” mediocre, or “at the lowest.” To his astonishment, many of these “men of science” protested that “mental imagery was unknown to them . . . they had no more notion of its true nature than a colour-blind man, who has not discerned his deficit, has of the true nature of colour.” When he began to sample persons “in general society,” however, he found “an entirely different disposition to prevail. Many men, and a yet larger number of women, and many boys and girls, declared that they habitually saw mental imagery, and that it was perfectly distinct to them and full of colour.” There were also some notable exceptions to the rule among his scientific friends. A certain Charles Darwin, Galton’s much esteemed cousin, responded that his image of the breakfast table included some objects “as distinct as if I had photos before me.” Galton’s questionnaire spawned many descendants. We have used psychologist David Marks’s Vividness of Visual Imagery Questionnaire (VVIQ) in our own work (see figure on page 113). This questionnaire asks for vividness judgments about images of 16 scenes that are rated from “no image at all, you only ‘know’ that you are thinking of the object,” scoring 1/5, to “perfectly clear and as vivid as real seeing,” scoring full marks. Galton’s intriguing observation that for some the “power of visualization was zero” was almost entirely neglected over the following century, despite a great flowering of research on imagery more generally. A single American psychologist, Bill Faw, researched the topic in the past few decades, estimating that around 2 to 3 percent of his undergraduate students, like Faw himself, were “wakeful non-imagers.” Occasionally neurologists, starting in 1883 with Jean Martin Charcot, the father of French neurology, encountered patients who lost the ability to visualize following brain injuries or strokes, and a few psychiatrists, such as Jules Cotard in 1882, recognized that mood disorders could cause a dimming of imagery and sometimes its disappearance. But most research examining imagery vividness focused on people with mid-range vividness scores. It suggested that these scores were reasonably consistent over time, but they showed rather modest, unexciting correlations with other psychological abilities.
Barbara Aulicino
5
2
The first time I knowingly encountered a person without the ability to create imagery was in 2003. Identified only by the code MX for research purposes, he was a delightful retired surveyor in his mid-60s. Not long before I met him, he had undergone a cardiac procedure. Shortly afterward he realized that he could no longer visualize: He had previously relished his active mind’s eye, for example, calling to mind images of friends, family, and places he had visited as he settled down to sleep. His dreams became avisual after the procedure, and he found that when he read, the novel would no longer create a visual world. His vision, by contrast, appeared entirely unaffected. MX’s account of his unusual symptoms was so compelling that we ultimately studied his brain activation in a visualization task using functional magnetic resonance imaging (see figure on page 114). When MX looked at famous faces, his pattern of brain activity was normal, but when he tried to visualize them, he failed to activate visual brain regions that came into play in our control participants. This difference suggested a satisfying neural www.americanscientist.org
4
1
correlate for the subtle but distinctive change in experience that he reported. I found MX’s case fascinating but did not anticipate what followed. The science journalist Carl Zimmer wrote an accessible account of our research in Discover magazine in 2010. Over the next few years, my colleagues Sergio Della Sala and Michaela Dewar and I were contacted by 21 people who recognized themselves in Zimmer’s
3 The Vividness of Visual Imagery Questionnaire (VVIQ) asks responders to visualize a number of unfolding scenes, such as: The Sun rises above the horizon into a hazy sky, the sky clears and surrounds the Sun with blueness, clouds form and a storm blows up with flashes of lightning, then a rainbow appears. Responders are then asked to rate their imagery from 1 to 5, with 5 being perfectly clear and as vivid as real seeing, 4 being clear and reasonably vivid, 3 being moderately clear and lively, 2 being vague and dim, and 1 being no image at all, just an awareness that you are thinking about this subject.
lack of wakeful imagery. About half of them told us that they lacked imagery in all sense modalities, not just the visual. Some described affected relatives. Oddly, all but two were men. I felt that this phenomenon deserved an appropriate name. The terms used in the neurological literature, such as defective revisualization and visual irreminiscence, were unwieldy. I consulted a colleague trained
Aristotle’s name for the mind’s eye was phantasia, so we prefixed an a, denoting absence, to coin the term aphantasia. description of MX—with the key difference that they had never been able to visualize. Their accounts were quite consistent. They usually became aware of this idiosyncrasy in their psychological makeup in early adulthood. It intrigued rather than dismayed them. Most respondents described rather poor autobiographical memory. Most still dreamed visually despite their
in classical philosophy, David Mitchell of the New College of Humanities in London, who suggested that we borrow from Aristotle, one of the Greek fathers of philosophy. Aristotle’s name for the mind’s eye, in his work De Anima (Of the Soul) was ɔȽɋɒȽɐɜȽ, or phantasia. We prefixed an a, denoting absence, to coin the term aphantasia, the lack of a mind’s eye. Words are 2021
March–April
113
y = – 62
3
controls
2 1
–1 –2 –3 L
patient
A patient who lost the ability to visualize after a surgery underwent functional magnetic resonance imagery of his brain activation during a visualization task. When he visualized faces, he activated visual brain regions less than normal (above), but he activated more strongly brain areas that likely broadly indicate more mental effort (below).
z = 10 1
patient
0.5
–0.5 –1
controls
M.X. Adapted from A. Zeman et al., 2010.
powerful tools. To my surprise, this simple coinage, published in a letter describing our 21 aphantasic contacts, triggered an avalanche of interest. Widespread press coverage of the word and the phenomenon that it describes has since led around 14,000 people to get in touch by email. The majority have described various forms of aphantasia. Fewer people reported experiences from the opposite end of the vividness spectrum, with exceptionally vivid imagery; we termed this hyperphantasia. We were struck by the strong emotion expressed in many of the messages: “This is unbelievable. I’ve gone my entire life attempting to explain that I cannot picture things in my head”; “. . . a phenomenon that feels like a secret I’ve been keeping my whole life”; “so much of the world now makes sense”; “the craziest thing is knowing that I’m not alone.” The cofounder of Mozilla Firefox, Blake Ross, posted a feisty account of his self-discovery as aphantasic that went viral: “I felt that 114
American Scientist, Volume 109
transcendent warmth I’ve only known once before, when a dorky high school outcast in Florida stumbled on a group of California programmers who just seemed to ‘get him.’ It’s the feeling of finding your people.” We had connected with an unmet need. Our flooded email inbox created a unique opportunity for further research. With the help of a team of student interns from the University of Exeter, I responded to the emails pouring in with a request to complete the VVIQ and another imagery questionnaire exploring a range of related topics. These questions asked, for instance, how and when people recognized their “difference,” whether they dream in images, and whether they have trouble recalling episodes from their personal past. This exercise has allowed us to give a preliminary description of the psychological significance of imagery extremes from an analysis of 2,000 questionnaire pairs from people with lifelong aphantasia and 200 with hyperphantasia.
Vividness Extremes Our first finding echoed Galton’s observations about his scientific colleagues. Although there are many exceptions to this rule, aphantasia is associated with a bias toward mathematical and scientific occupations, whereas hyperphantasia is associated with more traditionally creative trades. Next, we identified two areas of difficulty for many people with aphantasia: Approximately one-third report poor autobiographical memory, whereas a (partially overlapping) third report a problem in recognizing faces; these complaints are rare among people with hyperphantasia (see figure at the top of page 115). These findings from the far extremes of imagery vividness harmonize with reports from other researchers that, in general, having more vivid imagery predicts richer, clearer, and less effortful recollection of autobiographical events. Similarly, a previous study of people with congenital prosopagnosia—a lifelong failure to recognize faces—had indicated that
aphantasia hyperphantasia controls
70 60 50
percentage frequency
percentage frequency
80
40 30 20 10 0
good
bad normal autobiographical memory type
unsure
100 90 80 70 60 40 50 30 20 10 0
aphantasia hyperphantasia controls
poor
normal face recognition ability Adapted from A. Zeman et al., 2020.
www.americanscientist.org
About a third of people with aphantasia report poor autobiographical memory, whereas a partially overlapping third report problems recognizing faces; these complaints are rare among people with hyperphantasia.
extreme imagery report that other family members are similarly affected, allowing us to calculate a roughly tenfold increase in risk compared with the general population. It is too soon to judge whether this increase has a genetic basis. We hope to find out, but this effort will probably be hampered by a complexity that may well have occurred to you. Aphantasia is almost certainly not a single entity: It is a variation in experience that can occur in a range of settings—for example, in association with face recognition difficulty—or with lack of imagery in other senses. Its subtypes have yet to be clearly defined;
but if they exist, their genetic background may well vary. The Task of Triangulation The story I have told you so far has relied on first-person evidence: what our participants have told us about their imagery and other aspects of their mental lives. This evidence is consistent: At around the time that we published our description of the psychological features of imagery extremes, another research group, led by imagery researcher Joel Pearson of the University of New South Wales in Sydney, Australia, described very similar findings. But
perception
aphantasics
memory
Wilma Bainbridge at the University of Chicago and her colleagues took the approach of quantifying aphantasia with drawing. Their study found that aphantasics lack object memory, but do not lack spatial memory. When the participants were shown a photograph of a real scene for 10 seconds and then asked to draw it from memory, aphantasics recalled fewer objects in the scene, but had lower incidence of mistakenly adding objects not in the photograph. However, when aphantasics were then asked to pick out the image of the scene they had been shown from a set of scenes, they did as well as controls, or people with average imagery vividness. When the aphantasics were asked to copy that scene while looking at the image, there was also no difference from controls.
2021
March–April
115
Adapted from W. Bainbridge et al., 2021.
controls
their visual imagery tends to be faint. A fourth association kept cropping up in our correspondence, although we had not specifically asked about this trait in our questionnaire: Many people with aphantasia reported they were on the autistic spectrum. At the opposite end of the spectrum, hyperphantasia appeared to be linked to synesthesia—the process by which some quality of experience, such as the sound of a vowel, is accompanied by an involuntary, unrelated, secondary experience, such as a color (see “Synesthesia’s Altered Senses” in the July–August 2020 issue). These associations were both intriguing, raising questions about the underlying mechanisms involved, and reassuring: They suggested that rather than being isolated oddities, visual imagery extremes are part of a bigger psychological picture. Our hundredfold larger sample gave us an opportunity to examine two other hints from our previous study of 21 participants. Around 60 percent of people with aphantasia reported visual dreams. This apparent discrepancy makes neurological sense, because the processes within the brain leading to dreaming and wakeful imagery are very different, so it is quite plausible that they should dissociate. People with aphantasia who dream avisually give fascinating descriptions of narrative, conceptual, and emotional dreams. Much as in our smaller study, around half of those with extreme imagery, both high and low, told us that all their senses were affected; for the remainder, some or all the other modalities of imagery were of normal vividness. This disparity suggests that both factors common to all sense modalities and factors specific to each influence the vividness of imagery. Our estimates for the rates of extreme imagery in the community are about 1 to 3 percent for aphantasia, and 3 to 11 percent for hyperphantasia, depending on the threshold chosen for diagnosis. Many of our participants with
116
American Scientist, Volume 109
than in people with aphantasia. Other candidate explanations include differences in the area of visual cortices, which Pearson has shown to be related to differences in the strength and accuracy of imagery using his weakperception technique. There is also evidence that variation in the excitability of visual regions can influence imagery strength. These possibilities are not mutually exclusive, and more than one of these hypotheses may prove correct.
SCL shift from baseline (microseconds)
Imagery Versus Imagination Imagination—defined as our ability to represent, reshape, and reconceive things in their absence—is one of the defining powers of the human mind. Its central importance contributes to the interest in imagery extremes, as imagery is, for most of us, a prominent ingredient of our imaginings. The fortunate opportunity to study large numbers of
people with aphantasia and hyperphantasia prompts some general reflections. First, are aphantasia and hyperphantasia “disorders”? In general, I think not. They are intriguing variations in human experience, analogous to synesthesia, which, like aphantasia, affects around 2 percent of the population. Both extremes of imagery vividness have interesting psychological associations, but neither is a barrier to leading a rich, creative, and fulfilling life. I suspect that the two extremes of the vividness spectrum will prove to have balanced advantages and disadvantages. They are, however, occasionally symptomatic of disorder: Aphantasia, for example, can sometimes result from a stroke, a head injury, or an episode of depression. So if someone who has previously had imagery suddenly loses it, it’s reasonable to try to find out why.
imagery 0.8
control aphantasic
0.4
0.0 20
40
60
80
100
mean SCL
time (seconds)
SCL shift from baseline (microseconds)
you may be skeptical about first-person evidence altogether. People are not entirely reliable witnesses of their mental lives. Descriptions of experience seem a good point of departure for psychological research. But if imagery extremes are significant, it should be possible to triangulate these first-person reports with more objective measurements, applying both behavioral tests and neural, brain-based approaches. This work is underway. Pearson had previously used the idea that imagery is like weak perception to develop an ingenious measure of imagery strength. Briefly, his method uses the finding that a visual image formed in the mind’s eye can influence subsequent perception in much the same way as a faint visual stimulus presented externally. The extent of this influence can be measured to provide a relatively objective estimate of imagery strength. In people with aphantasia, the influence is undetectable, suggesting that, indeed, they are failing to form visual images at all. A second elegant experiment from Pearson’s lab is also telling. His team asked people with and without aphantasia to read a series of scary descriptions, such as a swimmer’s view of an approaching shark, which would evoke vivid imagery in most of us. They found that people with aphantasia failed to show the marked change in skin conductance observed in control participants without aphantasia (see figure at right). This difference was not because of an overall reduction in emotion, as the aphantasic participants showed a normal reaction to photos of scary scenes. My team has recently used standard psychological tests to measure memory and imagination in people with aphantasia, average imagery, and hyperphantasia. Tests examining memory for verbal and visual material over intervals of half an hour did not distinguish the groups. But there were marked differences when we compared the richness of the description of personally significant past and imagined events. This result meshes well with the accounts given by some—though not all—people with aphantasia of relatively scant autobiographical memory (see figure on page 117). Neural studies of aphantasia are also at an early stage. We have preliminary evidence that neural connectivity between frontal and posterior visual regions of the brain is stronger in the resting brain in people with hyperphantasia
perception
2.4 control aphantasic 1.6
0.8
0.0 20
40
60
mean SCL
time (seconds) Adapted from M. Wicken et al., 2019.
When viewing a progression of scary images, people with or without aphantasia showed a physiological fear response, measured as a change in their skin conductance level (SCL) that indicates autonomous nervous system arousal (bottom). But when read a description of a scary scene, only people with aphantasia lacked a physiological fear response (top).
400 60 mean internal details
350
composite score
50 40 30 20 10 0
300 250 200 150 100 50 0
atemporal future imagination task aphantasia
control
remote recent autobiographical task hyperphantasia
Adapted from F. Milton et al., 2020. https://psyarxiv.com/j2zpn
Violin plots, named for their shape, show both the range and frequency of data (thick horizontal lines show median scores). For a task in which people are asked to describe imaginary scenes, either in an imagined future (for example, next New Year’s Eve) or without any specific temporal location, the plots show the richness of the narratives produced by people with aphantasia, hyperphantasia, and average imagery vividness (left). Differences also arise when the same groups are asked to recollect recent or remote episodes from their personal past (right).
Second, does aphantasia imply an absence of imagination? The answer is a clear no. Among those people who contacted us because our description of aphantasia matched their own experience were the prolific neurologist Oliver Sacks, the pioneering geneticist Craig Venter, Pixar President Ed Catmull, and Mozilla Firefox cocreator Blake Ross. In an unexpected twist, over the past five years more than 100 aphantasic visual artists have been in touch with us,
aphantasia for whom this description is true, but for several reasons I am now doubtful that this hypothesis about aphantasia is generally applicable. For one thing, many people with aphantasia love the visual world, and some of them, aphantasic artists, devote their lives to depicting it. For another, about 50 percent of people with extreme imagery report that all modalities of imagery, including imagery of sounds, are vivid in the case of hyperphantasia
For some of us, thought is closer to sensory experience, and for others, it’s more remote. which has allowed my colleagues, artist Susan Aldworth and cultural historian Matthew MacKisack, to mount an exhibition of aphantasic and hyperphantasic art. Imagination is a much richer and more complex capacity than visualization. Aphantasia illustrates the wide variety of representation available to human minds and brains; visual imagery is by no means the only one. Third, does aphantasia imply a verbal cognitive style? This connection seemed likely to me when I first began to think about this topic. If you lack a mind’s eye, I mused, presumably you will tend to be more interested in sounds and words than visual images. There may be some people with www.americanscientist.org
or dim to absent in the case of aphantasia. This result suggests that a more relevant distinction than verbal versus visual may be abstract versus experiential: For some of us, thought is closer to sensory experience, and for others, it’s more remote. But it’s possible that no single distinction is sufficient to capture the contrast between aphantasia and hyperphantasia, not least because it is unlikely that either is a single entity. Finally, what is imagery for? Aristotle wrote, “The soul never thinks without a phantasm.” He was wrong; aphantasia contradicts this view. That is not to say that imagery does not play a part in the thinking of those of us who have it. But conscious imagery, at least, does not
seem to be essential. It seems that people with aphantasia, especially those lacking all forms of sensory imagery, must either use more abstract representations—such as those of language—in their thinking or unconsciously draw on imagery. We need more research to tease apart these alternatives. It has been a privilege to share so many insights from our participants’ inner lives. I keep a few favorites pinned to my board. “I’m in the dark here,” wrote one contributor, quoting a famous line from Scent of a Woman; another mused, “There are lots of ways of being human,” surely one of the key messages from this work; a third wrote poignantly, “I’m learning to love without images.” Bibliography Aldworth, S., and M. MacKisack, eds. 2018. Extreme Imagination: Inside the Mind’s Eye. Exhibition Catalogue. Exeter: University of Exeter Press. Bainbridge, W., Z. Pounder, A. F. Eardley, and C. I. Baker. 2021. Quantifying aphantasia through drawing: Those without visual imagery show deficits in object but not spatial memory. Cortex 135:159–172. Dawes, A. J., R. Keogh, T. Andrillon, and J. Pearson. 2020. A cognitive profile of multisensory imagery, memory, and dreaming in aphantasia. Scientific Reports 10:10022. Kosslyn, S., W. Thompson, and G. Ganis. 2006. The Case for Mental Imagery. New York: Oxford University Press. Laeng, B., and U. Sulutvedt. 2014. The eye pupil adjusts to imaginary light. Psychological Science 25:188–197. Pearson, J. 2019. The human imagination: The cognitive neuroscience of visual mental imagery. Nature Reviews Neuroscience 20:624–634. Wicken, M., R. Keogh, and J. Pearson. 2019. The critical role of mental imagery in human emotion: Insights from aphantasia. bioRxiv doi:10.1101/726844 Winlove, C., et al. 2018. The neural correlates of visual imagery: A co-ordinate-based meta-analysis. Cortex 105:4–25. Zeman, A. 2020. Aphantasia. In The Cambridge Handbook of the Imagination, A. Abraham, ed., pp. 692–710. Cambridge: Cambridge University Press. Zeman, A., et al. 2020. Phantasia: The psychological significance of lifelong visual imagery vividness extremes. Cortex 130:426–440. Zeman, A., M. Dewar, and S. Della Sala. 2015. Lives without imagery: Congenital aphantasia. Cortex 73:378–380. Zeman, A., S. Della Sala, L. Torrens, V. Gountouna, D. McGonigle, and R. Logie. 2010. Loss of imagery phenomenology with intact visual imagery performance: A case of “blind imagination.” Neuropsychologia 48:145–155. Adam Zeman is a professor of cognitive and behavioral neurology at the University of Exeter College of Medicine and Health in the United Kingdom. Email: [email protected] 2021
March–April
117
S c i e n t i s t s’
Nightstand
The Scientists’ Nightstand, American Scientist’s books section, offers reviews, review essays, brief excerpts, and more. For additional books coverage, please see our Science Culture blog channel, which explores how science intersects with other areas of knowledge, entertainment, and society: americanscientist.org/blogs /science-culture. ALSO IN THIS ISSUE LAZY, CRAZY, AND DISGUSTING: Stigma and the Undoing of Global Health. By Alexandra Brewis and Amber Wutich. page 120
© 2020 by Lia Halloran
BLACK HOLE SURVIVAL GUIDE. By Janna Levin. page 122
Artwork by Lia Halloran, from Black Hole Survival Guide.
118
American Scientist, Volume 109
Plutonium Legacies Pedro de la Torre III THE APOCALYPSE FACTORY: Plutonium and the Making of the Atomic Age. Steve Olson. 336 pp. W. W. Norton, 2020. $27.95.
I
n The Apocalypse Factory: Plutonium and the Making of the Atomic Age, science writer Steve Olson walks readers through the development of nuclear weapons science, chronicles the construction and operation of the primary plutonium production complex for U.S. nuclear weapons, follows its product as it undergoes fission above cities and deserts, and gives us a small glimpse of ongoing efforts to deal with the domestic fallout of contaminated bodies, buildings, water tables, and soils. He does this from the perspectives of those involved, while describing the relevant science and engineering in engaging and approachable ways. This embodied perspective becomes particularly compelling as Olson describes the immediate aftermath of the Nagasaki bombing through the eyes of a surgeon who survived the blast. Olson is attempting to address a tendency in the historiography of nuclear weapons to focus on laboratories such as Los Alamos and the eminent physicists who worked there, rather than on sites of production and the engineers, chemists, and others involved in ushering in the nuclear era in which we still find ourselves. The “factory” referred to in the book’s title is the nuclear production complex known as the Hanford Site, which abuts the Columbia River in southeastern Washington State near Olson’s childhood home. Although the bomb that was tested at the Trinity Site in New Mexico and the “Fat Man” bomb that destroyed Nagasaki
were designed and assembled at Los Alamos, the plutonium fueling those devices and most subsequent nuclear weapons made in the United States was produced on the banks of the Columbia in reactors cooled by its waters. Olson mostly focuses on Hanford’s origins and early years, arguing that the site holds vital lessons for survival and repair in the nuclear era. Elucidating the book’s title, he notes in an epilogue that in the Bible, the apocalypse is not the final battle between good and evil—that’s Armageddon. . . . An apocalypse is a revelation—literally an uncovering—about the future that is meant to provide hope in a time of uncertainty and fear. In Part 1, “The Road to Hanford,” Olson guides us through the scientific discoveries and early days of the Manhattan Project. We see events largely through the eyes of chemist Glenn Seaborg, who first isolated plutonium using the highly toxic chemical processes later employed on an industrial scale to separate plutonium from the spent fuel of Hanford’s reactors. As Olson relates Seaborg’s contributions, he weaves into that story descriptions of some of the most consequential years of nuclear chemistry, the creation in secret of the first nuclear reactor (Chicago Pile 1), and the sometimes-tense relationship between scientists and military officials, which continued into the postwar period. In Part 2, “A Factory in the Desert,” Olson describes the construction and early operation of the Hanford Site, the Trinity test of the first atomic bomb, nuclear weapons design at Los Alamos, and the contentious discussions about how and whether to use these new weapons. We see these events not just through the eyes of nuclear scientists but also from the perspective of workers and military officials. Olson does a particularly good job of introducing physicist Leona Woods, a neglected figure
Inside this 900-foot-long windowless concrete building, one of three such edifices at the Hanford Site, chemical processes developed by Glenn Seaborg and others were used to dissolve uranium from spent fuel elements; plutonium was then separated from the resulting liquid using centrifuges. All of this work had to be done by remote control. From The Apocalypse Factory.
in the historiography of the Manhattan Project. Woods was a young physicist from Illinois who worked alongside Enrico Fermi on the Manhattan Project and was often the only woman in the room. She was instrumental in solving the xenon poisoning problem that almost led to the failure of Hanford’s B Reactor, which was the first industrial-scale reactor in the world. Finally, we also visit Tinian Island (the launching point for the atomic bomb attacks against Japan) and follow the crew of the Bockscar (the B-29 bomber carrying “Fat Man”) as they fly toward Nagasaki. Part 3, “Under the Mushroom Cloud,” is the most powerful portion of the book. In it, Olson describes the bombing of Nagasaki and its aftermath as seen through the eyes of Raisuke Shirabe, who was a surgeon at the Nagasaki Medical College Hospital. We witness his initial confusion and terror as his office becomes a blinding whirlwind when the bomb is detonated, as well as his efforts to evacuate and care for survivors, and his reunion with the surviving members of his family. We are with him as patients, including his son, begin to die mysteriously of what turns out, of course, to be radiation sickness. At times, the reader loses track of the fact that the scientists, engineers, workers, and others whom Olson so sympathetically describes are building weapons of mass destruction that will eventually be used in what many rewww.americanscientist.org
gard as war crimes. Olson doesn’t quite take a firm stance on such matters, but by letting us see the consequences of their work through Shirabe’s eyes, he eliminates the abstraction with which these events are so often spoken of. Seeing the bomb from the perspective of embodied survivors has a more powerful impact than do casualty figures, descriptions of airplane cockpits, and discussions of war strategy. In Part 4, “Confronting Armageddon,” Olson writes about Hanford’s Cold War history, the struggles of the local communities that were exposed to Hanford’s emissions, the beginning of environmental remediation efforts at the site, and the work of the B Reactor Museum Association and others who have made Hanford into a site of historical preservation and education. There is a tendency in the telling of Hanford’s history—and the history of nuclear weapons more broadly—to focus on origins. The Manhattan Project is always rightfully treated as an important historical event, but the Cold War–era nuclear complexes and the era of remediation and tenuous survival in a world with nuclear weapons and nuclear wastelands are just as relevant and interesting. Indeed, Olson seems to indicate that the most significant “apocalypse” might be found in the much longer era of repair in which we now find ourselves. “We have many more things to clean up in this world,” he says.
“Hanford’s cleanup, if done persistently and well, could provide an object lesson in making the Earth whole again.” Like Olson, I also find hope and lessons for the future in the monumental effort to address the legacies of plutonium production on the Columbia Plateau. But I wish that he had devoted more pages to this complicated, contentious, and even heroic work of repair. Readers would have benefited, for example, from a deeper dive into the fascinating science and engineering of the remediation happening at Hanford now. However, remediation is nearly always accorded less prestige and attention than bomb making. There are also many important lessons to be learned from the postwar Indigenous history of the site, but unfortunately, Olson says relatively little about that history. Hanford is the site of major violations of the treaty rights of the Yakama, Umatilla, and Nez Perce nations, who have long-lasting religious, cultural, and historical ties to the area. In the United States, the nuclear complex and nuclear history have impinged on Native American lands and bodies in a number of places, including at the proposed repository at Yucca Mountain in Nevada. Hanford’s continuing violation of treaties, the politics of consultation with federal agencies, and the ways in which the arguments of tribal governments are too often ignored—these are topics that deserve a book of their own. Attention to this history would also have the beneficial effect of complicating the pervasive assumption that environmental remediation is a technical project that can be easily divorced from social, cultural, and political issues. Despite these missed opportunities, The Apocalypse Factory is an excellent and engaging introduction to the history of Hanford and the Manhattan Project. Readers will appreciate Olson’s ability to describe complicated scientific and technical matters while connecting them to both biography and history. He also leaves readers with a vitally important imperative that is too often treated as hopeless idealism: the need to abolish nuclear weapons through the international control of the products of apocalypse factories such as Hanford. Pedro de la Torre III is an adjunct assistant professor of anthropology at the City University of New York’s John Jay College of Criminal Justice and a science and technology studies scholar focusing on Hanford and the politics of environmental remediation. 2021
March–April
119
How Best to Foster Healthy Behaviors Christopher Hamlin LAZY, CRAZY, AND DISGUSTING: Stigma and the Undoing of Global Health. Alexandra Brewis and Amber Wutich. 270 pp. Johns Hopkins University Press, 2019. $34.95.
I
n settings ranging from global health projects to general medicine, well-intentioned practitioners often demean those they seek to help. Health workers may infantilize adults by telling them simple health truths they already know; they may project contempt toward a person simply for having been born into a setting or way of life that the individual is powerless to change; or they may use shame as a tactic for changing behavior. Lazy, Crazy, and Disgusting: Stigma and the Undoing
The authors, medical anthropologists Alexandra Brewis and Amber Wutich, explain that they conceived the project that resulted in the book in response to practices used in Community-Led Total Sanitation programs in Bangladesh and elsewhere. Such programs discourage open defecation by representing it as “disgusting,” hoping that this shaming will encourage communal investment in sanitary infrastructure such as toilet blocks and piped water. Brewis and Wutich note that these blame-laying approaches, which often ignore the constraints imposed by a lack of resources, have adverse effects on communal relations and the selfesteem of individuals. Such practices are ineffective, they argue, and may undermine the success of the projects. In highlighting the counterproductive nature of some well-intended approaches, the book makes a legitimate point. But as I will try to show, that point is a limited one. It is impossible to avoid admonition in public health, and it is as important to ex-
The integration of cultural social science into public health development projects and into health care institutions more generally is too often neglected. of Global Health examines the effects of such practices in three quite different areas—obesity, mental illness, and domestic sanitation—devoting a section to each. Vignettes of victim-shaming are combined with discussions of healthshaping practices and social psychological research on health norms and the impact of stigmatization—labeling that can leave a person feeling unvalued, undesirable, or unwanted. For obesity, the label is “lazy,” for mental illness it is “crazy,” and for lack of sanitation it is “disgusting.” The book provides an accessible, synthetic, and critical examination of the health effects of shame and stigma, one that was already long overdue when the book was published in 2019. That was before the onset of the current pandemic. The topic is of even more pressing concern now, when the public’s health depends so much on the behavior of individuals. 120
American Scientist, Volume 109
plore ways of bringing about positive change as it is to avoid measures that are counterproductive. Often Brewis and Wutich state their conclusions in strong terms, declaring, for example, that “Shame in all its forms needs to be removed from the public health tool kit, because it too easily misfires.” But then will come a caveat—in this case, “at the very least until the social stigma and longer-term impacts . . . are adequately tracked and addressed.” They recognize handwashing as “a pragmatic, central goal of global public health,” but they want readers to appreciate that shaming people for not washing their hands has a “dark side,” because it can result in the dehumanization of people unable to access soap and water. In addition to exploring the harm done by shame, the authors campaign for a research agenda that is too often neglected: the integration of cultural
social science into public health development projects and into health care institutions more generally. They call for fuller development of stigma epidemiology, which would lead to routine preparation of “stigma impact statements.” They recognize some of the reasons such integration seldom takes place. Funding agencies give preference to projects tackling well-focused problems that can be addressed quickly and have clear, near-term outcomes; idealistic researchers, committed to change, have those same priorities. But if they lack insight into the local culture, researchers and health workers will find it difficult to get a sense of what it might be like to walk in the shoes of the residents even for a day, much less a year or a lifetime. They may instead resort to a simplistic, demeaning response: “That’s bad; don’t be that way.” A book such as this, aimed at the general reader and focused on the harm done by health-shaming, cannot be expected to explore thoroughly the methodological problems involved in measuring that harm. The authors ask, “Is sanitation for all worth the painful, damaging humiliation and rejection of some?” The question is not merely rhetorical, and if stigma assessments are to be integrated into health planning, costs and benefits will need to be measured. As matters of social science methodology, the difficulties in performing such evaluations are formidable—they involve arriving at definitions, choosing means of measurement, agreeing on null hypotheses, controlling confounding factors, and so forth. Similar difficulties arise in assessing the effects of admonitions in individual lives. I am thinking about null hypotheses when I assess my diet and exercise regimen: Even if I have lost no weight, might I have gained more weight without the regimen? Professional ethics are implicated too. One expects truth-telling from a clinician. At my dental checkup, I want the hygienist to chastise me if I am flossing haphazardly. Do I floss more diligently after being chastised? Perhaps. Response to shaming is not an either/or matter. I may respond more positively to one messenger than another, even when the message being delivered is the same. For Brewis and Wutich, shaming and other forms of stigmatization reflect the “powerful moral overtones” that pervade public health, and that fact dismays them. But a moral commitment
has been and remains central to media shape identity, shame a field committed to progressive remains powerful in commuchange; their book is itself an nity interactions. It can be a example. They say surprisingly tool of liberation, sanctionlittle about the differing moral ing the claim for bodily ausituations in the three domains tonomy in the #MeToo movethey have chosen to discuss. ment. And I believe that it can With regard to sanitation, an have value as a tool of pubarea in which behaviors affect lic health: “You ought to be the transmission of infectious ashamed” is a correct response disease, a case can be made for to people who refuse to wear admonitions, but that is not so a mask during a pandemic, in the realm of mental illness— asserting their right to exhale there can be no warrant for viruses into others’ faces. heaping shame on people who As Brewis and Wutich admay already see their lives in mit, it is difficult to destigterms of what they are not able matize health practices. Nevto be. The authors imply that ertheless, they do offer some obesity is, in principle, subject options for promoting health to personal choice in the same that avoid creating stigma. Inway that hygienic behavior stead of using stigma to keep is; but they go far to represent people with contagious disobesity as intractable, and they eases at a distance, you can emphasize the difficulty of losgive them paid sick days that ing weight. They maintain that allow them to stay home from antiobesity efforts that “aren’t work, or you can put them in overtly blaming or shaming” hospital isolation units. You may nevertheless be objectioncan carefully craft the language able because those efforts “fully of public health messaging to embrace the idea of individual eliminate stigmatization. You responsibility in ways that likely can teach providers how to use reinforce stigmatizing beliefs.” empathy to improve providerIf, as that statement implies, “in“Defecating in the open makes you the talk of the whole vil- patient communication and dividual responsibility” does not lage. The stench is . . . Yuck! Ugh! Flies follow everywhere you rapport. And with input from affect health, then admonitions go.” So says this Indonesian poster that aims to improve sanita- anthropologists, you can inare clearly moot. However, in tion by shaming people into building and using toilets. From dividually tailor global health many cases (in lifestyle-related Lazy, Crazy, and Disgusting. programming for each local diabetes, for instance) individual community. Obesity can be responsibility clearly does play a role. The insights of both of these theorists addressed through structural changes, And if we jettison admonition, are more ambiguous and unsettling such as making cities more walkable, what’s left? The authors tend to see than Brewis and Wutich acknowledge. eliminating food deserts, and addresshealth education (on the risks of tobac- They agree with Douglas’s hypothesis ing poverty. co, fast food, sugary drinks, and the that “hygiene stigma” is about mainLazy, Crazy, and Disgusting often pits like) as “soft stigma.” In focusing on taining “statuses and boundaries” rath- the individual against social forces. what to avoid, they overlook deeper er than infection-avoidance, but they An alternative focus on humanization questions of how health-related behav- stop short of recognizing, as Douglas within communities might reveal that iors and sensibilities are shaped. The did, how important those statuses are. in a setting where autonomy is valued, social science their book presents is For Douglas, internalizing a sense of admonition and judgment could act mainly empirical, consisting of sur- shame was concomitant with com- to transform complacency and fatalveys relating feelings of shame to vari- munity life and helped communities ism into realistic aspiration and greater ous variables. The chief theorists on avoid unresolvable conflicts. Brewis self-esteem. whom Brewis and Wutich rely are the and Wutich cite Goffman’s stark deAs a historian of public health influanthropologist Mary Douglas and so- scriptions of the routine dehumaniza- enced by Goffman and Douglas, I am ciologist Erving Goffman, who pro- tion of patients in asylums, but they ig- struck by some cases in which shamduced half a century ago work that nore his exploration of the power roles ing has worked. Over time, it proved remains seminal, particularly for shed- that can transform an empathetic and effective in antispitting campaigns for ding light on shame, a term often used stigma-deploring trainee attendant into controlling tuberculosis. A “soft stigma” in a way that conflates the expression someone whose professional identity regarding diet and lifestyle followed of admonition with the dehumanizing is predicated on the dehumanization publication of the results of the Framabjection and hopelessness that may that results from the separation of “us” ingham Heart Study, and it improved result from admonition. Those are two from “them.” heart health. Shaming played an imporseparate things, and it is the latter that Can and should we hope to escape tant role in the great sanitary revolution is the real concern here. shame? Even in a world where global of the 19th century, when people came www.americanscientist.org
2021
March–April
121
to envy and want to emulate those who had running water and toilets. The success of that revolution required not just adequate finance, effective technology, and strong leadership, but also realistic aspirations to have better and healthier lives. Brewis and Wutich recognize, but in my view do not sufficiently emphasize, the great extent to which social and economic circumstances determine whether admonitions are demeaning or constructive. Social epidemiologists following in the footsteps of Richard G. Wilkinson and Ichiro Kawachi have
demonstrated that the stress and helplessness that accompany even the perception of inequality are themselves detrimental to health. When I follow my dog, plastic bag in hand, shame is my guide. I am following Immanuel Kant’s categorical imperative of acting as I wish all would act. I am reinforcing community. Yet the guarantor of my self-esteem and respectability is ultimately that bag. And it is only by accident of circumstance that I need not use it for my own wastes, as residents of many shantytowns around
the world must do. Like me, they are seeking cleanliness and embodying community by doing so. Christopher Hamlin is a professor of history at the University of Notre Dame and Honorary Professor at the London School of Hygiene and Tropical Medicine. He is the author of “The History and Development of Public Health in Developed Countries,” in the Oxford Textbook of Global Public Health, edited by Roger Detels, Martin Gulliford, Quarraisha Abdool Karim, and Chorh Chuan Tan (6th ed., 2015); More Than Hot: A Short History of Fever (Johns Hopkins University Press, 2014); and Cholera: The Biography (Oxford University Press, 2009).
book excerpt from Black Hole Survival Guide by Janna Levin
Abiding Darkness Janna Levin In her new book, Black Hole Survival Guide, as a way of making the science of black holes more comprehensible, astrophysicist Janna Levin uses them as a “fantasy scape” for thought experiments, encouraging the reader to imagine viscerally the experience of a black hole encounter.
B
lack holes were unverified for decades, unaccepted for decades, absurd, maligned and denied by some great geniuses of the 20th century, until physical evidence of real black holes in the galaxy was discovered. Find one just a few thousand light years away—a light year being the distance light can travel in a year, nearly 10 trillion kilometers, the distance you would travel driving at the average highway speed limit for 10 million years. Take a left at that yellow star and veer toward that star cluster. Wandering at the base of the sky, we are under them. We are above them. Black holes in their abiding darkness are scattered plentifully among the stars, which themselves are scattered plentifully, like somber glitter infiltrating the void. We are in orbit around one in the center of our Milky Way galaxy. We are pulled toward another in the Andromeda galaxy. I want to influence your perception of black holes, to shuck away the husk a bit, get closer to their darkest selves, to marvel at their peculiarity and their prodigious character. We can take a road less traveled, follow a series of simple observations that
122
American Scientist, Volume 109
culminate in an intuitive impression of the objects of our attention, which are not objects at all, not things in the conventional sense. . . . Weightlessness and Free Fall Black holes are much maligned, depicted unfairly as behemoths when they are often benign and actually by nature quite small. Still, before you travel, you should do your research and consider the hazards. The perils, in fairness, are exceptional and unmatched if you are not careful. As with nature untamed here on Earth, black holes demand respect for safe navigation. After all, you are a trespasser on their territory. Black holes were the unwanted product of the plasticity of space and time, grotesque and extreme deformations, grim instabilities. Honestly, they’re not such an anathema to scientists now. Black holes are a gift, both physically and theoretically. They are detectable on the farthest reaches of the observable universe. They anchor galaxies, providing a center for our own galactic pinwheel and possibly every other island of stars. And theoretically, they provide a laboratory for the exploration
of the farthest reaches of the mind. Black holes are the ideal fantasy scape on which to play out thought experiments that target the core truths about the cosmos. When in pursuit of a black hole, you are not looking for a material object. A black hole can masquerade as an object, but it is really a place, a place in space and time. Better: A black hole is a spacetime. Imagine an empty universe. You have never seen or experienced such a pristine place, a vast nothingness that is the same everywhere— vast and bare. And flat—still threedimensional, but everywhere flat. Here is the sense in which an empty universe is a flat space: If you were in flat space you would float on a straight line. Despite the fact that you are not falling in the colloquial sense, free motion is called free fall. You are in free fall as long as you do not fire any rockets, get pulled or pushed—essentially as long as there is only gravity. Just surrender to space. If free motion can be traced by straight lines, and lines that begin parallel never cross, then the geometry of space is flat. Chances are surpassingly bad that you are in free fall right now. Chances are also surpassingly bad that you are in a flat space, since there is no such place anywhere in our galaxy. Sit on a chair and you are not in free fall. The chair pushes on you to stop your fall. Stand on a floor and you are not in free fall. The floor pushes against your feet to prevent your plummet to street level. Lying in bed we feel heavy. We say gravity pulls us down.
But we have it all wrong. Totally inverted. What you feel is not gravity but rather the atoms in the mattress pushing against your atoms. If only the bed would get out of your way, and the floor, and all the lower floors, you would fall, and falling is the purest uninterrupted experience of gravity. Only in the fight against gravity do you feel its pull, an inertia, a resistance, a heaviness. Give in to gravity, and the feeling of a force disappears. The classic setting for the idea of free fall is an elevator. You are high up in an apartment building in an elevator. You feel a force against your feet. That force is between your feet and the elevator floor and keeps you in the cab. It’s a force between matter. Now, if you are interested in pure gravity, without the interference of interactions between matter, you must get rid of the elevator cab somehow. So you enlist someone to cut the cable. The elevator falls and you with it. During the descent, since you and the floor fall at the same rate, you would float in the elevator. You don’t fall to the floor, because the floor is falling too. You can push off the walls and tumble in the air. You seem weightless, as though you were an astronaut in the space station. You can pour water out of your water bottle and drink the droplets from the air, like astronauts do. You can release a pen, a phone, a rock in front of you, and these float too. Einstein called this profoundly simple observation—that we experience weightlessness when we fall—the happiest thought of his life. The spoiler is that your atoms interact with the atoms in the Earth’s surface, and that would ensure an unhappy end to your free fall when you hit the ground. But that’s not gravity’s fault. The shattering of your bones would be due to other forces, like forces between atoms. (If you were made of dark matter, you could fall right through the Earth’s crust and sail on down.) We can’t run the elevator experiment for long before the Earth gets in the way. So instead imagine you float far from the Earth, far from the Milky Way. Imagine a fictitious empty space, except for you and your space suit. If you sent tracers, threw a projectile in each of the three spatial directions, those pro-
www.americanscientist.org
jectiles would free-fall. Now imagine each projectile leaves a helpful trail illuminating its path; soon, a grid of straight lines would be visible. You could see plainly that space was flat—free-falling objects follow straight lines—and that space was empty, except for you and your space suit and your tracers and the luminous trails charting the grid. Gravity is so weak that none of these little pieces have a noticeable impact on the flat emptiness of it all.
and watch the arc it traces. The projectile will not travel in a straight line. The path paints a curve in space, an arc. All the objects we throw follow curved paths toward the Earth. We could travel around the globe throwing projectiles and all objects we throw from international couches will bow to the ground. We could document the results and draw a threedimensional grid of the curved paths and thereby construct a map of the shape of the space around the Earth.
Black holes are a gift, both physically and theoretically. They anchor galaxies, and they provide a laboratory for the exploration of the farthest reaches of the mind. The universe is not empty. We are very aware that we are bound to the Earth. The Earth is bound to the Sun and the Sun to the Milky Way galaxy. The Milky Way is bound to the neighboring galaxy Andromeda, both residing in the Virgo supercluster of galaxies. And the Virgo supercluster senses all the other galaxies and all the accumulated energy in our observable universe. So we don’t live in a flat, empty spacetime. Astronauts also don’t float in empty space. They can see the Earth spin and the Sun roll along. They are falling and weightless, but on a path we’ve been accustomed to calling an orbit, an orbit around the Earth in orbit around the Sun in a glacially long orbit around the galaxy. Their paths aren’t straight. Their paths are curved into a circle around the Earth sewn into the circle around the Sun sewn into the path around the galaxy, because free-fall paths are curved when the sky isn’t empty. Because space is curved by the presence of matter and energy. Curved Space You can prove that you live in a curved space and not a flat space from the comfort of your couch by throwing things. Throw something
The lesson: The Earth deforms the shape of space. And you can map that shape by drawing free-fall paths. Free fall depends on the speed at which you toss something around the planet. Drop a wrench to the Earth and it follows a line straight down. Throw the wrench across the room, and the descent is along an arc, the same arc a car would descend along if thrown at the same speed and in the same direction as the wrench. Throw the wrench faster and the arc gets longer. Throw the wrench fast enough and it will clear the curve of the Earth and launch into orbit. Throw the wrench faster still and it will float away from the Earth forever until caught on the curve of another celestial object, like Jupiter or the Sun, and tumble on a different path. Planets fall around the Sun. No engine pushes them. They trace ovals, always clearing the atmosphere. The Earth falls freely around the Sun, and the Moon falls freely around the Earth. We put lots of human-made objects into orbits, which are just freefall paths. Once the launched spacecraft get where they need to be, the engines are turned off and they can fall forever in orbit around the Sun or, more commonly, the Earth. Some
2021
March–April
123
mission specialists fight against the inclusion of thrusters in the spacecraft designs, in case a decline in funding encourages the space agency to maneuver the satellite out of orbit and incinerate the bounty in the atmosphere. Defunct satellites haunt space, ghostly litter in orbit for the lifetime of the Solar System. The International Space Station (ISS) falls freely around the Earth. The astronauts in the space station float because they are falling like the ill-fated elevator occupants, not because they don’t feel the gravitational effect of the Earth. They do.
can the Earth pull on the Moon, when they are not touching? Einstein did not understand gravity and neither did anyone else, but other scientists of his time either didn’t appreciate the severity of the failing or they didn’t pause to consider the implications. But because he did not understand gravity and he admitted as much, Einstein challenged the most accepted and elementary aspects of reality. Before Einstein, it was customary to consider gravity to be a force of one body acting on another, but a force that mysteriously did not require actual contact. After Einstein,
How can the Earth pull on the Moon without touching it? It doesn’t. It doesn’t pull on the Moon at all. It exerts no force. Instead, the Earth bends space. And the Moon tumbles freely. The station is only a few hundred kilometers high and very much under the Earth’s influence. The gravitational effect is traced by curves in the shape of space, and the ISS follows one such natural curve, an ever-falling circular orbit. The astronauts and the space station travel nearly 28,000 kilometers per hour to complete a full circle every 90 minutes, out of the sunlight and into the Earth’s shadow and out again into the Sun’s radiance. They move fast enough that they always clear the Earth’s atmosphere while falling and thereby never crash into the surface. From Einstein’s happiest thought (we fall in space weightlessly), we deduce with the most ordinary observations (tossed projectiles from the vantage of our Earth-covering couches) that free-fall paths are curves in space. Gravitation is curved spacetime. And that insight was Einstein’s greatest. Einstein’s unabashed devotion to simplicity shows in the childlike wondrousness of his unembellished, spare thought experiments. He knew he did not understand standardissue grade-school gravity—how
124
American Scientist, Volume 109
the language changed. Gravity was cast as a curved spacetime. How can the Earth pull on the Moon without touching it? It doesn’t. It doesn’t pull on the Moon at all. It exerts no force. Instead, the Earth bends space. And the Moon tumbles freely. Black Holes Are a Space Far from a black hole, the curves in space are the same species as those around the Sun or the Moon or the Earth. If the Sun were replaced by a black hole tomorrow, our orbit would be unchanged. The curve we would fall along around a black-hole sun is nearly identical to the curve we fall along around the actual Sun. Of course, the perpetual dusk would be apocalyptically cold and dark. But our orbit would be just fine. The Earth is on average about 150 million kilometers from the Sun, which is about 1.4 million kilometers across. In comparison, a black hole the mass of the Sun would be 6 kilometers across. You can approach much closer to a black hole and remain unharmed than you can to the Sun, despite a black hole’s reputation for voraciously consuming all and sun-
dry. You only really notice a radical difference between the space around a black hole and the space around the Sun when you get within a few hundred kilometers of the center of each. And you can’t get that close to the Sun without incinerating. Bore inside the solar plasma and the gravitational pull of the Sun eases off. As you approach the center, you leave behind some of the Sun’s mass. The curving of space inside the Sun’s atmosphere becomes more gradual as the mass beneath you diminishes. By contrast, no matter how close you approach to a black hole, the source never diminishes. The curving only gets sharper. Black holes are special because you leave none of the black hole’s mass behind you; it is as though all of the mass is concentrated ahead of you. Always. You can approach infinitesimally close to the center of a black hole and still feel all that mass in front of you. Keep a safe distance from an unobtrusive black hole and you will neither be torn apart nor sucked up. Black holes just are not the catastrophic engines of destruction they’re portrayed to be, at least not until you veer recklessly close, not until you cross the point of no return, and then admittedly circumstances can get harrowing. Even if you approach boldly close, within several widths of the black hole, you could set up your space station, shut off the engines, fall freely in a stable orbit that takes mere hours to complete, and enjoy the scenery for as long as supplies last. Excerpted from Black Hole Survival Guide, by Janna Levin. Copyright © 2020 by Janna Levin. Excerpted by permission of Alfred A. Knopf, a division of Penguin Random House LLC. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher. Janna Levin is a professor of physics and astronomy at Barnard College of Columbia University. She is also director of sciences at Pioneer Works, a center for arts and sciences in Brooklyn. Her previous books include Black Hole Blues and Other Songs from Outer Space, How the Universe Got Its Spots, and a novel, A Madman Dreams of Turing Machines, which won the PEN/Robert W. Bingham Prize.
March–April 2021
Volume 30 Number 02
Sigma Xi Today A NEWSLETTER OF SIGMA XI, THE SCIENTIFIC RESEARCH HONOR SOCIETY
Important Upcoming Dates for Sigma Xi
From the President Wonder Women
SPRING NOMINATIONS: Who will you nominate for Sigma Xi membership this spring? Membership requires two nominators. Sigma Xi staff can help you find a second nominator if you need one. www.sigmaxi.org/become-a-member. MARCH 1: The Society offers subsidies for chapters in need of financial assistance to host a virtual or in-person visit by a Sigma Xi Distinguished Lecturer. The deadline for Distinguished Lectureships subsidy applications is March 1 at 11:59 pm Pacific time for lecturer visits that will take place between July 1, 2021, and June 30, 2022. Visit www.sigmaxi.org/lectureships to see a list of the most recent cohort of distinguished lecturers. MARCH 15: Undergraduate and graduate students may apply by March 15 for research funding from the Grants in Aid of Research (GIAR) program. For more information, visit www.sigmaxi.org/giar. MARCH 26: Students in high school through graduate school may register and submit an abstract by 11:59 pm Pacific time on March 26 to compete in the online Student Research Showcase in April. For more information, visit www. sigmaxi.org/srs. APRIL 26: Online judging begins for the Student Research Showcase, including a discussion period between students and judges. Judging concludes on May 10, 2021.
Sigma Xi Today is managed by Amia Butler and designed by Chao Hui Tu.
www.americanscientist.org
This issue of American Scientist appears during Women’s History Month. As I submit this letter near the end of December 2020, we have just elected the first female vice president of the United States. We will also soon benefit from the results of the work of two female scientists—we will have two vaccines for COVID-19 that are direct results of research conducted by Dr. Kizzmekia Corbett and Dr. Katalin Karikó. This does not diminish the contributions of all those involved in the vaccine, but during Women’s History Month, I think the contributions of these women is something to be celebrated. The story of both of these women is one of perseverance. Dr. Corbett was born in rural North Carolina and attended college on a Meyerhoff scholarship at the University of Maryland, Baltimore County. Her research into virology resulted directly from summer internships at a variety of labs. Dr. Corbett’s scientific foundations benefited from the investment by the National Institutes of Health (NIH) and the National Science Foundation (NSF) into summer research programs for undergraduate and high school students. This training and her subsequent postbaccalaureate and postdoctoral training at the National Institute of Allergy and Infectious Diseases (NIAID) positioned her to participate on the team that developed the Moderna COVID-19 vaccine. Foundational to the Moderna vaccine is the work of Dr. Karikó. Her interest in messenger RNA (mRNA) began when she was working in Hungary in the field of vaccines. After immigrating to the United States, she continued to work on this novel approach to vaccine development, though it was believed at the time to provoke an inflammatory response and was not considered a promising approach. In fact, it has been reported that her promotion to professor at a prominent university was denied for this reason. Nevertheless, she persisted. She continued to conduct experiments, and later, with her collaborator Drew Weissman, she solved the problem with mRNA. Her mRNA research has now led to the development of the Pfizer vaccine and the foundations for the Moderna vaccine. Here at Sigma Xi, we are committed to equity and inclusion in the scientific pursuit. We celebrate the national investment in the research enterprise, and we invest in research through our Grants in Aid of Research (GIAR) program. We will continue to showcase the contributions of all members of our scientific community, especially those who have been historically under represented. Kudos to these great women scientists, and to all the women whose shoulders they stand upon. The end of 2020 saw a prominent newspaper columnist criticizing Dr. Jill Biden’s right to be called “doctor” because she has a doctorate degree, but not an MD. Suffice it to say that the research behind the COVID-19 vaccines was conducted by two female PhDs. Thank you, Dr. Corbett and Dr. Karikó!
Sonya T. Smith
2021 March–April 125
LEADERSHIP
Sigma Xi Election Results Members of Sigma Xi, The Scientific Research Honor Society, elected their peers to leadership roles in an online election held November 9–22, 2020. Sigma Xi thanks all members who voted or volunteered to run as candidates. Many of the newly elected volunteer leaders will begin their term on July 1, 2021. Nicholas A. Peppas was elected to serve a three-year term comprised of a year each as Sigma Xi president-elect, president, and past-president. President-elect designee Peppas, a professor at The University of Texas at Austin, is a researcher in biomaterials, drug delivery, and chemical/biomedical engineering with 137,000 citations (H=173). Inducted into Sigma Xi in 1973, he was the Purdue University Chapter president and awarded the Sigma Xi Monie A. Ferst Award. He is a member of the National Academy of Engineering, National Academy of Medicine, National Academy of Inventors, and American Academy of Arts and Sciences, plus European, Chinese, Canadian, French, Spanish, and Greek academies. He is deputy editor of Science Advances and past president of groups such as the Society for Biomaterials. Other newly elected directors, associate directors, and representatives are listed below with their Sigma Xi chapter affiliations. Directors and associate directors serve three-year terms, starting July 1, 2021. Committee on Nominations representatives serve three-year terms, starting immediately after the election. Director: Baccalaureate Colleges Constituency Pamela K. Kerrigan College of Mount Saint Vincent Chapter Director: Canadian/International Constituency Jennifer Patterson Swiss Chapter Director: Northwest Region HollyAnn Harris Omaha Chapter Director: Southeast Region Dana A. Baum Saint Louis University Chapter Associate Director: North Central Region Kelly E. Crowe University of Cincinnati Chapter
Associate Director: Southwest Region Alli M. Antar Rice University-Texas Medical Center Chapter Associate Director: Membership-at-Large Constituency Heather A. Arnett Membership-at-Large Associate Director: Research & Doctoral Universities Constituency Peizhen Kathy Lu Virginia Tech Chapter Committee on Nominations: Mid-Atlantic Region Representative Francis C. Dane Radford University Chapter
Nicholas A. Peppas
Committee on Nominations: Northeast Region Representative Julie B. Ealy Columbia University Chapter Committee on Nominations: Area Groups, Industries, State & Federal Laboratories Constituency Representative Bruce A. Fowler Centers for Disease Control and Prevention Chapter Committee on Nominations: Comprehensive Colleges & Universities Constituency Representative Andrea Ashley-Oyewole Prairie View A&M University Chapter
2019–2020 Sigma Xi Chapter Awards The 2019–2020 Sigma Xi Chapter Award winners were announced at the 2020 virtual Assembly of Delegates on November 5. Finalists are chosen by regional and constituency directors based on information in the chapters’ Annual Reports, and winners are selected by the Committee on Qualifications and Membership.
Chapter of Excellence Award Winners The Chapter of Excellence Awards are bestowed on chapters for exceptional chapter activity, innovative programming, and true community leadership. First Place Boise State University Second Place Greenbrier Valley Third Place (Tie) University of Cincinnati Woods Hole Calgary 126 Sigma Xi Today
Chapter Program of Excellence Award Winners The Chapter Program of Excellence Awards are bestowed on chapters that organized or hosted a single outstanding program. First Place Northeastern University for Think Like a Scientist, a science outreach program for students in grades 4–8 Second Place Kansas City for Rendering the Invisible Visible: Student Success in Exclusive Excellence STEM Environments, a Distinguished Lectureship event with Dr. Robbin Chapman
Fiscal Year 2020 Top-Electing Chapters Overall Top-Electing Chapters Brown University Fordham University Princeton Top-Electing Chapters by Constituency Area Groups, Industries, State & Federal Laboratories Constituency Group Delta Vermont Baccalaureate Colleges Constituency Group Swarthmore College Carleton College
Third Place Eckerd College for Psychedelics: Out of the Ancient and into 2020, a lecture by Dr. David Nichols
Comprehensive Colleges & Universities Constituency Group Saint Joseph’s University Fairfield University
Honorable Mentions University of Alabama at Birmingham for Alabama’s Worst Natural Disaster, Wetumpka Impact Crater
Research & Doctoral Universities Constituency Group Brown University Fordham University
University of New Mexico for Science & Society Distinguished Public Talks Series
Canadian/International Constituency Group Calgary University of Alberta and Avalon (Tie)
Vassar College for Science in Your Life Lectures
PROGRAMS
Grants in Aid of Research Recipient Profile: Deidra Jordan Grant: $750 (Spring 2019) Education level at time of the grant: PhD student Project description: Jordan’s project uses metagenomics to study microorganisms in the soil and surrounding plants— essentially the soil microbiome—and the effects of elevated salinity and nutrients on those communities. This project’s soil samples come from two-year mesocosm studies examining freshwater and brackish water sites within the Everglades. Saltwater intrusion is a significant issue in the Everglades due to the impacts of sea-level rise. Since microbes are the first responders to environmental changes and provide numerous benefits to the soil, it is essential to understand how microbes respond to these changing conditions. Monitoring shifts in microbial communities also provides insight into the
soil’s health. The second aspect of Jordan’s project applies the knowledge gained from forensic science to complement and enhance intelligence gathering for geographic provenance. How has the project influenced her as a scientist? Jordan says, “It’s essential to understand that every experiment may not come out as expected, which is part of the process.” Working through the research process and collaborating with other scientists on this project have influenced her to be a more astute scientist. Where is she now? Inducted into Sigma Xi in 2019, Jordan is currently a PhD candidate in biology at Florida International University. She is expected to graduate in spring 2022.
Support Student Researchers: Be a Judge for the Student Research Showcase What motivated you when you were a student? Who encouraged you to pursue your goals? How did your mentors inspire you?
You can support the research careers of current high school, undergraduate, and graduate students by volunteering as a judge for the 2021 Sigma Xi Student Research Showcase. This online showcase is a unique opportunity for students to develop effective science communications skills and tailor their presentations to different audiences. Scientists who can successfully communicate the importance of their research to a broad audience are better equipped to secure financial and social support for their work. Participating students build a website containing an abstract, a slideshow, and a short video about their research. From the comfort of their homes or offices, judges will evaluate students’ presentations and leave questions and comments on their websites, from April 26 to May 10, 2021. Judges will evaluate presentations
Sigma Xi 2021 Student Research Showcase
www.americanscientist.org
based on communication, scientific thought, and scientific method. Judges will then collectively select division winners, who will receive up to $1,000 in monetary awards and a nomination for associate membership in Sigma Xi. Qualified Sigma Xi members can sign up as judges by completing the volunteer form at www.sigmaxi.org/judge21. Nonstudent members are qualified if they have experience in the research field that they are judging. Those who can’t judge may still follow the competition and leave comments on participants’ websites at www. sigmaxi.org/srs. The more constructive feedback students receive, the more they will learn. After judging ends, the public will vote for the People’s Choice Award; that student winner will receive an additional $250.
Learn more at www.sigmaxi.org/srs
2021 March–April 127
EVENTS
Virtual STEM Art and Film Festival Showcased Science–Art Collaborations The 2020 STEM Art and Film Festival, held virtually on November 8, celebrated the visual and performing arts as tools that help teach the public about science, technology, engineering, and math. Featuring more than 40 works of art and more than 20 films, the public festival was the final event of the virtual 2020 Annual Meeting and Student Research Conference held by Sigma Xi, The Scientific Research Honor Society. Juniper Harrower, founder and director of SymbioArtlab, walked away with the festival’s top two honors: Best Artwork and Best Film. Harrower ’s installation, The Joshua Tree Soil series, was inspired by scientific findings from her ecological research. To create her
art, Harrower integrates elements of her research such as Joshua tree fibers, seed oil, and the spines of trees as paint brushes. Data is also a key component in Harrower’s work—she uses intricate algorithms from her research to establish all the mechanisms of her artwork. Harrower’s film, A Joshua Tree Love Story, is a stop-motion short film that follows a female scientist as she investigates the effects of climate change on the ecology of Joshua trees and their life cycle. The festival featured artwork and films presented by winners of #SciCommMake, a new competition hosted by Sigma Xi and Science Talk. The program forms collaborations between
scientists and artists to create projects that communicate evidence-based scientific research findings to the public. The agenda also included a panel discussion with the creators of the As Above As Below art exhibit. This collection features six collaborative multimedia exhibits created by teams of artists, astrophysicists, and neuroscientists. This panel highlighted two of the exhibits and discussed different aspects of each collaboration. In addition to the special events, attendees of the virtual STEM Art and Film Festival enjoyed screenings of short films, documentaries, and animations. Judges selected the following artwork and films to receive the 2020 awards.
Film Awards
Artwork Awards
Best Film Overall A Joshua Tree Love Story submitted by Juniper Harrower
Best Artwork Overall The Joshua Tree Soil series submitted by Juniper Harrower
Best Documentary The Missfits submitted by Ellie Wen
Honorable Mention Chaos at the Crest submitted by Philip Schein
Best Short Film Origami in STEM submitted by Claire Morton
Distinguished Recognition Zoom in on STEAM submitted by April Bartnick
Best Animation Cellular Senescence submitted by Nitya Ayyagari
People’s Choice Award Forest of Life submitted by Julia Miao
People’s Choice Award COVID-19 Doesn’t Care submitted by Jeffrey Toney and Stephanie Ishack
Best Performing Arts Stethophone submitted by Anika Fuloria and Barbara Nerness
Art and Film Competition Judges Sigma Xi thanks the following judges for viewing and evaluating each film and artwork submission. Film Judges Clare Gibson Filmmaker Allegorical Alchemy Kirsten (Kiki) Sanford Principal Broader Impacts Productions, LLC Myrna Jacobson Meyers Assistant Research Professor of Biological Sciences University of Southern California 128 Sigma Xi Today
Art Judges David Goodsell Professor of Computational Biology The Scripps Research Institute Research Professor RCSB Protein Data Bank Kimberly Moss Associate Professor of Art and Visual Culture Iowa State University
Fenella Saunders Director of Science Communications and Publications Editor-in-Chief of American Scientist Sigma Xi, The Scientific Research Honor Society Esther Mallouh Art Liaison Keen on Art
You’re Invited
Roots to Fruits: Responsible Research for a Flourishing Humanity +RZVFLHQWL¿FYLUWXHV serve society
November 4–7, 2021 Meeting location: Conference & Event Center Niagara Falls, New York* Lodging: Sheraton Niagara Falls Registration opens April 1, 2021 at www.sigmaxi.org/amsrc21 *The plan for an in-person Annual Meeting to take place in Niagara Falls, New York is subject to change based on federal and state recommendations concerning travel and public events. Updates will be posted on the event website: : www.sigmaxi.org/amsrc21.