240 71 3MB
English Pages 122 [124] Year 2017
Bettina Bock von Wülfingen (ed.)
TRACES GENERATING WHAT WAS THERE
Bettina Bock von Wülfingen (ed.)
TRACES GENER ATING WHAT WAS THERE
This publication was made possible by the Image Knowledge Gestaltung. An Interdisciplinary Laboratory Cluster of Excellence at the Humboldt-Universität zu Berlin (sponsor number EXC 1027/1) with financial support from the German Research Foundation as a part of the Excellence Initiative.
A German language edition is also available: Bettina Bock von Wülfingen (ed.): Spuren. Erzeugung des Dagewesenen, Berlin/Boston 2017, Bildwelten des Wissens 13 (ISBN 978-3-11-047650-7)
Copy-editing Rainer Hörmann, Jim Baker Typesetting and design Andreas Eberlein, Berlin Printing and binding DZA Druckerei zu Altenburg GmbH, Altenburg ISBN 978-3-11-053478-8 e-ISBN (PDF) 978-3-11-053506-8 e-ISBN (EPUB) 978-3-11-053483-2 © 2017 Walter De Gruyter GmbH Berlin/Boston www.degruyter.com This publication, including all parts thereof, is legally protected by copyright. Any use, exploitation or commercialization outside the narrow limits set by copyright legislation, without the publisher’s consent, is illegal and liable to prosecution. This applies in particular to photostat reproduction, copying, scanning or duplication of any kind, translation, preparation of microfilms, electronic data processing, and storage such as making this publication available on Internet.
7 EDITORIAL 11 IMAGE DESCRIPTION Kathrin Friedrich Layers of Operation. Lars Leksell’s Neurosurgical Planning Image
15 John A. Nyakatura Description, Experiment, and Model. Reading Traces in Paleobiological Research Exemplified by a Morpho-functional Analysis
IMAGE DESCRIPTION 29 Kathrin M. Amelung, Thomas Stach Visualizing Viruses. Notes on David S. Goodsell’s Scientific Illustrations and Their Use in Molecular Biology between Picture Model and Trace
35 Dieter G. Weiss, Günther Jirikowski, Stefanie Reichelt Microscopic Imaging. Interference, Intervention, Objectivity
55 Soraya de Chadarevian “It is not enough, in order to understand the Book of Nature, to turn over the pages looking at the pictures. Painful though it may be, it will be necessary to learn to read the text.” Visual Evidence in the Life Sciences, c. 1960
65 Bettina Bock von Wülfingen Giving a Theory a Material Body. Staining Technique and the “Autarchy of the Nucleus” since 1876
INTERVIEW 75 Traces and Patterns. Pictures of Interferences and Collisions in the Physics Lab. A Dialogue between Dr. Anne Dippel and Dr. Lukas Mairhofer
89 Barbara Orland Liquid or Globular? On the History of Gestalt-seeing in the Life Sciences of the Early 19th Century
99 Marietta Kesting Traces of Bodies and Operational Portraits. On the Construction of Pictorial Evidence
111 Sophia Kunze Reduced Complexity or Essentialism? Medical Knowledge and “Reading Traces” in the History of Art
IMAGE CREDITS 121
123 AUTHORS
EDITORIAL
On April 28, 1906, a patient named Auguste D. died in the Clinic for Lunatics and Epileptics in Frankfurt am Main. The physician Alois Alzheimer had been closely following the course of her illness since she was admitted in 1901; although he had left the clinic, he was immediately informed of her death. Her confused behavior and speech seemed paradigmatic to him for his idea that a confused state did not necessarily have to be an expression of syphilis in all the individuals who were referred to him. In this case, Alzheimer combined patient observation with histological findings and publications. After Auguste D.’s death, her brain was sent from Frankfurt to Alzheimer’s new place of work, the Royal Psychiatric Clinic in Munich. He made sections of the cerebral cortex that had already been fixed in alcohol in Frankfurt, which he then treated with different methods of staining. The stains and preparations further fixated the material, temporally storing the time of Auguste D.’s death. Through these stains, numerous plaques and fibrils became visible in and between the nerve cells: The color chemicals adhered to them, causing light-refracting densities to develop in the tissue. These were visible with a magnifying glass and a backlight. Such condensations didn’t emerge – using the same coloring method – in the brain material of a deceased person not suffering from this type of dementia. Alzheimer described them as changes in the cerebral tissue and considered them the cause of Auguste D.’s confused behavior and speech, which could be described as typical for her case and similar cases. Shortly afterwards, he published the case history together with the findings as a specific illness that, at the suggestion of his superior, hospital director Emil Kraepelin, would later bear the name “Alzheimer’s Disease.” This is not the end of the story of the traces of Auguste D.’s dementia: In December 1995, the long-sought medical report on Auguste D. was rediscovered in the basement of the now modern clinic in Frankfurt, and two years later the original preparations were recovered in Munich by the neuropathologists Mehraein and Graeber. There was great interest in whether the categorization of Auguste D.’s case as an Alzheimer’s case – the case from which the disease takes its name – would stand up to today’s criteria. In addition, a few years after Konrad Kujau forged The Hitler Diaries, the question arose as to whether these were the original preparations made by Alzheimer on the specimen slides.
8
Editorial
1: Alzheimer’s preparation “Deter,” numbering by Graeber.
In order to prove that these were indeed the long-sought preparations more than a hundred years after their production as trace carriers, Graeber had the police apply a technique from the area of criminology to verify the age of the ink used on the glass of the slides. Experts also compared the signature on Alzheimer’s handwritten CV, which could be clearly attributed to him, with the writing on the slides. This additional trace production and trace reading demonstrated that it must have been Alzheimer himself who left these particular traces. Alzheimer’s preparations are still temporal evidence even a century after their production, albeit in a different way than in the cut preparation. This had previously been a picture that could be experienced three-dimensionally with a height-adjustable microscope, but now a molecule in the preparation, which had not been significant in Alzheimer’s trace production, became the center of interest: the DNA of the former patient. To obtain the DNA, preparations were removed from the slides; the image was destroyed, and the DNA was amplified and analyzed using chemically elaborate, now robotized, methods. Computer-generated images and printouts presented the gene sequence as letters and graphs.
Editorial
9
The history of the traces of Alzheimer’s disease in Auguste D.’s brain preparations has continued for more than one hundred years and reveals – despite all the differences in the technical approaches – consistencies in the foundations of the various procedures: Traces hold time as a form. Then as now, it is not actually the traces that are sought, but rather a reference to something that had been there. However, the reference remains: Not what is sought is revealed, but an event related in a particular way to the desired one – the adhesion of color molecules, the signing of a CV, the electrical reaction of certain amino acids as the constituents of the patient’s DNA. Examining trace production shows that the traces first convince the researchers themselves, but also a particular audience. Temporal storage is also required to present the trace of what was previously seen. The persistent absence of what was sought shows the specific epistemic continuity of many traces read and produced in the laboratory during the last century. In microscopic images and in the particle accelerator CERN, which can be described as a large microscope, maps are produced with the aid of traces, which give us orientation. At least if the maps are good, because in that case the traces replace the actually sought, which eventually fades into oblivion. Bettina Bock von Wülfingen
IMAGE DESCRIPTION
Kathrin Friedrich
Layers of Operation Lars Leksell’s Neurosurgical Planning Image This planning image for a neurosurgical procedure was published in 1971 by the Swedish neurosurgeon Lars Leksell in his book Stereotaxis and Radiosurgery. An Operative System (fig. 1). Alongside the technical and clinical bases for radiosurgical and minimally invasive operations on the brain, such as for the treatment of tumors, Leksell describes in particular the use of a stereotactic frame that he developed himself in conjunction with imaging procedures and visual planning. In the image here, various material elements, processes, and intentions created “a basis of evidence” the decoding of which would be used in the final therapeutic intervention on the patient’s body. The image suggests the superimposition of several layers. The undermost layer of the whole image, the grayscale background, was created through pneumoencephalographic imaging of a human skull. This X-ray method is prohibited today. Cerebrospinal fluid would have been taken from the patient shortly before or during the radiographic imaging process via a tap in the lower back and replaced with air. Before the pneumoencephalographic visualization could yield information on the patient’s condition, he or she had to be prepared in line with the conditions of the imaging process such that a contextually significant image could be produced. The air would collect in the hollow spaces of the brain, such as parts of the ventricular system, as indicated here by the darker shadows in the center of this image. It was thought that the distribution of air would enable the identification of malformations in the cavities themselves or in the surrounding brain tissue. Leksell’s Figure 26A appears in the section entitled “Targets that cannot be visualized.” 1 The actual target of the neurosurgical intervention could not be shown by means of radiography; positions could merely be deduced from referential points. It is only by establishing aesthetic and epistemic relationships between the distribution of contrasting shading and typical neuroanatomic visualizations, between normal and pathological forms of ventricles, that assumptions can be made regarding 1 Lars Leksell: Stereotaxis and Radiosurgery. An Operative System, Springfield, Illinois: Thomas, 1971, pp. 37–41.
12
Image Description
supposed changes in the surrounding areas. The actual target of the intervention could not be made visible through X-ray techniques as they became established a few years before computed tomography was introduced in clinics. On another layer, the image shows a similarly dramatic form of “preparation of a model,” which is revealed with and through the image. Along the vertical edges and the bottom of the image, faint measurement scales can be seen; these stand out against the diffuse background due to their ordered, geometric nature. Similarly, the viewer can distinguish two whitish blocks at the bottom corners and a structure midway down the right edge of the image. Since they appear as white shapes, it can be assumed that they refer to physical objects that are formed of a very dense, homogeneous material, which the X-rays could only penetrate to a limited degree. A metal frame had been screwed onto the head of the patient undergoing the pneumoencephalography; this would remain in place until the final neurosurgical operation. Using this kind of stereotactic frame, a Cartesian system of coordinates could be constructed around the skull and viewed in the X-ray image. The system of coordinates would serve both to plan the surgical procedure and to control the implementation of the plan on the patient during the operation. The bolt and surround seen in the image clearly show that the frame encompasses not just an epistemic and temporal interconnection of bodies, coordinates, visualization, and intervention, but also a physical correspondence. Yet another layer was imposed on Leksell’s planning image in the form of a transparent film. On it, the annotation of the normal morphology of the ventricular system and anatomical lines serve as guidelines for the comparison with the patient in question. Coordinates and angles for the later procedure can be read off and marked. Leksell had devised a mathematical and geometrical process for calculating intervention points in the form of a diagram with finely drawn spirals and concentric circles extending over almost the entire image. Using this diagram in conjunction with the stereotactic frame, and drawing on other calculation formulae, angles were devised for the subsequent insertion of instruments and electrodes into the inner brain. Calculation results could be noted on the planning image in corresponding spaces and thus be transferred to the surgical situation as an “operation programme.”2 Hence the “electrode angles” noted at the top left of the image serve intraoperatively as guides for the placement of instruments in the stereotactic frame, enabling the insertion of electrodes into the brain through holes
2 Leksell (s. fn. 1), p. 16.
Image Description
13
1: Neurosurgical planning image by Lars Leksell.
drilled in the cranium. Then the surgeon targeted thermal impulses on a lesion in a specific area of the brain, the aim being to manipulate cerebral functions and thereby treat conditions such as schizophrenia, for example. Labelling and numbering in this image also assume a particular status, which applies to the entire ensemble: The various components and layers of the visualization point to both a current situation and to the treatment to be performed, to the preliminary preparation and to its transfer or translation into a subsequent therapeutic intervention. This type of planning image can be produced only by amalgamating temporal, material, aesthetic, and epistemic relationships. It exists not as a singular unique object but as a central step in a process built of relations that are constantly being reformed, but also must be differentiated. This neurosurgical planning image programmatically unifies instructions for surgical action that it seems to coordinate as much as the stereotactic frame on the patient’s head does. The layered traces of past intervention and of the absent patient, are extended into the traces of surgical planning in order ultimately to intervene in the body and its condition.
John A. Nyakatura
Description, Experiment, and Model Reading Traces in Paleobiological Research Exemplified by a Morpho-functional Analysis “[O]ur approach is to exploit motion to illuminate morphology in extant forms so that morphology can help to illuminate motion in extinct ones.” 1 The epistemic activity of reading traces can be attributed to an archaic practice that is based on the human capability to infer something that is absent from something that is present. 2 This activity is closely linked to the term “trace,” which has been the subject of controversy in the literature. In her introductory text, Sybille Krämer concisely compiles the constituent attributes of traces.3 From her list, the following aspects appear to be the most important for the discussion here: whatever has caused the trace is now absent; traces left behind are unintentional; traces are not inherently active but originate passively from external activity; traces are dependent on being read by someone, and finally, traces are produced through interpretation. Traces are therefore characterized by inherent latency.4 This quality of traces requires a directed attentiveness from the trace-reading person, who is initially in a state of incertitude and ambiguity.5 This last aspect is also underscored by Werner Kogge, who lists trying and doubting, changing one’s perspective, putting something to the test, and/or putting something into a new context as typical practices
1 Steven M. Gatesy, David B. Baier: Skeletons in Motion: An Animator’s Perspective on Vertebrate Evolution. In: Kenneth P. Dial, Neil Shubin, Elizabeth L. Brainerd (eds.): Great Transformations in Vertebrate Evolution, Chicago: University of Chicago Press, 2015, pp. 303–316. 2 Louis Liebenberg: The Art of Tracking. The Origin of Science, Claremont, Cape Town: D Philip, 1990, pp. 1–187. 3 Sybille Krämer: Was also ist eine Spur? Und worin besteht ihre epistemische Rolle? Eine Bestandsaufnahme. In: Sybille Krämer, Werner Kogge, Gernot Grube (eds.): Spur. Spurenlesen als Orientierungstechnik und Wissenskunst, Frankfurt am Main: Suhrkamp, 2007. 4 See also the summary of a workshop that focused on the epistemic characteristics of traces by Johanna Sackel: Tagungsbericht. Workshop: Spur – Zur Belastbarkeit eines epistemologischen Modells. H-Soz-u-Kult, January 9, 2014, http:// www.hsozkult.de/conferencereport/id/tagungsberichte-5170, acc. 04–2016. 5 Krämer (s. fn. 3), pp. 11–33. In contrast to Krämer, Kogge sees reading traces not as an activity that is directed a priori, but rather as one that directs itself. Werner Kogge: Spurenlesen als epistemischer Grundbegriff: Das Beispiel der Molekularbiologie. In: Sybille Krämer, Werner Kogge, Gernot Grube (eds.): Spur. Spurenlesen als Orientierungstechnik und Wissenskunst, Frankfurt am Main: Suhrkamp, 2007, pp. 182–221.
16
John A. Nyakatura
in reading traces.6 In particular, actively engaging with the material und actively searching for traces appear to distinguish the reading of traces from other activities.7 A brief working definition of the epistemic activity of reading traces for this article might therefore be: Reading traces denotes an interpretive process characterized by an active search and enabled by direct engagement with the material. Only the directed activity of reading traces allows them to become apparent to their reader due to traces’ latency. Reviewing this tentative working definition, the following discussion presents an attempt to identify the role of reading traces in paleobiological research based on a specific research project. Challenges and practices in paleobiological research
Paleobiology is concerned with the biological investigation of extinct organisms. In contrast to analyses of modern animals, morpho-functional analyses of extinct animals in paleobiological research are confronted with a specific set of problems and challenges: On the one hand, fossils preserve only part of the information on the once-living organism. Many tissues such as skin and muscle are usually not preserved. Moreover, processes such as an animal’s movements in certain behaviors can no longer be observed. On the other hand, the fossil record8 is naturally incomplete and biased.9 In summery, only scant information is available on the specimens studied, other species of their time, and their habitats. To address these challenges, diverse research approaches are integrated in contemporary paleobiological research. Functional analyses in particular apply a wide spectrum of methods from diverse research fields including, but not limited to, anatomy and (bone) histology, physiology, behavioral biology, biomechanics, and engineering. Modern animals are frequently used as proxies10 in order to study aspects 6 Kogge (s. fn. 5), pp. 182–221. 7 Kogge (s. fn. 5), pp. 182–221. 8 The fossil record is the sum of the scientifically described fossils taking into account the necessary information for age determination (stratigraphy). 9 Fossilization is critically dependent on climatic factors, among others. This results in an unbalanced representation of organisms in the fossil record. Another reason for the bias in the fossil record is the highly unbalanced distribution of fossil sites (there is a bias towards sites in western industrialized countries). 10 A number of concerns about this approach have been expressed in the literature. Among other things, the selection of modern species as models for fossil species is problematic. Witmer proposed a transparent approach. Larry M. Witmer: The Extant Phylogenetic Bracket and the Importance of Reconstructing Soft Tissues in Fossils. In: Jeffrey J. Thomason (ed.): Functional Morphology in Vertebrate Paleontology, Cambridge, New York: Cambridge University Press, 1995, pp. 19–33. In addition, the complexity of musculoskeletal function results in similar structures
Description, Experiment, and Model
17
that cannot be studied in fossils, or specific features of the investigated fossil are modelled. Analyses of extinct animals are therefore supported by diverse but not independent research practices such as description, experimenting, and modelling. Describing the diversity of organismic forms and structures is a fundamental area of inquiry in morphology and hence also in morpho-functional analyses in paleozoology. On the basis of descriptions, morphology aims to identify natural groupings of organisms, to derive explanations of the origins and transformations of organismic structures, und ultimately to determine the relationship of organisms to their environment by analyzing the relatedness of form and function.11 All preserved structures that make up a fossil are potentially informative for such a research question, and it is the researcher’s task to identify, describe, and interpret informative structures. Experiments are conducted whenever the objective is principally reproducible measurements concerning the functioning of a structure as part of a morpho-functional analysis. This calls for a controllable environment that makes it possible to minimize potentially confounding factors and to focus on the specific function. In addition, equipment is needed that, for instance, enables movements, force exertions, muscle activity, etc., to be directly or indirectly recorded. Experiments can be exploratory or test specific hypotheses. These quantitative approaches are called for partly because of the subjective nature of descriptions and in order to facilitate comparisons of (quantifiable) differences that are used as evidence in a functional interpretation. It is only in rare cases that the functioning of structures can be experimentally tested directly using the fossil; hence researchers revert back to the experimental analysis of comparable modern animals or using physical or virtual models as a proxy for the fossil. Models are often constructed on the basis of such experiments. It is very difficult to describe the diversity of models. They encompass abstract mathematical descriptions of specific, usually mechanical properties of an organism in morpho-functional analyses as well as virtual or physical constructions of structures,
does not necessarily imply similar function (cf. Lauder’s highly skeptical opinion with regard to paleobiological research). George V. Lauder: On the Inference of Function from Structure. In: Jeffrey J. Thomason (ed.): Functional Morphology in Vertebrate Paleontology, Cambridge, New York: Cambridge University Press, 1995, pp. 9–18. 11 See Gerhard Scholtz: Versuch einer analytischen Morphologie. In: Bildwelten des Wissens, 9, 2013, 2, pp. 30–44.
18
John A. Nyakatura
body parts, or even entire bodies.12 The common characteristic of all models is that they enable the control of all input parameters.13 If further assumptions are needed, these can be accounted for by systematically testing the influence of individual parameters on the final result.14 Reconstructing the locomotion of an early land-living vertebrate
The following section presents a research project that aims to reconstruct the locomotor mechanics and the functional morphology of the locomotor system of the almost 300-million-year-old early land-living vertebrate Orobates pabsti.15 The figures presented in this article conform to the usual visual presentation of results in the field. The symmetrical, table-like arrangement of the panels signals neutrality towards what is shown and indicates a documenting goal as listing the visual evidence is given a higher priority than aesthetic criteria. The project’s starting point is the exceptionally well-preserved holotype specimen16 recovered from a site in the Thuringian Forest of central Germany and then transferred to the Carnegie Museum of Natural History in Pittsburgh, USA, where it has been scientifically described.17 Moreover, further more or less complete fossils of the same species and fossil trackways assigned to Orobates as the trackmaker are available for the project (fig. 1a+b).18 The correlation between the body fossil and the 12 The spring-mass model for human locomotion and a robotic bat wing to test aerodynamic characteristics can be cited as examples for this spectrum. Reinhard Blickhan: The SpringMass Model for Running and Hopping. In: Journal of Biomechanics, 22, 1989, 11/12, pp. 1217–1227. Joseph W. Bahlman, Sharon M. Swartz, Kenneth S. Breuer: Design and Characterization of a Multi-Articulated Robotic Bat Wing. In: Bioinspiration and Biomimetics, 8, 2013, doi:10.1088/1748–3182/8/1/016009. 13 Philip S. L. Anderson, Jen A. Bright, Pamela G. Gill, Colin Palmer, Emily J. Rayfield: Models in Paleontological Functional Analysis. In: Biology Letters, 2013, doi:10.1098/rsbl.2011.0674. 14 This is achieved in so-called sensitivity analyses. See, for example, Callum F. Ross: Finite Element Analysis in Vertebrate Biomechanics. In: Anatomical Record Part A, 283, 2005, pp. 253–258. 15 The research project involves an interdisciplinary collaboration between biologists, paleontologists, biomechanists, geologists, materials scientists, and mechanical engineers from the US, Great Britain, Switzerland, and Germany. 16 A holotype specimen is the “typical” specimen that is used to scientifically describe a species. It is deposited as a reference in a public collection. 17 David S. Berman, Amy C. Henrici, Richard A. Kissel, Stuart S. Sumida, Thomas Martens: A New Diadectid (Diadectomorpha), Orobates pabsti, from the Early Permian of Central Germany. In: Bulletin of Carnegie Museum of Natural History, 35, 2004, pp. 1–36. 18 Sebastian Voigt et al. were able to correlate the relative lengths of individual fingers and toes preserved in the fossil tracks with those of the body fossil. Sebastian Voigt, David S. Berman, Amy C. Henrici: First Well-Established Track-Trackmaker Association of Paleozoic Tetrapods
Description, Experiment, and Model
19
species’ fossilized tracks in particular present an ideal case for the reconstruction of its locomotion because the fossil tracks preserve information about their maker’s locomotor characteristics. The holotype specimen was only superficially prepared and largely remains within the rock matrix to prevent damage. This circumstance as well as the plastic deformation and fragmentation of the fossil bones due to taphonomic processes prevent an effective analysis of its morphology. In brief and leaving aside the many dead ends and detours, the project can be summarized as follows: In a first step and in collaboration with the Institute of Lightweight Engineering and Polymer Technology at the Technical University Dresden, Germany, microfocus computed tomography (µCT), otherwise used for nondestructive materials testing, was used to derive a virtual, three-dimensional version of the holotype specimen (fig. 1c+d).19 Using direct and indirect indicators (based on the reference material), plastic deformation and fragmentation could be restored (fig. 1e–h). 20 At the same time, for comparison, individuals from four different modern species21 were analyzed following the same protocol: An instrumented trackway within a Plexiglas enclosure was used to measure the forces exerted on the ground by individual limbs (fig. 2a+b). Simultaneously, two synchronized highspeed X-ray cameras recorded the locomotion from two perpendicular projections (fig. 2c+d). This enabled a temporally precise correlation of three-dimensional skeletal motion and ground reaction forces to help characterize the locomotor mechanics. 22 Track making by the modern species was analyzed in relation to the speed of locomotion and the degree of humidity of the substrate in order to decode information on the fossil’s locomotion preserved in the fossil trackways Based on Ichniotherium Trackways and Diadectid Skeletons from the Lower Permian of Germany. In: Journal of Vertebrate Paleontology, 27, 2007, 3, pp. 553–570. 19 For a detailed description of the virtual reconstruction using different animation and product design software packages, see: John A. Nyakatura, Vivian R. Allen, Jonas Lauströer, Amir Andikfar, Marek Danczak, Hans-Jürgen Ullrich, Werner Hufenbach, Thomas Martens, Martin S. Fischer: A Three-Dimensional Skeletal Reconstruction of the Stem Amniote Orobates pabsti (Diadectidae): Analyses of Body Mass, Centre of Mass Position, and Joint Mobility. In: Plos One, 10, 2015, doi: 10.1371/journal.pone.0137284. 20 Nyakatura et al. (s. fn. 19). 21 The selection was based on phylogenetic, ecological, morphological, and pragmatic (limitations of the technical equipment, availability, etc.) criteria. 22 For an example of the characterization of the locomotor mechanics of a modern lizard see: John Nyakatura, Emanuel Andrada, Stefan Curth, Martin S. Fischer: Bridging “Romer’s Gap”: Limb Mechanics of an Extant Belly-Dragging Lizard Inform Debate on Tetrapod Locomotion During the Early Carboniferous. In: Evolutionary Biology, 41, 2014, 2, pp. 175–190.
20
John A. Nyakatura
(fig. 2e+f). 23 Finally, in the modern species used for comparison, the influence of usually not preserved soft tissues, such as skin and muscles, on the mobility of specific joints was studied and compared to the actual movement in these joints during locomotion. 24 This was used as the basis for the functional interpretation of joint mobility in the fossil (fig. 2g+h). General principles of locomotion were identified in the comparative analysis of modern species and can be assumed to apply to the fossil as well. Last but not least, in collaboration with professional science illustrators and computer graphic designers, a virtual whole-body model of the skeletal locomotor system of the fossil was created. The model is animated in such a way that it walks within the constraints of the fossil trackways and in accordance with the identified principles derived from the comparative analysis of modern species. Furthermore, the model enables systematic variation of the locomotion parameters (fig. 3a–e). 25 The model is used to search for parameter combinations that lead neither to collisions of bones nor to a virtual de-articulation of joints. It is thus an instrument that helps to identify plausible solutions to the problem of how to reconstruct the fossil’s locomotor mechanics. The reconstruction of the fossil’s locomotion therefore relies on an integration of morphological description and interpretation, the application of biomechanical locomotion principles derived experimentally from modern reference animals to the fossil, and the insight gained from modelling the fossil. Reading traces in paleobiological research
In his reflections on his circumstantial paradigm, Carlo Ginzburg points out that it is often minute clues in particular that hold the key to a deeper reality.26 In contrast, when analyzing fossil material, all clues that become apparent to the investigator are used in the paleobiological interpretation: the entire spectrum of the gradual 23 See, for example, Stefan Curth, Martin S. Fischer, John A. Nyakatura: Ichnology of an Extant Belly-Dragging Lizard – Analogies to Early Reptile Locomotion? In: Ichnos, 21, 2014, pp. 32–43. 24 See Patrick Arnold, Martin S. Fischer, John A. Nyakatura: Soft Tissue Influence on Ex Vivo Mobility in the Hip of Iguana: Comparison with In Vivo Movement and Its Bearing on Joint Motion of Fossil Sprawling Tetrapods. In: Journal of Anatomy, 225, 2014, 1, pp. 31–41. 25 The model can be understood as a three-dimensional image that originates as a result of the feedback, as described by Cynthia M. Pyle in: Art as Science: Scientific Illustration, 1490–1670 in Drawing, Woodcut and Copper Plate. In: Endeavour, 24, 2000, 2, pp. 69–75. See also Bettina Bock von Wülfingen in this publication. 26 Carlo Ginzburg, Anna Davin: Morelli, Freud and Sherlock Holmes: Clues and Scientific Method. In: History Workshop, 9, 1980, pp. 5–36.
Description, Experiment, and Model
21
transition from obvious to subtle indications. The former include fossil trackways as they are constraints on important locomotor parameters. It is important here to distinguish between analyses of traces of organismal behavior preserved in the fossil record (trace fossils or ichnia) and reading of traces as an epistemic activity, which is the focus of this article. In paleoichnology, a sub-discipline of paleobiology, trackways, individual footprints, burrows, and other biogenic structures are studied. 27 However, trace reading is, of course, also performed as an epistemic activity during paleoichnological research. A trackway offers traces that provide information on stride length, stride width, the orientation of the hands and feet, etc. A criterion for a trace for generating new morpho-functional knowledge is therefore not only which species left behind the trackway, but also how the trackway was left behind so that inferences can be made as to the functioning of the locomotor system. More subtle indicators are derived from the analysis of the body fossil’s joint morphology. The degree of congruence of the joint surfaces that make up a joint enables a rough evaluation of the possible mobility of a joint, but this relationship appears to be far from straightforward. 28 Moreover, and similarly intangible as joint surface information, muscle attachment sites on the bones provide information about the size and line of action of muscles. The identification of useful traces to address a research question is largely determined based on their accessibility and visibility. As a stand-in for the actual fossil material, digital representations of fossil structures that previously could not be prepared and remained within the surrounding rock matrix can now be visualized, analyzed, and potentially described and interpreted. 29 It is not surprising that µCT reconstruction in particular has led to a “digital revolution” in paleozoology.30 Despite these positive technological advances, the reading of traces remains a process of searching and interpreting that is dependent on the subjects conducting the research. To address this problematic subjectivity and for
27 See the introductory textbook by Arno H. Müller: Lehrbuch der Paläozoologie. Band 1: Allgemeine Grundlagen, Jena: G. Fischer (5th ed.) 1992, pp. 1–514. 28 Cf. Arnold et al. (s. fn. 24), pp. 31–41. It has been shown that soft tissues that are not usually preserved in fossils have a significant influence on joint mobility. 29 For Orobates, a dorsal lip on the acetabulum that constrains the abduction of the thigh could be cited as an example here. This structure was hidden in the rock matrix and was only made accessible for analysis by µCT imaging. 30 Digital visualization enables the researcher to look at the object from all possible angles and to superimpose or mask neighboring structures on the screen.
1: Idealized visual presentation of the process of digitally reconstructing a vertebrate fossil. The arrangement and numbering of the panels suggests a step-by-step linear approach. Dead ends and failures in the reconstruction process are not shown. A: Dorsal aspect of the holotype specimen of Orobates pabsti, Diadectidae. B: Fossil trackways were unequivocally assigned to Orobates pabsti as the trackmaker. C: The holotype specimen was scanned using µCT at the Institute of Lightweight Engineering and Polymer Technology at the Technical University Dresden, Germany. D: High-resolution volume rendering of the skull. E+F: Plastic deformation of the skull was corrected using symmetry axes. G: Complete digital skeleton. H: Reconstructing a life-like posture requires further assumptions.
2: The procedure for reconstructing the fossil’s movement. Locomotor characteristics of modern animals can be correlated with the morphological structural specificities that can also be identified in the fossil. A: The biplane high-speed X-ray videography facility at the Institute of Systematic Zoology and Evolutionary Biology at the Friedrich-Schiller Universität Jena, Germany. B: A green iguana in the experimental setup for motion analysis. C+D: Digital bone models are superimposed onto the X-ray shadow (here the lateral projection of a green iguana) of the respective bones in order to visualize and quantify skeletal movements. E+F: The virtual model of the fossil can be manipulated and moved in any way the investigator wishes with 3D software packages. G+H: The model enables the systematic variation of individual locomotion parameters within the constraints of the fossil trackway. Despite the successive arrangement of the panels, the transfer of locomotor principles identified in modern animals (A–D) to fossils (E–H) remains problematic. Here the solution space is explored with the aid of the interactively controllable model.
3: Modelling: A digital model can be manipulated using animation software packages and allows the systematic variation of individual parameters. A+B: Within the software (here Autodesk Maya) every joint can individually be moved. C–E: Fossil trackways are also digitized and constrains the locomotor reconstruction. Using the manipulable model the solution space is explored. Despite superficial similarity to the experimental settings as well as the comparable arrangement of panels, transferring movement principles (fig. 2) to fossils (fig. 3) is problematic.
Description, Experiment, and Model
25
further reasons, quantitative approaches using experimentation and modelling are increasingly being adopted. For paleobiological research, the key advantage of experimenting with modern animals is that characteristics that cannot be examined in the fossil itself can be studied and can subsequently be inferred to apply to the fossil as well (e.g. the presence of certain muscles or the occurrence of a specific behavior). For the reading of traces, it is paramount in experimental settings that the detection of traces, which are latently present within the material – here in the anatomical structures – can be actively prompted. The decoding of form-function relationships in experimental approaches with living animals enables a targeted search for morphological correlates of a function in the fossil material. This means that experiments can help the researcher to become aware of a trace in the sense of the working definition proposed in the introduction to this article. An analyzed function of a structure in modern animals must always be linked to the fossil’s structures to justify the inference. Experimental results therefore become arguments for the interpretation of describable morphological characteristics in the fossil. It can be argued that modelling is qualitatively different to the reading of traces as part of morphological description and the identification of additional traces using experimental observations. Models enable cases that do not occur in nature to be tested. A model can be used to systematically vary an individual parameter until anatomically feasible limitations are violated so as to characterize a solution space step-by-step in a transparent and reproducible manner.31 However, to justify the use of a model, it is necessary to return to the other research practices: A model needs to be validated with experimental data.32 Experiments thus act as a fulcrum because this practice is important for both the reading of traces in the fossil material and for the model. The model is based on the fossil material but becomes disassociated from it as it is no longer used to facilitate the interpretation of the preserved material but is aimed directly at the characteristic being studied in the extinct animal. In conclusion, reading traces as an epistemic activity is undertaken in paleobiological research either through direct observation and description of the 31 A highly illustrative example is provided by Steven M. Gatesy, Martin Bäker, John R. Hutchinson: Constraint-Based Exclusion of Limb Poses for Reconstructing Theropod Dinosaur Locomotion. In: Journal of Vertebrate Paleontology, 29, 2009, 2, pp. 535–544. 32 Anderson et al. (s. fn. 13) and John R. Hutchinson: On the Inference of Function from Structure Using Biomechanical Modelling and Simulation of Extinct Organisms. In: Biology Letters, 2011, doi:10.1098/rsbl.2011.0399.
26
John A. Nyakatura
preserved material, indirectly through experimental approaches, or even more indirectly through modelling. Experiments within the context of functional analyses of modern organisms used as proxies or models used in a similar way always have to be linked to the structures of the fossil material through identified form-function relationships. Modelling involves a change of materiality, especially if the models are virtual. The increasing distance from the preserved material is also evident in the increased use of representations (e.g. the digital reconstruction and selected modern species in the project used as an example here). The degree of engagement with the material consequently decreases when identifying and reading traces. The goal is no longer to improve the interpretability of the fossil material, but to make direct inferences as to the preserved material’s characteristics or the function in question. The working definition of reading traces as an epistemic activity formulated at the beginning of this article should therefore be expanded to include the qualitative leap where a change in materiality occurs from the actual to the virtual.
1: The figure depicts an HIV virus in cross section immersed in blood at a magnification of one million. At this magnification the human eye would be able to detect individual macromolecules.
IMAGE DESCRIPTION
Kathrin M. Amelung, Thomas Stach
Visualizing Viruses Notes on David S. Goodsell’s Scientific Illustrations and Their Use in Molecular Biology between Picture Model and Trace “Imagine that we had some way to look directly at the molecules in a living organism. […] Think of the wonders we could witness firsthand. […] Many of the questions puzzling the current cadre of scientists would be answered at a glance.” 1 In his scientific illustrations, David S. Goodsell, Professor of Molecular Biology at the Scripps Research Institute in La Jolla, California, aims to realize the vision set out in the above quotation: (fig. 1) In the center of the drawing in portrait orientation is a green circle. Originating slightly below the midline, this circle almost completely occupies the upper half of the drawing. At regular intervals, the outline of the circle is interrupted by pale red tree-like structures, which form a link between the interior and exterior. Despite these connecting points, the circle, together with the shapes and structures inside it, contrasts distinctly from its surroundings: The circle encompasses individual light blue, brown, red, and purple shapes, as well as a structure composed of individual elements that resembles a cross-sectioned eggplant in shape and coloration. However, surrounding the circle there are innumerable individual elements, each of which is similar to all the others. These slightly translucent elements colored in warm earth tones overlap each other and fill all the remaining space in the picture. If we turn to the image’s caption, “HIV virus in blood serum, 1,000,000x magnification,” for clarification at this point, the striking difference between the structure of the circle and its surroundings conveyed by the drawing is immediately clarified as “HIV virus” refers to the circle, and “blood serum” to the speckled surroundings. Only when the viewer takes a second look does it become clear – if it does at all – that the seemingly seamless interlacing of visual perception and term is in fact merely a necessary construct of molecular biology in the form of an imaginary
1 David S. Goodsell: The Machinery of Life, New York: Copernicus Books, 2009, p. vii.
30
Image Description
model. This effect is quite intentional as the stated aim of Goodsell’s visualization is to give the viewer an apparently immediate insight into a world beyond normal human perception. Goodsell writes: “[T]he world of molecules is completely invisible. I created […] illustrations […] to help bridge this gulf and allow us to see the molecular structure of cells, if not directly, then in an artistic rendition.” 2 In order to realize his scientific illustrations artistically, Goodsell adopts a cartoon-like3 style. This choice gives him two decisive advantages: The straight and simple forms and the flat color print of a comic make a good visual overview more readily achievable and thereby facilitate faster comprehension of the depicted information. At the same time, the link to cartoons hints at a divergence between the representation and its object. This divergence is rooted in the very conventions of comic style. Just as a cartoon presents a complex story in a single image that is reduced to the essentials, Goodsell’s scientific illustrations cannot be seen as the “thing in itself,” in this case, a representation of blood serum and an HIV virus.4 Recognizing this divergence in the drawing draws the viewer’s attention to the real potential in Goodsell’s scientific illustrations: It is not the presence of the representation that is the decisive factor, but the trace that the drawing establishes to an epistemic object or entity. In fact, “trace” should not be understood as referring to the image model per se, but rather as referring to the allusion present in the image, or one that might be experienced in the image model – an allusion to something not present in the image itself.5 In this context, it is clear that the production of model images presents a particular challenge for increasingly complex knowledge in molecular biology. Goodsell confronts this task in two separate, but successive stages: Firstly, Goodsell
2 Goodsell (s. fn. 1). 3 Goodsell explains the link to cartoons in a video, stating: “So I wanted to come up with a style that’s simple enough that you can see the whole picture. So that’s why I use these flat colors and simple outlines, kind of cartoony stuff.” https://www.youtube.com/watch?v=f0rPXTJzpLE, acc. 05–2016. 4 The problem of referring to a picture model as the “Sache an sich” (“thing in itself ”) occurs in teaching in particular, cf. Alexander Vögtli, Beat Ernst: Wissenschaftliche Bilder. Eine kritische Betrachtung. Basel: Schwabe, 2007, p. 58. 5 Gisela Fehrmann, Erika Linz, Cornelia Eppig-Jäger (eds.): Spuren, Lektüren. Praktiken des Sym bolischen, Munich: Fink, 2005, p. 9.
Image Description
31
2: Part of the scientific process of generating the evidence leading to David S. Goodsell’s integrative watercolor painting. The capsid, a structural protein, was chosen as an example. It has the protein database accession number 3h47. A: Fourier transformation of an electron micrograph of a flattened sphere of a capsid protein preserved in ice. B: Three-dimensional reconstruction of capsid proteins based on their optical density in electron micrographs. C: Ribbon diagram of the protein structure of the capsid protein.
summarizes existing information6 on a particular epistemic object and illustrates this in an unusually complex model such as the HIV illustration. This is not only grounded in a detailed knowledge of molecular biology and a meticulous collection of facts, but is also based on the transformation of this “Spurenlese” 7 (trace reading) into a “readable” 8 image concept. By aiming for “readability”, Goodsell places the image in a particular functional context: Images are not normally read but viewed. By emphasizing readability, Goodsell stresses the function of his images as a reference to something that escapes immediate perception. The fascination and productivity of his images lie in the construction of a complex yet simultaneously clear-cut vividness. This fascination and productivity cannot be experienced beyond his scientific illustrations because outside his illustrations, the extensive information that has been compiled in them only exists in isolation (fig. 2). 6 The information used is derived from different media and research methods, such as texts, graphs, X-ray crystal analysis and electron microscopy. 7 “Reading traces is a complex and laborious process as it is not possible to merely find and read the research object but necessary to produce the very subject by selection.” (Translation by Katrin M. Amelung) Sybille Krämer: Was also ist eine Spur? Und worin besteht ihre epistemologis che Rolle? Eine Bestandsaufnahme. In: Sybille Krämer, Werner Kogge, Gernot Grube (eds.): Spur. Spurenlesen als Orientierungstechnik und Wissenskunst, Frankfurt am Main: Suhrkamp, 2007, pp. 18–19. 8 Here Goodsell speaks of “readable concepts.” David S. Goodsell: Visual Methods from Atoms to Cells. In: Structure, 13, 2009, 3, p. 353.
32
Image Description
At the same time, this reference in the scientific illustrations to the basic information behind them does not operate in only one direction; rather, there is a mutually dependent stabilization of their readability: “The reading of the trace produces […] the very construct that in hindsight is interpreted as a creator of the trace. Thus traces are […] simultaneously a material appeal to, and the product of, transcriptional processes.” 9 With this in mind, if we take a closer look at the illustration of the HIV virus in fig. 1, we see that Goodsell chose a concept for this image that he himself calls a cross-sectional metaphor. This concept allows for the representation of large molecular domains and contexts of entire molecular scenes, as the graphical representation of a cross-sectioned HIV virus immersed in blood plasma is explicitly related to the preparation of objects in light and electron microscopy. In this way, the constructed character of Goodsell’s image, which is exclusively based on disparate information, is blurred in favor of a uniform narrative, thereby enhancing the intended legibility of the image. The colors and forms used play a not unimportant role in this process: The perception of difference between the diverse individual elements (blood) and the compact, closed circle (HIV) is based on both the contrasting effect of the colors used and the distinctness of their forms. While the blood serum is represented in restrained and mutually similar earth tones, the circle and the elements enclosed within it are characterized by strong contrasts in coloration. The selection of forms supports this distinction as it enables the comparison of similar and dissimilar forms, thus ensuring the (re)cognition or differentiation of structures. The choice of forms and colors is similarly rooted in using a uniform style (in this case like a cartoon) and in choosing from a large variety of possibilities only two visualization techniques: watercolor and drawing. This enables the viewer to experience the image as a self-contained composition, which in turns makes it possible to obtain an overview and hence a sense of orientation.10 Secondly, Goodsell interactively links his final scientific illustration; for example, the illustration of the HIV virus is linked to the latest research results 9 Translation by Thomas Stach; Fehrmann et al. (see fn. 5), p. 9. 10 “Orientation does not relate to an entire area of knowledge but to choosing and directing suitable points of view. […] Traces are more than mere landmarks; they are exceptionally attractive clues for one searching for a particular phenomenon. By finding and keeping them, one can return to them so as to link them to other traces.” (Translation by Katrin M. Amelung) Werner Stegmaier: Anhaltspunkte. Spuren zur Orientierung. In: Krämer et.al. (see fn. 7), pp. 86, 91.
Image Description
33
3: Digital tracks. A: Clicking on A on the area marked with an arrow redirects the viewer to the webpage shown in B. B: Watercolor painting of the capsid protein in the online protein database (protein database accession number 3h47). By viewing the hyperlinks step-by-step, the viewer can access additional information and can ultimately access the original scientific publications (e.g. C). In this way, a trail is laid from the painting to the experiment that situates the evidence in an intersubjective reality. C: One of the original scientific publications describing the analysis of molecular structures depicted in Goodsell’s watercolor painting.
34
Image Description
in the RCSB Protein Data Bank (fig. 3).11 In this way, the model image that had previously been created through the transformation of individual insights refers back to the foundations of its production and simultaneously reaches beyond these foundations: The linking of individual structures via hyperlinks, such as the hyperlinks in the scientific illustration of the HIV virus, to the information that these structures are grounded in is not static. If, for example, scientific knowledge on a specific protein, such as the capsid structural protein, changes, this updated information can replace the original information without there being a need to delete the interactive link to the scientific illustration. To summarize, the reading of latent traces in the visible representation of scientific illustrations points to the epistemic entity of molecular biological research. However, this reference is not simply given; instead, it always depends on an operational practice/process as the reference only materializes and is only updated when the trail is followed. Hence each of David S. Goodsell’s scientific illustrations can be interpreted as the intersection of different tracks whose reading(s) leads to the heart of the production of knowledge in molecular biology.
11 The protein data bank is a freely accessible online resource providing 3D structure data on proteins and nucleic acids (www.rcsb.org).
Dieter G. Weiss, Günther Jirikowski, Stefanie Reichelt
Microscopic Imaging Interference, Intervention, Objectivity “But why, it may be asked, should a philosopher care how they [the microscopes] work? Because a correct understanding is necessary to elucidate problems of scientific realism […].” 1 When seeing an object under a microscope, 2 one never sees the object itself, but traces of it created, collected, and combined to form an image by the microscopy process. This can support observations or presentations. The traces made visible serve as evidence in a manifested image, while the object itself remains inaccessible. Today, the creation of such traces is performed with increasingly complex techniques. In this case, “objectivity” means, as we argue in this article, the trained ability to interpret the resulting traces correctly. We introduce here the term epistemic virtue of educated selectivity to describe the corresponding virtue and skills developed by microscopists between 1830 and 1850. Contemporary light microscopes can visualize objects that are not only invisible to the naked eye, but also smaller than the classical limit of resolution (according to Abbe, roughly half the wavelength of the light used, approximately 250 nm at 2,000× magnification, i.e. 2,000-fold).3 As fig. 1 and 2 demonstrate, the techniques developed since the 1980s even provide information on the shape of structures within cells that are smaller than 25 nm. These methods are called super-resolution techniques, and they enable us to exceed the classical resolution limit by a factor of 10. To this end, modern microscopes do not only work solely with optical components such as lenses, optical filters, or grids; instead, these parts are supplemented by electronic camera sensors, amplifiers, digital filters, and photon multipliers. Thus, only with today’s technology could it be shown, as fig. 1 illustrates, that certain proteins in the centrioles of the cell´s nucleus are arranged in the shape of a diamond ring. Similarly it was only discovered by employing 1 Ian Hacking: Do We See Through a Microscope? In: Paul M. Churchland, Clifford A. Hooker (eds.): Images of Science, Chicago: University of Chicago Press, 1985, p.132. 2 The question of whether we “see” with the microscope was discussed by Ian Hacking in: Rep resenting and Intervening. Introductory Topics in the Philosophy of Natural Science. Chapter 11: Microscopes, Cambridge: Cambridge University Press, 1983, pp. 186–209. He conceded that it is legitimate to say that we see with the microscope, albeit a different kind of seeing. 3 One nanometer (nm) is one millionth of a millimeter.
36
Dieter G. Weiss, Günther Jirikowski, Stefanie Reichelt
1: Microscopy beyond the resolution limit. a: Confocal microscopy image of B-cells from human blood. The centrioles that control cell division (green and red) are located close to the nucleus (blue). b: Super-resolution microscopy image using the structured illumination method (SIM). The diameter of the centrioles including the rings is 300 nm.
video microscopy that cell organelles (diameter ~50 nm, shown as round objects) move along microtubules (diameter 25 nm, elongate objects) in living cells (fig. 2). Comparing fig. 2a and b reveals that the microtubules themselves move over the surface. None of these objects would be visible when looking under the microscope with the eye.4 The new methods of light microscopy fundamentally complicate the relationship between the object to be imaged and the visual effects that it produces5 to such an extent that the question of how to evaluate these images epistemically poses itself as a pressing issue. How microscopic images come into being and make the invisible visible is therefore outlined below in three steps. We begin by demonstrating the physical fundamentals of contrast generation in the microscope, which does not occur by simple magnification, but by optical interference of the light diffracted by the object.
4 Robert D. Allen, D.G. Weiss: An Experimental Analysis of the Mechanisms of Fast Axonal Transport in the Squid Giant Axon. In: Harunori Ishikawa, Sadashi Hatano, Hidemi Sato (eds.): Proceedings of the 10th Yamada Conference on Cell Motility at Nagoya, September 11–13, 1984, Tokyo, 1985, pp. 327–333. 5 Cf. the chapter by Amelung & Stach in this volume.
Microscopic Imaging
37
2: AVEC-DIC super-resolution microscopy shows the dynamic behavior of microtubules and synaptic vesicles in the nerve cell. a: Video microscopy following the Allen method, resolving objects down to 150 nm and visualizing objects down to 20 nm; image width 10 µm. b: Same view 40 seconds later.
We then show how dramatically human intervention can influence the process of image generation on all levels, namely by modifying the sample, the microscope, and the way in which images are recorded, and by postprocessing the images. These interventions aim to create images that are specifically designed for certain poses, thus, generating the intended artifacts. We provide an overview of the increasingly sophisticated techniques for modifying contrast and color, which – combined with increasing resolution power – allow us to obtain a greater number of traces. Contemporary microscopy is not restricted to imaging the topography of an object that can be recorded; it is also capable of imaging a multitude of physical and chemical properties as well as physiological processes in the case of living objects. Given that the imaging process in microscopy is influenced on all levels, we discuss in the third section whether objective imaging is possible at all. Very early in the history of microscopy, the increased production of traces made it necessary to distinguish between traces arising from the object itself, and hence representing true images in the sense of intended artifacts, and traces of other origins, the unintended artifacts.
38
Dieter G. Weiss, Günther Jirikowski, Stefanie Reichelt
3: a: The Airy disc of a point light source shows the intensity distribution that is determined by diffraction. b: High-resolution lenses generate Airy discs with small diameters. c: This allows two discs originating from different adjacent point sources to be recognized as separate (resolved) objects.
At this point, we wish to emphasize that here we use the term interference in a strictly optical sense. By contrast, we follow Hacking6 and refer to the influences exerted during the imaging process by the microscopists as interventions. Interference: Image generation and resolving power
Interference describes the optical phenomenon used in light microscopy whereby two superimposed waves give rise to a new wave with a larger or smaller amplitude. In the case of light waves, this difference is perceived by our eyes as contrast. We are familiar with similar phenomena from daily life, for instance, when concentric wave fronts originating from two stones tossed into water interact with each other. Whenever an optical instrument is used in the investigation of an object, the light does not just pass through the optical system so that the lenses create an enlarged image of the specimen. Instead, light waves close to each other are diffracted in different directions when passing the edges of the specimen’s structures. The diffracted and undiffracted waves interfere positively and negatively, thus producing differences in brightness, i.e. contrast. The image of a group of very small point objects in a microscope is therefore not simply an enlarged image of them but that of the sum of the Airy disks. The diffraction pattern that arises from a tiny object is called an Airy disk (after Sir George B. Airy, a 19th-century British astronomer). It consists of a central bundle of undiffracted light surrounded by several rings of diffracted light. In the
6 Hacking (s. fn. 2), Part B, Intervening. p. 147ff.
Microscopic Imaging
39
4: In the microscope, a grid-shaped specimen is first transformed into its diffraction pattern (Fourier spectrum) and subsequently magnified for investigation. The Fourier spectrum in the Fourier plane or back focal plane shows the zero-order maximum (direct light, center) and the first-order maxima (diffracted light, surrounding the center).
nomenclature of diffraction theory, the center is called the zero order maximum, and the rings are called first-order maxima, second-order maxima, etc. (fig. 3) However, before the light waves reach the primary image plane, they pass the back focal plane of the objective lens, also known as the Fourier plane (fig. 4). Light waves diffracted by the object travel different distances and thus spread out with a phase shift relative to each other and to the direct light; they are therefore subject to interference. In the back focal plane of the objective lens, the entire object, in our example a grid, which can be imagined as being composed of a large number of points, is transformed into a two-dimensional diffraction pattern, the so-called Fourier transformation or Fourier spectrum. Despite its completely altered form, it contains all the information (geometry and intensity) that constitutes the image of the specimen.7 In the Fourier spectrum, the distances between all the details of the specimen and their directions are transformed into distance information 7 In X-ray illumination of crystals from minerals or macromolecules, diffraction patterns are created following the same principle (3D Fourier spectra); these contain information on the distances between all the atoms in the crystal. 3D models of the molecules are obtained by back-transformation. See Soraya de Chadarevian in this volume.
40
Dieter G. Weiss, Günther Jirikowski, Stefanie Reichelt
(spatial frequencies). Small distances (high-spatial frequencies) are represented by points that are farther away from the center. The smallest distances resolvable by the optical system are represented in the first-order maxima farthest from the center (fig. 4). Since different wavelengths of light (i.e. colors) are diffracted at different angles, the spectral colors of white light are separated. Direct light (zero-order maximum) and diffracted light (higher-order maxima) will then generate the image by interference. In the image plane, the radius or the diffraction of Airy discs defines the resolving power of the optical system, which means the smallest possible distance between two objects that are still recognizable as separated; in other words, the smallest object that is truly resolved, i.e. depicted in the correct size and shape in comparison to reference objects. The theory of diffraction-limited resolution was developed by Ernst Abbe in 1873, when he discovered that lateral resolution is defined as:
Abbe resolutionx,y ≈ λ/2NA
where λ is the wavelength of the illumination light, and NA is the numerical aperture of the objective lens, a measure of the lens’s capacity to collect light. As a consequence, the size of a depicted point decreases with decreasing wavelength and increasing numerical aperture (fig. 3a+b). For the visible part of the light spectrum (400 nm–800 nm wavelength) and an optimal optical setting of the microscope, the resolution limit according to Abbe lies between 200 nm and 400 nm, approximately half of the wavelength used. The maximal useful magnification that can be achieved with a light microscope is therefore close to 2,000-fold (Abbe magnification). Although larger magnification would generate larger images, it would not reveal more details (so-called empty magnification). The light waves of the Fourier plane diffraction pattern are then projected onto the primary image plane, whereby they are transformed back into a real intermediate image of the specimen (fig. 4). The image is a mosaic of Airy discs too small to be recognized, but together forming the bright and dark areas of the image. The eyepiece focuses the image like a magnifying glass onto the lens of the observer’s eye. In turn, the lens of the eye projects the image onto the retina and thus completes the vision process.8
8 Rudi Rottenfusser: Proper Alignment of the Microscope, In: Methods in Cell Biology, 114, 2013, pp. 43–67.
Microscopic Imaging
41
Intervention
For the scientific investigation of cellular objects or processes, it is important to generate images that serve the purpose of understanding and convincing. Therefore, the microscopist utilizes a broad range of possible methods of intervention that enable the intended modifications of the imaging process on all levels, seeking to obtain contrast-rich images of the specimen detail. Hence, it is evident that any process of microscopic image generation will produce intended artifacts. Anyone using a microscope should be familiar with the possibilities for intervention so as to be able to use them to increase image clarity and expose certain details, but at the same time to avoid introducing distorting influences and prejudice. A person who is unaware of the nature of the possible interventions can easily fall victim to unintended artifacts or misinterpret the images. Modification of the object under study
Staining techniques or the production of histological sections typically require a chemical fixation procedure to be performed on the biological specimen (e.g. with formaldehyde) to prevent the organic material from rotting. For sectioning purposes, the specimens are usually embedded in polymers such as paraffin or resins. These steps can cause a series of unintended artifacts, such as shrinking, swelling, the formation of cracks, but most importantly, all dynamic processes that occur in the living specimen are frozen.9 In histology and pathology, staining reagents are applied; these must absorb light of characteristic wavelengths and, when bound to specific components of the specimen, make the distribution of these components visible by increasing their contrast. Appropriate color filters can be used to block light of certain wavelengths and thereby increase or decrease the visibility of stained components. The variety of optical contrasting methods
Based on knowledge of the physical basis of microscopic imaging, especially the interference of diffracted and non-diffracted light, a variety of techniques that interact with the imaging process were developed in order to generate contrast by employing optical modulators in the front focal plane (condenser plane) or in the back focal plane (Fourier plane) of the objective lens (fig. 4).10 With the exception 9 Cf. the chapters by Bettina Bock von Wülfingen and Barbara Orland in this volume. 10 For a detailed description, see Savile Bradbury, Peter J. Evennett: Contrast Techniques in Light Microscopy. Oxford: BIOS Scientific in association with the Royal Microscopical Society, 1996.
42
Dieter G. Weiss, Günther Jirikowski, Stefanie Reichelt
of brightfield and darkfield microscopy, which follow the same principles as our macroscopic vision in everyday life, these methods create contrast from various physical and chemical properties of the specimen that are otherwise inaccessible to the eye. In brightfield microscopy, contrast is created by the absorption of light by the specimen; in darkfield microscopy, it is created solely by using indirect diffracted light; in interference microscopy, by the thickness of layers; in phase contrast and differential interference contrast (DIC) microscopy, by specimen-dependent retardation of the passing light; in polarization microscopy, by the birefringent properties of the specimen; and in fluorescence microscopy, by the excitation of fluorescent molecules and the emission of fluorescent light. Some examples are shown in fig. 5. Many translucent objects such as cells are phase objects that cannot be visualized with the brightfield or darkfield method because they do not absorb light. However, light that passes through a phase object is retarded, and the associated phase shift gives rise to interference, which can be visualized with phase contrast or differential interference contrast optics.11 Both these contrasting techniques create visible optical artifacts: In phase-contrast microscopy, objects appear with an artificial, bright halo; DIC microscopy creates a pseudo three-dimensional appearance for the objects as opposite sides show bright and dark shades (fig. 2+5d). In order to evaluate DIC images correctly, it must be kept in mind that the image represents a very thin optical section of the specimen, with details above and below the focal plane being excluded. Only by means of artificial shadowing does the object become visible. However, DIC images do not reveal anything about the object’s true height. When publishing such images, dense objects must be depicted with the shadow oriented downward so that they are perceived as convex particles. Fluorescence microscopy is today probably the most important contrasting technique in biology and medicine as it makes it possible to specifically label substances. A fluorescent molecule behaves like a torch in front of a dark background. Labeling with fluorescent dyes coupled to antibodies enables the specific visualization of only one particular species out of thousands of proteins due to the high specificity of the antigen-antibody reaction. In one variety of this technique the cells under study are genetically modified. The gene of the green fluorescent protein (GFP), which was originally extracted from a jellyfish species, can be coupled 11 Frits Zernike: Das Phasenkontrastverfahren bei der mikroskopischen Beobachtung. In: Z. techn. Phys., 16, 1935, pp. 454ff.; Georges Nomarski: Microinterféromètre différentiel à ondes polarisées. In: J. Phys. Radium, 16 Supplement, 1955, pp. 9S–11S.
Microscopic Imaging
43
5: Different images are obtained from the same object, the unicellular alga Micrasterias (size approximately 250 × 220 µm), by employing different contrasting techniques. a: Brightfield contrast (transmitted light). b: Darkfield contrast (scattered light from side illumination). c: Fluorescence of chlorophyll. d: Differential interference contrast (DIC).
to the gene of any protein of interest by genetic engineering. Cells modified in this way as well as all their daughter cells express a fluorescent variant of the target protein, whose distribution can be recorded with fluorescence microscopy. Other fluorescent dyes visualize traces of physiological processes and their dynamics in living cells, such as the intracellular pH, calcium ion concentration,12 or membrane potential. This rapid development allowed microscopes to advance into fields previously restricted to biochemistry and physiology. However, fluorescent specimen can suffer from annoying background glow. Here confocal laser scanning microscopy makes the crucial difference. This method improves resolution by a factor of 1.4 and eliminates the disturbing background glow from fluorescent molecules outside the focal plane. This is achieved by using only a tiny laser spot for illumination and detecting only the light emitted from this point through a pinhole. The point is used to scan the specimen pixel by pixel and layer by layer, creating a stack of virtual images from which a three-dimensional image of the specimen can be calculated. The observer can digitally navigate through these images and obtain a complete understanding of the cellular structures and their arrangement. This method of imaging and three-dimensional reconstruction demands a hitherto unseen quantity of computational interventions.
12 Grzegorz Grynkiewicz, Martin Poenie, and Roger Y. Tsien : A new generation of Ca2+ indicators with greatly improved fluorescence properties. In: .J. Biol. Chemistry, 260, 1985, pp. 3440–3450.
44
Dieter G. Weiss, Günther Jirikowski, Stefanie Reichelt
6: Improved resolution through video microscopy. Resolution and visualization are two very different things, as comparing fig. 6 and 7 shows. The resolution limit is demonstrated by the combined brightness distribution of the Airy discs of closely adjacent bright objects. a: Abbe resolution for the human eye. b: Contrast enhancement for the upper region in (a) by amplifying the signal and re-adjusting the thresholds for black and white. c: The resolution limit for electronic eyes is reached when Airy discs are so close that the difference in contrast is almost but not quite equal to zero (compare b and d). d: Sparrow criterion valid for machine vision.
The resolution limit and super-resolution
When electronic cameras and electronic image editing became available around 1980, new possibilities arose for visualizing additional image details. This led in an unexpected way to surpassing the resolution limit formulated by Abbe and previously considered unsurmountable. Scientists recognized that, although this limit is valid for seeing with the eye, it is not for seeing with electronic cameras. Thus, the realms of super-resolution microscopy were entered. a) Video-enhanced contrast (VEC) microscopy
Robert D. Allen discovered by chance that the Abbe resolution limit is dependent on the human capability to distinguish between different shades of gray. Whilst two closely adjacent objects can only be distinguished by the eye if they are separated by a sufficient contrast difference (fig. 3c+6a), electronic recording of the image enables an impressive enhancement of contrast (fig. 6 d); when the bright and dark thresholds are redefined, this leads to an increase in resolving power (fig. 6c+d). Allen recognized that, with his method of video-enhanced contrast differential interference contrast (AVEC-DIC) microscopy true resolution could not only be increased twofold, but the visualization of objects as much as ten times smaller and their motility were also achieved. In order to evaluate these images correctly, it must be kept in mind that objects below the resolution limit are not depicted to scale but appear inflated by diffraction (fig. 7b+c). They can be
Microscopic Imaging
45
7: Improved visualization of objects below the Abbe resolution limit by video microscopy. a: Intensity distribution for a bright object. b: The diameter of the Airy disc remains constant for very small objects. Only the amplitude decreases. c: However, if the intensity of the upper area is amplified, contrast also increases, making very small objects clearly visible, even though they are not truly resolved as in fig. 6.
depicted for the first time, but in reality they may be ten times smaller, as is the case for the microtubules in fig. 2.13 The publication of the AVEC-DIC method by Allen in 1981 led to a revolution in cell biology because it made live cell imaging possible, thereby enabling the observation of cellular processes at up to 20,000fold magnification without fixing and drying the cells. This range of magnification had up until then been restricted to electron microscopy, which required fixation and dehydration.14 We owe today’s knowledge of cell dynamics, cell motility, and motor enzymes (fig. 2) to the video microscopy technique.15
13 Robert D. Allen, Nina S. Allen, Jeffrey L. Travis: Video-Enhanced Contrast, Differential Interference Contrast (AVEC-DIC) Microscopy: A New Method Capable of Analyzing Microtubule-Related Motility in the Reticulopodial Network of Allogromia laticollaris. In: Cell Motil., 1, 1981, pp. 291–302. 14 Robert D. Allen: New Observations on Cell Architecture and Dynamics by Video-Enhanced Contrast Optical Microscopy. In: Ann. Rev. Biophys. Biophys. Chem., 14, 1985, pp. 265–290; Watt W. Webb: Light Microscopy – A Modern Renaissance. In: Ann. N. Y. Acad. Sci., 483, 1986, pp. 387–391; David Shotton: The Current Renaissance in Light Microscopy. In: Proc. Roy. Microsc. Soc., 22, 1987, pp. 37–44. 15 Tobias Breidenmoser, Fynn Ole Engler, Günther Jirikowski, Michael Pohl, Dieter G. Weiss: Transformation of Scientific Knowledge in Biology: Changes in Our Understanding of the Living Cell through Microscopic Imaging. In: Preprint Series of the Max Planck Institute for the History of Science, Vol. 408, Berlin, 2010, pp. 1–89.
46
Dieter G. Weiss, Günther Jirikowski, Stefanie Reichelt
b) Super-resolution fluorescence microscopy
For roughly 20 years, different opto-electronic techniques have been developed to overcome the resolution limit for fluorescence microscopy as well. Super-resolution methods resolve objects smaller than approximately 200 nm down to 20 nm, the size of large protein complexes. As good review articles on this topic already exist, we will only briefly mention these methods here. The structured illumination microscopy (SIM) method (fig. 1b) exceeds the classical resolution limit by superimposing a fine grid structure onto the unknown specimen and reconstructing the image. Stimulated emission depletion (STED) microscopy achieves a resolution of less than 20 nm by laser-optically decreasing the diameter of the illumination spot in a confocal laser-scanning microscope. The highest resolution can be achieved with photo-activated localization microscopy (PALM) or stochastic optical reconstruction microscopy (STORM). With these stochastic techniques single fluorescent molecules can be distingushed by switching the photon emission from these molecules off and on or by separating them according to their photon blinking pattern. 16 Objectivity Selectivity and the region-of-interest problem
Selectivity plays an important role, starting from the physiological process of visual perception between the observer’s retina and his brain.17 In his essay on the relationship between vision and perception, Wolf Singer demonstrated the physiological problems associated with the mental uptake of optical information.18 It is surprising how little aware we are of the autonomy of our visual system and the large extent to which pre-existing information merges into the perception of that which we consider to be true. Our brain prefers to offer us information that is already familiar from experience, or that it considers striking or important, so as to make visual perception particularly efficient and informative. This particular kind of subjectivity is developed in early childhood and influences the interpre 16 Lothar Schermelleh, Rainer Heintzmann, Heinrich Leonhardt: A Guide to Super-Resolution Fluorescence Microscopy. In: J. Cell Biol., 190, 2010, pp. 167–175; Siegfried Weisenburger, Vahid Sandoghdar: Light Microscopy: An Ongoing Contemporary Revolution. In: Contemp. Phys., 56, 2015, pp. 123–143. 17 Chistoph von Campenhausen: Die Sinne des Menschen, Stuttgart: Thieme, 1993. 18 Wolf Singer: Das Bild in uns, vom Bild zur Wahrnehmung. In: Hubert Burda, Christa Maar (eds.), Iconic Turn. Die neue Macht der Bilder, Cologne: DuMont-Literatur-und-Kunst-Verlag, 2004, pp. 56–76.
Microscopic Imaging
47
tation of optical information, but by doing so, it improves visual orientation in the macroscopic world. Microscopists striving for objectivity must therefore keep this in mind and always ask themselves if their brain might have perceived an image intuitively, in other words, subjectively. In order to observe complex situations responsibly and in depth, we need to consider that the brain may suppress information that it considers seemingly unimportant, for instance, if it does not correspond to the objects we are looking for or our hypotheses about them.19 It seems that we need a hypothesis to avoid overlooking things, but simultaneously we must try not to overlook things that are (as yet) unrelated to our hypothesis. Objective imaging begins with an unbiased selection of a representative region of the specimen under investigation. The importance of this step increases as the magnification of the microscope increases. One cubic centimeter of tissue would serve for five to ten million thin sections for electron microscopy, each of which would yield roughly 1,000 different images. This implies that all the laboratories in the world combined may to date only have imaged in detail a few cubic centimeters of biological material with this technique. Due to the high number of images of one single cell that can be taken at high magnification with light microscopes, too, microscopic imaging requires a working hypothesis, concepts, and prior knowledge. These must serve as the basis for an educated selection of the proper region of the specimen to be studied and from which images that are informative, representative, and therefore true are to be obtained. Random or automated recording of images would be useless for answering scientific questions. When evaluating the reliability of a scientific image, microscopists must therefore take into consideration how our brain handles visual information just as much as the methodical circumstances under which the image was generated. Theory-driven or biased?
Early in the history of microscopy, microscopists became aware of the above issues and realized that they needed a concept in order to identify the objects of investigation and select the regions that were appropriate for investigation in a particular context. Otherwise they would get lost in the endless world of the microcosm. Thus theory-driven and object-related selective visual attention was considered necessary. At the same time, they attempted to avoid overlooking objects or processes 19 Herbert Hagendorf, Josef Krummenacher, Hermann-Joseph Müller, Torsten Schubert: Wahr nehmung und Aufmerksamkeit. Chapter 15: Selektive Aufmerksamkeit, Heidelberg: Springer, 2011, pp. 179–201.
48
Dieter G. Weiss, Günther Jirikowski, Stefanie Reichelt
that appear unrelated to the working hypothesis, but also to recognize and filter out aspects that were the result of preoccupations. With this in mind, Henry Baker (1698–1774) admonished his fellow microscopists: “When you look through the microscope, shake off all prejudice, nor harbour any favourite opinions; for, iff [sic!] you do, ´tis not unlikely fancy will betray you into error, and make you see what you wish to see.” 20 Microscopic history contains numerous examples where microscopists failed in the balancing act between too little of a working hypothesis, which led them to overlook important information, and focus too much on a working hypothesis, which led to the overinterpretation of image information and the confirmation of their pet hypotheses due to bias. 21 Educated selectivity: The virtue of objectivity in classical microscopy
If an image has been created through these multiple interventions in the specimen, the optical and electronic imaging procedures, the question is raised if it can be objective at all. Lorraine Daston and Peter Galison have presented an in-depth analysis of the question of the objectivity of epistemic, predominantly macroscopic images in scientific atlases and catalogues. 22 They describe the basic types of epistemic virtues that were proposed for the imaging process and their successive emergence over the past centuries. The first of these virtues, truth-to-nature, emerged in the early 18th century. Generalization and idealization were considered legitimate means to depict true images. This virtue was followed by the virtue of mechanical objectivity between the 1830s and 1930s, according to which it was considered appropriate to let nature speak for itself. Automated imaging methods, such as photography, were called for to reduce human influence on the images as much as possible. The first third of the 20th century then saw growing unease with mechanical reproduction, and the epistemic virtue of trained judgment emerged. 23
20 Henry Baker: The Microscope Made Easy, 2nd ed., London: Dodsley, Cooper and Cuff, 1743, p. 62. 21 For examples, see Breidenmoser et al. (s. fn. 15). 22 Lorraine Daston, Peter Galison: Objectivity, New York: Zone Books, 2007, p. 501. 23 Daston, Galison (s. fn. 22), pp. 309–361.
Microscopic Imaging
49
If we apply Carlo Ginzburg’s metaphor of reading traces, 24 which is cited extensively in the cultural history of images, to the microscopic imaging process, we can say that, in microscopy, the traces of the object created by physical and chemical techniques need to be collected, read, and evaluated in order to approximate an objective image. While Ginzburg’s methodological paradigm aims to include all usable traces, our focus here is on the identification and educated exclusion of irrelevant traces in microscopy. To this end, the leading microscopists of the time, Pieter Harting, Hermann Schacht, Lionel S. Beale, Leopold Dippel, Carl Wilhelm von Nägeli and others, developed their own rules between 1830 and 1850. These rules were collected in their handbooks on microscopy (first editions 1850, 1851, 1857, 1867, and 1867) as a counter to the dominant virtue of mechanical objectivity and with the intention to replace it. They argued that the specimen should be thoroughly investigated under the microscope and then recorded in a drawing. 25 The drawings should restrict themselves to those aspects stemming from the specimen itself while all other aspects, although visible, should be excluded as unwanted artifacts based on experience and prior knowledge. The following five demands were advanced for the generation of objective microscopic images: 1. As seeing with the microscope differs fundamentally from everyday vision, special training in microscopic vision was called for. This would allow investigators to obtain a three-dimensional understanding of the specimen from the two-dimensional optical planes seen in the microscope. It should also enable them to recognize optical and chemical artifacts, deceptions, and image components originating from outside the focal plane, and to judge whether the selected part of the specimen is representative. 26
24 Carlo Ginzburg, Anna Davin: Morelli, Freud and Sherlock Holmes: Clues and Scientific Method. In: History Workshop, 9, 1980, pp. 5–36. 25 Jutta Schickore: Fixierung mikroskopischer Beobachtungen. Zeichnung, Dauerpräparat, Mikrofotografie. In: Peter Geimer (ed.): Ordnungen der Sichtbarkeit, Fotografie in Wissenschaft, Kunst und Technologie, Frankfurt am Main: Suhrkamp, 2002, pp. 285–310. 26 Leopold Dippel: Das Mikroskop und seine Anwendung, Erster Theil, 1st. ed., Braunschweig: Vieweg, 1867, pp. 305ff. “Eigenthümlichkeit des mikroskopischen Sehens”; Carl Nägeli, Simon Schwendener: Das Mikroskop. Theorie und Anwendung desselben. 2nd ed., Leipzig: Engelmann, 1877, pp. 167–168 and pp. 188‑247.
50
Dieter G. Weiss, Günther Jirikowski, Stefanie Reichelt
2. Before the drawings were made, the specimen should be explored and the results combined with existing knowledge. The specimen should be investigated under a variety of well-chosen conditions (different kinds of illumination, fixation techniques, staining methods, contrasting methods, focus planes, etc.) to achieve a comprehensive understanding of it. 27 3. Only aspects that were recognized as parts of the specimen as a result of the investigation should be recorded in the drawing, whilst all others should be eliminated by active exclusion. This method of educated selectivity made it possible to avoid unintended artifacts, optical illusions, and fallacies. Idealization and abstraction in the sense of Daston and Galison’s virtue truth-to-nature were rejected as improper. The microscopists of the time proclaimed that truth-to-nature (Naturtreue) could only be achieved by drawings created according to their rules. 28 4. The elimination of particular image information required the creation of hand drawings, which were later transferred to copper or stone engravings. 29 In contrast to scientific fields dealing with macroscopic objects, microscopists for the large part rejected mechanical objectivity through photography since this does not eliminate, with the consequence that it depicts both useful and useless information in the same manner, instead of producing a true image of the specimen.30 Harting wrote in 1866 on photography: “It is this excessive faithfulness that renders such images not only unclear, but also untrue.” 31 5. It should be the microscopist’s responsibility to provide the observer with an understandable image that results from scientific examination of the specimen and contains all relevant traces, avoids irrelevant ones, and 27 Pieter Harting: Das Mikroskop. Theorie, Gebrauch, Geschichte und gegenwärtiger Zustand dessel ben. Zweiter Band: Gebrauch des Mikroskopes und Behandlung mikroskopischer Objecte, 2nd. ed., Braunschweig: Vieweg und Sohn, 1866, pp. 14–15; Dippel (s. fn. 26), pp. 306ff. “Methoden der mikroskopischen Beobachtung”; Lionel S. Beale: How to Work with the Microscope, London: Harrison, 1868, pp. 187ff. 28 Dippel (s. fn. 26), p. 306, pp. 308–317; Harting (s. fn. 27), p. 276; Beale (s. fn. 27), pp. 191–195; Schacht: Das Mikroskop und seine Anwendung, 3rd. ed., Berlin: Müller, 1862, p. 273. 29 Dippel (see n. 26), pp. 456–468; Harting (s. fn. 27), pp. 276ff.; Beale (s. fn. 27), p.33; Schacht (s. fn. 28), pp. 267 and 272. 30 Harting (s. fn. 27), p. 276, pp. 281–294. 31 Harting (s. fn. 27), p. 276. Translation by the authors.
Microscopic Imaging
51
spares the observer the trouble of having to investigate the object of interest himself, which may not be accessible to him.32 These requirements clearly show that the microscopic community as early as 150 years ago had thoroughly dealt with the question of how objective images are to be obtained and had developed their own epistemic virtue to this end. The practice that they established also accounts for the fact that only a very small number of regions must be selected from the immense amount of possible views at high magnification. Furthermore, the depiction should be based on the careful choice of appropriate interventions for imaging and a selective drawing process. For this image practice, we want to introduce, as noted above, the new term virtue of educated selectivity. The virtue of educated selectivity today?
While light microscopy is undergoing an ongoing revolution in which new technologies are emerging at ever-increasing frequencies,33 the pressing question remains as to which epistemic virtue modern microscopy might follow. To our knowledge, written rules do not exist; neither are epistemic virtues a much-discussed subject among professional microscopists today. However, it is evident that essential aspects of educated selectivity are followed today very effectively, even though this is in part only because of tradition. Nowadays, microscopic images are taken either with analog video cameras or digital CCD or CMOS chip cameras, and recorded pixel by pixel. Images have thus become intensity maps that result from the transformation of photons into electrons and back again, from electrons to the photons that we observe on screen. Drawing is no longer popular, but the possibilities presented by image processing allow us to follow a large part of the traditionally required imaging directions in digital form. The focusing through the object plane by plane that was called for in order to achieve a complete understanding of the object and the elimination of e. g. fluorescence light from out of focus planes are achieved today by confocal microscopy. The elimination of image components that do not belong to the specimen itself is solved today with the help of video microscopy by using digital background subtraction in a particularly elegant way (fig. 8). First, an image is recorded just 32 Dippel (s. fn. 26), p. 458; Schacht (s. fn. 28), p.272: “[…] was […] Zeichnung sein soll: ein getreues Bild der Natur, aber keine subjektive Vorstellung”. 33 Weisenburger, Sandoghar (s. fn. 16).
52
Dieter G. Weiss, Günther Jirikowski, Stefanie Reichelt
below the focal plane that contains only the unintended artifacts, such as dust particles and uneven illumination (fig. 8c). This background image is stored and then subtracted from all incoming pictures from the camera (fig. 8b) so that images containing only the detail of the object itself result (fig. 8d). In this way, even the weakest traces of an object concealed by disturbing artifacts can be made visible. It is often assumed that images recorded on silver or paper film were in principle true traces of the specimen, whereas digitally recorded or processed images should be considered as rarely true because they are digitally edited in most cases and frequently subject to alterations. Anyone familiar with darkroom practices knows how utterly wrong this is because the falsification or elimination of image components can be practiced in the darkroom, too. All other microscopic methods are similarly subject to interventions and can thus be misused to contaminate seemingly objective images. This attitude of general suspicion is typical for those who still adhere to mechanical objectivity and doubt that image processing will ever be applied responsibly in microscopy. Digital techniques are in fact vitally important to microscopy precisely because, if used responsibly, they can visualize traces that would otherwise remain invisible or shrouded by unwanted artifacts. In contrast to digitally generated images, digitally distorted images are commonly excluded from publications. As part of their peer-review process, high-ranking journals require researchers to disclose the editing steps performed and to submit their images additionally in the unedited (RAW) format.34 Contrast enhancement is permitted for improved clarity, whereas falsification by adding or removing only particular details or postprocessing only certain areas of the image is considered fraudulent, and such images are excluded from publication. All of the physical, chemical, biological, analog, and digital steps in microscopic image generation are applied as intended interventions, turning images of the object into intended artifacts ab initio. Whether the images are true representations or are influenced by the desire to prove false theories is subject to careful evaluation by a responsible scientific community. Validation is a necessary prerequisite for ensuring the objectivity of microscopic images and the longevity of the conclusions drawn from them. Validation is achieved by testing the compatibility of the images with observations from other scientific fields, by independent biophysical or biochemical methods, by parallel investigations with alternative microscopic methods, and especially in the case of super-resolution techniques, 34 www.councilscienceeditors.org/resource-library/editorial-policies/white-paper-on-publicationethics/3–4-digital-images-and-misconduct, acc. 09–2016.
Microscopic Imaging
53
8: Elimination of all image details that do not belong to the specimen by video microscopy using a completely transparent thin section of fixed striated muscle tissue as an example. a: In-focus image with very low contrast as seen in the eyepiece of the microscope. b: Electronic contrast enhancement of the same specimen as seen on a video screen. Dust and fibers disturb the image; the specimen is barely visible along the right margin. c: The background image recorded in a focal plane outside the specimen shows only the unintended artifacts. d: Subtraction of the background image selectively reveals the muscle specimen (image width 15 µm).
by correlative electron microscopy. In addition, electronic communication media make epistemic images immediately accessible worldwide so that they are subject to evaluation by thousands of peers, thus reducing the danger of deceiving the scientific community with incorrectly produced or misinterpreted images. This is a further advantage of the digital era for microscopy. However, the further we proceed into the previously invisible microscopic world, the greater the distance to the actual physical object is,35 and the greater the need for prior knowledge of imaging methods in order to understand the meaning of the images correctly. While microscopists themselves may follow the epistemic virtue of educated selectivity, problems might arise on the side of the less informed observers. Since they are often not trained in observing and interpreting contemporary epistemic images, and may be ignorant with regard to the context of technical image generation, they can easily fall victim to false interpretations. DIC microscopy, for instance, creates an artificial relief-like shading around an object that, despite its appearance, reveals nothing about the true height of the depicted objects (fig. 2+5d). Educating the audience in the correct interpretation of epistemic images could therefore be another virtue – perhaps not an epistemic one – that microscopists should dedicate themselves to. 35 Horst Bredekamp: In der Tiefe die Künstlichkeit. Das Prinzip der bildaktiven Disjunktion. In: Horst Bredekamp, John Michael Krois (eds.): Sehen und Handeln, Berlin: Akademie Verlag, 2011, pp. 206–224.
54
Dieter G. Weiss, Günther Jirikowski, Stefanie Reichelt
We have shown, firstly, that microscopic imaging is a complex process based on the interference of diffracted light; secondly, that there are a multitude of possible interventions in the imaging process; and thirdly, that microscopists since 1850 have considered it important to actively make a set of choices to obtain true and meaningful images. If the selection process is based on prior knowledge, which may also include non-microscopic knowledge, and on the experience of individuals trained in microscopy, an image results that is as free as possible of extrinsic details and unintended artifacts. Such images consequently represent neither a subjective idealization nor a mechanical reproduction. Instead such a collection of traces can be understood as a true image of the specimen. Daston and Galison describe the shift from the virtue of mechanical objectivity to the virtue of trained judgment that occurred between 1900 and 1930: “Within the first third of the twentieth century, new possibilities emerged as the self-abnegating scientists and their various modes of automatic registration began to yield to scientists who worked with highly sophisticated instruments but were, nonetheless, proud of their well-honed judgments in the formation and use of images.” 36 There are similarities between the virtue of trained judgment and the virtue of educated selectivity in microscopy described in this article. It would appear that microsc opists anticipated by about 100 years questions that only became relevant to macroscopic epistemic imaging in the 20th century. At this time, scientists began to work in all fields of imaging with increasingly complex apparatuses and to edit images specifically for presentation. By using the term educated selectivity, we want to highlight a means to obtain objective epistemic images that was developed by microscopists over 150 years ago and that has retained its validity until this day, even after the introduction of digital microscopic imaging techniques and super-resolution methods. This article was supported by grants from the Mecklenburg-Vorpommern Regional Excellence Funding Program and the Mecklenburg-Vorpommern Research Fund to DGW and GJ. We are grateful to Fynn Ole Engler, Julia O.M. Lippmann, and Bettina Bock von Wülfingen for their fruitful discussions and valuable comments.
36 Daston, Galison (s. fn. 22), p. 319.
Soraya de Chadarevian
“It is not enough, in order to understand the Book of Nature, to turn over the pages looking at the pictures. Painful though it may be, it will be necessary to learn to read the text.” Visual Evidence in the Life Sciences, c. 1960 The quote in the title comes from Francis Crick, a key proponent of the new molecular biology in the post-WWII era. The target of Crick’s not-so-subtle attack was cytogeneticists, who supposedly spent their time pouring over microscope images of chromosomes. Crick was not disinterested in the structure and function of chromosomes, the key subject of cytogeneticists’ toil. On the contrary, in the early 1970s he had declared the structure of the chromosomes of higher organisms to be “probably the major unresolved problem in biology today.” 1 He himself had published an article in which he boldly proposed a “general model for the structure of chromosomes of higher organisms.” It suggested that intricately folded regions containing the large regulating portions of the DNA molecule were followed by extended stretches containing the shorter coding regions. He used diagrammatic sketches accompanied by page-long captions to explain the various aspects of the model, which he described as “speculative” but “logically coherent” and “compatible with a large amount of experimental data obtained using very different techniques.” 2 The model did not hold up under the critical scrutiny of some of his colleagues. Nevertheless, it was presumably on the strength of his work on chromosomes that Crick was invited to comment on the proceedings of the Chromosome Conference, a long-standing gathering of chromosome researchers, which he attended in 1977. It was on this occasion – and with characteristic irreverence towards the conveners of the meeting – that he made his critical remarks. This was not the first or the only time that Crick expressed disdain towards microscopists. When Crick, who started his biological career under the gui dance of Arthur Hughes, a cell biologist, was asked about his former teacher, he summed up his response saying, “Well, you see, he was a microscopist.” 3
1 Francis H. C. Crick: The Double Helix. A Personal View. In: Nature, 248, 1974, p. 767. 2 Francis H. C. Crick: General Model for the Chromosomes of Higher Organisms. In: Nature, 234, 1971, p. 27. 3 Quoted in Robert Olby: Francis Crick. Hunter of Life’s Secrets, Cold Spring Harbor, NY: Cold Spring Harbor Laboratory Press, 2009, p. 78.
56
Soraya de Chadarevian
Crick’s biographer surmised that, coming from Crick, this was a rather “damning remark.” 4 Crick was not alone in being skeptical about visual evidence. Indeed, there is a long tradition, especially in the physical sciences, of spurning images in favor of formal mathematical and law-like structures.5 Without doubt, molecular biologists relied on visual evidence in their work – or so it appears. It suffices here to recall the role the X-ray diffraction image of DNA (now known as photograph 51) taken by Rosalind Franklin and her assistant Raymond Gosling played in the work that led to Watson and Crick’s proposal of the double helical structure of DNA. The two researchers’ (unacknowledged) use of the photograph has been widely condemned. Such a polemic would not have ensued if the picture had not provided useful evidence. Indeed, to the trained crystallographer the picture pointed to a helical conformation – even if Crick was quick to note that Watson, who was the only one of the two researchers to see the picture, did not have sufficient mathematical and crystallographic knowledge to fully grasp its meaning.6 This raises the questions: What distinguished crystallographic images from microscope images and their respective use by molecular biologists and cytogeneticists? For that matter, what is an image? And which images provide acceptable evidence for whom and why? To answer these questions, it will be useful to move beyond the polemical statements and examine in greater depth the visual practices of both chromosome researchers and molecular biologists. This essay begins by examining the production and uses of images in cytogenetics around 1960. It then investigates the role of visual evidence in molecular biology. Here the focus is on protein crystallography, Crick’s own specialty and a key approach in the early history of molecular biology. The final section briefly considers the current excitement around high-resolution microscopy in the molecular life sciences and reflects on the place of visual evidence in current data practices. Microscopy and visual evidence
Techniques for making chromosomes visible, including cell preparation, staining, trained microscopic observation, drawing, photographing, and the ordered arrangement of chromosomes, were constitutive steps in the study of chromo 4 Olby (s. fn. 3), p. 78. 5 Lorraine Daston, Peter Galison: Objectivity, New York: Zone, 2007, pp. 253–307. 6 Francis Crick: What Mad Pursuit. A Personal View of Scientific Discovery, London: Weidenfeld and Nicolson, 1989, p. 67.
Visual Evidence in the Life Sciences, c. 1960
57
1: Photomicrographs of human chromosomes.
somes. Chromosome pictures also became recognizable images for a wider public, appearing in clinical settings, in media reports, and even providing the pattern for a Marimekko fabric print. The X and Y chromosomes became particularly iconic. Well into the 1970s, lined up chromosome sets were the most recognizable images of genetics. However, here our focus is on the construction of visual evidence in the laboratory. Without doubt, the chromosome pictures that cytogeneticists were looking at were highly constructed. For instance, to produce their famed images that led to the correction of the number of human chromosomes from 48 to 46, Joe Hin Tjio and Albert Levan at the Cancer Chromosome Laboratory at Lund University set up cell cultures of embryonic tissue, treated the cells with colchicine to stop cell division at a stage when the diffused chromatin fibers condensed into compact chromosome structures, added a hypotonic solution to the culture medium to swell the cells, stained the cells, placed the preparations on a slide, and skillfully pressed the covers with their thumbs in a final attempt to spread the chromosomes apart. Chromosome preparations could then be viewed under the microscope and be drawn and photographed. The epistemic value of these two approaches was a matter of dispute between the authors of the paper. Whilst for Levan seeing was intimately connected to the act of drawing, Tjio saw the photomicrographs as providing the decisive evidence. The photomicrographs included in their joint paper showed “the ease with which the counting could be made” and, at least according to Tjio, provided direct evidence of the new chromosome count (fig. 1).7 Both camera
7 Joe Hin Tjio, Albert Levan: The Chromosome Number of Man. In: Hereditas, 42, 1956, p. 2.
58
Soraya de Chadarevian
lucida drawing and photography required special skills. Perhaps not surprisingly, Levan excelled in India ink drawing, while Tjio was a skilled photographer. Far from relying on the “mechanical objectivity” of photography, Tjio continued to manipulate the images in the darkroom.8 Despite Levan’s defense of drawing, developments in the field increasingly favored photography. Photographs (as drawings to a certain extent) captured the microscope image and preserved the experimental evidence. They provided proof and invited other scientists to check the evidence. Discussions about chromosome counts took place around photographic images. Nevertheless, chromosomal observation required training and even such a seemingly basic activity as counting chromosomes was everything but straightforward, as Aryn Martin has shown.9 Indeed, a curious aspect of the confirmation of the new human chromosome count was that previously published photomicrographs were now seen as consistent with the new count of 46. Photographs themselves were subject to further manipulation. Enlarged prints were cut up and the chromosomes ordered according to a standardized scheme agreed upon in specially convened standardization conferences. Photographic slides could be projected against a screen, facilitating the measurement of the magnified chromosomes. Measuring played a key role in human chromosome researchers’ early practice. The report of the first standardization conference in 1960 intentionally did not include any images. Instead the centerpiece of the report consisted of a table detailing the average measurements of the length of each chromosome relative to the total length of all the chromosomes in the set, the ratio of the longer arm relative to the shorter arm and of the shorter arm in relation to the overall length of each chromosome (fig. 2). The prominent human geneticist Lionel Penrose, who did not attend the conference, took the trouble to check all the figures published in the report, finding various arithmetical errors. He regarded measuring as fundamental to the practice of human chromosome research. Thus, through photography, chromosomes became tangible objects that could be manipulated, measured, and sorted.
8 Soraya de Chadarevian: Chromosome Photography and the Human Karyotype. In: Historical Studies in the Natural Sciences, 45, 2015, pp. 115–146. 9 Aryn Martin: Can’t Any Body Count? Counting as an Epistemic Theme in the History of Human Chromosomes. In: Social Studies of Science, 34, 2004, pp. 923–948.
Visual Evidence in the Life Sciences, c. 1960
59
2: Table of chromosome measurements compiled at the standardization meeting in Denver in 1960.
What exactly irritated Crick in cytogeneticists’ use of images? What were images for crystallographers, and which epistemic role did they play in their experimental practice? X-ray diffraction and the molecular structure of biological molecules
X-ray analysis started with an X-ray diffraction image, obtained by exposing single protein crystals to a beam of X-rays – often for hours (in this respect, X-ray diffraction can be considered a type of microscopy and vice versa). The result was captured on photographic plates (fig. 3). For crystallographers these images represented a pattern of spots of various intensities. From the intensities of the spots, indicated by their darkness, and the calculated phases, the distribution of electron density in the molecule could be calculated. Intensities were measured directly from the photographs. Initially, this was done manually by matching the darkness of the spots to a reference scale. Later, densitometers were employed to measure diffraction images. Determining the phases and the electron distribution involved an immense amount of calculations for which crystallographers made pioneering use of electronic computers. Electron density maps looked very much like topographical maps (fig. 4). To capture the 3D distribution of electron densities, sections were laid through the molecule and contour maps plotted on trans-
60
Soraya de Chadarevian
3: X-ray diffraction photograph of myoglobin, the first protein whose molecular structure was solved (late 1950s).
lucent sheets, which could be superimposed and illuminated. The map provided evidence for the 3D structure of the molecule, although its precise configuration could only be made out by constructing a 3D model – another kind of visual representation. Here a choice had to be made about which construction material to use, with different kinds of model parts highlighting different aspects of the structure and allowing for different manipulations and further refinements. The crystallographic analysis of a molecule from diffraction image to 3D model can thus be reconstructed as a series of transformations of visual representations, with each representation providing evidence for the next step of analysis.10 Given the importance of visual representations in crystallography, was Crick’s dismissal of the pictorial practice of chromosome researchers perhaps just polemical? Or can differences be identified in the way the two communities interpreted and used images? Images and data
One difference lies in the level of scale. Pitting text against images, Crick may simply have been defending molecular analysis against an analysis on the cellular level amenable to light microscopy. Nevertheless, the terms in which he couched the difference between the two approaches are significant. Epistemically, cytogeneticists honed observational skills while crystallographers were obsessed with the amount of calculations required to move from one step of analysis to
10 See Amelung and Stach, and Weiss, Reichelt and Jirikowski (in this volume).
Visual Evidence in the Life Sciences, c. 1960
61
4: Drawing contours on electron density maps (c. 1957).
the next. Starting with John Kendrew, who presented the first atomic structure of a globular protein in 1957, protein crystallographers saw structure analysis as a huge data-handling problem. Approaching the problem in these terms, they recognized early on the benefits of using electronic computers, then still experimental machines, for calculating and storing data.11 The amount of data processed for the atomic resolution of the first protein structures was unprecedented, a fact that crystallographers never failed to emphasize and that added an aura to their achievements. Like crystallographers, chromosome researchers attempted to harness computers to facilitate the tedious and time-consuming work of karyotyping. Yet their needs lay in the field of pattern recognition. Automation translated the work of the human observer into the following steps for each chromosome: digitization of the visual image, field segmentation, feature extraction, and classification. Computer scientists were trying to tackle similar problems in other fields. One significant example was the challenge of character recognition in both typescript and handwriting. The link between character and chromosome reading is intriguing as chromosomes (pace Crick) were often compared to letters, metaphorically and otherwise. Yet chromosome recognition proved to pose even greater problems than character recognition as chromosomes could overlap and lie in different orientations. There was no measuring involved in character recognition and no interest in the density of letters or in anomalies, which were crucial in chromosome analysis. 11 Soraya de Chadarevian: Designs for Life: Molecular Biology after World War II, Cambridge: Cambridge University Press, 2002, pp. 98–135.
62
Soraya de Chadarevian
5: The Cytoscan for computer-aided chromosome analysis developed by the Pattern Recognition group in Edinburgh. The machine included a computer-linked microscope and a computer screen for editing.
Different algorithms were developed to overcome these problems in chromosome analysis, yet despite the immense amount of work and money invested in this area, humans continued to outperform machines. As one of the leaders of the chromosome automation project in Britain put it: very soon it became clear “that human eyes can do tricks that are damn difficult to mimic with machines.” 12 The machine the group eventually developed and commercialized consisted of an interactive system that combined the scanning capacities of the computer with the pattern recognition capacity of a human operator. It was a self-contained machine with a computer-linked microscope and a screen-based editor (fig. 5). On closer inspection the distinction between crystallographers and chromosome researchers in their use of computers was not quite as neat as just suggested. On the one side, many crystallographers bemoaned the absence of a display function in the digital computer and, for a while, preferred to run their calculations on an analog computer, purposely built to calculate Patterson projections, which displayed the calculated electron density maps on the screen of a cathode ray oscilloscope (fig. 6a+b). This, so its proponents claimed, aided interpretation. In addition to using computers to run calculations, molecular biologists also attempted to harness computers for molecular modelling. The approach, pioneered by Cyrus Levinthal at MIT in the 1960s, attempted to link computer modelling software with interactive graphic display systems to study 3D molecular structures
12 Interview conducted by the author with Denis Rutovitz, Edinburgh, 1 April 2008.
Visual Evidence in the Life Sciences, c. 1960
63
6a: X-Ray Analogue Computer built by Ray Pepinsky, capable of performing crystallographic calculations in real time and displaying the results on a screen (1951). b: Display of an electron density projection on the screen of Pepinsky’s machine.
on the screen (fig. 7a+b).13 On the other hand, chromosome researchers hoped that, in addition to speeding up karyotyping, computerized methods for measuring the optical density of chromosomes – a step similar to the use of automated densitometers to measure the density of diffraction spots – might be applied to find aberrations that could not be detected with a light microscope. The method would, so to speak, “weigh” chromosomes rather than measuring their length. Nevertheless, on the whole, molecular biologists appealed to images only when they were supported by extensive measurements and calculations. The appeal to numbers to justify visual evidence has not abated – evidence to the contrary notwithstanding. Publications in molecular biology are now awash with colorful illustrations, and super-resolution microscopes that enable live-cell imaging of molecular processes are creating a buzz in the field. Even sequencing, the quintessential textual analysis of DNA, is now based on microscopic techniques. Indeed, the Illumina sequencer, a machine widely used for sequencing, is essentially a fluorescent microscope combined with clever chemistry. Similarly, recent developments enable researchers to follow DNA synthesis in vivo on the screen. Yet these images are the product of much number crunching. In the new genera 13 Eric Francoeur, Jérôme Segal: From Model Kits to Interactive Computer Graphics. In: Soraya de Chadarevian, Nick Hopwood (eds.): Displaying the Third Dimension. Models in the Sciences, Technology and Medicine, Stanford: Stanford University Press, 2004, pp. 402–429.
64
Soraya de Chadarevian
7a: Computer workstation with the interactive display system (also known as the ‘Kluge’) used for protein modeling by Cyrus Levinthal at MIT, mid-1960s. b: Detail of the display system and the globe that enabled the operator to move and rotate the structure on the screen.
tion of microscopes, software algorithms that analyze and recompose the images from vast amounts of data are crucial in increasing the optical resolution of the instruments. Colors are also added to the black-and-white fluorescent images via computer algorithms. Bioinformatics, too, increasingly relies on images to visualize data and make them amenable to examination and manipulation. Yet despite the reliance on images, database research is considered theoretical work.14 To paraphrase Crick, we may conclude that historians of images cannot simply rely on what they see. What matters is what the images represent for whom.
14 Hallam Stevens: Life Out of Sequence. A Data-Driven History of Bioinformatics, Chicago: University of Chicago Press, 2013, pp. 171–201.
Bettina Bock von Wülfingen
Giving a Theory a Material Body Staining Technique and the “Autarchy of the Nucleus” since 1876 The zoologist Oscar Hertwig was one of the first cell microscopists interested in conception to use the then new staining techniques. In the dispute over whether the cytoplasm or the nucleus was responsible for initiating and determining conception and heredity, Hertwig used these techniques in 1875 to observe what happened after the fusion of an egg cell and sperm, and to prove that it was the nucleus that played the essential role in the hereditary process. Hertwig’s methodology is typical for cell microscopists of his time, who in Jutta Schickore’s view, followed an epistemology which values the researchers’ judgment and ability to synthesize. With the aid of a vertically adjustable microscope stage, they had to be able to produce a spatial image of their observations from a variety of single impressions of a particular object. In addition, Hertwig draws upon an older tradition, namely the temporal order of his specimens, thereby reaching beyond the epistemological frame described by Schickore: Based on a multitude of individual specimens, he created a sequence of images representing the temporal process after conception. This was only possible because the staining technique not only enables the differentiation of that which was formerly invisible, but also the fixation of the movements within the cell at a specific point in time. This article presents the following thesis: The staining technique became the established methodology for effective visualization “fit to serve as evidence” [beweiskräftig],1 a process beginning with Hertwig, among others, during the late 1870s. Thereby objects and processes that were not affected by staining and possibly only perceivable through movement disappeared from sight. 2 The theory that the cytoplasm played an essential part in the process of conception and heredity was thus probably not only discarded far too early on3 due to the often described
1 Jutta Schickore: Fixierung mikroskopischer Beobachtungen. Zeichnung, Dauerpräparat, Mikrofotografie. In: Peter Geimer (ed.): Ordnungen der Sichtbarkeit, Fotografie in Wissenschaft, Kunst und Technologie, Frankfurt am Main: Suhrkamp, 2002, pp. 285–310, p. 293. 2 See the analogous argument concerning the relationship between liquids and solids in the daguerreotype as an instrument in physiology presented by Barbara Orland in this volume. 3 The relevance of the cell plasma to conception and inheritance was defended by many biologists from the beginning of this dispute and has gained growing acceptance since the 1990s.
66
Bettina Bock von Wülfingen
values ascribed to the different genders at that time.4 Instead and in addition, the new techniques and the corresponding epistemological frame of values – specif ically the fact that the role of the plasma in conception and heredity could not be turned into a chemical body – contributed to the delegititimization of accounts that focused on the role of plasma in heredity and conception. Microscopy at the end of the 19th century: True to nature through disciplined observation
Early on in the discourse of the Enlightenment, the public visualization of a phenomenon became part of scientific evidence. During the 19th century, the methods of such visualizations became essential elements in cultural and scientific debates,5 while the visualization itself became a scientific object. This also applies to the role of the researcher himself,6 whose disciplined and correct approach, especially in microscopy first in England during the mid-19th century and later during the 1860s in German-speaking countries as well, increasingly became the subject of publications and training courses.7 According to Carlo Ginzburg, by the end of the 19th century in different cultural arenas and hence also in the sciences, it was of the utmost importance to be able to demonstrate the not-obvious, the inconspicuous as the material trace of a given phenomenon and to do so in systematic and reproducible ways.8
4 See on this issue Helga Satzinger: Differenz und Vererbung. Geschlechterordnungen in der Genetik und Hormonforschung 1890–1950, Cologne: Böhlau, 2009. 5 Martin Jay: Downcast Eyes: The Denigration of Vision in Twentieth-Century French Thought, Berkeley: UC Press, 1993; Hans Jonas: Der Adel des Sehens. In: Ralf Konersmann (ed.): Kritik des Sehens, Leipzig: Reclam, 1999, pp. 257–271. 6 This self-mastery corresponds to the ideal of a then male scientist as such (Lorraine Daston, H. Otto Sibum: Introduction: Scientific Personae and Their Histories. In: Science in Context, 16, 1–2, 2003, pp. 1–8). 7 Graeme Gooday: ‘Nature’ in the Laboratory: Domestication and Discipline with the Microscope in Victorian Life Science. In: The British Journal for the History of Science, 24, 3, 1991, pp. 307–341; Soraya de Chadarevian: Sehen und Aufzeichnen in der Botanik des 19. Jahrhunderts. In: Michael Wetzel, Hera Wolf (eds.): Der Entzug der Bilder. Visuelle Realitäten, Munich: Fink, 1994, pp. 121–144; William Coleman: Prussian Pedagogy. Purkyne at Breslau, 1823–1839. In: William Coleman, Frederic L. Holmes (eds.): The Investigative Enterprise, Berkeley: UC Press, 1988, pp. 15–64; Jutta Schickore: The Microscope and the Eye. A History of Reflections, 1740–1870. Chicago, London: University of Chicago Press, 2007. 8 Carlo Ginzburg, Anna Davin: Morelli, Freud and Sherlock Holmes: Clues and Scientific Method. In: History Workshop, 9, 1980, pp. 5–36.
Giving a Theory a Material Body
67
Daston and Galison, however, focus on the production of traces instead of the mere reading of the traces:9 in the early 19th century, representations true to nature were achieved by drawing an ideal natural object deemed characteristic, even if it could scarcely be found in such an ideal state in nature itself. By the middle of the century, this ideal of an objectivity pursuing truth to nature was replaced by an ideal of mechanical objectivity. This consisted in enabling nature itself, by means of a variety of recording devices, to show itself on paper or in other media. In the ideal scenario, the scientific observer would be completely replaced by mechanized processes. This ideal, as Schickore shows, does not seem to apply to late 19th century microscopy. Here we find a different concept of objectivity in visualization and its presentation, demonstrated amongst other examples by the textbook authors Matthias Jacob Schleiden and Leopold Dippel. This concept emerges in the discussion of specimens, drawings, and microphotography; it is an objectivity neither compatible with one which presents an ideal from a multitude of objects, nor with an objectivity which pushes the human observer to the background. On the contrary, the observer, armed with his10 microscope, has to be able to achieve a synthesis using his “productive imagination” 11 of a 3D object viewed in different depths but seen in the monocular on only one specific level at a time so as to reproduce the observed natural experience in its multidimensional complexity. Experienced and skilled as the observer is, he will also be able to eliminate possible errors in the observation or, if that proves impossible, to discuss them in his presentation. Making even the vanished nucleus visible: Staining and heredity
The zoologist Oscar Hertwig’s methodology is in fact exemplary of Schickore’s descriptions. Hertwig studied with Ernst Haeckel in Jena and with the microscopist Max Schultze in Bonn. Since the 1860s, various scientists had tried to prove that the egg cell and sperm fused during the process of conception, and H ertwig’s study was – according to its reception – able to prove this indisputably.12 A ques 9 Lorraine Daston, Peter Galison: Objectivity. New York: Zone Books, 2007. 10 Regarding the male observer (s. fn. 6). 11 Leopold Dippel: Das Mikroskop und seine Anwendung, Braunschweig: Vieweg, 1867, p. 467. The observer is in these descriptions conceptualized necessarily as a male researcher. 12 Oscar Hertwig: Beiträge zur Kenntniss der Bildung, Befruchtung und Theilung des thierischen Eies. In: Morphologisches Jahrbuch, 1, 1876, pp. 347–432, p. 382. In the history of science, Fol’s publication in the same year is usually considered the first proof of the penetration of the sperm, although contemporary discussions at the end of the 19th and beginning of the 20th century
68
Bettina Bock von Wülfingen
tion remained, however, as to what exactly happened to the nuclei during and after conception. The pathologist Leopold Auerbach, for example, described their dissolution in a living object.13 Was it possible that the nuclei fell apart after conception, or did they fuse and persist in the new embryo? Hertwig examined the movements of nuclei and cytoplasm during the summer of 1875 with the aid of sea urchins, which he shook in a bottle to initiate conception. But in a living object, due to the limited resolution and contrast, Hertwig could only observe the nucleus up to the point where the sperm entered the egg cell and after its reappearance a little while later. Hertwig was well aware of the restrictions even in his excellent Zeiss microscope, so he applied the chemical staining method at the very first occasion after its publication in tissue research. It required a great deal of preparation to make the nuclei visible; Hertwig wrote half a page “advising” other researchers on this process.14 The eggs, fertilized in a watchcase, first had to be doused in osmium tetroxide, rinsed, and then stained: “With a fast insertion into Beale’s carmine on the one hand, we avoid the darkening of the egg induced by the application of osmium; on the other hand, the nuclei are dyed red, while the egg yolk [the cytoplasm] is hardly dyed at all and therefore remains translucent. I was thus able to create staining images of convincing clarity.” 15 The observer, well versed in his research, wanted first and foremost merely to convince himself. As the staining chemicals interrupt all processes in the cell at the same time, Hertwig was able to create different fixed states, which he then collected in a sequence of pictures of cellular moments. The movement within the cells was stopped after standardized periods of time. Hertwig claims that none of his colleagues had ever before “meticulously observed and described step-by-step the division in all its phases” 16 in this way. They had, like Auerbach, “not considered the possibility that the nucleus was made invisible in its fresh state.” 17 (fig. 1)
13 14 15 16 17
first praised Hertwig’s convincing report on fertilization. The year of publication in the study itself is recorded as 1875. Leopold Auerbach: Organologische Studien 1: zur Charakteristik und Lebensgeschichte der Zellkerne, Breslau: Morgenstern, 1874. Hertwig (s. fn. 12). Hertwig (s. fn. 12). Hertwig (s. fn. 12), p. 419. Hertwig (s. fn. 12), p. 425, emphasis added.
Giving a Theory a Material Body
69
1: Two nuclei advance on each other in the f ertilized egg – or they are pushed or pulled towards each other (“Egg, five minutes after conception. Immigration of the nucleus of the sperm […]” Hertwig 1875, appendix, Plate 11, fig. 7).
The introduction of the staining technique coincided with a risen interest in sexual dualism and the contribution of the respective sex to the conception and their respective contributions to the conception process becoming one of the central topics in embryology.18 An earlier theory in this field held that the sperm’s role was merely to immaterially trigger the egg cell’s development. The observation made by Hertwig and others that both the nuclei of the sperm and the egg cell approached each other and fused was at first seen as proof that this theory had to be replaced by the idea that both sexes provided the same material contribution to the development of the hereditary material within the cell. However, ten years later, Hertwig embedded his observations within a theory of a sexual division of labor – which Haeckel had first introduced – now for the first time concerning parts within the cell. Hertwig now considered the sperm, the so-called “matter of fertilization equally as a matter of heredity.” 19 “The female cell or egg now has the role of caring for substances necessary for the nourishment and growth of the cytoplasm for a rapid beginning of the development processes […].” 20And: “We call one the male, the other the female organization, male and female sexual characters.” 21 In contrast to Hertwig and zoologists, botanists, and embryologists, who similarly believed in such a labor division, including Eduard Strasburger, August Weismann, and Carl Wilhelm von Nägeli, other researchers maintained that the cytoplasm played the dominant role. US researchers around Charles Otis W hitman 18 Frederick B. Churchill: From Heredity Theory to Vererbung – The Transmission Problem, 1850–1915. In: Isis, 78, 1987, pp. 337–364; Helga Satzinger (s. fn. 4). 19 Oscar Hertwig (1885): Das Problem der Befruchtung und der Isotropie des Eies, eine Theorie der Vererbung. In: Jenaische Zeitschrift für Naturwissenschaft, 18, 11, 1885, pp. 276–434, p. 277. 20 Hertwig (s. fn. 19), pp. 222ff. 21 Hertwig s. fn. 19), p. 222.
70
Bettina Bock von Wülfingen
explicitly objected to the labor division described by Hertwig and others. Whitman called it an “anthropomorphic conception.” 22 Running counter to the “autarchy of the cell nucleus,” 23 the embryologist Hans Drisch and the physiologist Max Verworn demonstrated how the centrosome, a radial structure in the cytoplasm, moved the nucleus and, during the division, parts of the nucleus through the cell. However, they provided only a few images of the observed cellular material as evidence of their observation. The centrosome, today considered to consist of protein strings, could scarcely be differentiated from the surrounding plasma by staining. Even in 1905, the anatomist and text book author Ernst Ziegler concluded that centrosomes were hardly visible due to their small size. They could not be stained with nucleus-staining chemicals, “but instead with acid fuchsin, safranin and ferric hematoxylin,” and yet remained “difficult to prove.” 24 In contrast to the centrosome, the nucleus was so easily stained, not only by Baele’s carmine but also by aniline, that in 1882 Walther Flemming named those particles in the nucleus that became visible by staining “chromatin,” derived from the Greek word for color, chroma. 25 “Only staining can advance knowledge” 26
In contrast to mere enlargement or the refraction provided by the microscope, staining makes objects visible by at first materially producing them and chemically changing them through the addition of the staining materials. Many objects thus became visible even to the naked eye; they could be differentiated and for the first time examined.27 Specifically, it was the staining of pathogenic microorganisms that achieved a long-lasting success in the first trials using staining materials in microscopy. Pathogens were previously only assumed to exist; the staining of bacteria made 22 Charles Otis Whitman (1893): The Inadequacy of the Cell-Theory of Development. In: Journal of Morphology, 8, 1, 1893, pp. 639–658, pp. 648ff. 23 Max Verworn: Allgemeine Physiologie, Jena: G. Fischer, 1897, p. 56; Hans Driesch: Analytische Theorie der organischen Entwicklung, Leipzig: W. Engelmann, 1894, pp. 151ff. 24 Ernst Ziegler: Lehrbuch der allgemeinen Pathologie und der pathologischen Anatomie, Jena: Fischer, 1905, p. 299. Hence Flemming and Hertwig, he wrote, derived the spindle figure, which occurs in the nuclei and moves the chromosomes, in an analogy to the centrosome, “from the achromatic substance of the nuclear skeleton,” while Strasburger had them arise from cell plasma (ibid.). 25 Walther Flemming: Zellsubstanz, Kern und Zelltheilung, Leipzig: F. C. W. Vogel, 1882, pp. 99–129. 26 Santiago Ramón y Cajal: Recollections of My Life, transl. by E. Home Craigie with Juan Cano, Cambridge: MIT Press, [1937] 1996, pp. 526ff. 27 Axel C. Hüntelmann: „Ehrlich färbt am längsten“. Sichtbarmachung bei Paul Ehrlich. In: Berichte zur Wissenschaftsgeschichte, 36, 2013, pp. 354–380.
Giving a Theory a Material Body
71
them visible to both researchers and the general public. This procedure was first successfully performed in 1871, but only described in 1874 by Rieder and W eigert as a method for “showing different tissues as qualitatively different in the microscopic image.” 28 As Koch’s postulates about pathogenesis could finally be proven during the 1880s by staining microorganisms, the “staining of pathogenic organisms served as ‘objective’ evidence,” according to Hüntelmann’s historical review of the introduction of tissue-staining chemicals by Paul Ehrlich. 29 Together with Koch’s postulates, “the visualized bacteria formed an objective proof which could be discussed and juxtaposed with other competing models independent of place and person.” 30 The fact that staining techniques were so quick to advance in microscopy was, according to Axel Hüntelmann with reference to Ehrlich’s biographer Ernst Bäumler, the result of “historical intersections” 31 of different developments: During the 1860s, earlier staining materials, which had previously been laboriously produced from plants and animals, were replaced with more economic materials derived from tar production. Their staining effects advanced the chemical industry, which had initially emerged from the dye industry. With the increasingly chemical orientation of microscopy, a shift also reflected within heredity research, as can be seen in the work of August Weismann, Carl Wilhelm von Nägeli, and in particular Albert von Kölliker, arguments were more and more often based on chemistry.32 The visualization and the proof offered by staining techniques gained such an importance that the observation of the living cell lost all scientific relevance. Like many others before him, the neuroanatomist and Nobel Prize winner Ramón y Cajal stated around 1937 that it was only possible “[to] advance in the knowledge of the tissues […] by impregnating or tinting.” 33 Staining, however, made observation in the living object impossible, which in some scientific areas, such as research on the development of neuronal cells, became an obstacle.34 28 Robert Rieder: Carl Weigert in seiner Bedeutung für die medizinische Wissenschaft unserer Zeit. In: Robert Rieder (ed.): Carl Weigert. Gesammelte Abhandlungen: Bd. 1, Berlin: Springer, 1906, pp. 1–132, p. 14. 29 Hüntelmann (s. fn. 27), p. 273. 30 Hüntelmann (s. fn. 27), p. 274. 31 Ernst Bäumler, Paul Ehrlich: Forscher für das Leben, Frankfurt am Main: Societäts-Verlag, 1979, p. 36. 32 Rudolph Albert von Kölliker: Die Bedeutung der Zellenkerne für die Vorgänge der Vererbung. In: Zeitschrift für wissenschaftliche Zoologie, 42, 1895, pp. 1–46. 33 Ramón y Cajal (s. fn. 26), pp. 526ff. 34 Hannah Landecker: New Times for Biology: Nerve Cultures and the Advent of Cellular Life In Vitro. In: Studies in History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences, 33, 4, 2004, pp. 667–694.
72
Bettina Bock von Wülfingen
The staining technique in the contest of theories
The visualizing process in Hertwig’s work, which convinced him (and his audience) that the nucleus did not dissolve after conception, but remained and was essential to heredity, can be broken down into at least four image-generating steps: The specimen, at first an “‘image’ of itself,” 35 becomes, if the staining also contained a chemical fixation, lasting proof that could be communicated to others. Such a proof or form of evidence needed a counterpart: As Hertwig first wanted to “convince” himself with his stained objects, they served as an instrument36 for a “micro feedback,” the term Cynthia Pyle uses for communication by the researchers with themselves. She differentiates this from “macro feedback” obtained from publications.37 Hertwig’s publications contain a number of lithographs, signed not only with the lithographer’s name but also with his own. The originals of these sketches are never mentioned at all. The images depict the cells as always round, mostly symmetrical, and ordered, free of any organelles in the cytoplasm, giving them a schematized appearance. Since Hertwig, like his contemporaries, does not mention the drawings themselves, we cannot know whether he perceived them as contributing to micro feedback, as a means to gain insight,38 which could be described as a disciplined reading of traces, and which, in combination with its synthesis capacities, is also superior to microphotography.39 Hertwig at least allows himself, as his methodological descriptions show, to select and interfere. These methodological descriptions, however, refer only to the staining of the objects. The staining gave body to parts of the nucleus that had otherwise been invisible during specific stages of the cell cycle. It created a visual artefact which not only convinced its first observer, Hertwig himself, but also prompted the interest of a wide public audience.40 Hertwig later admitted that this had been his aim: to prove the theories of Haeckel and Nägeli, who had been unable to provide visual 35 Hans-Jörg Rheinberger: Präparate – ,Bilder‘ ihrer selbst. Eine bildtheoretische Glosse. In: Bildwelten des Wissens, 1, 2, 2003, pp. 9–19. 36 In Simondon’s sense of the term, instruments are characterized by the fact that they have a feedback structure inherent in them. Gilbert Simondon: Du mode d’existence des objets techniques, Paris: Aubier, 1989. 37 Cynthia M. Pyle: Art as Science: Scientific Illustration, 1490–1670 in Drawing, Woodcut and Copper Plate. In: Endeavor, 24, 2, 2000, pp. 69–75. 38 Cf. Christoph Hoffmann, Barbara Wittmann: Introduction: Knowledge in the Making: Drawing and Writing as Research Techniques. In: Science in Context, 26, 2, 2013, pp. 203–213. 39 Soraya de Chadarevian: Chromosome Photography and the Human Karyotype. In: Historical Studies in the Natural Sciences, 45, 1, 2015, pp. 115–146; Schickore (s. fn. 1). 40 Cf. Alex C. Hüntelmann (s. fn. 27).
Giving a Theory a Material Body
73
evidence. He needed to demonstrate that the nucleus (defined by him and his predecessors as male) served as the hereditary organ, whilst the cytoplasm (defined as female since it originated mainly from the egg cell) was the organ of adaptation.41 Competing against this theory, which was based on an easily stainable nucleus, embryologists who believed in the essential role of the centrosome during the hereditary process had few possibilities to prove their theories with the very methods that the moral economy of microscopy valued at the time.42 In her discussion of other microscopists using these very staining methods, Schickore still employs the term “truth to nature” as the criterion for objectivity. In the case studies of microscopy during the last third of the 19th century which she discusses, this term does not, however, imply that the drawing would create an ideal object based on a multitude of objects. It could therefore be asked whether those microscopists who had a movable microscope stage and hence could perceive the object on different levels and represent their multidimensional observations in one image were not already following a third form of ideal objectivity. Apart from truth to nature and mechanical objectivity, Daston and Galison assert that another ideal existed around the turn of the century, one that allowed the researcher to include his own discretion and schematization: the ideal of objectivity of “trained judgment,” which is clearly anticipated here in microscopy.
41 Oscar Hertwig (s. fn. 19), pp. 277ff. 42 Observing stained and moving particles in the cell plasma recently became possible in the late 1980s (see Weiss et al. in this volume).
2: Hertwig’s series of drawings of the process of the fecundation of the egg cell (in the first two images Hertwig didn’t use staining, in the others he did).
INTERVIEW
Traces and Patterns Pictures of Interferences and Collisions in the Physics Lab A Dialogue between Dr. Anne Dippel and Dr. Lukas Mairhofer Dippel:
Lukas, I got to know you as a philosopher interested in ways of thinking about nature and therefore decided to study physics. What do you do all day in the laboratory?
Mairhofer:
Here in our Viennese basement, we interfere matter waves – this is actually quite philosophical: We are demonstrating that it is inadequate to describe our world as being constituted by individual, indivisible, and impenetrable particles whose properties are determined prior to any interaction. In our experiments, the building blocks of matter exhibit wave behavior – they can superpose and interfere with one another and even with themselves. This is why we talk about matter waves.
Dippel:
What does the term ‘matter waves’ mean?
Mairhofer:
In 1924, the French physicist Louis de Broglie postulated that what were thought to be indivisible particles are sometimes better described as waves. Soon physicists showed that, under certain circumstances, the movement of electrons, neutrons, and even small clusters of helium cannot be reconciled with localized impenetrable particles. This seemed to be acceptable to physicists for those microobjects, but unfortunately the theory does not predict a limit for this strange wave behavior, which is the point at which the classical world begins. At the end of the 1990s, the interference of the fullerene isomer of C60 was demonstrated; these are large molecules with about 700 times the mass of a neutron. Now we interfere very large and complex molecules like vitamins and poly peptides, and hopefully in the near future we will do the same with the code and building blocks of life – with DNA strands and proteins. However, the process of interference cannot be observed directly. We infer it from the pattern that the molecules form behind the interferometer, the traces that they leave in our detector and that we translate into a scientific picture.
76
Dippel:
Interview
If I understand you correctly, it is about changing a whole worldview: For a long time, physics’ idea of nature consisted of the assumption that the smallest units of our universes are indivisible, inseparable as the term ‘a-tom’ means ‘the un-cuttable’ in ancient Greek, since the ancient Greek verb ‘temnein’ means something like cutting or separating. As early as the 5th century before Christ, Democritus writes: ”Some given matter may have the appearance of a given color; may appear to us as tasting sweet or bitter – but in reality there are only atoms and empty space.” 1 Early modern physicists and mathematicians, such as Newton or Leibniz, formulated mathematical laws of nature based on these ideas. Experiments have been conducted on the basis of these abstract mathematical laws. Through a steady process of falsification and verification, atomistic assumptions have been revealed to be false prejudices that cannot be properly linked with insights into the subatomic world. First, the electron was detected. Then it became apparent that light, i.e. photons, is discrete, behaves quantized, and crosses the cosmos in the form of wave-particles. On the subatomic scale, physicists observe entities that appear to be in a relationship of reciprocal interaction with one another. They turn into particles at the moment of measurement, but theoretical calculations lead to the hypothesis that they behave like waves on their way between their source and the point of measurement. This is the solution for how to explain the images of interferences that you get. The probability density spreads between the source and the detector like the amplitude of a wave. Therefore physicists initially talked about electron clouds. In conclusion, they decided to call these objects ‘elementary par ticles’ as they can be localized only at the moment of interaction, where they have the character of a particle. In contemporary proton-proton collision experiments, the calculations include the probability that particles can hit each other without necessarily decaying. Particles can penetrate each other without leaving marks.
1 Democritus, fragment 125 (in Galen of Pergamon). In: Harald Fritzsch: The Fundamental Con stants, transl. by Gregory Stodolsky, Hackensack, NJ: World Scientific, 2009, p. vi.
Interview
77
Mairhofer:
Well, that is a fascinating question – why do these particles, quarks, atoms, molecules behave like delocalized waves? According to our classical notion of material bodies, this should not be possible. It seems that a necessary condition for this is that they are very well isolated from their environment; this is why we work in an ultrahigh vacuum. Our vacuum chambers are as empty as space is around the moon. We evaporate the molecules into this empty space, for example, by heating them in an oven. There are more advanced methods for bringing them into the gas phase, but in any case, they have to form a dilute beam of non-interacting molecules. The physicist John Fenn described very poetically how the molecular beam method was developed from leaks, which are the most serious problem in vacuum technology: ”Born in leaks, the original sin of vacuum technology, molecular beams are collimated wisps of molecules traversing the chambered void that is their theater […]. On stage for only milliseconds between their entrances and exits, they have captivated an ever-growing audience by the variety and range of their repertoire.” 2
Dippel:
Original sin has quite a biblical dimension. This seems to me a bit exaggerated, but the idea of the ‘chambered void,’ which corresponds to a scene in a theater, sounds very convincing. What exactly is happening there?
Mairhofer:
When the molecules are isolated from one another and from their environment, the motion of their center of mass can no longer be precisely determined – it no longer follows a classical trajectory, but is much better described by a quantum mechanical wave function. For example, you can observe diffraction effects when the matter wave hits an obstacle, just as water waves do. When this obstacle has a regular structure – like a grating – a regular interference pattern forms behind the obstacle. This pattern can, for example, be obtained by collecting the molecules on a glass plate, where we irradiate them with laser light to a point where they start to fluoresce. The fluorescence is then observed under a microscope. This makes each individual molecule visible, but it is only when they are together that they form the typical pattern of
2 John Fenn: Foreword. In: Giacinto Scoles (ed.): Atomic and Molecular Beam Methods, Vol 1, New York: Oxford University Press, 1988 and 1992.
78
Interview
dark stripes where only few molecules are, and bright stripes consisting of many molecules (fig. 1). In our case, this modulation in the density of molecules cannot be explained by claiming that the molecules were like billiard balls following well-defined trajectories through one of the slits in the grating. But it [the modulation] can be derived from the diffraction of an extended matter wave that passes through several slits and interferes with itself behind the grating, as the wavelets emanating from each slit superpose. Although the pattern is formed by many individual and localized particles, we therefore infer a wave-like behavior. However, we cannot observe that behavior directly – if we try to do so, we only see particles passing through a single slit, and the interference pattern vanishes at that moment. Dippel:
How much truth can there be in an observation that is mathematically, medially and materially mediated?
Mairhofer:
Mathematical modeling and theoretical conceptualization play an important role in any experiment in modern physics. In our case, for example, our results can be compared to the predictions of classical physics, which fail, and to the much better-matching predictions of quantum physics. But I do not think that abstraction and modeling are the only basis of our experiments. The truth of a model lies in the practice that it corresponds to. This practice depends on craft and the engineer’s knowledge. If a model can describe and predict the results of our experiments sufficiently well, it is true for our purposes. The physical experiment translates a theoretical model into a technical setup and a measurement procedure.
Dippel:
I have a practical question that has emerged from my fieldwork. I have often observed that you ask yourselves about the possible applications for fundamental research. Which application-oriented technologies could be developed from the findings of Quantum Physics?
Mairhofer:
Quantum computers and communication that are protected against eavesdropping. Inertial sensors and very sensitive devices for gravity measurements, which the military wants to use for detecting enemy submarines, and oil companies want to use for detecting oil fields. Our experiments are much more peaceful. They enable the determination of
79
Interview
1: Fluorescing molecules are collected on a quartz plate where an interference pattern builds up over time.
the electric, magnetic, and optical properties of molecules – the position of the interference pattern is very sensitive to the external forces acting on the molecules. What is strange here is that, by interfering matter waves, we are investigating properties that are ascribed to the molecule as a particle and that depend on the internal structure of this particle. Particle physics uses a very different approach to study the internal structure of its objects. But you certainly know more about this than I do – after all, you are studying these physicists at CERN. Which is strange, since you are an anthropologist. Dippel:
CERN is a fascinating place for the humanities and social sciences. Zeeya Merali wrote an article titled “The Large Human Collider,” published in Nature, about the importance of laboratory studies for the social sciences.3 If we look at the high density of collaborations at CERN, it seems to me much more appropriate to speak of ‘Human Bubble Chambers.’ CERN is a heterotopic space, in the Foucauldian sense. Social and cultural spheres merge there, ones which often do not seem to be compatible in other areas of our globalized world.4 It is a
3 Zeeya Merali: The Large Human Collider. In: Nature, 464, March 2010, pp. 482–484. 4 Michel Foucault: Of Other Spaces. In: Diacritics, 16, Spring 1986, pp. 22–27.
Interview
80
2: Four lepton invariant mass distributions from the ATLAS experiment. The data are displayed as points and the background expectation as a histogram. Several SM Higgs boson signal contributions are included for different hypothetical Higgs boson masses. Background Z+jets and t+t bottom; background ZZ middle and Higgs boson signal top. The same process can be visualized for a singular event with Feynman diagrams, as seen in fig. 3.
‘site of projection,’5 as Sophie Houdart writes. First and foremost, it is a place where you can find the Faustian dream of modern physics, where people are seeking to understand what holds the world together in its inmost folds. It also mirrors social fears about the destructive powers of the natural sciences in particular, and humans generally. And it is a highly differentiated workplace for thousands of people who do their daily work there. To use Bourdieu’s termenology, at CERN near Lake Leman, you can observe a high density of cultural, symbolic, social, and economic capital. That’s why it would even be possible to install an anthropology department at CERN, and many physicists dream about this, as they told me jokingly during my fieldwork. Among other things, I am studying a cosmopolitan habitus and its local practices. Plus, there are many questions in the field of media theory that can be studied here empirically, from gathering and analyzing big data, all the way through to algorithmic knowledge production and its representation. Mairhofer:
How important are pictures in particle physicists’ work?
Dippel:
On the basis of collision experiments, such as those conducted on the Large Hadron Collider (LHC), knowledge can be gained about the inside of our atoms. Imaginaries rather than images are the key.
5 Sophie Houdart: The Noise of the World: The Apocalypse and the Crazy Farm Scenario, Limn 2013/3, http://limn.it/the-noise-of-the-world-the-apocalypse-and-the-crazy-farm-scenario/, acc. 10–2016.
81
Interview
3: Tree-level Feynman diagram for the gg-initiated roduction of an H-ZZ-4l candidate. p
Today, physicists undertake fundamental research with tools that were initially developed for war projects, such as the Monte Carlo simulation for the hydrogen bomb in Los Alamos.6 Work on collisions at CERN is purely for fundamental scientific purposes, and there is no searching for possible applications. Uses for technology developed at CERN are called ‘spin-off effects’: They happen, but they are not the center of its research. Experimental high-energy physics is ahistorical, focusing on the present. It is designed for measurements of Gedanken-images, approximations to what can be called material reality Here, the detector is the focus of interest; its electronic signals are analyzed in terms of impulse, directions, and interaction region in order to define the signature7 of a particle. What matters is statistical densities of events. This work process includes what is known as event reconstruction, which is only possible through a combination of simulation and data analysis. After data is taken from collisions, a histogram functions like a container to visualize the data. A histogram is a diagrammatic representation of the process of many events. The spatiotemporal developments of events are recorded. It is only through the recursive reappearance of many singular events that a pattern can be recognized (fig. 2+3). Images are the hidden phantasma of the research, not a tool of insight. 6 Richard Rhodes: Dark Sun. The Making of the Hydrogen Bomb, New York: Simon and Schuster, 1995, p. 304. 7 Arpita Roy: Ethnography and Theory of the Signature in Physics. In: Cultural Anthropology, 29, 3, 2014, pp. 479–502.
82
Interview
4: Atlas event display of an H-ZZ-4l candidate.
Mairhofer:
On the theoretical level, the fundamental entities of particle physics are described as matter waves, just like our molecules. How does the notion of collision come into this model, and what is actually colliding at CERN?
Dippel:
Packages of positively charged hadrons are accelerated up to the speed of light; they originate from hydrogen atoms and will be brought to collision at four different points. These kinds of proton packages consist of two up and one down quark, which are surrounded by gluon charges and quark-antiquark pairs. In other words, energies are colliding, and they decay in the second and third generations into other forms of energy. These processes leave traces in the different subdetectors, and are recorded and measured in different channels. The images of events emerge from a highly diversified interaction of humans, machines, data, and models. The visualizations that are created from such events serve merely as prostheses, as control images, or are used in outreach to the public (fig. 4+5). On the LHC, traces of events will be visualized, but those visualizations are not the object of research itself. And even what has been visualized can only be measured and understood through computer simulations. A non-reconstructed event consists of about 40,000 detector
Interview
83
5: 3D reconstruction of a ZZ candidate (CMS).
hits. The reconstruction of the event is merely filtering traces, which had been theoretically assumed and simulated beforehand. From the 40,000 point-like event hits, traces need to be reconstructed. The background, superfluous information, which can be thought of like the interference when you listen to the radio, is enormously high. Mairhofer:
These measurements are only possible because the background is not primarily constituted of white noise, but consists of events that are already understood. This makes it possible to calculate the rates of those events in complex simulations. The expected events are subtracted from the measurement, and it is the difference between the simulation and experimental result that gives the signal.
Dippel:
Besides theory and method, simulation is added as another distinct element within the process of cognition.
Mairhofer:
The model has to be implemented in experimental practice. But on the other hand, this step has to be reversed, and the apparatus that performs this implementation itself has to be translated into a theoretical model.
Dippel:
Exactly. The detector is based on the logic of a counter-experiment that requires a significant number of events. But at the same time, it is situated in the tradition of bubble chamber experiments, where traces of
84
Interview
6: Interference pattern of the largest molecules to date for which quantum interference has been demonstrated.
particles become visible within a liquid hydrogen medium. Physicists analyze speed, decay, impulse, and mass of particles. Since the experiment is about observing atomic processes, it is only possible to extrapolate the probabilities of the events. Prognostics is central to physic’s heuristics. The hits are the results of protons, which ‘meet’ each other as wave functions, and which in turn are called ‘particles.’ They are both particle and wave in one. Particle physics is expanding the cosmos and our language with objects that had first existed as ideas and which are no longer ideas, but ‘proven’ material entities. Nevertheless, to define them correctly, you can only use mathematical symbolization. You can know about them, even if it is hard to ‘believe’ in or to ‘talk’ about them without feeling a sense of skepticism. What is your relationship to the phenomenon of time in your laboratory, and what do you do with the data that you gather in your experiments? Mairhofer:
It takes the molecules a millisecond to fly through our interferometer; our turbo molecular pumps, which maintain the vacuum, rotate eight hundred times per second; a typical interference scan takes two minutes; lunch break takes an hour, and a few days pass before the chamber that contains the interferometer is pumped down to operating pressure. We do not vent it very often and keep it under vacuum if possible.
Interview
85
We organize our data into interference patterns. As I already mentioned, a somewhat intuitive method is collecting the molecules on a quartz plate at some distance behind the diffraction grating. It is also possible to scan a slit over the molecular beam and count the number of molecules that arrive behind it in a quadrupole or time-of-flight mass spectrometer. This also yields the typical periodic distribution of intensity. Figure 6 shows such an interference pattern. It was produced by scanning a mask over the molecular beam. The x-axis gives the position of the mask, the y-axis the number of molecules arriving in the detector at the respective position. The points are actually measurement data, the curve is a sine fit on that data. The normed difference between the minima and maxima of the count rate gives the interference contrast.8 The interference pattern is always the result of the detection of many thousands of molecules. The position of a single molecule never permits a statement as to whether it behaved as a wave or as a localized particle. It is the pattern, the relationship between the depictions of the individual molecules, from which we infer their behavior in the interferometer. Dippel:
What makes it so hard to think about wave and particle as one, from a physical and a philosophical perspective, or, in other words, why is the wave-particle dualism such a wondrous and fascinating idea?
Mairhofer:
This is a remarkable difference between my work and your research project at CERN: While we talk about matter waves, particle physicists talk about particles. However, in our detectors we count individual particles, whereas the particle physicists in their theory use wave functions when they describe what happens before detection, right?
Dippel:
The standard model itself is a field theory, which implies particles that can be measured. They are quanta of the field, a certain type of form. Even if wave functions need to be included in LHC physics, the idea of the particle prevails in everyday language while doing the experiments. Scattering theory itself serves to deduce from one particle to another particle. It describes the particles as waves. But when those hits are
8 Sandra Eibenberger et al.: Matter–Wave Interference of Particles Selected from a Molecular Library with Masses Exceeding 10 000 Amu. In: Phys. Chem. Chem. Phys., 15, 2013, pp. 14696– 14700.
86
Interview
measured and simulated, spin, impulse, energy, and mass are enough to determine a particle. You have to imagine it like this: If you shoot a plane wave onto something, then parts of it will be reflected at the surface of the targets, and they will continue in a direction. The wave turns into a spherical wave. The hits are like manifestations of those waves at a measurement point. But still the particle image prevails in the language used in high-energy physics. Mairhofer:
So you would argue that particle physicists use the concept of matter waves in their model just like us, and that it is the visualizations they make of those experiments that reintroduce the notion of particles? However, these visualizations are intimately connected to the experimental practice that produces them: The data of particle physics consists of trajectories that were initially detected in bubble chambers and nowadays are recorded with very complex and large digital cameras. To our intuition, these trajectories seem to contradict the behavior of a delocalized wave.
Dippel:
That’s why particle terminology dominates physics at the LHC. Expe rimental high-energy physicists call themselves ‘particle hunters’ and not ‘wave riders.’ That is what you should be called, by the way. The form of the experiment determines, wether you are describing waves or particles, and in turn the images physicists create about what phys icists call “nature”. The fundamental conditions of nature reveal the limitations of our perception and descriptions. The closer and more precisely physicists investigate, the stranger and more unfathomable their insights prove to be to the physicists themselves. But it is through this attempt to observe and understand that new words and images emerge, and this is shifting our horizon of perception. Through these kinds of ontological shifts and epistemic arrangements, physics, too, participates in what Ingeborg Bachmann called the “never completely to be completed dream of expression,” because physics is pushing us to the limits of our language and therefore of our being in relation to their environment.9
9 Ingeborg Bachmann, Literatur als Utopie. Frankfurter Vorlesungen V. In: Werke. Vierter Band: Essays, Reden, Vermischte Schriften, Koschel, Christine, u.a. (eds.), München: Piper, 1978, pp. 255–271, p. 268 [Translation by the author].
Interview
87
Ultimately, physics, too, is full of wonder and lost for words before the cosmos. The discipline demonstrates the meaning of signs, and the arbitrariness of their assignment, more clearly than many other fields in the natural sciences. Or, as one of the physicists from CERN told me: ”Nature does not care whether it is a wave or a particle. There is no real wave and no particle. There are just mathematical models, invented by humans. We make them the basis for everything in order to understand the behavior of what we call nature.”
Barbara Orland
Liquid or Globular? On the History of Gestalt-seeing in the Life Sciences of the Early 19th Century 1. The daguerreotype as an extension of the microscope
Alfred Donné (1801–1878), physician and microscopy teacher at the Hôtel Dieu in Paris, was lucky enough to take part in the legendary meeting in the summer of 1839 when Louis Daguerre (1787–1851) presented his invention, the daguerreotype, to the Académie des Sciences. Donné immediately recognized its potential as a documentation technique for micrographs, and within weeks he transformed himself from a scientific correspondent who reported in journals about the event to an explorer of the daguerreotype.1 He promptly came up with a groundbreaking change in the process. For him as a microscopist, the daguerreotype’s weak point was that it only yielded unique pictures. Because each recording resulted in only one picture, Donné simply combined the new technology with the old method of engraving on silver plates. In this way, the image from the daguerreotype plate could be transferred and reproduced with the usual copperplate printing on paper. In the same year he was able to show the scientists of the Académie des Sciences twenty paper prints from a daguerreotype with different images (including the microscopic enlargement of a fly’s eye). Convinced that “the series of experiments, we have employed […] are not far from being exhausted,” 2 he now focused on the tricky combination of the microscope and the daguerreotype. His goal was to present ephemeral traces of living material, which the doctor could discover under a microscope, but was unable to display to a wider audience. As early as February 27, 1840, Donné and his student and colleague Léon Foucault (1819–1868) were able to exhibit the first results of their work.3 They had replaced the eyepiece of a microscope with a sun-coated, light-sensitive plate. This enabled them to project shots from the microstructure of bones and teeth on the wall. Five years later, in 1845, the two pioneers published numerous histological
1 See Steffen Siegel (ed.): Neues Licht. Daguerre, Talbot und die Veröffentlichung der Fotografie im Jahr 1839, Munich: Wilhelm Fink Verlag, 2014, pp. 381ff. 2 Siegel (s. fn. 1), pp. 383–384. 3 Monique Sicard: La Fabrique du Regard. Images de science et appareils de vision (XVe–Xxe siècle), Paris: Odile Jacob, 1998, p. 108.
90
Barbara Orland
1: Blood cells from humans, camel, and frog, engravings from daguerreotypes. Donné/Foucault: Cours de microscopie, 1845, Table 2, Figures 5–8.
daguerreotypes in an atlas.4 Specifically, the engravings of 20 plates (45 × 32.5), each including four daguerreotypes (magnification 200–400), were presented. The pictures showed floating particles in bodily fluids, including in blood, pus, milk, and mucus (fig. 1). 2. Fluids under the microscope
These first microphotographic images were exceptional not just because they made the elusive traces of hitherto invisible worlds permanent. As “realistic” pictures, they objectified a very specific appearance of the living. As noted above, Donné’s first daguerreotypes provided images of the microscopic structure of liquids. Donné very consciously focused on corpuscles. These seemed to float as independent entities in space. As he stated, “the microscopic examination of fluids 4 Alfred Donné, J.-B. Léon Foucault: Anatomie microscopique et physiologique des fluides de l’écono mie. Atlas exécuté d’après nature, Paris : J.-B. Baillière, 1845. The atlas accompanied the publication Cours de microscopie, a microscopy handbook for medical students. Alfred Donné: Cours de microscopie complémentaire des études médicales. Anatomie microscopique et physiologie des fluides de l’économie, Paris: J.-B. Baillière, 1844. All translations are the author’s.
Liquid or Globular?
91
can only serve to see its floating solid particles, such as cells in the blood, or the spermatozoa in the semen, and rarely gives us information about the composition of the liquid itself; it is even necessary for the floating solid particles to possess some stability and regularity of forms, because if this is not the case, you risk mistaking them for quite indifferent strange particles.” 5 To study liquid matter under the microscope was not difficult in itself. A drop, placed between two glass plates, produced a film that can be analyzed quite readily. However, organic substances such as blood, lymph, or milk tend to coagulate immediately after they are removed from the body. In the outer atmosphere their texture – and hence the microscopic image – will change its structural character. As ephemeral and volatile substances, humors cannot be preserved, in contrast to vascular and tissue preparations.6 Changes of consistency “in dying blood,” as the Berlin professor of medicine Carl Heinrich Schultz (1798–1871) remarked in his study on blood circulation, are therefore “the source of most of the contradictions and errors in considering living blood.” 7 But beyond physiologically induced transformations, the viewer also faced an epistemological problem. Only solid structures and individualized particles can be detected under the microscope. Low-viscosity secretions remain colorless and more or less transparent. However, all solid, opaque entities that move within a fluid in an unusual way (e.g. on shaky paths) draw attention to themselves. What is seen is not liquids but things in liquids. In fact, microscopy is a description technology for fixed tissues and structures; it is an object-fixed technique. Undifferentiated liquid substances without definite boundaries resist observation with this scientific tool. This epistemological property of microscopy considerably influenced the development of anatomy and physiology. As long as anatomists confined themselves to seeing the body with the naked eye, to feeling it and smelling its fumes, they practiced a topographic mode of knowledge that readily confirmed ancient humoral knowledge. No one expected to see flows, as all moisture expanded and dried at the time of death. Organ tissue, which still felt soft and moist during dissection, appeared 5 Donné: Cours de microscopie (s. fn. 4), p. 10. 6 On preparation history, see Hans-Jörg Rheinberger: Schnittstellen. Instrumente und Objekte im experimentellen Kontext der Wissenschaften vom Leben. In: Helmar Schramm et al. (eds.): Instrumente in Kunst und Wissenschaft. Zur Architektonik kultureller Grenzen im 17. Jahrhundert, Berlin: Akademie Verlag, 2006, pp. 1–20. 7 Carl Heinrich Schultz: Das System der Circulation in seiner Entwicklung durch die Thierreihe und im Menschen, und mit Rücksicht auf die physiologischen Gesetze seiner krankhaften Abweichungen, Stuttgart/Tübingen: Cotta, 1836, p. 7.
92
Barbara Orland
to be particularly soaked in blood. For this reason, the anatomist Thomas Bartholin (1616–1680) described the meat of the intestines as “a pouring [Zugießung] of blood,” whilst declaring the lean meat dry and therefore “fibrous.” 8 Yet as the microscope made the smallest structures, invisible to the naked eye, the focus of anatomical studies, the nature of the conclusions drawn shifted from humoral physiological gestalt-seeing to structural analysis. The composition of the single body part became more important. For this reason, Donné considered the microscope to be more than just a representation technology; to him, it was a dissection instrument, comparable to chemical analysis or an autopsy. Its main purpose was to “get to know the physical characteristics of the body and its composition,” and “to differentiate the elementary parts from each other, to recognize their essence and structure that escape the ordinary senses because of their delicacy, and the phenomena that they bring forth.” 9 3. Hydraulic processes
However, the microscope did not immediately set the stage for modern tissue and cell theory. When Antonie van Leeuwenhoek (1632–1723) examined blood, semen, milk, and tears as one of the first scientists to do so in the 1670s, he caused a sensation not because he clarified the composition of fluids, but because of his discovery of hitherto unknown entities. Although anatomists had already come to the view that dark spots in tissue were pores, most spots were considered to either be microorganisms (“animalcules”) or described as “lively atoms.” 10 The latter often appeared as spheroids or vesicles, called globules. Alternatively, the microscopic shapes had a line-like form and were therefore identified as fibers or tendons. The medium in which the various entities moved – called serum, or blood water – retreated behind this discovery. The liquid itself appeared as nothing but a carrier substance. Instead of thinking about the humores, following the discovery of blood circulation by William Harvey (1628) anatomists began to take a close interest in the vessels in which liquids move. Later, they began to investigate even the most delicate parts of the vascular system. Harvey had introduced new challenges to anatomy. For 8 Thomas Bartholin: Neu-verbesserte künstliche Zerlegung des menschlichen Leibes … denen Johannis Walaei 2 Send-Schreiben … beygefüget sind, Nürnberg: Hofmann, 1677, p. 7. 9 Donné: Cours de microscopie (s. fn. 4), p. 9. 10 On the history of microscopy see Dieter Gerlach: Geschichte der Mikroskopie, Frankfurt am Main: Harri Deutsch, 2009; on Leeuwenhoek and the beginnings of microscopy in the 17th century: Catherine Wilson: The Invisible World. Early Modern Philosophy and the Invention of the Microscope, Princeton: Princeton University Press, 1995.
Liquid or Globular?
93
instance, he claimed that blood does not subside somewhere in the periphery but always returns to its starting point; if true, the question then arose as to where the paths of circulating fluids extend. Marcello Malpighi (1628–1694) and other microanatomists after him drew a picture of a vascular body, consisting of pipes and ducts right down to the finest capillaries, that connected the arterial to the venous system.11 Others described organs as sump strainers, filters, or secretory mechanisms, embedded in a flow system; on the one hand, this system was determined by the pressure, the elasticity, and velocity of liquids, and on the other, it had a whole series of valves and flaps (glands) that conducted the direction of the fluids.12 From the late 17th century until the beginning of the 19th century, anatomical-physiological research was directed at answering three key questions that all revolved around the problem of understanding the circulation of blood and other body juices. Initially with the help of the microscope, anatomists explored the paths of the blood movement in living body parts of chicks, mice, frogs, salamanders, etc. The intention was to “see” all the arteries and veins, and to observe stagnations or even blockages. Secondly, using ligatures and vascular injections on living animals, they wanted to determine the flow directions and paths of circulation. And thirdly, since tissue formation was conceived as a metamorphosis of substances in the circulation process, anatomists had to find out where urine was separated from the blood or how blood was transformed into milk within the female breast or somewhere else. To understand these transformations of bodily matter, it seemed indispensable to study the mechanics of the invisible microcirculation. As the microscope has been introduced in anatomy at a time when natural philosophy was debating corpuscular modes of matter generation and had rediscovered ancient atomism, the claim that anatomists had seen swimming beads in flowing blood fell on fertile ground.13 The hypothesis that globules are the small 11 See Domenico Bertoloni Meli: Marcello Malpighi. Anatomist and Physician, Florence: Olschki, 1997. 12 See Barbara Orland: The Fluid Mechanics of Nutrition. Herman Boerhaave’s Synthesis of Seventeenth-Century Circulation Physiology. In: Barbara Orland, E.C. Spary (eds.): Assimilating Knowledge. Food and Nutrition in Early Modern Physiologies, Special Issue of Studies in History and Philosophy of Biology and Biomedical Science, 43, 2012, 2, pp. 357–369. 13 On globulism, see John V. Pickstone: Globules and Coagula: Concepts of Tissue Formation in the Early Nineteenth Century. In: Journal of the History of Medicine, 28, 1973, 4, pp. 336–356; Jutta Schickore: Error as Historiographical Challenge: The Infamous Globule Hypothesis, In: Giora Hon, Jutta Schickore, Friedrich Steinle (eds.): Going Amiss in Experimental Research, Boston: Springer, 2009, pp. 27–45.
94
Barbara Orland
est particle of life seemed plausible. This is not the place to explore the resulting controversies surrounding microcirculation in the tissues. Using the microscope, it was easy to amalgamate findings from circulation physiology with corpuscular philosophical theories. In our context, it is vital to emphasize that, in the 1820/30s, any kind of descriptive anatomy was still intended not merely to obtain exact information on the tissue structure (histology), but rather to understand the processes of tissue formation as part of blood circulation.14 The properties of substances and the conditions of their coming into being and constant transformations were the focus; the classification of tissues, by contrast, was secondary. The Viennese anatomist Joseph Berres (1796–1844) reflected that only after researching for some time with a microscope did the precise delineation and classification of organic elements come to his mind. Having started with the imaging of the vascular network and the peripheral capillaries, he then studied the human tissue as districts where “those mysterious workshops existed in which the functions of the life process take place, such as appropriation, secretion, and excretion.” 15 Only then did he find himself forced to develop a classification of “classes and orders” in order “to avoid annoying repetitions and tedious descriptions of varied peripheral vascular conditions” because of the “subtlety and delicacy of the specimen.” 16 4. Metabolic matter
Returning to Alfred Donné and the question of what he expected from medical photomicrography, the foregoing discussion has shown that, at the time when the daguerreotype came to his attention, the terms and concepts of the microworld were by no means clear. Although the microscope had been in use for at least two centuries, histology was a relatively new field of descriptive anatomy. The theory of globules existed in many varieties; an immense number of objects with the smallest deviations were observed. Some microscopists described them as solid bodies with a perfect sphere; others saw elliptical bubbles, and yet others described them as pearl necklaces and claimed that the fibers were formed of globules. Historian of science Jutta Schickore has shown that such divergent observations were not merely the result of the technical state of microscopes. She argues that the glob-
14 See Ann Mylott: The Roots of Cell Theory in Sap, Spores, and Schleiden, Diss. Phil. Univ. of Indiana, 2002. 15 Joseph Berres: Anatomie der mikroskopischen Gebilde des menschlichen Körpers, Wien: Carl Gerold, 1837, p. 6. 16 Berres (s. fn. 15), p. 36.
Liquid or Globular?
95
ules hypothesis cannot simply be dismissed as the prehistory of the cell theory.17 The epistemic situation was much more complex. Microscopists not only debated the appearance of tissue structures; their observations also had to interpret them physiologically, that is, to explain them as a part of the life processes within an organism. In his dissertation Recherches et physiologiques chimico-microscopiques sur les globules du sang, du pus, du mucus et sur ces humeurs de l’œil (1831), Donné showed that the globules visible under the microscope were of a different nature, and that, in most cases, they had to be considered as matter in transition, whose appearance changes depending on the moment in the life process in which they are found.18 Methodologically, he saw himself as a follower of the anatomie élémentaire et globulaire proposed by François-Vincent Raspail (1794–1878), a chemist who claimed to have transferred the chemical laboratory onto a microscope slide.19 Raspail and other physiologists and chemists of the time studied digestion and nutrition as forms of tissue formation. 20 In contrast to purely chemical research, on the basis of which researchers assumed that solid tissue structures were formed by fermentation or crystallization of the liquid parts – processes imitated in the laboratory – Raspail and later Donné identified the blood nuclei as globules albumineux, and not as globules organisés. They argued that blood globules were not the basic elements of all living things, as was previously believed; rather these beads come from the material that the human organism generates during the process of digestion. 21 During digestion, they claimed, food is formed into small beads, which initially make fresh blood and finally – as coagulated blood – are transformed into meat. The food pellets are considerably smaller than blood cells because they have to pass through the walls of blood vessels. Blood globules form fibers; fibers form membranes and tendons, which in turn form the base material for larger body parts (bone system, brain, muscles, etc.). From a physiological point of view, globules were therefore metabolic matter. Their existence could only be explained
17 Schickore (s. fn. 13). 18 Alfred Donné: Recherches physiologiques et chimico-microscopiques sur les globules du sang, du pus, du mucus et sur ces humeurs de l’œil, thèse de médecine de Paris n° 8, Paris : Impr. Didot jeune, 1831. 19 Raspail is said to be one of the founders of histology. Years before cell theory was proposed by Schleiden and popularized in medicine by Virchow, Raspail talked about cellular laboratories (“La cellule laboratoire”). See Georg Dhom: Geschichte der Histopathologie, Berlin, Heidelberg: Springer, 2001, p. 14. 20 Pickstone (s. fn. 13). 21 Donné: Recherches (s. fn. 18), p. 10.
96
Barbara Orland
by the physiology of digestion and nutrition. The quality of food affected the condition of the blood because the life process constantly rearranges the bodily substance. 22 With the aid of the microscope, Donné and his contemporaries were searching for signs and traces of the life process. This becomes particularly evident with the question of the so-called globules du lait, which Donné studied a few years later in the context of his health policy on mothers’ versus nurses’ milk (fig. 2). 23 Until that time, no body fluid had withstood chemical-physiological analysis as effectively as milk. In dairy research, the unanimous verdict prevailed that there is no other substance whose properties change as quickly and intensely as those of milk. This was a source of frustration because it seemed to be impossible to obtain comparative criteria to evaluate nurses’ milk. Donné now hoped to have found an evaluation criterion for the varying milk globules. From the moment that he began exploring differences rather than commonalities in milk samples, he felt able to recognize the “more delicate changes” that characterize this substance. 24 This shift in perspective created a new problem, which the daguerreotype promised to solve. To get meaningful results from a plethora of comparisons of globules, it was necessary to identify similarities and differences between them, based on pattern recognition in shape, size, shape, color, etc. Drawings appeared much too crude to reflect the many variations of this sensitive substance. While “normal conditions” could be studied with a microscope alone, the many variations produced by comparing changing or readily decomposed milk required a representation technology for adequate documentation. With the combination of photography and microscopy, Donné hoped to have found a media technology that enabled the documentation of the pathological form deviations of juices on a globular level so that they could be compared in detail. This hope was not fulfilled. Although he provided comprehensive details of his microscopic milk analysis for the first time in his 1842 publication Conseils aux mères sur la manière d’élever les enfants nouveau-nés (with three editions and four revisions published by 1905), with the exception of the translation for the US market in 1859, it never 22 Donné, Foucault (s. fn. 4), p. 76. 23 Alfred Donné: Du lait, et en particulier de celui des nourrices, considérée sous le rapport de ses bonnes et de ses mauvaises qualités nutritives et de ses altérations, Paris : l’auteur, 1837; see also: Ann F. La Berge: Mothers and Infants; Nurses and Nursing: Alfred Donné and the Medicalization of Child Care in Nineteenth-Century France. In: Journal for the History of Medicine, 46, 1991, pp. 20–43. 24 Donné: Cours de microscopie (s. fn. 4), p. 313.
Liquid or Globular?
97
2: Breast milk of “average quality,” engraving from daguerreotype. Alfred Donné: Mothers and Infants, Nurses and Nursing, Boston 1859, Plate 1.
contained daguerreotypes of milk. 25 One can only speculate that the images in print were either too expensive, or they appeared unsuitable for a wide audience. In any case, Donné’s daguerreotypes did not become accepted as a diagnostic tool for globular pattern recognition, and the microscopic analysis of trace elements shifted from globules to cells, from the search for minimal particles to the study of elementary organisms.
25 His first book on milk, Du lait et en particulier de celui des nourrices (1837), included a plate with engravings that were not from daguerreotypes, but drawings from microscopic samples.
Marietta Kesting
Traces of Bodies and Operational Portraits On the Construction of Pictorial Evidence This article briefly retraces some of the historical events leading to the creation of photographic identity cards with passport photos and fingerprints, and to the present development of DNA forensics, focusing on the transformation of the broad documentation of the body and its visualization. Since when and how have bodies been tied to a name, a place of birth, and a country of citizenship by passport photos and fingerprints as an index – or in the future potentially by a DNA trace – with the objective being to make a particular body unambiguously recognizable and identifiable? “The passport is the noblest part of a human being. […] A human being can come into existence anywhere, anytime, in the most careless manner, but never a passport.” 1 Or to switch perspective, as the title of a book on diaspora studies summarized, “God needs no passport.” 2 Human beings, however, are obliged to allow themselves to be identified and need to be recognized – something that even Odysseus struggled with on his return in ancient Greece – otherwise they are not welcome. Nowadays passports usually combine fingerprints and a passport photo. The fingerprint is a contact print, a physical trace, produced by direct contact by the finger on a surface. As the art historian Bettina Uppenkamp notes, “[…] the ease with which in the imprint of the human hand the gesture results in a figure, contact in a likeness, or to express it semiotically, an index turns into an icon.” 3 Hand imprints are the oldest surviving pictorial symbols. Fingerprints are different for each individual and are useful for unambiguous identification, but this cannot happen on “first sight” and instead requires the respective person to cooperate or to be arrested.
1 Bertolt Brecht: Flüchtlingsgespräche, Frankfurt am Main: Suhrkamp, 1961, p. 7. [Translation by the author] All translations are the author’s unless otherwise indicated. 2 Peggy Levitt: God Needs No Passport, New York: The New Press, 2007. 3 Bettina Uppenkamp: Der Fingerabdruck als Indiz. Macht, Ohnmacht und künstlerische Markierung. In: Bildwelten des Wissens, 8, 1, 2010, p.7.
100
Marietta Kesting
Citizen’s identity cards
The history of mandatory identification starts with the French Revolution in 1789, which declared all male inhabitants of France citizens.4 When war was imminent, passport laws were introduced to control the crossing of state borders, to prevent the nation’s own population from leaving, and to register all army conscripts and distinguish them from “foreigners.” This introduction of mandatory identification is part of the modern disciplining of the population, which Michel Foucault analyzed in detail and described as creating specific laws and regulations for every aspect of people’s lives.5 These events in France therefore mark the creation of the first modern administrative state. Many other European countries followed suit and implemented similar laws not just “at home” but also in their colonies. Being recognized
The situation changed completely in the first half of the 19th century with the invention of a number of photographic processes. Physical objects were depicted mechanically through the optical-chemical trace of light. The media and perception theorist Rudolf Arnheim noted: “The physical objects draw their own likeness by the means of the optical and chemical functionality of the light.” 6 For the first time it seemed technically viable to record the human face indexically and automatically without any disturbance or interference. In France, the inventor and photographer André Disdéri obtained a patent for business cards that included small photographic portraits. These cartes de visite were affordable and soon became very popular.7 At the same time, the advantages that photography offered for the identification of persons were also recognized. In 1870, hundreds of members of the Paris Commune were identified using photographs and subsequently killed.8 This demonstrates how the recording and storing of the human bodies and faces
4 The first declaration was only meant for “mature” citizens, meaning women were excluded. The suffragete Olympes de Gouges presented her “Declaration of the rights of the woman and female citizen” 1791 at the National Assembly. Olympe de Gouges: Schriften, Frankfurt am Main: Stroemfeld, Roter Stern, 1989. 5 Michel Foucault: Discipline and Punish: the Birth of the Prison, New York: Random House, 1975. 6 Rudolf Arnheim: Die Seele in der Silberschicht. Medientheoretische Texte. Photographie – Film – Rundfunk, Frankfurt am Main: Suhrkamp, (1974) 2004, p. 24. 7 Jochen Voigt: Faszination Sammeln. Cartes de Visite. Eine Kulturgeschichte der photog raphischen Visitenkarte, Chemnitz: Edition Mobilis, 2006. 8 Gisèle Freund: Photographie und Gesellschaft, Reinbek: Rowohlt, 1979, p. 119.
Traces of Bodies and Operational Portraits
101
of the population made each individual identifiable by the police. In this historical situation, however, to be recognized amounted to a death sentence. Government-issued passports with a passport photo
In 1914, Prussia passed a new passport law and for the first time legally required passport photos for each identity card. The issuing authority asked for two passport photos; one was kept on file in the local office, whilst the other was inserted in the identification documents. The photograph was glued onto the official document and stamped on both sides so that it could not be removed undetected. The following text was added: “The owner of the passport is effectively the person represented in the photograph, and he or she has personally signed his or her name below it.” 9 (fig. 1) Here we find a symbiotic relationship between image and text, which are only able to create evidence in conjunction with each other. Furthermore, “wanted” photographs used to apprehend suspects or criminals and passport photographs are genealogically linked.10 The formal prerequisites and the standardized procedure for taking the photo in front of a neutral, bright background aesthetically turn all passport photos into wanted photos, changing the individuals represented into wanted criminals. For this reason, subjects almost always disidentify with their passport photo, often saying, “I don’t really look like this,” when they present it in private. In addition, people having their passport photo taken are acutely aware that they are being photographed. This knowledge is translated into their facial expression. Arnheim described this special looking back: “This is the human being when he is exposed to other’s gaze: He needs a persona and asks himself how he comes across; he is in danger or may receive a fortune – simply due to the fact that he is being looked at.” 11 For this reason, passport photographs seem to mortify their subjects and create self-evidence. They have the status of a visual document, presenting the most 9 Cf. Andreas Reisen: Der Passexpedient. Geschichte der Reisepässe, Baden-Baden: Nomos, 2012, (see image 1 here), Image. 65, p. 97. 10 Cf. Susanne Regener: Bildtechnik und Blicktechnik. In: id.: Fotografische Erfassung: zur Geschichte medialer Konstruktionen des Kriminellen, München: Fink, 1999, pp. 161–167. 11 Rudolf Arnheim (s. fn. 6), p. 27. Cf. also Roland Barthes: Camera Lucida, New York: Farrar Straus & Giroux, 1981, p. 10–16 and Allan Sekula: The Body and the Archive. In: October, 39, Winter 1986, p. 3–64.
102
Marietta Kesting
1: Identity card, 1932.
factual portrait. A passport or identification document is usually a material object in which photography and citizenship intersect, and that assists the nation-state in producing or constructing the male or female citizen with a fixed nationality, place and date of birth. The passport with a photo has a dual purpose: making every individual identifiable and hence controllable, but also turning each and every individual into a set of data that can be compared with other individuals and organized into groups by religion, age, or other categories of difference in statistical analyses. Recognizing citizens, foreigners and gender roles
Passport photos can be considered a special kind of operational image. According to the image theorist W. J. T. Mitchell, being addressing by an image that “includes the spectator as the target of pictorial gaze” 12 can be linked to Louis Althusser’s concept of “interpellation.” In Althusser’s famous example, a policeman calls out, 12 W.J.T. Mitchell: What Do Pictures Want: The Lives and Loves of Images, Chicago: University of Chicago Press, 2004, p. 49.
Traces of Bodies and Operational Portraits
103
“Hey, you there!” on the street, whereupon the individual hailed turns around, and in doing so, becomes a subject “because he has recognized that the hail was ‘really’ addressed to him, and that ‘it was really him who was hailed’”.13 The next logical step in this scene would be for the policeman to say, “Identify yourself.” In this way identification documents and police controls construct both citizens and foreigners. Gender theorist Judith Butler reconsidered Althusser’s concept of interpellation in the creation of gender roles. Butler analyzes the act of baptism that turns a human being into a gendered subject (“It’s a girl!”).14 However, Butler criticizes Althusser’s concept of interpellation as it assumes the figure of a sovereign godly voice that is reduced to the moment of its articulation and does not include any possibilities for resistance and rearticulation.15 In fact, a special case illustrating a new articulation of subjectivities can be found even in the history of photographic identification. This is highly significant as the ID photo together with the passport is usually a normative instrument that does not allow any obscurities or positionings in-between, and codifies gender, citizenship, and age. This exception is a special identity card that permitted and facilitated the performing of the other gender role, and that was introduced and distributed during the Weimar Republic by the pioneer of the study of sexualities Magnus Hirschfeld. It was called a “transvestite certificate or card.” 16 These exceptions permitted by the police and the psychiatric establishment only existed for a few years. As an addendum, the continuation of the history of the identity card under National Socialism should be noted. The National Socialists introduced a new law on domestic mandatory identification on September 10, 1939. The identity card and passport system was actively used to differentiate between Jews and non-Jews, as is well known and has been extensively researched.17
13 Louis Althusser: Ideology and Ideological State Apparatuses. In: id. (ed.): Lenin and Philosophy and Other Essays, transl. by Ben Brewster, New York: Monthly Review, 1971, pp. 127–186, quote p. 174. Emphasis in the original. 14 Cf. Judith Butler: Bodies That Matter, New York: Routledge, 1993, p. 232. 15 Butler (s. fn. 14), pp. 81–97. 16 Hans-Magnus Hirschfeld Archive. Cf. Magnus Hirschfeld: Transvestiten, Berlin: Verlag Alfred Pulvermacher, 1910, pp. 192–198, and Rainer Herrn: Die falsche Hofdame vor Gericht: Transvestitismus in Psychiatrie und Sexualwissenschaft oder die Regulierung der öffentlichen Kleiderordnung. In: Medizinhistorisches Journal, 2014, 49, pp. 199–236. 17 On this issue, see Götz, Aly & Karl-Heinz Roth: Die restlose Erfassung. Volkszählen, Identifizieren, Aussondern im Nationalsozialismus, Frankfurt am Main: Fischer, (1984) 2000.
104
Marietta Kesting
DNA snapshots & Stranger Vision – Constructing visualizations from physical traces
Since 2007, biometric passport photos have been recorded and stored together with a digital fingerprint on a chip that is integrated in EU passports and German identity cards.18 When entering the United States, foreign citizens are photographed again, and prints of all their fingers are taken. This indicates that the passport photo has lost its privileged role in identification and is now only seen as evidence when combined with other sets of data. The “truth” about a person is no longer visible from the outside when comparing the individual to his/her photograph, but is increasingly located within the individual and searched for by analyzing other physical traces, including DNA. These types of physical traces are invisible to the human eye and have to be extracted, sequenced, and calculated. And yet even in the age of DNA, forensics has not abolished identification images – on the contrary, visualizations similar to passport photos are still widely used when a search is being conducted for an unknown person. Human perception does not function without an image, and the face remains the most recognizable part of a person in spite of all the difficulties in accurate identification. Here a connection between bodily traces and their imagery is reconstructed that recalls the earliest interactions between people and photographs. The writer Hubert Fichte, a keen observer of the photographic medium, described the “magical beliefs” ascribed to early photographic portraits: “The photograph is considered a particle of the original; it gives one – as fingernails or a lock of hair do – power over the other. Our grandparents carried daguerreotypes of their loved ones on necklaces made of hair, or a photo of their fiancée together with a lock of hair.” 19 It is not possible today to create an accurate image, similar to a photograph, from a lock of hair, but imagery can be created from this physical trace of a specific body. In this way, a missing person can be searched for, and the final validation of his/ her identification can be provided by comparing the DNA. The American artist Heather Dewey-Hagborg explored this development by collecting DNA traces of strangers in New York City, such as saliva on discarded 18 For information on electronic passports, see http://www.bmi.bund.de/DE/Themen/Moderne-Verwaltung/Ausweise-Paesse/Reisepass/reisepass.html, acc. 08–2015. 19 Hubert Fichte: Schwarz/Weiß doppelt belichtet. Kleine Chronologie zum Werk des afroa meri kanischen Fotografen van der Zee. In: Frankfurter Rundschau, January 12, 1980.
Traces of Bodies and Operational Portraits
105
2: Heather Dewey-Hagborg: Stranger Vision.
chewing gum or a cigarette butt, and a strand of hair. At the biohacking lab Genspace, she extracted sequences of the genetic code found in her samples under scientific supervision and used different primers to create a polymerase chain reaction. 20 Then she could search for specific markers within the DNA sequence for individual traits such as eye and hair color, a predisposition to freckles, a tendency for obesity, and a number of other characteristics. 21 She fed the results into a 3D modeling software that generated sculptural portrait busts from the data, which were then printed with a 3D printer. The resulting portraits are not a completely accurate match; rather they show something akin to a family resemblance (fig. 2+3). As a cross-check, the artist had first created a portrait from her own DNA sample; the strong likeness it presented had surprised her. 22 Dewey-Hagborg’s piece received a lot of media attention and prompted a discussion about (genetic) surveillance and the protection of personal data, visibility in public spaces, and ethical issues. 23 These were exactly the questions she
20 Cf. NGO Genspace: http://genspace.org and http://thenewinquiry.com/scificrimedramaw ith astrongblacklead/, both acc. 08–2015. 21 Dewey-Hagborg documents her process on her blog: https://deweyhagborg.wordpress. com/2013/06/30/technical-details/, acc. 08–2015. 22 Natalie Angley: Artist creates faces from DNA left behind in public. In: CNN, http://edition. cnn.com/2013/09/04/tech/innovation/dna-face-sculptures/, acc. 08–2015. 23 See also Gambino, Megan: Creepy or Cool? Portraits Derived from Hair and Gum Found in Public Places. May 3, 2013, http://www.smithsonianmag.com/ist/?next=/science-nature/creepy-orcool-portraits-derived-from-the-dna-in-hair-and-gum-found-in-public-places-50266864/, acc. 08–2015.
106
Marietta Kesting
3: Heather Dewey-Hagborg: Stranger Vision.
had wanted to provoke, as she had done previously in other works that commented critically on new technologies and scientific methods. Without doubt, the portraits she created differ from passport photos in their aesthetics, formal qualities, and technical aspects, and yet they have a similar function: to trace an individual and make his or her face recognizable. Significantly, the verbal metaphors for describing this new process are also taken from photography as the technique has been termed a “DNA snapshot” and had been commercially trademarked by the American company Parabon NanoLabs, even though the process has nothing in common with the quick photographic snapshot, relying instead on an elaborate calculation process that ultimately retranslates data into visual form (fig. 4). 24 “DNA Phenotyping is the prediction of physical appearance from DNA. It can be used to generate leads in cases where there are no suspects or database hits, or to help identify remains. Using in-depth data mining and advanced machine learning, and with support from the US Department of Defense, we have built the SnapshotTM Forensic DNA Phenotyping System, which accurately predicts genetic ancestry, eye color, hair color, skin color, freckling, and face shape in individuals from any ethnic background, even mixed individuals.” 25 Parabon NanoLabs already offers this method commercially, and it is, of course, no surprise that it supported by the US Department of Defense. 24 Parabon NanoLabs, https://snapshot.parabon-nanolabs.com, acc. 08–2015. 25 Parabon NanoLabs (s. fn. 24).
Traces of Bodies and Operational Portraits
107
4: Poster Parabon Nanolabs.
Conclusion: Mis/Recognition
The pursuit of exact recognition and identification still determines the making of passports. Current photographic identification has to adhere to biometric standards and is complemented with digital fingerprints. This is meant to create “a closer congruence between the identity card and its holder”, as EU governments put it.26 The process for creating identities by means of DNA traces is not yet viable, but could be used in the future. These technologies are already being applied in their current state in criminal forensics. 27 However, they are still dependent on visualizations because human beings have no organs of perception capable of directly perceiving DNA sequences. While trans* and inter* persons have been allowed to change their first names and assigned gender in their passports in Germany since the passing of the Transsexual Act in 1981, 28 DNA analysis seems to establish another way to reach the genetic and objective “truth” of a person’s gender. 29 Moreover, critics fear that DNA snapshots will further strengthen ethnic/racial profiling by the police. “Racial profiling” refers to more frequent targeting of, and identity checks on, non-white people by the police. Image theorist Nicholas Mirzoeff describes this 26 http://www.ausweis-app.com/elektronischer-aufenthaltstitel/, acc. 08–2015. 27 Andrew Pollack: Building a Face, and a Case, on DNA. In: New York Times, March 23, 2015. 28 Cf. the German Federal Anti-Discrimination Agency on trans*: http://www.antidiskriminierungsstelle.de/DE/ThemenUndForschung/Geschlecht/Themenjahr_2015/Trans/trans_node. html, acc. 08–2015. 29 For a critical discussion of the history of “objectivity“, see Peter Galison, Lorraine Daston: Objectivity, Cambridge, MA: MIT Press, 2007.
108
Marietta Kesting
practice as creating a “nomadic border” that can be established at any time when a “citizen” looks at a suspicious person who could be an undocumented migrant.30 This nomadic border is grounded in visual differences and seems to insist on essentializing the materiality of the body. As gender theorist Verena Namberger argues, existing structures of racial exclusion are not overcome by neomaterialist conceptions of the body, and yet they are also not simply upheld. Rather: “In the world of cyborgs, genetic codes, and virtuality, the articulations of racism are altered. With the concept of bodies as assemblages, it is possible to analyze how different concepts of race – biological or cultural, analogue or digital – are not superseded, but ‘intra’-act and are sometimes mutually reinforced.” 31 Needless to say, DNA samples can also be forged and altered; they can be tainted or changed, as is portrayed in the 1997 science fiction film Gattaca.32 The main character, Vincent, whose genes are considered substandard, assumes the identity of Jerome, who was a great athlete before an accident confined him to a wheelchair. In this way, Vincent, who had been conceived “naturally” by his parents, can live his dream of working for the elite space agency. Vincent receives blood and urine samples, skin cells, and a fake fingerprint from Jerome. The film presents possibilities for escaping the seemingly perfect control of subjects and assuming a new identity. “Self-identity is a bad visual system,” 33 stated the techno-science pioneer Donna Haraway as early as 1988, thereby criticizing both the limitations of self- position in academic life as well as the often unreflected power systems in optical appliances. Perceptual processes that we believe to be means by which to recognize others and ourselves are always linked to the production of discernible differences. Photographic and similar images and visualizations are thereby always complicit in producing knowledge about the bodies of others. They are part of delineation processes, a function they still hold today, even after photography has lost its privileged position as a trusted producer of evidence and objectivity. At the same 30 Nicholas Mirzoeff: The Right to Look. A Counterhistory of Visuality, Durham, NC: Duke University Press, 2011, p. 282. 31 Verena Namberger: Rassismustheorien und die Materialität des Körpers. In: Tobias Goll et al. (eds.): Critical Matter, Münster: Edition Assemblage, 2013, p. 145. Emphasis as in the original. 32 Andrew Nicoll (director): Gattaca. USA, 1997, http://www.imdb.com/title/tt0119177, acc. 08–2015. 33 Donna Haraway: Situated Knowledges. The Science Question in Feminism and the Privilege of Partial Perspective. In: Feminist Studies, 1988, 14, 3, p. 585.
Traces of Bodies and Operational Portraits
109
time, it is impossible to frame subjects in a fixed position – as we have seen during the Weimar Republic, when modern citizens were created, subcultures and non-normative modes of subjectivation simultaneously emerged. For this reason, subjects will always find modes of rearticulating imagery that is saturated with (state) power so as to criticize and politically subvert it.
Sophia Kunze
Reduced Complexity or Essentialism? Medical Knowledge and “Reading Traces” in the History of Art For a long time, the humanist exploration of the body and its representation in images has been a point of collaboration and interdependence between the arts and medical/anatomical studies. As a result, there is a scientific tradition of looking at certain artistic images from a medico-historical perspective.1 These images show: 1. in the broadest sense, a medical treatment, often presented as a genre; 2. the representation of illness, such as Christian representations of plague saints; 3. the representation of so-called “medical oddities”; 4. anatomical/medical illustrations which serve a specific educational purpose; 5. no specific topic, rather the images are created by the “sick” themselves (studies then mostly focus on the artists’ psychological state and its effects on their work). 2 Furthermore, every image that shows a human being and hence a human body could potentially become the subject of a medico-historical analysis. Anatomical science has a traditional link to the visual arts as the genesis of anatomical drawing requires an interdisciplinary exchange. These drawings require a medical scientist, who researches body structure and physiological processes, as well as a technically and visually accomplished artist who can translate the subject into an understandable and compelling – in the sense of “evident” – image. The strategies chosen to accomplish this goal vary depending on the period and context or established assumptions, for example: In early anatomical prints and studies, a direct transfer from Christian iconography and composition can often be seen. Later in the process of establishing medical knowledge, we find a shift towards – from a modern perspective – a more ‘objective’ representation.3 On the one hand, artists are part of the production of medical images; on the other, anatomical studies themselves form the basis for artistic works. Leon Battista Alberti states in his treatise De Pictura, published in 1 Boris Röhrl: History and Bibliography of Artistic Anatomy. Didactics for Depicting the Human Figure, Hildesheim: Olms, 2000; Karl Eduard Rothschuh: Konzepte der Medizin in Vergangenheit und Gegenwart, Stuttgart: Hippokrates-Verlag, 1978. 2 On point 5, see Gottfried Böhm: Die Kraft der Bilder. Die Kunst von „Geisteskranken“ und der Bilddiskurs. In: id. (ed.): Wie Bilder Sinn erzeugen. Die Macht des Zeigens, Berlin: Berlin Univ. Press, 2007, pp. 229–242. 3 Jonathan Sawday: The Body Emblazoned. Dissection and the Human Body in Renaissance Culture, London/New York: Routledge, 1995.
112
Sophia Kunze
1435/36, that knowledge of the anatomical structure of the human body – bones, muscles, and movement – should be the basis for a successful picture. However, it has to be emphasized that Alberti’s advice does not elicit a naturalistic image in a mimetic sense, but rather a conjunction of well-formed elements of “nature” to create an ideal image – better than nature.4 This article will focus on interdisciplinary exchanges between the history of art and the history of medicine with reference to so-called representations of oddities or curiosities – meaning everything situated by definition outside the limits of a normative corporal conception. Looking at the research on these images, it can be noted that most of the published literature is written by medical practitioners or medical historians (in most cases by authors who practiced in both fields at the same time). José de Ribera’s Bearded Woman (1631, fig. 1), for instance, has been the subject of multiple analyses, which read like medical records. The pediatrician Ottmar Tönz states: “A fully virilized woman with a full beard and a receding hairline appears here to be nursing her child. Can that be possible? Yet it has to be assumed that, in such a case of heavy virilization, the gonadotropins are fully suppressed so that conception and pregnancy would appear utterly unthinkable.” 5 Looking at the scarce research by art historians on The Bearded Woman, it can be seen that art historians also employ medical explanations to cope with the seemingly strange subject. The art historian Elizabeth du Gué Trapier states in 1971: “Ribera [was commissioned] to paint a subject which helped to earn for him the reputation of a painter of abnormality and ugliness.” 6 A catalog on Neapolitan artists calls the image a “clinical case,” “but [Ribera’s] artistic mastery transformed this clinical and disturbing topic into a brilliant work of art.” 7 In the anthology L’inven tion du corps, Nadeije Laneyrie-Dagen describes Ribera’s depiction and similar
4 Leon Battista Alberti: Das Standbild. Die Malkunst. Grundlagen der Malerei, ed. and transl. into German by Oskar Bätschmann, Darmstadt: WBG, 2000, pp. 256–257. 5 Ottmar Tönz: Curiosa zum Thema Brusternährung. Von stillenden Vätern, bärtigen Frauen und saugenden Greisen. In: Schweizerische Ärztezeitung, 81, 2000, Vol. 20, pp. 1058–1063, translation by the author; a similar analysis can be found in: Juan Falen Boggie: En torno a la mujer barbuda de José de Ribera. In: Revista Peruana de Pediatría, 2007, Vol. 60, part 2, pp. 1024–1035. 6 Elizabeth du Gué Trapier: Ribera. In: The Hispanic Society of America, 1, 1952, Vol. 35, pp. 68–74, p. 71. 7 Silvia Cassani (ed.): Civiltà del Seicento a Napoli, exhib.cat., Museo di Capodimonte, Naples, 1984, p. 410. Translation by the author.
Reduced Complexity or Essentialism?
113
1: José de Ribera: The Bearded Woman (Magdalena Ventura), 1631, oil on canvas, 196 × 127 cm, Museo del Prado, Madrid.
ones as “women suffering from adrenal or endocrinological disorder.” 8 The same pathological explanations can be found in the discussions in the literature on any other kind of curiosity, for example, the representation of so-called “dwarves,” who were extensively painted during the Spanish Golden Age. Many historical studies examine the social and cultural status of these individuals, but a wide range of studies also focus on their “illness.” 9 With reference to the latter, Henry Meige has to be mentioned. At the end of the 19th century, he attempted to classify artistic representations of dwarves based on medical taxonomies, stating that it was possible to determine whether, “[t]hese dwarves were of achondroplastic origins 8 Nadeije Laneyrie-Dagen: L’invention du corps. La représentation de l’homme du Moyen-Age à la fin du XIXe siècle, Paris: Flammarion, 1997, p. 174. Translation by the author. 9 Lothar Sickel: Zwerg, In: Uwe Fleckner, Martin Warnke (eds.): Handbuch der politischen Ikonogra phie, München: Beck, Vol. 2, 2011, pp. 567–574.
114
Sophia Kunze
2: Pieter Brueghel the Elder (workshop): The Yawning Man, 1567, oil on canvas, 96 × 128 cm, Collection of Stockholm University.
or in rachitis, or whether they exhibited microcephalic, hydrocephalic, infantile, or obese habitus.” 10 Meige’s medical approach to analyzing art led to a painting attributed to Pieter Brueghel the Elder showing a yawning man providing the name for a nervous disease classified in 1910, the so-called Brueghel’s syndrome (also known as Meige’s syndrome, fig. 2).11 Meige was a student and later coworker of Jean-Marie Charcot, director of the French Salpêtrière Hospital.12 Charcot and his students and employees, Paul Richer, Ludwig Choulant, and Henry Meige, are considered the founders of what is known as medico-historical research, a branch of research that ana 10 Henry Meige: Les nains et les bossus dans l’art, Paris: Vve Lourdot, 1896. Translation by the author. 11 Henry Meige: Les convulsions de la face, une forme clinique de convulsion faciale, bilatérale et médiane. In: Revue Neurologique, 20/1910, pp. 7–443; cf. Peter Berlit (ed.): Klinische Neurologie, Berlin: Springer, 2012 (3rd edition), pp. 999–1000. On Brueghel: Larry Silver: Pieter Bruegel, Paris: Citadelles & Mazenod, 2011, p. 384. 12 Jean-Martin Charcot: Iconographie photographique de la Salpêtrière, Paris: Bureaux du Progrès médical 1878; for further information, see Georges Didi-Huberman: Invention de l’hystérie. Char cot et l’Iconographie photographique de la Salpêtrière, Paris: Macula, 1982.
Reduced Complexity or Essentialism?
115
lyzes artworks medically and which emerged as a subgenre in the 19th century, primarily in France.13 All of them had received a medical education and were practicing physicians; they also shared an interest in the pictorial representation of diseases. The most important representative of medico-historical research in Germany was Eugen Holländer, who proclaimed himself the successor of Charcot and his students around 1900, and published his treatises Die Medizin in der klassischen Malerei (medicine in classical painting, 1905), Die Karikatur und Satire in der Medizin (caricature and satire in medicine, 1905), and Plastik und Medizin (sculpture and medicine, 1912). Although his works focus on artistic artifacts, he is scarcely acknowledged by art historians. However, as a medical practitioner, he has gained quite some reputation, especially in the last two decades, mainly for his other achievements: Holländer is known as a pioneer of aesthetic surgery and for carrying out the first “transplantation of body fat for aesthetic and reconstructive purposes” in 1906. He is also known for having performed the first surgical facelift in 1901, “seduced by female persuasion,” as he himself stated.14 In his introduction to Plastik und Medizin in 1912, Holländer explains and describes “medical art history, its importance, purpose, and development”: “One requirement for doing a ‘pathology of the image’ is a general knowledge of the bodily appearance in its constantly altering manner, depending on artists’ styles, technology and culture. […] From these artworks, we learn about the difficulty of determining disease from examining its manifestations within the body.” Holländer later explains his methodical approach: One has to “note the general representation of the body in its ever-changing manner.” 15 In Holländer’s methodology, the so-called connoisseur of art, who knows about context and artistic style as he or she considers an image, is just as necessary as a physician’s experience-based evaluative diagnosis. This methodical approach supports Ginzburg’s 13 Ludwig Choulant: Geschichte und Bibliographie der anatomischen Abbildung nach ihrer Bezie hung auf anatomische Wissenschaft und bildende Kunst, Leipzig: Rud. Weigel, 1852; Jean-Martin Charcot, Paul Richer: Les démoniaques dans l’art, Paris: A. Delahaye et E. Lecrosnier, 1887; Paul Richer, L’art et la médicine, Paris: Gaultier, Magnier et Cie, 1900. 14 Andreas Gohritz: Eugen Holländer (1867–1932). Ein weitgehend unbekannter Pionier der ästhe tischen Chirurgie und autologen Fettinjektion und medizinischen Kunstgeschichte in Deutschland, creative common 2011, http://www.egms.de/static/de/meetings/dgch2011/11dgch647.shtml, acc. 06–17–2016. 15 Eugen Holländer: Plastik und Medizin, Stuttgart: Ferdinand Enke, 1912, p. 2. Translation by the author.
116
Sophia Kunze
thesis that the history of art and the natural sciences around 1900 follow the same principles of “reading traces” (or “reading trace evidence”), but within different epistemic models.16 In addition, Holländer clearly states what should not be expected from his studies: “to extract sober and practically usable knowledge; or while looking at an anatomical drawing by Rembrandt, to gain knowledge about the human muscles; or while looking at the portrayal of disease, to gain experience in the art of medical diagnosis.” 17 In 1852, Ludwig Choulant had already drawn attention to this problem by stating in his study on the history and bibliography of anatomical illustration: “On the one hand, it is difficult to find good information about the objects [medical imagery] by looking at historical analysis by art historians because these objects are too far from the usual artistic pieces considered in their research. On the other hand, anatomical and medical studies provided by physicians are not satisfying either, as they usually lack the required historical and cultural knowledge.” 18 How do we assess this interaction between medical and (art) historical analysis? What is the status of the image within this methodological model? At the beginning of this article, I briefly presented the art historical categorization of medical images produced by artists; now I would like to ask how much objectivity, here referring to naturalism, can there be assumed to exist in a picture which is definitely a work of artistic production? From the perspective of image criticism, one would undoubtedly say that no image can ever present an objective and hence naturalistic view of the object represented – naturalism is a concept. Even a scientific illustration will always be the result of categorization, processes of transfer, conditioned views, attribution, and so forth. Therefore, any medical image can only be a representation of the concept of objectivity, a representation of an epistemological agreement.
16 Carlo Ginzburg: Morelli, Freud, and Sherlock Holmes. Clues and Scientific Method. In: Umberto Eco, Thomas Sebeok (eds.): The Sign of Three. Dupin, Holmes, Peirce, Bloomington: Indiana University Press, 1983, pp. 81–118. 17 Choulant (s. fn. 13), p. 5. 18 Choulant (s. fn. 13), p. 5.
Reduced Complexity or Essentialism?
117
At this point I would like to return to Ginzburg’s thesis of reading traces as a scientific method. In his article, Ginzburg describes Giulio Mancini as the founder of so-called connoisseurship in art history.19 Prior to Giovanni Morelli’s art historical analyses, which are based on the same principal idea, Mancini developed a method which would enable the connoisseur to attribute paintings (or copies of paintings) to a particular artist by focusing on details. Mancini explains that, when copying an image, it is primarily the unimportant details, ones which are not part of the image’s core narrative, that are usually not copied as precisely and thereby reveal the image’s producer. In retracing Mancini’s and Morelli’s methods, Ginzburg also emphasizes the fact that both were famous physicians and especially well known for their exceptional gift for diagnosis. Mancini, and later Morelli, assume that “readable traces” in paintings occur involuntarily and can therefore be discovered. Ginzburg notes that this forms a parallel to the symptoms of a disease, which also appear involuntarily and therefore can be read as clues leading to the illness.20 The use of the term “symptom” is relevant as it refers to the symbolic value, which, in contrast to the results of an empirical analysis, has to be “discovered” and interpreted. This leads to a major problem: The symptom of an illness refers indexically to the ill body and has to be interpreted – different illnesses have the same symptoms; the symptom presented in an artistic painting refers indexically to its material genesis but is, as I have shown, similarly interpreted. But if a physician examines a painting expecting to find medical evidence, the question arises: Did the producer share the same knowledge of the disease and its symptoms, and attempt to represent them? Or did the producer, unaware of the illness, mimetically present a “sick body” and thereby involuntarily underscore the symptoms of its illness? One of these possibilities must be assumed if one is seeking to legitimize the evaluation of a painting from a modern medical perspective, as if producing an anamnesis for a person. But running counter to this idea, Mancini explicitly stresses the singularity and individuality of an artwork, stating in his Considerazioni that the essence of paintings in general is the difference between real life and painted object: “le specie della pittura nate dalla differenza delle cose imitate.” 21 He also sees the best representation of nature in a combination of ideal elements, like Alberti cited above. Therefore an evident image – meaning that the viewer 19 Ginzburg (s. fn. 16), p. 25. 20 Ginzburg (s. fn. 16), p. 24. 21 Giulio Mancini: Considerazioni sulla pittura, Adriana Marucchi (ed.) and Luigi Salerno (commentary), Rome: Accademia nazionale dei Lincei, 1956; Alberti (s. fn. 4), pp. 256–257.
118
Sophia Kunze
sees what is intended and believes what he or she sees – cannot have a mimetic approach to nature. In the above discussion of The Bearded Woman, I have shown that medico-historical research argues from the viewpoint of modern medical knowledge. In such studies, the epistemic models applied – hormones and the endocrine system – lead to the assumption of illness and to statements that there is a disorder within the presumably balanced hormonal system. However, hormones and their significance for bodily functions are a category of knowledge that emerged at the beginning of the 20th century. This means that a modern epistemic model is being used to explain a premodern image. At this point, the awareness regarding the cultural and historical circumstances in any object’s emergence, Holländer and his French predecessors claimed to be aware of, is lost. A 17th-century artistic image is described in the same way as a depiction of modern illnesses would be, not only by physicians but also by art historians themselves, as if they were looking at a patient and producing a diagnosis. It seems to be common sense that every interdisciplinary collaboration, especially one between the humanities and the natural sciences, requires some degree of simplification. The respective areas of knowledge, methodological and theoretical competences, as well as the often unspoken fundamental assumptions can only be transferred with a certain degree of abstraction and generalization if some valuable essence is to be communicated. The danger of this necessary simplification or reduction lies in essentializing a certain subject or theory, especially as disciplinary discourses are frequently lost in the transfer. 22 With reference to this process in the humanities, Maren Lorenz states: “One has to be amazed that, although historians today know about the instability of scientific truth, modern natural scientific knowledge is often incorporated in lines of historical reasoning without any critical reflection or historization.” 23 On the contrary, Klaus Walter, a modern endocrinologist with an interest in bearded female saints, states: “Such diseases
22 For example: The endocrine system is often used to describe the phenomenon of bearded women, with statements that their combination of bodily hormones is too “masculine” or generally incorrect. At the same time, there is a wide range of discussions on the evaluation of statistical research leading to the definition of correct or incorrect levels of certain hormones in a body. This discursive information is lost in the process. 23 Maren Lorenz: Wozu Anthropologisierung der Geschichte? Einige Anmerkungen zur kontraproduktiven Polarisierung der Erkenntnisinteressen in den Geisteswissenschaften. In: Rebekka Habermas, Alf Lüdtke, Hans Medick, Edith Saurer, Beate Wagner-Hasel (eds.): His torische Anthropologie, 3, 2003, pp. 415–434, p. 416. Translation by the author.
Reduced Complexity or Essentialism?
119
exist today and, of course, existed in the Middle Ages.” 24 There is thus a certain claim of objectivity in this kind of analysis which goes against the postmodern insight of the constructedness of diseases and their historical variations. 25 It is possible to categorize images by Velázquez based on the type of “dwarf” shown, as Henry Meige suggests, but why should we? There was no such categorization when those paintings were made, nor was there the kind of separation of norm and abject that Meige derives from this analysis. Our contemporary epistemic models are therefore irrelevant to the function or historical reception of these artworks. These images are not considered as translations or framed representations with all their stylistic ingenuities, but as depictions of disease. Medical evidence can seemingly be gathered, and the body’s anthropological consistency can seemingly be proven. The interpretation of these images based on medical biological knowledge draws attention to the dangers of essentialism as it mainly serves to naturalize our contemporary system of normative medical categories.
24 Klaus Walter: Gedanken über die Entstehung der Legende der Heiligen Kümmernis aus medi zin ischer Sicht. In: Sigrid Glockzin-Bever, Martin Kraatz (eds.): Am Kreuz eine Frau. Anfänge Abhängigkeiten Aktualisierungen, Münster: Lit, 2003, pp. 98–121. Translation by the author. 25 Lorenz (s. fn. 23), p. 416.
IMAGE CREDITS
Title: Traces of particles in the bubble chamber, 1976, Bubble Chamber Film (B&W), © Fermi National Accelerator Laboratory. Editorial: 1: Alois Alzheimer (1906): Preparation Auguste D., Zentrum für Neuropathologie und Prionenforschung, Klinik für Psychiatrie und Psychotherapie, LMU München. Photo: Bettina Bock von Wülfingen. Friedrich: 1: Lars Leksell: Stereotaxis and Radiosurgery. An Operative System, first edition, 1971. By courtesy of Charles C Thomas Publisher Ltd., Springfield, Illinois. Nyakatura: 1A–B: By courtesy of Thomas Martens. 1C: By courtesy of Tanja Kirsten. 1D–H, 2A, 2C–H, 3A–E: John Nyakatura. 2B: By courtesy of Stefan Curth. Amelung, Stach: 1: © 2010, Rights Managed by Nature Publishing Group. By courtesy of Thomas Söderqvist: Selling Point. David Goodsell. In: Nature Medicine, Vol. 16, 2010, p. 943. 2A: © 2007 Elsevier Inc. By courtesy of: B. Ganser-Pornillos, A. Cheng, M. Yeager: Structure of Full-Length HIV-1 CA. In: Cell, Volume 131, Issue 1, p. 71. 2B: © 2009 Elsevier Inc. Reprinted. By courtesy of I. Byeon, X. Meng, J. Jung, G. Zhao, R. Yang, J. Ahn, J. Shi, J. Concel, C. Aiken, P. Zhang, A. Gronenborn: Structural Convergence between Cry-EM and NMR Reveals Intersubunit Interactions Critical for HIV-1 Capsid Function. In: Cell, Volume 139, Issue 4, p. 781. 2C: http://www.rcsb.org/pdb/explore/explore.do?structureId=3H47, acc. 09–2016. 3A–B: http://pdb101.rcsb.org/learn/resource/the-structural-biology-of-hiv-flash, acc. 09–2016. 3C: http://www.cell.com/cell/abstract/S0092-8674(09)00580-7?_return, acc. 09–2016. Weiss, Jirikowski, Reichelt: 1: Stefanie Reichelt, for methods see Joo-Hee Sir, Alexis R. Barr, Adeline K. Nicholas, Ofelia P. Carvalho, Maryam Khurshid, Alex Sossick, Stefanie Reichelt, Clive D’Santos, C. Geoffrey Woods, Fanni Gergely: A Primary Microcephaly Protein Complex Forms a Ring around Parental Centrioles. In: Nat. Genet., 43, 2011, pp. 1147–1153. 2: Modified from: Dieter G. Weiss, Willi Maile, Robert A. Wick: Video Microscopy. Chapter 8. In: Alan J. Lacey (ed.): Light Microscopy in Biology. A Practical Approach. IRL at Oxford University Press, 1989. 3–4: Modified from: http:// zeiss-campus.magnet.fsu.edu/articles/basics/imageformation.html, and http://zeiss-campus. magnet.fsu.edu/articles/basics/opticaltrain.html, acc. 11–2016. 5: Willi Maile and Dieter G. Weiss. 6: Modified from: Dieter G. Weiss, Willi Maile, Robert A. Wick: Video Microscopy. Chapter 8. In: Alan J. Lacey (ed.): Light Microscopy in Biology. A Practical Approach. IRL at Oxford University Press, 1989, p. 227. 7: Modified from D.G. Weiss: Videomicroscopic Measurements in Living Cells: D ynamic Determination of Multiple End Points for In Vitro Toxicology. In: Molec. Toxicol., 1, 1987, p. 470. 8: Modified from: Dieter G. Weiss, Willi Maile, Robert A. Wick: Video Microscopy. Chapter 8. In: Alan J. Lacey (ed.): Light Microscopy in Biology. A Practical Approach. IRL at Oxford University Press, 1989, p. 226. De Chadarevian: 1: J. H. Tjio, A. Levan: The Chromosome Number of Man. In: Hereditas, vol. 42, 1956, fig. 1a. By courtesy of Hereditas. 2: J. Lejeune et al.: A Proposed Standard System of Nomenclature of Human Mitotic Chromosomes. In: Lancet, vol. 275, 1960, p. 1064, plate II. © Elsevier. 3: By courtesy of Bror Strandberg und MRC Laboratory of Molecular Biology, Cambridge. 4: By courtesy of MRC Laboratory of Molecular Biology, Cambridge. 5: J. R. I. Piper: Cytoscan. In: MRC News, 56, September 1992, p. 27, fig. 3. By courtesy of Medical Research Council (UK). 6a–b: R. Pepinsky: X-RAC and S-FAC: Analogue Computers for X-Ray Analysis. In: Ray Pepinsky (Hg.): Computing Methods and the Phase Problem in X-Ray Crystal Analysis. Report of a Conference Held at The Pennsylvania State College, April 6–8, 1950, State College Pennsylvania, 1952, p. 183, fig. 7 and p. 234, fig. 72. © State College Pennsylvania 7a–b: E. Francoeur: History of Visualization of Biological Macromolecules. On-line Museum. Early
122
Image Credits
Interactive Molecular Graphics at MIT, fig. 1 and 2, http://www.umass.edu/molvis/francoeur/ levinthal/lev-index.html, acc. 10–2016. By courtesy of Martin Zwick. Bock von Wülfingen: 1–2: Oscar Hertwig: Beiträge zur Kenntniss der Bildung, Befruchtung und Theilung des thierischen Eies. In: Morphologisches Jahrbuch, 1876 (1875), no. I, appendix, plate 11; fig. 7, plate 12, fig. 16, 18; plate 13, fig. 21, 22 and 26. Dippel, Mairhofer: 1: Thomas Juffmann et al.: Real-time single-molecule imaging of quantum interference. In: Nature Nanotechnology, 2012, no. 7, p. 297–300. 2: Gregorio Bernardi, MatthewHerdon: Standard model Higgs boson searches through the 125 GeV boson discovery. In: Rev. Mod.Phys. 86, 2014, Nr. 2, 479, arXiv:1210.0021 [hep-ex], p. 19. 3: https://atlas.web.cern.ch/Atlas/GROUPS/ PHYSICS/CONFNOTES/ATLAS-CONF-2015-031/fig_03a.pdf, acc. 06–2015. 4: https://atlas.web. cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/HIGG-2013-12/fig_06a.pdf, acc. 06–2015. 5: https:// cds.cern.ch/record/1459462/files/eemm_run195099_evt137440354_ispy_3d-annotated-2. png?subformat=icon-640, acc. 06–2015. 6: Sandra Eibenberger et al.: Matter-wave interference of particles selected from a molecular library with masses exceeding 10000 amu. In: Phys. Chem. Chem. Phys., 2013, no. 15, p. 14699. Orland: 1: Alfred Donné, J.-B. Léon Foucault: Cours de Microscopie, 1844, plate 2, fig. 5–8. Wellcome
Images. 2: Alfred Donné: Mothers and Infants, Nurses and Nursing, Boston 1859, plate 1. Kesting: 1: Andreas Reisen: Der Passexpedient. Geschichte der Reisepässe, Baden-Baden 2012, p. 97, fig. 65. 2–3: https://deweyhagborg.wordpress.com/2013/06/30/technical-details/; http:// deweyhagborg.com/projects/stranger-visions, acc. 08–2015. 4: Parabon Nanolabs, https:// snapshot.parabon-nanolabs.com, acc. 08–2015. Kunze: 1: José de Ribera, The Bearded Woman (Magdalena Ventura), 1631, 196 × 127 cm, oil on canvas, Museo del Prado, Madrid. Fig. in: Civilt. del Seicento a Napoli, Silvia Cassani (ed.): Museo di Campodimonte, 24.10.1984–14.4.1985, Naples 1984, p. 410. 2: Pieter Brueghel the Elder (Workshop): The Yawning man, 1567, 96 × 128 cm, oil on canvas, Collection University of Stockholm. Fig. in: Larry Silver, Pieter Bruegel, Paris 2011, p. 384, no. 312.
AUTHORS
Kathrin Mira Amelung, M. A. AG Morphologie und Formengeschichte, Cluster of Excellence Image Knowledge Gestaltung. An Interdisciplinary Laboratory, Humboldt-Universität zu Berlin PD Dr. Bettina Bock von Wülfingen Cluster of Excellence Image Knowledge Gestaltung. An Interdisciplinary Laboratory, and Institute for Cultural History and Theory, Humboldt-Universität zu Berlin Prof. Dr. Soraya de Chadarevian UCLA Department of History and Institute for Society and Genetics, University of California, Los Angeles Dr. Anne Dippel Department for Cultural Anthropology/Cultural History, Institute for Cultural Science and Art History, Friedrich-Schiller Universität Jena and Cluster of Excellence Image Knowledge Gestaltung. An Interdisciplinary Laboratory, Humboldt-Universität zu Berlin Dr. des. Kathrin Friedrich Cluster of Excellence Image Knowledge Gestaltung. An Interdisciplinary Laboratory, Humboldt-Universität zu Berlin Dr. Günther Jirikowski Cluster of Excellence Image Knowledge Gestaltung, An Interdisciplinary Laboratory, Humboldt-Universität zu Berlin and Zentrum für Logik, Wissenschaftstheorie und Wissenschaftsgeschichte ZLWWG, University of Rostock Prof. Dr. Marietta Kesting cx centre for interdisciplinary studies, Academy of Fine Arts, Munich Sophia Kunze, M. A. Cluster of Excellence Image Knowledge Gestaltung. An Interdisciplinary Laboratory, Humboldt-Universität zu Berlin Dr. Lukas Mairhofer Institute for Quantum Optics, Quantum Nanophysics and Quantum Information, University of Vienna Prof. Dr. John A. Nyakatura Cluster of Excellence Image Knowledge Gestaltung. An Interdisciplinary Laboratory, Humboldt-Universität zu Berlin and Institute of Biology, Humboldt-Universität zu Berlin PD Dr. Barbara Orland Pharmacy Museum, University of Basel Dr. Stefanie Reichelt Light Microscopy Core, Cancer Research UK Cambridge Institute, Li Ka Shing Centre, Cambridge, UK PD Dr. Thomas Stach Cluster of Excellence Image Knowledge Gestaltung. An Interdisciplinary Laboratory and Electron Microscopy, Institute of Biology, Humboldt-Universität zu Berlin Prof. Dieter G. Weiss Center for Logic, Theory and History of Science (ZLWWG), University of Rostock