IN GRATITUDE: A BRIEF RETROSPECTIVE

Guenter Bernhard Risse, distinguished historian of medicine and Professor Emeritus, passed away on February 15, 2026, at the age of 93. He died peacefully at his home in Lincoln, California in the midst of his family, after a long struggle with Parkinson’s Disease. (Obituary)

He requested this retrospective of his life be published after his passing in gratitude for all the people who influenced his life and assisted him in his academic career.

IN GRATITUDE: A BRIEF RETROSPECTIVE

Guenter B. Risse

In a recent publication, Aging Thoughtfully: Conversations About Retirement, Romance Wrinkles and Regret, authored by University of Chicago faculty members Martha C. Nussbaum and Ernst Freund, the authors discuss and support the traditional penchant for retrospection that infects those of us who are aging and wish to “find wisdom in the wrinkles.” My intent, however, is to express gratitude to a group of people who decisively influenced my professional life. Perhaps looking backward and from a notoriously foggy and selective memory and crafting a coherent narrative will help me achieve a measure of self-understanding. Moreover, the story lends meaning and positive feelings to one’ own life at a time of physical decline and social privations in the midst of the corona virus pandemic afflicting our planet.

While selfish, such an endeavor often turns altruistic and reaches out to the future, hoping that the memories, insights, and even lessons could be of value for those left behind. Leaving a legacy to the next generation is, as Nussbaum and Freund assert, a vehicle for posthumous remembrance as well as understanding and creative stimulation. In this essay, I wish to stress that the popular notion of being self-made, in total control of your destiny, is a myth. In the course of a lifetime, everybody experiences critical turning points and can easily identify as well as credit persons who proved to be consequential, influencing and facilitating life’s trajectory and eventual success.

Parents: F. Bernhard Risse (1902-1988) and Kaete A. Westernhagen Risse (1911-2009)

Both my parents were born in Germany. My father came from Barmen, an early industrial town on the steep slopes of the Wupper Valley in the Rhineland famous for its textile and metallurgy mills as well as home to the first suspended and electric monorail system in the world, the Schwebebahn. In fact, my paternal grandfather Josef (1854-1948) was a weaver working at a nearby plant under deplorable conditions. The youngest surviving male in a family of ten children--two older brothers had died at the battle of Verdun—my teenaged father barely managed to complete his primary schooling before looking for work to support the extended household. Luckily, he qualified for an apprenticeship at a German import-export firm, Staudt & Co. with multiple subsidiaries in South America including Argentina, Uruguay, Chile and Paraguay. Officially registered in 1887 by businessman Wilhelm J. Staudt with headquarters in Berlin and Buenos Aires, the company initially offered European textile products in exchange for raw materials such as sheep wool, cotton, and cowhides.

Neutral during the conflict that devastated Europe, Argentina’s economy rebounded after WWI because of an increase in foreign trade. Like other German enterprises, Staudt & Co. now also offered industrial products such as iron and steel products, notably tools and machinery. As part of his job, my father agreed in 1922 to go overseas and work at the Buenos Aires affiliate. By this time, the resident German community in Buenos Aires was not particularly hospitable to newcomers, but the trading houses that had selected and recruited their employees through the main offices in Germany were quite patronizing and supportive; they even encouraged return visits to the homeland. Indeed, my father was able to make a brief trip to see his parents in 1926. Eventually, half of them decided to permanently leave Argentina as conditions and opportunities in Weimar Germany improved.

From several accounts, my father was a model employee. Blessed with an ironclad contract, he was privileged worker insulated from the anxieties afflicting most German immigrants. With an extraordinary facility for foreign languages, he added Spanish, French, and later English to his international portfolio. Meticulous, self-taught bookkeeping followed. On rare occasions, he would reminisce about periodic sales trips away from Buenos Aires, to the interior of Argentina. Carrying two extremely heavy suitcases full of sample tools, he used trains and buses to make visits to Staudt subsidiaries in Rosario, Santa Fe, and Concordia, Entre Rios as well as faraway German settlements, especially in the provinces of Entre Rios, Cordoba and Santa Fe. To locate potential customers, he was also keen to use local telephone guides and check on potentially Germanic listings.

My mother hailed from Berlin, capital of Germany and one of the most famous European cities. In sharp contrast to my father’s early trajectory, she emigrated in March 1924 with her family at the crest of the German immigration wave to Argentina. Endowed with considerable technical skills—he was an expert locksmith by trade, her father, Paul Westernhagen (1884-1974) had fought during WWI in Germany’s Reichsmarine, barely avoiding several deadly naval battles. Retired from active service, he became a small shopkeeper in Berlin, ruined by inflation in the early 1920s. Like other middle-class Germans facing an extremely bleak future in their homeland, he considered emigration. In florid letters, his older brother, a chemist already living in Buenos Aires, extolled the attractions of his ammonia-producing factory, promising work, partnership, and lodging. After a long and stormy sea voyage, the anxious and seasick Westernhagen family, father, mother and one sister, was allowed to disembark with their belongings after their sponsor finally showed up hours later. They all climbed into a streetcar for a trip to the outer suburbs and open fields, ultimately ending at a primitive hut that could only be reached by a muddy footpath. Living in abject poverty with his girlfriend and a black child, the unmarried uncle apologized for the conditions and toxic fumes, hoping that by now extremely disappointed newcomers would provide the necessary effort and capital to make improvements.

Suffering greatly for months, my mother’s family managed to extricate itself from this nightmare, returning to a northern Buenos Aires neighborhood in 1925 and renting a property that was converted into a boardinghouse. While my grandfather proved adaptable and turned to the manufacture of safes and metal furniture, the women were busy spending their days cooking, cleaning, and washing. Subsequent moves to another property located further south of the city slightly improved their finances but failed to lift the burdens of full time menial domestic work. Unfortunately, my teenage mother also never got a chance to go to secondary school; her parents found some occasional private tutors who volunteered to improve her Spanish language skills. Like many other newcomers, the shock of exchanging a bourgeois life in a key European metropolis for the desolation, poverty, and insecurity in sparsely populated suburban enclaves created an unflattering perception of Argentina: it was the Affenland (monkey land), primitive, full of tricks and broken promises. For the rest of her life, my mother would remain a proud German citizen extolling her cultural superiority over the locals and traveling on a German passport. In her vocabulary, the word hiesiger (local) permanently acquired a stigmatizing, negative quality.

Visiting German passenger ships temporarily anchored in the harbor of Buenos Aires was then a popular forum for new and lonely immigrants to socialize. Attending afternoon teas enriched with music and dancing allowed young people to exchange glances and share their experiences under the stern supervision of their parents. Quenching their nostalgia for German life, language, and music, they congregated and relaxed after weeks of woes and hard work, my parents first met briefly in 1926 and then again in 1928. Both were quite attractive people; still a teenager and to the chagrin of my grandparents, my mother was often the target of instant marriage proposals given the scarcity of German-born women among the immigrants. She still did not understand much Spanish. After a brief courtship under the scrutiny of chaperons, they became engaged on December 24, 1929, their wedding took place a few months later, March 8, 1930. In an autobiographical essay written before her death, she wrote with pride that she had finally managed to become an independent German Hausfrau. Indeed, the kitchen remained her absolute domain, albeit expenses were kept under a strict budget. Menus and three regular meals a day were part of her routine. She made her daily rounds through the neighborhood to procure fresh ingredients. Mending clothing and knitting sweaters were meant to save money. Assistance was declined, except the daily shopping and dishwashing chores.

In 1933, a year after my birth, my father left the Staudt Company, ready to partner with a representative of foreign pharmaceutical firms. Exploiting his linguistic skills, “small pharma” imports from laboratories in France, Germany, and the US became the focus of his new commercial enterprise. His timing was perfect: Juan B. Justo had just been elected President of Argentina a year earlier, inaugurating a period of international trade and prosperity that lifted the country out of the Depression. Dedication and hard work came to provide our family with enough resources to rent a modest apartment in the center of Buenos Aires, and join the local middle class. Although retaining their German cultural identity, my parents did not actively participate in the life of its ethnic community. They refused to join clubs and societies. This also allowed them in the 1930s to evade the growing presence and influence of national socialism in Buenos Aires. An extended business trip in 1938 that also included visits with family members further dampened their lingering nostalgia. Already apolitical, the encounter with life under Hitler left an indelible mark on all of us. Moreover, on Christmas Eve 1938, a new addition to the family, my brother Edgar, would made all previous plans obsolete. After our return we had moved, to an apartment house especially selected for its proximity to the German Hospital where my mother expected to deliver.

At the onset of WWII, both my parents clearly realized that the chances of ever returning to live in Germany were over. The commitment to a life-long stay in Argentina crystalized around 1940. Moreover, threats to blacklist my father’s pharmaceutical business prompted him to become an Argentine citizen: Francisco Bernardo. With two children it was time to get out of urban apartment living and move to a suburban dwelling with garden. We fled Buenos Aires for Florida, a northern town linked by electric train service. After renting for a while, my father purchased a vacant lot and found a contractor to build a new house in 1942. For the next decades, we lived a comfortable middle-class life. Wisdom was dispensed in the form of traditional proverbs. Frugality was key, spending devoted to essentials. We did not own an automobile; restaurant visits limited. There was no desire to impress others. Household expenses were kept on a strict, no-frills budget, clothing was mended. Constant encouragement and material resources were directed towards the goal that had eluded my parents in their formative years: more schooling. We were given monthly allowances for books and materials, and discouraged from chasing part time jobs that could prove distracting. There was considerable pressure to excel and get good grades. Although public education was free in Argentina, private primary schooling and supplemental classes were mean to expand our horizons. When I finally obtained my medical diploma in May 1958, my parents were ecstatic: “we finally have a doctor in the family.”

Health scares: Carlos Neumeier and Ricardo V. Rodriguez, Buenos Aires, 1937-1940

A decision from my parents in 1937 sending me to a private kindergarten backfired six months later when I contracted a serious case of whooping cough that became complicated by bilateral pneumonia. Since I was critically ill in the pre-antibiotic era, a local pediatrician, Carlos Neumeier, came to our home and spent two nights at my bedside with my parents, employing one of the new imported sulfa drugs that saved my life. A long and slow period of convalescence followed, including an extended time at sea in the spring of 1938 during the trip to Germany.

In the early 1939, I came down with a sudden and painful abdominal condition and my parents took me to the nearby German Hospital emergency room for evaluation--we still lived at the Arenales Street apartment in the Palermo district of Buenos Aires. After a thorough examination and X rays, the diagnosis was acute appendicitis and was officially admitted and readied for surgery. My father was not convinced and requested a second opinion from Ricardo V. Rodriguez, a prominent Buenos Aires surgeon who promptly showed up, checked me out and recommended my immediate discharge, explaining that surgery was not the answer. Risking acute peritonitis and possible death, I went home with my parents and thankfully recovered under his dedicated care.

Fredericus Schule, Vicente Lopez: Karl Schade and Virginia B. Kreutel,1940-1945

My parents supported the contemporary notion of a dual German-Argentine education. In 1939, I spent my first year of elementary instruction at the prestigious and traditional Germania Schule located in the center of Buenos Aires. Founded in 1903, the school followed the approved six-year curriculum common in all Argentine establishments with additional afternoon sessions for teaching German language and literature. Although I cannot remember many details except a lack of friends and long bus rides in heavy traffic, the institution lost favor with my parents who considered it elitist, stuffy, as well as expensive and looked forward to a change of school after resettlement in suburban town of Florida. Our new address on Juan B. Justo Street seemed a good omen: a decade earlier, this former president’s trade policies had been a godsend for my father’s business.

But where to go? Just a mile away, German families living in the adjacent towns of Vicente Lopez and Olivos, had already come together to establish their own dual tract institution in 1935. Its name Fredericus Schule honored the memory of Frederick II, King of Prussia (1712-1786) a complex and contradictory figure often known as “the Great.” Based in good measure on his military successes and cosmopolitan approach during his rule, Frederick was viewed as a patron of the 18th century Enlightenment, notably art, music and literature as well as a strong leader and precursor of German unity. Therefore, in the 1930s, adherents of Germany’s new Führer sought to legitimize his rule by drawing favorable comparisons between them.

The school was located on an acre of prime land used for gardening and botanical studies. Some fields were devoted to sports, notably soccer and gymnastics. Volunteering parents cooperated in the organization of year-around festivities, national anniversaries, musical and theatrical performances, craft fairs and graduation ceremonies, transforming the Fredericus Schule into a veritable cultural center under the efficient direction of Karl Schade and Virginia Bruno de Kreutel. Born in Germany, Schade was among the first teachers selected in 1935 to devise and implement a German curriculum for primary education at the school. He also edited a useful handbook of German poetry for class use. Demanding but fair, he was the institution’s Director during the five years, 1940-1945, when I attended, from the second to sixth year of primary education. In his free time, he was an avid mountain climber, eventually participating in a 1947 expedition that made an attempt to reach the top of the Aconcagua, American’s highest peak.

Director Kreutel was in charge of the Scholl’s official Spanish course of studies. She managed to obtain official approval from local and provincial educational authorities. She was also keen in cultivate good relations with the local government in Vicente Lopez. Both displayed superb leadership in a difficult period created by the events of WWII. Tensions and deep divisions arose between ideologically committed Nazi teachers, parents, and students known as the “zackigen” (jagged like the swastika symbol, physically fit and smart) versus the apolitical “runden” (round, plain, flabby, evenhanded). Being part of the latter, the schism came to afflict our family’s social relationships, particularly as the war took a decidedly unfavorable turn. Sadly, in March 1945, as the fighting was nearing an end in Europe, the Argentine military government then led by President General Edelmiro G. Farrell and Vice-President Juan D. Peron, finally abandoned its neutral position and under pressure from the United States declared war on Germany. Moreover, the authorities issued a decree proclaiming all German assets in Argentina to be “enemy property” and thus ordering their seizure. Included in this unprecedented and unconstitutional property grab were not only schools like the Fridericus Schule but banks, business firms, associations, and sport clubs. By November 1945, my class was the last to graduate before the institution was forced to close its doors forever.

Colegio Nacional de Buenos Aires: Osman Moyano and Jose Maria Monner Sans, 1946-1951

 Even before my 6th grade graduation, my father had already decided that I should apply to the Colegio Nacional de Buenos Aires, then the oldest and most prominent institution for secondary education in the country that only became coed in 1959. Originally founded by the Jesuits next to the Church of San Ignacio in colonial times, the College was located within the perimeter of the so-called Manzana de las Luces, Buenos Aires’ Enlightened zone. Later, it became a lay institution administered by the local University and thus the educational center for the creation of what a recent author, Alicia Mendez, has coined a “meritocratic elite” that included presidents as well as other high government officials, intellectuals, artists, scientists, as well as diplomats, judges and industrialists. Indeed, CNBA was frequently perceived as a center of Argentine culture, its exceptionality resting on historical traditions. All six years featured the study of Latin. Like all other public institutions, education was free: students were only responsible for the cost of books, supplies, meals, and transportation.

 My father’s rationale for insisting that I apply to the CNBA was not influenced by economic considerations or dreams for social advancement. What worried him was the progressive intellectual degradation occurring in Argentina since the military revolution of 1943. An energized nationalism and militarism coupled with religious indoctrination sought to reform public education and create a “new Argentina,’ prompting the firing and voluntary departure of thousands of notable scholars and teachers, including several presidents and two Noble Prize winners. With the quality of instruction and research already declining, the CNBA appeared to be less compromised than the university system it was part of. Many liberal teachers managed to hold their secondary posts. While similarly opposed to the military government, CNBA’s socially heterogeneous teenagers, drawn mostly from urban centers in Argentina, failed to exhibit the same active resistance towards Peron’s government rules than their older university students prone to protests and strikes that reached new heights in the riots of October 1945.  

Thus, together with nearly 1,500 yearly applicants, I spent several months in the fall of 1945 with a personal tutor in preparation for taking a highly competitive written examination designed to test my knowledge in four main subjects, mathematics, Spanish, history, and geography. The quest was successful: to the joy of my family, especially my father, I received my acceptance letter. About 200 young men were admitted each year and placed in eight divisions, each composed of 25 students. The school functioned in two shifts, one meeting in the morning and the other in the afternoon. I chose the afternoon shift, since I lived in a suburb and faced a full hour of commuting using trains, subways, and doing a fair amount of walking. CNBA was housed in a monumental neoclassical building with white marble staircases, vast courtyards with water fountains, wood-paneled hallways, ornate library, assembly room, and swimming pool, located within the historical center of Buenos Aires.

In February 1946, just before starting my first year at the CNBA, General Juan Peron won the popular vote and was legitimately elected president. After an extended period of labor unrest and crippling strikes, he assumed the office in June. Perhaps reflecting the renewed emphasis on compulsory Christian instruction after the 1943 military revolution that sought to establish a “New Argentina,” the CNBA began listing religion among the ten subjects to be studied during the first and second year courses.

Just before classes ended in November 1946, the Colegio also inaugurated his new director, Osman R. Moyano (1890-1952), a former veterinary physician, ex-student, and teacher at the institution. Humble and austere, Moyano was deeply impressed by CNBA’s mystique; he was in love with its aura of erudition, the solemnity of its auditorium, library, and majestic hallways. He wowed to protect its integrity, the dignity of its professorial and auxiliary staff as well as the highly selected student body. Ironically, Moyanoowned this appointment to an old family friend, the surgeon and now despised Peronist politician Oscar Ivanissevich, who had been installed at the helm of the autonomous and more liberal University of Buenos Aires by the new government.

Moyano’s educational philosophy was perceived as politically correct since it made a clear distinction between mere instruction--transmission of current knowledge--with actual education, a more comprehensive and desirable approach that promised to “shape souls.” The latter mode demanded a closer relationship with students and was designed to also convey moral values to adolescents towards the formation of future informed citizens. For Moyano, the traditional barriers between professorial chair and student bench needed to be lifted to create a more harmonic learning environment.   

One of the highlights during the first five years were courses given by Jose Maria Monner Sans (1896-1987) that dealt with aspects of the Spanish language, from style, syntax, spelling, literary theory, poetry and history. A prominent author and university leader, Monner Sans was a demanding teacher who had published an early monograph about history as a literary genre. Science and data collection were the new goddesses; everything was susceptible to measurement. Admitting that objectivity was an illusion, Monner Sans emphasized history’s subjective nature, since all our narratives of the past were based on collected information driven by conscious or subliminal sympathies and intentions, a position that helped shape my own ideas on the subject.

As my course of studies entered its sixth and final year, political pressures mounted. Graduation in late November 1951 could not come soon enough. To be sure, the Colegio remained a mostly apolitical island surrounded by by a sea of faithful Peronistas. Provided with a bust of Peron’s spouse, Evita, for prominent display in the building, Moyano quietly hid the statue out of site in an obscure corner of the building. In early May, CNBA students protested in front of the building housing the distinguished newspaper La Prensa, which had been taken over by the autocratic government, only to be arrested and dispersed. With a general election, President Peron’s supporters organized numerous rallies in support his reelection. Since the aging Vice President, Hortensio Quijano, was in failing health, support for the candidacy of First Lady Eva Peron to become the president’s running mate gained momentum. At a rally on Plaza de Mayo on August 22, President Peron was expected to reveal his final choice. Organized by labor unions and the Ministry of Education, the large gathering required the compulsory presence of students and teachers from all major schools. Our CNBA delegation, fronted by Rector Moyano, occupied a prominent place next to the old Cabildo building. After Peron appeared on the balcony of the government building, the Casa Rosada, the current Education Minister, Oscar Ivanissevich, introduced Eva Peron as “Mrs. Vice-President,” prompting loud boos from students of our CNBA delegation. The crowd fell silent, the presidential couple and their immediate retinue directing their angry gaze towards the dissenters, much to the dismay and embarrassment of Moyano. Within a few minutes, members from the mounted presidential guard surrounded our group. Blandishing their sables, these men on their excited horses simply galloped into the rows of fleeing CNBA students, indiscriminately chasing and hitting everyone with the flat side of their weapons. Luckily, amid the confusion, I found myself near a subway entry, thus evading the onslaught and rushing home, but some of my classmates were badly hurt. Peron never revealed his choice for the presidential ticket but there were rumors that his wife was ill. Indeed, she died of cancer nearly a year later. Moyano was summoned to the Ministry but retained his job thanks to his close personal ties to Ivanissevich. No student was punished but we came under greater scrutiny by the afternoon prefect, Mr. Amoroso, a short and elderly sympathizer of the current regime who zealously policed the school’s corridors threatening to compose incriminating reports or recommend suspensions that would prevent our graduation. At a solemn ceremony, I received my bachelor diploma in December 195i. Moyano called it a fiesta de hogar, a home celebration.

Indeed, CNBA had been my intellectual shelter and Moyano’s quiet leadership — he coined it his “apostolic mission” - managed to protect it by paying lip service to the outside forces of Argentina’s fascism. He died a year later. The most critical and formative years of my adolescence were over. Since my subsequent career unfolded in the United States, I never developed that sense of belonging, fellowship, and solidarity characteristic of most CNBA graduates. Like the local University, the Colegio remained an independent and rebellious center for the opposition to Peron’s Argentina. In fact, our struggles were merely a prelude of further bloody clashes with authoritarian military leaders that occurred in the decades ahead.   

 

Medical School: Luis Dellepiane and Juan C. Gonzalez Peña, Buenos Aires 1952-1953.

 At the start of my first year in Medical School on April 1, 1952, I signed up to study topographic and functional anatomy in the Department headed by the notable surgeon traumatologist Professor Luis Dellepiane. Because of open enrollment, the large number of students admitted that year forced Dellepiane to create a auxiliary teaching team of honorary “monitors” drawn from selected students who had successfully passed the first partial examination. Selected for such a post, Disector Ayudante Voluntario, I carried out my own dissections and demonstrated my findings to other groups of students, under the supervision and encouragement of my section chief. Professor Juan C. Gonzalez Peña. The latter became an inspirational leader and personal friend any incoming medical student should have. This positive experience allowed me in 1953, during my second year, to remain in the Anatomy Department and continue to function as a member of Professor Dellepiane’s School of Dissectors, dissecting for and teaching theoretical subjects to small groups of incoming students.  

 

Medical School: Bernardo A. Houssay and Miguel R. Covian, Buenos Aires, 1953-1954

During Year 2, the primary field of studies was physiology. The Department was ably represented by Jose B. Odoriz (1908-1971), one of the last collaborators of the 1947 Nobel Prize winner Bernardo A. Houssay(1887-1971), one of the most prominent and influential 20th century Latin American scientists, a key spokesman for modern biomedicine and defender of academic freedom. Houssay’s intellectual prominence and liberal political leanings, however, had brought him into conflict with the military dictatorship that ruled Argentina after 1943. Dismissed from his university posts, he was forced to shift his research to a privately funded laboratory in Buenos Aires, the famous Instituto de Biologia y Medicina Experimental in Palermo from 1944 until the fall of General Peron’s regime when in 1959 it was affiliated to the University of Buenos Aires.

Odoriz’s inspirational lectures and an acquaintance with another Houssay associate, Miguel R. Covian(1913-1992), spiked my interest in biomedical research. The latter had recently spent two years at Johns Hopkins University studying the physiology of the hypophisis, and upon his return established a experimental laboratory of neurophysiology. With Covian’s encouragement, I started attending weekly sessions at Houssay’s Instituto on Costa Rica Street in Palermo devoted to the presentation of research conducted by other Houssay’s collaborators and students including Braun Menendez, Rapela, and Brandt.

During Year 3, my preclinical studies shifted to medical semiology under the chairmanship of Professor Roque Izzo, taught at Buenos Aires’ most prominent sanatorium, the Hospital Municipal Enrique Tornu, named for a notable Argentine physician and climatologist, a graduate of the Paris Medical School who specialized in lung diseases. Since 1904 this venerable institution, a vast complex of pavilions, administrative facilities and gardens, had not only collected local sufferers of tuberculosis, notably domestics and day workers, but also housed individuals sent from distant Argentine provinces. Indeed, flooded with immigrants like other modern urban centers around the world, Buenos Aires continued to have its share of infected patients. Since the basement of the pavilion housing provincial cases was reserved for experimental studies, I was encouraged by Gil Mancini, an independent docent, Covian and others to conduct a series of studies using guinea pigs, an animal frequently employed for diagnostic purposes and thus in great supply. Indeed, Covian was already working on the neural basis of thirst and physiology of appetite at Houssay’s Instituto employing Norwegian rats, and had published a preliminary paper on the subject in 1952. He was curious whether his findings could be confirmed employing another animal model. With the assistance of a few other students, we tried to follow existing protocols until it became clear that our experimental animals, the guinea pigs, kept perishing from tuberculosis, a not surprising fate in a perennially contaminated institution like the Hospital Tornu.    

Medical School, Juan Jose Beretervide /Hospital Juan A. Fernandez, Buenos Aires, 1957-1958.

 Year 4 of Medical School was hectic and confusing, spent fulfilling my military obligation at the Argentine Army’s Escuela Profesional General Lemos outside Buenos Aires. After a series of interruptions because of the 1955 military revolts that eventually toppled the current president, Juan Peron, in September of 1955, I returned next year to complete the required year 5 clinical work, covering internal and emergency medicine, obstetrics and gynecology, pediatrics as well as infectious diseases. An influx of migrants from the poorer Argentine provinces to the capital city for the past decade threatened to overwhelm its health care facilities, notably the decaying municipal hospitals erected during an earlier era when shelter and rest were the only offerings for a possible recovery. Housed in flimsy hovels surrounding Buenos Aires, sarcastically referred to as villas miseria (villas of misery), the poor were exposed to a broad spectrum of contagious diseases, notably venereal. Lacking means to obtain care, they flooded the salas de guardia (emergency rooms), a bonanza for so-called perros (dogs), lowly but loyal medical students like myself who staffed such facilities and followed orders including multiple shots of the newly available wonder drug, penicillin.

 My last year of medical school, from January to March 1957, December 1957 until April 1958, was spent as a volunteer working daily at the municipal Hospital Juan A. Fernandez. Completed in 1943, this modern high-rise building located in the fashionable northern district of Palermo, sponsored the Medical Service of Dr. Juan J. Beretervide (1895-1988), then one of the most distinguished Argentine clinicians of his generation. Chief of the Service in Ward 2 since 1928, professor of clinical medicine, and member of the Argentine Academy of medicine, Beretervide assisted by Drs. Masoch and Lazzari, was instrumental in guiding my interests towards general internal medicine. In fact, he also graciously agreed to sponsor my medical thesis in September 1960, researched and written during my first year of my residency in internal medicine at Mercy Hospital in Buffalo New York in the USA during the years 1959-1960.

Hubert Lando and Angelo S. Deloia, New York. 1958-1960

During my last year of medical school, my father connected with a business partner in New York, Hubert Lando (1886-1968) with the purpose of exploring the possibility of post-graduate medical training in the US. Given the increasingly competitive nature of medical practice in Argentina because of the ever-larger number of post Peron graduates without an equivalent expansion of population and medical facilities, many young physicians faced a bleak economic future. Under such circumstances, obtaining additional training through the completion of an internship in the US could prove advantageous. On a visit to Buenos Aires in early 1958, Lando related that one of his former neighbors—he lived in the northern New York suburb of Scarborough—was a general practitioner now specializing in radiology on the staff of Mercy Hospital in South Buffalo. Since 1951, this institution had managed to create an affiliation with the Georgetown University Medical School in Washington, DC, that brought prestigious faculty members such as cardiologist Proctor Harvey and nephrologist George Steiner to Buffalo for by-weekly conferences and rounds. Yet, In spite of its distant academic connections, private hospitals such as Mercy Hospital encountered difficulties in attracting newly graduating American doctors for its internships and residencies at a time when the American hospital system was experiencing a dramatic expansion. Instead, Mercy Hospital already has already come to rely on several foreign medical graduates drawn from predominantly Catholic countries such as the Philippines, and Puerto Rico.  

 Mr. Lando contacted Angelo S. Deloia (1906-1988) at Mercy Hospital to explore the possibility of a one-year rotating internship. The inquiry was successful: a vacancy had occurred. Given the lateness—all house staff positions in America start July 1--I was allowed to begin October 1, 1958 giving me enough time to obtain a student visa under the Exchange Visitor Program, resign from a postdoctoral fellowship in the Army Medical Hospital in Buenos Aires and travel by ship to New York. Arriving there in late September, Lando kindly picked me up upon landing, arranged for temporary accommodations in the city, and guided me through the initial cultural shock with special tutorials of American English that would be critical for communicating with patients.

Following a train ride to Buffalo, my first encounter with Dr. Deloia almost never happened: he was at the station looking for a swarthy dark-haired South American instead of a blond young man. Later, however, we managed to bond and his wife, three sons, and grandmother, literally adopted me, making me feel part of the family. Even after Dr. Deloia abruptly left Mercy Hospital a few months later, because of professional disagreements with his partner at the Department of Radiology, this support never waivered. I often spent my weekends at his Buffalo house with his three sons, Toni, Gregory and Steve. They became my American family and played an important role during my early efforts to assimilate.        

John J. O’Brien, Anthony C. Constantine and Joseph Kopp, Buffalo, New York, 1959-1960

 As my rotating internship entered its final months, I reached a decision to continue my medical training in the US instead of returning home. In doing so, I was following the advice of John J. O’Brien, an internist with cardiology training who was Mercy Hospital’s Director of Medical Education. While I worked 36-hour shifts and nearly 100 hours weekly earning room and board as well as a very low stipend, my internship turned out to be more of a secondary service role with lesser educational and practical value. Most attending physicians sought to bypass the house staff, making their unannounced rounds to see their private patients late in the evenings, and phoning their orders directly from their offices to the nursing desk. Only Dr. O’Brien and his deputy insisted on regular morning reports, and the biweekly conferences. Under such institutional circumstances, a residency in internal medicine under O’Brien, with rotations in pathological anatomy with Anthony C. Constantine and tutorials with Joseph Kopp, another internist with hematological credentials seemed attractive. To the disappointment of my parents who eagerly expected my return, the additional prospects for conducting clinical research on selected Mercy Hospital patients for the purpose of writing my doctoral thesis made my choice easier. By October 1959, I accepted the post of first-year resident in internal medicine.

 Meanwhile, thanks to my father’s business contacts in Buenos Aires, I was able to write and sent a series of articles in Spanish for publication in a prominent Argentine medical journal, Orientacion Medica. Arranged in the form of travel notes, the essays provided information about the current situation of foreign medical graduates in the US as well as discuss some topics of interest discussed as meetings of the American Federation of Clinical Research in Atlantic City that I attended as part of my medical training.  

 By early 1960, however, I came to the realization that my goal of seeing a larger number of patients and be allowed to actively participate in their management could not be achieved in a smaller suburban private hospital. Clinical research for completing the MD thesis faced opposition from the attending physicians; promised rotations except for a stint in the Pathology Department performing autopsies failed to materialize. Moreover, some of my reports, presented at clinico-pathological conferences, exposed diagnostic errors as well as surgical mismanagement with the potential to trigger malpractice litigation, an uncomfortable situation for me since Mercy Hospital’s had a small closed medical staff.

John G. Mateer, Detroit, Michigan, 1960-1961         

 Searching for alternatives, I came across information about training programs at Henry Ford Hospital, in Detroit. Originally opened in 1915 under the sponsorship of the Ford family and known for its early clinical trials with penicillin during WWII, this institution had flourished since the early 1950s under the leadership of its Executive Director, Robin C. Buerky, one of the top and more innovative hospital directors in the country. Unbound by the Board of Trustees and Medical Advisory Board, Buerky had managed to recruit a number of prominent specialists to the institution’s closed professional staff. Moreover, he accomplished the construction of a new 17 floor Clinical Building formally opened in1955 as well as expanding the wings of the existing hospital in 1957. A Division of Medical Education and Research allowed for the growth of house staff positions, including residencies in nineteen specialties and postdoctoral fellowships. With its academic status enhanced, Henry Ford Hospital had become by the late 1950s one of the most prominent group practice hospitals in America, sponsoring yearly Clinical Conferences in affiliation with the Cleveland Clinic in Ohio, the Ochner Clinic in New Orleans, Mayo Clinic in Rochester, and the Lahey Clinic in Boston.

 A trip to Detroit to investigate the possibility of a residency position in internal medicine at Henry Ford Hospital brought me into contact with John G. Mateer (1890-1966), who despite his age remained the Physician-in-Chief of the institution and therefore responsible for assembling its house staff. A Johns Hopkins medical graduate, Mateer was not only a gastroenterologist of note, but also well-known in automotive circles as a respected physician. A personal friend of Henry Ford, he had been called to is side and pronounced him dead from a stroke in 1947. We discussed the history of HFH, his institutional trajectory since 1920, philosophy of medicine, and commitment to medical care. Mateer seemed intrigued about my background and experiences in Buffalo. We chatted about the importance of so-called “functionaI” or psychosomatic disease. In the end, he offered me a one-year residency in internal medicine starting July 1, 1960. Excited, I immediately accepted: my path towards quality medical training had taken a decisive leap forward. Later, after completing my assignments, Mateer allowed me to spend three additional months at the Hospital working from July to September 1961 in a newly established Psychosomatic Clinic for the purpose of effectively complete two years of internal medicine training that seemed adequate for my future professional life outside the United States. Indeed, an expiring exchange visa and anticipated return to practice my craft in Argentina after marriage beckoned. In fact, my bride, Alexandra (Sandy) had been the head nurse of HFH’s psychiatric ward, and I had gotten to know her in November 1960 during a 30-day rotation to assist staff psychiatrists for strictly medical problems.

Daniel J. Bordoli, Martha W. Griffiths, 1962        

Following the October wedding and a trip by boat from New York to Buenos Aires, we landed in late November facing another political and economic crisis. There were rumors of an imminent military coup. Before my departure to the US, the current president, Arturo Frondizi, had been elected and assumed power in May 1958. A progressive politician, he had been battling with former Peron labor supporters on the left and conservative forces including the military on the right. Such uncertainties and a veritable glut of new medical graduates looking for employment made it difficult to set up a private practice. In the meantime, without a local license, Sandy tried in vain to land a nursing position in spite of her degree and advanced skills. Contacts with the Ford Motor Company and Kaiser Industries in Argentina were unsuccessful. The option of moving away from Buenos Aires promised few rewards. The provinces were impoverished; all my foreign training and experience would be wasted. Temporarily living with my parents, we also faced an acute shortage of housing due to high demand and frozen rents. The unusual hot summer triggered frequent power outages leading to a dearth of water. My previous mentors. Including Dr. Beretervide’s Medical Service at the Hospital Fernandez shared their similar contemporary frustrations, recommending a return to the US. The only alternative that could potentially be profitable was to move to the oilfields of Comodoro Rivadavia in Patagonia, a lonely outpost far away from cosmopolitan Buenos Aires.

 Calling on one of my former friends from Medical School, Daniel J. Bordoli, I managed to obtain a temporary position at the Centro Gallego de Buenos Aires, a prominent local cultural center serving a population with ancestry from the region of Galicia in Spain. My task was to replace vacationing staff physicians for the summer months of January-February 1962. Bordoli was also kind enough to allow me to handle his private practice while he was away, but without prospects for a steady job and income we reached the difficult decision to consider returning to the United States. This determination, reached during our first Christmas together, devastated our families, notably my parents, and cast a pall over all our futures.

 The Exchange Visitor Program stipulated that previous visa holders remain two years outside the US before returning for another sojourn, but there was the possibility for a waiver of its requirement. Since I was married to an American citizen, grounds for the waiver could be framed as creating exceptional hardship to that person. Such a document would initially be processed by the Office of Cultural Exchange in the State Department, and if successful, forwarded to the Immigration and Naturalization Service at the Justice Department for further disposition, including extensive background checks. Even assuming success, the entire process involving two separate government bureaucracies promised plenty of red tape and a lengthy wait. After some initial time-consuming confusion, the initial application went to the US Immigration and Naturalization Office in Detroit, my last place of residence before leaving for Argentina.

 After the petition was promptly returned to us with demands for additional documentation, the situation turned ugly as well as embarrassing. The State Department has a different, much more optimistic vision of conditions in Argentina and our struggles were deemed exaggerated. My father volunteered to compose a gut-wrenching affidavit corroborating the prevailing professional and financial crisis afflicting Argentina, as well as expressing his inability to support us for an extended period of time. Other friends and former colleagues wrote similar letters painting pictures of strife, confusion and marginal living standards they would rather prefer to ignore, least putting on paper. Bordoli dramatically summarized his current professional activities, mostly voluntary, confessing that without his relatives’ assistance, he would be forced to look for another line of work. He was also kind enough in early March 1962 to compose a medical affidavit recommending that my now pregnant wife, Sandy, should consider returning to the US in view of potentially serious complications due to a progressive blood type incompatibility that would prompt an early induced delivery by Caesarean section followed by an immediate and complete blood transfusion of her child.

 The developing scenario, reinforced by further turmoil in Buenos Aires including President Frondizi’s overthrow late in March by the military leadership, suggested that we needed to plan for a temporary separation, with Sandy returning home to manage the last months of her pregnancy under careful medical observation while I pursued further possible medical work while practicing as a volunteer at Dr. Beretervide’s clinic. My Detroit in-laws, in turn, contacted their friend and Congressional representative, Martha W. Griffiths (1912-2003), to lobby on our behalf at the State Department for the waiver. A member of the powerful Ways and Means Committee, and an ardent advocate for women’s rights, Griffiths arranged a meeting with Secretary of State under President John F. Kennedy, Dean Rusk. The surprising outcome, communicated to us via a telegram, was that the State Department had agreed to issue me a permanent visa. Suddenly, our eventual return to the United States was assured.   

Phillip T. Knies, Columbus, Ohio 1962

Following the decision to return to the US if allowed by the Immigration authorities, I had applied in January to nearly 180 institutions in the US and Canada, A sudden house staff vacancy at Mount Carmel Hospital in in Columbus, Ohio prompted Phillip T. Knies (1907-1980), its Medical Director, to look for a third year resident in Internal medicine for the 1962-63 rotation at that institution beginning July 1st. Professor of Medicine at the Ohio State University School of Medicine, Knies was also Interested in public health issues, notably prevention of imported diseases; he headed the Epidemiology Division at the federal Quarantine Branch in the capital city. After carefully checking my credentials and previous performance as a house officer both in Buffalo and Detroit, he offered me this position in mid-March. Although his mail was slow to reach Argentina, I managed to send him a telegram on the last day of his deadline, March 31, accepting the position. Obtaining this salaried job was key to our return to the US.

 Knies proved to be both a kind but also demanding and outspoken medical leader. While focusing on clinical topics, he also enthusiastically supported humanistic perspectives, including my research for a historical paper that was awarded first prize by the Columbus Society of Internal Medicine sponsored by Ohio State University. The competition was limited to house officers from the area’s community hospitals as well as the University Hospital.

Owsei Temkin, Baltimore, November 1962

Since graduation from medical school, I had been interested in pursuing studies in the history of medicine and paleopathology with special emphasis on ancient Egypt. In fact, during my student years, I became the honorary secretary of the Instituto Argentino de Egyptologia under the directorship of Enrique Piñero Jr. with worldwide links to other academic and governmental organizations. In fact, before returning to the USA, I gave a lecture at the United Arab Republic Embassy on April 25, 1962 on the topic of paleopathology in Ancient Egypt, notably the employment of radiology to investigate its mummies. Given the bleak professional future then awaiting most foreign medical graduates in the American Midwest, I conceived the somewhat controversial idea of returning to the classroom and combining my medical credentials with the study of history. Indeed, to eschew a potentially lucrative medical career for another lengthy course of studies and less-well remunerated academic career was considered a folly, even the butt of numerous jokes. To further explore such plans, I arranged a visit to Baltimore for the purpose of interviewing with Owsei Temkin, then the Director of the Institute for the History of Medicine at Johns Hopkins University School of Medicine. At that time, this unit offered a number of NIH fellowships for physicians who wished to obtain PhD degrees in history. A gracious host, Dr. Temkin, who specialized in Ancient Greek medicine, discussed my plans, but given my interests in ancient Egypt, ended up suggesting that the University of Chicago would be a better venue for my studies, notably the Oriental Institute. I was recommended to contact Ilza Veith, a former Hopkins graduate and now professor at the Medical School. After a visit to Chicago in February 1963, an agreement was reached between Professor John A. Wilson from the Oriental Institute, Professor F. Eggan from the Anthropology Department and Veith to cosponsor an interdivisional plan of studies leading to a Ph.D. degree in Anthropology. To support it, Professor Veith promised to request a postdoctoral fellowship.  

Walter Palmer, and Henrietta Herbolsheimer, Chicago, Illinois, 1963

 Unfortunately, as my plans for a radical career change from practicing internist to graduate student matured, professor Veith announced her departure for the University of California, San Francisco, a move that cancelled the expected fellowship and put my academic future in limbo. However, Knies’ friendship with Walter Palmer (1896-1993) a famous American gastroenterologist and Richard T. Crane Professor Emeritus at the University of Chicago Medical School allowed me to obtain a temporary medical license to practice medicine in the state of Illinois as well as recommend me for a full-time paid position at the University Health Service at that institution headed by Henrietta Herbolsheimer (1913-1999). This post also allowed me to receive an Assistant Professor appointment at the Medical School, making me both a faculty member and a graduate student in the same institution.

 An early advocate for women’s health and epidemiology as well as an associated professor of medicine in the Medical School, Herbolsheimer’s support and help in scheduling student appointments between my scheduled classes became crucial. In addition to some tuition waivers, this staff appointment allowed me to finance my studies towards a Ph.D. degree from the History Department starting in the fall of 1963. However, as a humanist I shared in a September 16, 1963 letter to Dr. Knies, the growing emphasis on basic biological science in patient care: “the cooler winds of science are overcoming the warmer human understanding and reassurance,” a trend I wished to highlight in my future historical work. 

John A. Wilson, Lester King, and William H. McNeill, Chicago, 1963-1969.

I started my academic studies at the University of Chicago in the fall quarter of 1963 at the Oriental Institute part time by taking a course in the archeology of ancient Egypt under the supervision of professor John A. Wilson (1899-1976), a prominent Egyptologist and successor of James H. Breasted. Subsequent course work on ancient Near East culture, hieroglyphics, and Middle Egyptian texts followed. By early 1966, the issue of a topic for my MA dissertation arose. Since all extant medical texts had already been translated, what could be considered original work? A plan to participate in an excavation project at Saqqara in Egypt being conducted by the prominent Egyptologist Walter B. Emery near the suspected tomb of Imhotep, the ancient god of healing, appeared promising. Since Wilson traveled each winter to Egypt as part of the University of Chicago’s bibliographic work at Luxor, a personal meeting with Emery was arranged in London. Unfortunately, the project’s sponsor, the Egypt Exploration Fund, frustrated by the lack of significant findings and the high costs of removing a vast network of tunnels filled with embalmed Ibis birds, was cancelling the Imhotep dig. Since the Fund retained its concession in that zone, there was little hope that further explorations would occur in the foreseeable future. Moreover, UNESCO’s urgent call to save Nubian monuments from the impending flooding caused by the new Aswan dam, shifted resources and explorers to new projects. With Wilson’s advice--he chaired the UNESCO Consultative Committee for the Salvage of Nubian Monuments--I decided to switch departments, joining the Humanities Division and History Department where I was awarded an MA Degree in December 1966 in both the History of Ancient Near East and the History of Science. My career as an Egyptologist was over, but my lifelong interest in Ancient Egypt’s health and medicine spurred by professor Wilson persists.

 A Lecturer in the History of Medicine at the University of Chicago, Lester S. King (1908-2002), a pathologist and Harvard graduate had a full-time day job: he was a Senior Editor of the Journal of the American Medical Association with headquarters in downtown Chicago. Versed in the philosophy and history of medicine, his interests centered primarily on the evolution of medical thought with particular reference to modern times. King was frequently lecturing on the “heritage of problems.” He stressed that while answers changed, problems, and questions remained the same. Since he was working on a book which surveyed eighteenth-century topics, I shifted my research and found the topic for my doctoral dissertation: the transmission and influence of medical theory from Scotland to Germany. A notable bonus of being a Lester King student was the fact that he was a champion of clear and concise scientific writing, mincing no words and heaping criticism on authors who employed jargon and tortured prose. At once, my Germanic style and employment of the passive voice became his favorite targets, transforming my drafts of term papers into puddles of red ink. While feared, periodic pilgrimages to his Chicago office for frank discussion and feedback became valuable lessons for my transformation from a medical practitioner to academic and author.

Chairing the History Department at Chicago during part of my student years, famous world historian William McNeill (1917-2016) had won the National Book Award in History and Biography in 1964 for his Rise of the West  

In early 1967, McNeill encouraged me to seek outside financial support from the Josiah Macy, Jr. Foundation in New York headed by John Z. Bowers. The application was successful in March of that year, allowing me receive a Fellowship in History of Medicine and the Biological Sciences renewable for three years that enabled me to end my full-time job in the UC Student Health Service and accelerate my remaining PhD requirements on a full-time basis. In the late 1960s, McNeill as working on a new book that sought to place past human disease outbreaks within an ecological framework. Knowing that I had medical training, we frequently engaged in conversations concerning the spread of major pandemics. He was particularly amazed about the influence of smallpox in the conquest of Mexico and other transcontinental diffusions of contagion that altered the course of history. His book, Plagues and People was published in 1976 and became popular although some mainstream medical historians dismissed it as a work of fiction. With their strong endorsements, both Dr. King and Professor McNeill were also instrumental in persuading the University of Minnesota to offer me a position after earlier efforts to obtain a curatorship at the Smithsonian Institution failed. This development followed my presentation of a paper at the History of Science annual meeting in Dallas, Texas, December 27-30, 1968 attended by professor Leonard Wilson, the head of a new Department of Medical History at that institution.    

Owen H. Wangensteen, University of Minnesota School of Medicine 1969-1971.

 When I accepted an offer to assume the position of Assistant Professor in December 1969, the world-famous surgeon and recently retired chief of surgery, together with his wife Sally, a medical editor and historian, immediately offered support and friendship. Owen H. Wangensteen (1898-1981) was a world-famous Minnesota surgeon who had attracted a large number of students, including several heart transplant surgeons. After his retirement in 1967, he had been instrumental in adding a floor to the Health Sciences Library to house an impressive collection of rare medical books and start a program of medical history with a graduate program for physicians. The Wangensteen Historical Library of Biology and Medicine officially opened in 1972 and his book on the history of surgery was published in 1978.

 When I was notified in September 1970 that my appointment in the new Department of the History of Medicine at the Medical School would not be renewed after June 30, 1971 since my salary came from “soft” money, Dr. Wangensteen immediately came to my assistance. Given the current economic conditions prevailing in most medical schools, I decided to return to private medical practice in Minneapolis by February 197I. Yet, Dr. Wangensteen contacted several university leaders and educational foundations around the country, shopping my academic record to keep me on an academic tract A month later we succeeded with his help to obtain a tenured position at the University of Wisconsin. He was a pillar of support to my family and me during two difficult years of financial insecurity.

Glenn C. Faith and Peter L. Eichman, Madison, Wisconsin, 1971.

Dr. Faith was a faculty member in the University of Wisconsin Pathology Department and member of the Search Committee headed by the anatomist Otto A. Mortenson in March 1971 to fill the chair in medical history vacated by the resignation of visiting professor Nicolas Mani. Mani had resigned in October 1970 to assume the directorship of the Institute of the History of Medicine at the University of Bonn in Germany. After my invited lecture in Madison on April 8, he was quite enthusiastic about my candidacy; I believe his efforts in my behalf were decisive in my selection.

 Just before the end of his administrative term, Dr. Eichman, Dean of the Medical School since 1965 and an internist and neurologist, became involved in the recruitment process. In view of the fact that all my predecessors, including the first professor, Erwin Ackerknecht, who had arrived in 1946, all returned to Germany, Eichmann asked me the rhetorical question: “do you have any ambitions towards a chair of medical history in Europe?” Getting a negative response, he agreed on May 12 to generously offer me a tenured position as Associate Professor in spite of the fact that I had only received by Ph.D. in History from the University of Chicago a few months earlier. His enthusiasm and commitment following our frank exchange of ideas about the future of medical history in Madison laid the foundations for reactivating this departmental unit, develop new elective courses for medical students and eventually offer a limited number of fellowships for graduate studies. The unit continues to flourish in the 21st century.

Ralph H. Kellogg and Rudi Schmid, San Francisco, 1984-2000.

A University of California, San Francisco Professor of Physiology and an avid student of history among his broad spectrum of interests, Ralph H. Kellogg, (1920-2009) had been appointed in 1984 to be a member of the Search Committee for a new chair for the Department of the History and Philosophy of Health Sciences at his institution. In that capacity, he repeatedly encouraged me to apply for the position. After my appointment and our arrival in July 1985, he generously guided our first steps in the enchanting city of San Francisco. When we managed to purchase a home across the street from his house, Kellogg also became a friendly neighbor, hosting occasional dinners and meetings. He also offered sage advice concerning administrative matters and supported critical departmental rebuilding efforts.     

 Dean of UCSF’s Medical School since 1983, Rudi Schmid (1922-2007) was an early supporter of my candidacy for Professor and Chair of the Department of the History of Medicine. A world-famous expert on liver diseases, the Swiss-born researcher and teacher was eager to put his personal stamp on the UCSF faculty, appointing more than a dozen of departmental chairs during his highly successful administrative tenure that unfortunately ended in 1989. Our personal relationship flourished because of his unique interdisciplinary approach and internationalism. We bonded further because of our experiences at the University of Chicago where he was a professor during my years on that campus pursuing a PH.D. degree; he had also been at the University of Minnesota Medical School and knew Owen Wangensteen. From the very start, Dr. Schmid and his Executive Assistant Dean Valli McDougle became strong supporters of the Department and soon our friendship extended beyond the purely administrative formalities. Indeed, my wife and I had the pleasure of attending a number of Rudi Schmid’s evening soirees at his beautiful home and garden at Kentfield in Marin County.

Family

 Last but not least, I would like to acknowledge the critical roles played by my loving wife Alexandra (Sandy) as well as my three children. Without their support, most of the above steps in my career would not have been possible. From the very start, Sandy became a faithful companion and advisor. After our wedding in Detroit in October 1961, she strongly supported my earlier plan to move and live in Argentina that proved frustrating and futile. Our stay in Buenos Aires sadly ended a few months later with a return to the US. Her citizenship and problematic pregnancy determined my fate, giving me the chance of a permanent visa and possible medical career in America. A year later, Sandy bravely signed on to my unexpected quest for an academic career, an uncommon shift with significant social and economic implications for our future. More importantly, while I resumed the role of graduate student, this change created a growing burden of domestic and childrearing responsibilities for her. Juggling the needs of a growing family, the initial years in suburban Chicago were quite difficult, but with Sandy’s steady encouragement and hard work we succeeded. Purpose and common sense prevailed. Because of dwindling academic positions in my field of expertise, the ensuing moves, first to Minnesota, then Wisconsin, and ultimately San Francisco all required flexibility, understanding, and above all careful planning.

 During an especially stressful time in Minnesota in 1981 when it appeared that my position of assistant professor financed by an extramural grant would no longer be funded, Sandy immediately started preparations for returning to her previous nursing career, ready to contribute financially. Later, when I obtained an appointment in Wisconsin and moved to Madison, she was left alone not only to manage our Minneapolis household but also organize and execute the sale and subsequent interstate move.

 In the 14 years in Madison and later the 15 years in San Francisco while I was the senior professor and departmental chair, Sandy managed to organize and host a colossal number of social functions, receptions, and dinners with graduate students, faculty members, and visiting scholars in addition to her weaving and volunteering at the California Marine Mammal Center and State Parks. Finally, to celebrate my retirement and 70th birthday, she was instrumental in organizing a special luncheon at the American Association of the History of Medicine meeting in Kansas City, MO to give colleagues and friends a chance for a typical roast.                            

 Finally, I would like to acknowledge the contributions of my three daughters Heidi, Monica and Lisa. Change can often be difficult and frustrating. Searching for adequate housing, making new friends, and finding adequate schools for our children took courage, patience, good will and time. Their acceptance of our multiple moves through the Midwest at key points of their schooling and socializing demanded by my academic career was admirable. As they grew older, help included chores related to my scholarly writing. I will never forget the Christmas holiday season of 1998 when the entire family created a general index for my 700-page book on hospital history after the graduate student hired for the task unexpectedly quit. Sitting around the tree and wrapped presents, my girls shouted and shuffled their cards to meet the publisher’s final deadline. Much was at stake: after nearly ten years of research and writing, the book needed to materialize before the next millennium. Editorial help was always forthcoming, especially from Lisa who read my last two books on Chinatown. Monica worked with illustrations and created Power Point presentations. Heidi created and still maintains my private website.

 Under the rubric family, I also wish to point out the important role played by neighbors, personal friends, colleagues, and students in each of the venues I had the privilege to interact with them. Along the way they vastly enriched our lives. An apt and humorous summary of my odyssey can be found in the lyrics of a song composed in Madison, Wisconsin on the occasion of my academic move to San Francisco in 1985.

 
LOST TO SAN FRANCISCO: AN ACADEMIC LAMENT

(Song to the tune of My Darling Clementine) 

Chorus:
Oh my Guenter
Oh my Guenter
Gone to Frisco far away
You are lost but not forgotten
Gone to Frisco far away

 From the Pampas
Came to Buffalo
Where the snow fell everyday
To Chicago
Studied mummies
But medical history held sway

(Chorus)

 Then to Minnie
Minnie-apple
Where Lord Leonard had his way
With his ste-tos-
scope he learned
What your plans were every day

(Chorus) 

From Minneapolis
You came to Madison
Athens of the Middle West
Brought in Ronald
And then Judy
Made medical history the very best

(Chorus) 

TV Lenny
And his warehouse
Kept you warm and entertained
But the call from
San Francisco
Was too loud to be detained

(Chorus) 

AIDS and earthquakes
Not withstanding
You leave our Athens
For the West
You are lost but not forgotten
And we wish you all the best!

(Chorus)

My Rare Book Collection Has A New Home

Guenter-Rissee-edit-396x704.jpg

Since retiring and moving to Seattle in 2001, I decided in subsequent years to donate books and journals acquired during my thirty-year career as medical historian. Given my eclectic and wide-ranging interests, book and journals obtained from purchases at auction, publishers‘ review items, as well as gifts from colleagues living abroad, had created a valuable assortment of primary and secondary sources related to the history of disease and medicine. Conversely, irked by the activities of aggressive, largely California bottle collectors who went after popular remedies by dumping the contents and destroying their labels, I had also started in the 1970s to assemble a veritable assortment of pharmaceutical ephemera from small rural Wisconsin and Iowa drugstores forced to close their doors.

My early and favorite recipient was the National Library of Medicine in Bethesda, MD. Books written in Spanish and collected during a WHO sponsored visit to Latin America in 1989, went to the UCLA Medical Library in Los Angeles. Since the 2010s, German medical works enriched the holdings of the University of Washington in Seattle. British reference materials found a new home at the Huntington Library in Pasadena while some Argentine periodicals reinforced the splendid collections of the New York Academy of Medicine. However, since the core of the book collection and the pharmaceutical memorabilia remained intact, the final and logical step was to search for an institution interested in the medical past but devoid of historical materials.

Because of my affiliation with the University of Washington Medical School, I knew about efforts in the State Legislature to establish a second medical school under the auspices of Washington State University on its Health Sciences campus in Spokane. Despite bitter debates and considerable opposition, the proposal was approved in 2015, leading to the official establishment of the Elson S. Floyd College of Medicine. Subsequent accreditation allowed the institution to officially schedule the admission of its first class of students for August 2017. The rest was easy. That summer, I contacted Jonathan D. Potter, director of the WSU Academic Library in Spokane. Coincidentally, Potter was already actively involved in collecting and preserving digital and physical documents from the new Medical School. After a brisk email correspondence, several visits involving Porter and Spokane faculty members, as well as strong support from my family, we agreed by Christmas 2017 on the transfer of the books and pharmaceutical antiques, a task completed in spring 2018.

To learn more about where my antique and rare book collection is located, read their announcement here:

Medical historian gifts College of Medicine with rare collection

Screening Immigrants for Disease: A Personal Story

Abstract

This essay discusses the roots of American anxieties and fear of imported disease as well as efforts to screen immigrants suffering from tuberculosis. Using a personal experience, the story examines this “loathsome” menace as articulated in successive US Immigration Laws since 1882.


America’s persistent fears regarding foreign threats from imported disease has deep historical roots. Much has been written about the health-related aspects of US immigration. As ethicist Arthur Caplan recently reaffirmed, “there is a long, sad and shameful tradition in the United States in using fear of disease, contagion, and contamination to stigmatize immigrants and foreigners.” As I argued recently in a separate essay, our country has traditionally sensationalized “invasions” of foreign infectious diseases, blaming arriving weak or unhealthy “others,” for their appearance and transmission. My hypothesis regarding such strong and negative emotions suggested that they were rooted in experiences reaching back to the earliest conquest and colonization of what was believed to be a pristine and healthy American continent. Imported scourges--notably smallpox—quickly overwhelmed and decimated America’s original adult population exhibiting genetic uniformity and immunological incompetence. In Colonial America, fear, disgust, and paranoia shaped responses to imported smallpox in Boston after 1636, notably the Salem witchcraft proceedings.

The birth of a new and free nation--the United States--generated enthusiasm for protecting and improving the wellbeing of its citizens. “Good health” enhanced the quality of life and became an integral part of the new nation’s identity. Moreover, it insured a person’s capability to obtain employment and thus create the necessary financial foundations essential for living independently and successfully accomplishing life’s most desirable goals. For this purpose it was essential that all risks and threats with the potential of impairing wellness be forcefully thwarted. By the nineteenth century, increasing flows of European immigrants led to periodic epidemic of cholera afflicting larger population centers such as New York. New threats like leprosy and plague originated in Asia, blamed on hordes of “uncivilized” Chinese migrants. Indeed, as analyzed in my two recent books on San Francisco, the specter of this “yellow peril” drifting towards the rest of the country ignited the flames of American nationalism and racism, spreading panic in California and beyond.

On a gray Saturday afternoon on September 20, 1958, the Dutch-owned Liberty freighter, the SS Alpherat, slowed down at the entrance of New York Harbor, stopping next to a cutter from the nearby US Coast Guard Service stationed on Staten Island. Soon a three-man team emerged and proceeded to board our vessel. Their purpose: check all pertinent documentation as well as interview crew and passengers. As one of three “working passengers” traveling from Buenos Aires, Argentina, I was eager to arrive at my final destination in Brooklyn where a friend of my father was expecting to welcome me. The trip had lasted almost a month, with successive stops in Brazilian ports such as Paranagua, Santos, Rio de Janeiro, and Vitoria that had allowed extensive sightseeing, followed by a somewhat perilous Atlantic crossing during hurricane season. Although cargo--mostly bananas--was the captain’s priority, passengers were excused from performing chores; we dined daily with him and his officers and enjoyed excellent food and conversation.

This was my first trip to the US. I had graduated a few months earlier from the University of Buenos Aires School of Medicine, and with the help of my father’s business connections, accepted an offer from Buffalo’s Mercy Hospital for a year of postgraduate training: a rotating internship that would start October 1st. After a series of bureaucratic snafus—I had to fill out a number of questionnaires assuring Americans that I was not a criminal, polygamist, revolutionary, communist, or homosexual--I finally obtained a student exchange visa issued by the US Consulate in Buenos Aires and took leave from my family. The timing seemed right: Argentina was in the midst of another serious political and economic turmoil. A general strike of physicians had already led to riots and violence at my alma mater, aggravated by rampant inflation and postal paralysis.

Thanks to my father’s preliminary efforts, my two shipmates came to share one of the two cabins while I had the luxury of occupying the other, an arrangement that allowed a greater measure of privacy needed to intensify my study of the American English language. After all, I would soon be interacting with very sick patients and making fateful medical decisions that required a measure of fluency and familiarity with a new vocabulary. One of the passengers was a middle-aged businessman who had periodically visited his brother in Miami and now expected to remain in the US and become his partner. Stressing his Jewishness, friendly and ebullient, he was a valuable source of information regarding American life, although it soon became clear that the “business” was linked to a sleazy gambling and prostitution ring. In sharp contrast, the other younger male protected his privacy and seemed quite reserved. He admitted to be originally from Croatia, but it was not clear when he had immigrated to Argentina. I came away with an impression that something mysterious surrounded his persona, a suspicion that was validated when the Coast Guard officials came on board. Intimidated by their authoritative demeanor and colorful uniforms, we all crowded in a reception area, clutching our travel documents and waiting to be called for personal interviews. To our astonishment, we observed that the Croatian was promptly escorted to the waiting boat; according to his roommate, the man had turned out to be a suspected Nazi collaborator during WW II traveling on dubious documentation.

As Alan Kraut in his book about the so-called “immigrant menace” remarked, for most immigrants landing on American shores the first encounter was a physician. While my papers seemed to be in order, one of the officials from the US Public Service began to take a particular interest in my health certificate and conventional chest X Ray. Both had been issued in Buenos Aires and deemed acceptable by the American Consulate. I was aware that the film depicted a small, calcified tuberculous lesion in my upper left lung, the result of an exposure to actively infectious patients. The infection, that made me tuberculin positive, had probably occurred three years earlier during a clinical rotation in Buenos Aires’ most prominent sanatorium, the Hospital Municipal Enrique Tornu. Since 1904 the venerable institution, a vast complex of pavilions and gardens, had not only collected local sufferers of this disease, notably domestics and day workers, but also housed individuals sent from distant Argentine provinces. Indeed, like any other modern metropolis around the world flooded with immigrants, Buenos Aires continued to have its share of contagious residents afflicted with tuberculosis, although the new antibiotic treatments were now drastically lowering their numbers and calming past obsessions regarding the danger they posed to the public at large. For local physicians, my lesion was judged to be common and benign. I had been reassured that in time it would surely be reabsorbed and disappear.

While my chest X-ray seemed to attract further scrutiny and unfavorable commentary, I wondered what all this fuss about a solitary and in my mind innocuous spot in my lung meant. I guessed that without prophylactic antibiotic treatment—then uncommon in my country of birth—the question arose: was the disease still active? Then and now, the natural history of tuberculosis is poorly understood. The presumption was that this was still a latent case with live tubercular bacteria encased in a calcified box. Although such containment ruled out the risk of current contagion, there was still a possibility that the dormant bacilli could be reactivated and dispersed throughout the body, leading to a generalized miliary form of infection.  

At the time of my arrival, section 212 of the Immigration Law of 1952 known as the McCarran and Walter Act stipulated that aliens suffering from leprosy, dangerous contagious diseases and “tuberculosis in any form” could be prevented from landing and detained in an US Immigration facility for further examination. Based on Section 13 of the Act, such people were “potentially excludable” with or without a special inquiry if the medical officer certified that the alien was afflicted with disease without a right of appeal. Unbeknownst to me, tuberculosis had already been successfully employed decades earlier not only to prevent the entry of Mexican workers but also to force their removal. An unfavorable judgment could spell disaster. I suddenly realized that I could be taken off the ship, perhaps quarantined before being repatriated, thus jeopardizing my plans for an internship and residency in America. My future medical career could be in jeopardy; plans for a postgraduate education abroad shattered.

Delving into the spectrum of emotions that drove Americans to harsh measures like segregation and isolation is illustrative. Fed by psychological, ideological, and pragmatic urges, notably xenophobia and overt racism, these efforts succeeded in scapegoating and stereotyping not only victims of smallpox, but later also sufferers from leprosy, tuberculosis, plague, and syphilis. In fact, a few months after the Chinese Exclusion Act, the first formal US Immigration Act, approved of August 3, 1882, excluded people with mental and physical defects considered threats to the country. The emphasis was visible abnormalities, notably those “suffering from loathsome or dangerous disease.”

Since our skin bears many natural attributes, wrinkles, colors, scars, and infectious eruptions, these features have been employed throughout history as means to identify diseases and stigmatize social groups. The use of “loathsome”—a code word--was deliberate. Readily institutionalized by the medical profession, the term became part of an emotional vocabulary designed to instill aversion and rejection. Loathsomeness implied a broad range of revolting feelings, from physical disgust to moral contempt, fear to outrage and repulsion, horror to odium. Primarily intended to identify acute ailments with hideous skin manifestations, the attribution was also easily linked to moral infringements and deficiencies. Visual “skin reading” thus became the bedrock for the diagnosis of contemptible diseases, although few practitioners possessed the necessary skills for rising to the challenge. Given the stigmatized character of these diseases, most physicians, already warped by moral prejudices, mostly relied on quick visual glances during their clinical screenings, occasionally enlisting additional clinical or laboratory criteria to back up their initial interpretations.

In fact, “loathsome or dangerous contagious disease” remained a key criterion for exclusion in the Immigration Act of 1891. Opened in 1892, the Ellis Island immigration station subsequently witnessed the arrival of millions of people, peaking between 1910 and 1914, until its closure just four years before my arrival. Numerous scholars have uncovered personal stories and official documentation related to the medical inspections of immigrants that took place there. For steerage passengers, the “six-second physical” occurred following disembarkation, as men, women, and children lined up single file with their belongings, joining several queues headed by officers from the US Public Health Service. With their experienced eyes targeting facial features and bodily postures, the doctors’ fingers rudely retracted the eyelids of every newcomer, searching for signs of a dreaded disease: trachoma, an ubiquitous infectious condition leading to total blindness. Following the brief inspection, examiners identified and detained those newcomers requiring additional medical evaluation through a system of coded alphabetical letters, written on their clothing using a stick of chalk. Detention and further examinations awaited many.

This momentous formality became seared in the collective memory of the new arrivals. Decades ago, searching in the archives of the New York Public Library, I discovered a medical examination play written in the early 1900s obviously based on contemporary immigrant experiences. I surmised that the script was perhaps intended for local school children with the goal of familiarizing them with America’s health culture and medical system. The screenings, a crucial rite of passage, often termed the “Final Day of Judgment,” were justified as protecting the homeland from nasty and crippling imported disease, particularly contagious conditions with the potential of triggering epidemics deemed dangerous. But beyond the new awareness based on the germ theory of disease loomed other interests, racial, economic, cultural, and political, all wishing to restrict or halt immigration. Wrote one health official, A.C. Reed in 1913: “The medical phases of immigration blend very quickly into the subjects of national health protection, national eugenics, and even the future existence of the ideals and standards of life which we are proud to call American.”

In addition to arrivals with suffering from “loathsome and dangerous contagious disease,” the Immigration Act of February 1907 added a separate category for exclusion: ”persons afflicted with tuberculosis.” In fact “pulmonary tuberculosis” had already been classified as a “dangerous contagious disease” in the Classification of Excludable Medical Conditions in 1903. The new focus on an old and lethal scourge at the turn of the 20th century, previously referred to as “consumption” or the “wasting disease,” reflected greater medical and public awareness of its presence among arrivals from North and Central Europe. Affecting all social classes but predominantly infecting poor urban dwellers living and working in overcrowded environments, pulmonary tuberculosis had acquired nearly epidemic proportions, but its manifestations remained hidden from sight in sharp contrast with more spectacular disfiguring human foes like smallpox and leprosy. Nicknamed the “white plague,” the silent disease previously considered hereditary, lurked in unventilated tenement houses and sweatshops. Stereotypical among its New York victims were Jewish tailors.

Thanks to the bacteriological revolution of the 1880s, the etiological agent of this scourge—the tubercle bacillus—had rapidly gained credence in medical circles, helping not only by its presence to confirm the diagnosis, but also explain the mechanism of its aerial person to person transmission through coughing and spitting. By the early 1910s, culturing such discharges for bacterial presence and visualizing the location and damage in lung tissues with the help of chest X Rays decisively contributed to the screenings of potential sufferers, including immigrants. Fear and discrimination reared their ugly head. In the new world of germs, sufferers stigmatized for their lack of hygiene and careless behavior, were now held responsible for the spread of tuberculosis and often subjected to involuntary confinement. The sick could threaten the health of the nation. A decade later, in 1917, the new Immigration Act now included “persons afflicted with tuberculosis in any form.”

A key factor driving the expanding category of excludable immigrants was the belief that if allowed to enter the United States, they would “likely become a public charge.” The country needed healthy laborers capable of tolerating the physical challenges of a rapidly industrializing world. Eugenic considerations played an influential role based on earlier hereditary prejudices. Racially constructed notions of debility, notably applicable to “rice-eating” Asians, favored wholesale exclusion. Sickness, especially long-term, could prove disastrous for their job-dependent lives. Indeed, since tuberculosis often turned into a chronic disease, the expense of caring for those admitted to America would be considerable, especially if carried out in newly created specialized facilities: sanatoria. To this day, prejudice and stigmatizing loathing continues to be a key component of our invasive screening procedures carried out In the name of public health and safety.

Fortunately my own story had a happy ending. Whether the US Public Health officials subscribed to earlier notions that people from other countries in the Americas possessed greater susceptibility to tuberculosis could not be determined. On the other hand, skilled immigrants--myself included--faced fewer obstacles to enter the US under the McCarran and Walter Act. After a brief consultation, I was finally allowed to disembark with the proviso that I schedule serial checkups with some of my future medical colleagues in Buffalo. More than half a century later, the lonely calcium spot--like a veritable time bomb—still lodges in my lung, occasionally spooking my caregivers.

Sources

Emily Abel, From Exclusion to Expulsion: Mexicans and Tuberculosis in Los Angeles, 1914-1940,” Bulletin of the History of Medicine 77(Winter 2003): 823-849.

American Federation of Labor, “Some Reasons for Chinese Exclusion,” reprinted in US Senate Documents of the 37th Congress, Washington, DC: Government Printing Office, 1902.

Diego Armus, The Ailing City: Health, Tuberculosis, and Culture in Buenos Aires, 1870-1950, Durham: Duke University Press, 2011.

Barbara Bates, Bargaining for Life: A Social History of Tuberculosis, 1876-1938, Philadelphia: University of Pennsylvania Press, 1992.

Susan Craddock, “Tuberculosis, Tenements, and the Epistemology of Neglect,” in City of Plagues: Disease, Poverty and Deviance in San Francisco, Minneapolis: University of Minnesota Press, 2000.

Amy L. Fairchild, “Politics of Inclusion: Immigration, Disease, Dependency and American Immigration Policy at the Dawn and Dusk of the 20th Century,” American Journal of Public Health 94 (April 2004): 528-539.

Alan M. Kraut, Silent Travelers: Germs, Genes, and the Immigrant Menace, New York: Basic Books, 1994.

Howard Markel and Alexandra M. Stern, “ The Foreignness of Germs: The Persistent Association of Immigrants and Disease in American Society,” Milbank Memorial Quarterly 80 (2002): 757-788.

A. C. Reed, “Immigration and the Public Health,” Popular Science Monthly 83 (1913): 320-328.

Guenter B. Risse, Plague, Fear and Politics in San Francisco’s Chinatown, Baltimore: Johns Hopkins University Press, 2012.

Guenter B. Risse, Driven by Fear: Epidemics and Isolation in San Francisco’s House of Pestilence, Urbana: University of Illinois Press, 2016.

Guenter B. Risse, “Fear of Outsiders is an American Tradition,” History News Network, Feb 1, 2016. http://historynewsnetwork,org/article/161836

Nayan Shah, Contagious Divides: Epidemics and Race in San Francisco’s Chinatown, Berkeley: University of California Press, 2001.

Beth Rowen, “ Immigration Legislation From the Colonial Period to the Present,” www.infoplease.com/us/immigration/legislation-timeline-html

Elizabeth Yew, “Medical Inspection of Immigrants at Ellis Island, 1891-1924,” Bulletin of the New York Academy of Medicine 56, 1980): 488-510.

Gut Reactions: Fear and Disgust in Public Health History

Featured in the History of Emotions blog this week:

As far back as Charles Darwin’s 1872 work, The Expression of Man and Animals, it has been assumed that the emotion of disgust is an evolved universal trait, found in all cultures at all times. In the late 1960s and early 1970s, Paul Ekman and Wallace Friesen presented evidence for a universal facial expression of disgust, all-but cementing it as one of the six basic emotions that are still used as the backbone of a great deal of emotion research.

This week, a group of scholars from a range of disciplines have come together to explore different aspects of disgust, attempt to define it and question whether it had an origin, whether it is universal, and whether it can even be called an emotion. The first in this series is a guest post by Guenter B. Risse.

Risse is professor emeritus of the history of medicine at the University of California, San Francisco, and currently affiliate professor at the University of Washington’s Department of Bioethics and Medical Humanities. He has been working on the influence of emotions in public health, notably in his Plague, Fear and Politics in San Francisco’s Chinatown (Johns Hopkins University Press, 2012) and the recent Driven by Fear: Epidemics and Isolation in San Francisco’s House of Pestilence (University of Illinois Press, 2016) His blog attempts to sketch the dynamics responsible for the construction of health risks and drastic separation of persons deemed dangerous to society.


Testifying in April 1876 before a commission appointed by the California Senate to study the nefarious impact of Chinese immigration, Hugh Toland, a prominent surgeon and founder of his own medical school, asserted that 90 percent of venereal cases in San Francisco could be blamed on Chinese prostitutes. Engaging in “beastly sex,” they ultimately threatened America’s survival by “infusing poison into the Anglo-Saxon blood.” Revealing himself as a notorious racist, Toland expanded on the alarming incidence of “some of the worst cases of syphilis I have ever seen.” For a few dimes, white preteen local boys were initiated into the pleasures of the flesh with “frightful” consequences, poisoned by the seeds of a nearly incurable disease filling “our hospitals with invalids.” Indeed, some women in advanced stages of the disease came to the city hospital, where frightened and disgusted patients as well as physicians sought their immediate expulsion and transfer to a stigmatized isolation facility: San Francisco’s Pesthouse, where they briefly languished before dying.

Continue reading at the History of Emotions blog, August 22, 2016.

SF Public Library - Facebook Event Announcement

The San Franicsco Public Library has created a Facebook Event page for our event on June 2, 6pm - a talk on the new book, Driven by Fear. It's a Q&A style format rather than a lecture, with lots of personal stories and anecdotes. If you are interested, or know of a friend who might be interested, please share it.

Book sales will be available too.

San Francisco Public Library - Facebook event

Q&A - University of Illinois Press

Guenter B. Risse is a professor emeritus of the history of medicine at the University of California, San Francisco. He answered some questions about his book Driven by Fear: Epidemics and Isolation in San Francisco’s House of Pestilence.

(Read full version of the Q&A on the University of Illinois Press blog site HERE)

Q: What was the San Francisco Pesthouse and why was it unique?

Guenter Risse: So-called “pest houses” or lazarettos were mostly temporary facilities created to segregate and isolate individuals suspected or actually suffering from diseases deemed contagious and therefore capable of igniting an epidemic outbreak. Initially conceived during the Black Death pandemic in Europe, such secluded institutions and quarantine stations were preferably located in port cities or solitary islands. Unlike other institutions of forceful confinement such as hospitals, asylums, and prisons, they have been seldom studied. Delving into the spectrum of feelings that drove them to harsh measures like segregation and isolation is illustrative. Their role must be more fully explored and inform our current beliefs and behavior.

The San Francisco Pesthouse opened its doors after the Gold Rush to separate arriving migrants suffering from smallpox, notably Chinese. Given the perceived dangers of mass infection, the choice of location was a hillside in the largely rural Potrero Nuevo district, away from the city. Here it remained for almost a century, in spite of subsequent urban sprawl. Its uniqueness stems from the fact that beyond harboring infected people, the SF Pesthouse became an instrument of blatant racial segregation, confining not only Chinese suffering from syphilis and leprosy, but warehousing a host of other terminal patients with chronic ailments.

Q: In Driven by Fear, you show that we cannot solely understand reactions to San Francisco’s Pesthouse and its inmates by framing the problem in purely rational terms. What does the story tell us about the need to understand the role of emotions in shaping local responses?

Risse: California’s emotional landscape was shaped by environmental and cultural factors linked to the consequences of Gold Rush. Self-reliance, independence, and the pursuit of personal wealth favored narcissism and discouraged social cohesion. Distrust of government led to weak and corrupt political, administrative, and legal institutions. Health and illness were mostly private concerns in a world of intense social and economic competition requiring a great deal of physical and spiritual stamina. Threats to individual well-being lurked everywhere, sparking distress and fear of their implications for employment and survival. Often deprived of aid provided by family and friends, vulnerable newcomers easily panicked, blaming and lashing out at strangers. The results in San Francisco were not surprising: a profusion of private physicians with dubious credentials together with a corrupted and underfinanced municipal health department that provided their hospitals and the Pesthouse with insufficient funds compounded the inmates’ misery.

Q: How can we enhance public health history?

Risse: Traditional narratives of public health depict a successful rational enterprise based on successive scientific discoveries, progressive legislation, and their steadfast implementation in the service of government and the well-being of a population. Since emotions are essential forms of human experience, however, they are central for understanding human conduct and survival. Indeed, behavior and language expose an array of negative emotions surrounding threats or states of sickness, from anxiety to fear of contagion, disgust with deformed physical appearances, as well as contempt concerning cultural practices that create undesirable landscapes and unfamiliar odors. Since Pacific Coast cities such as San Francisco came to fear the scope and consequences of Asian migration, notably China, feelings of revulsion came to include persons suffering from “loathsome” diseases, especially smallpox and leprosy. The use of this code word was deliberate; it was part of a popular and medical vocabulary designed to instill aversion. Loathsomeness implied a broad range of revolting feelings, from physical repulsion to moral scorn, racial fears to outrage, odium to hate. Taken together, these emotions were instrumental in creating a political and social climate responsible for shaping the institutional trajectory and reputation of the SF Pesthouse.

Q: Can the history of emotions also offer assistance in understanding America’s persistent fears regarding recent threats of intrusive foreign diseases such as Ebola fever and Zika virus infections?

Risse: Prompted by xenophobia and overt racism, the United States has traditionally sensationalized “invasions” of foreign infectious diseases, blaming arriving weak or unhealthy “others,” for their appearance and transmission. Awareness of dangers, real or imagined, is largely based on assumptions derived from past exposures. Indeed, strong and negative emotions are rooted in experiences reaching back to the earliest conquest and colonization of what was erroneously believed to be a pristine and healthy continent. Unfortunately, the first contacts between indigenous people, conquerors, and settlers lead to the involuntary spread of deadly endemic European diseases conveyed by the new arrivals. These imported scourges—notably smallpox—quickly overwhelmed and decimated America’s adult population exhibiting genetic uniformity and immunological incompetence.

Empires declined or were destroyed, with depopulation forcing the introduction of slaves from Africa afflicted with their own equally fatal cohort of tropical diseases like malaria and yellow fever, all ostensibly sullying the Promised Land. It is therefore not surprising that since colonial times, fear, disgust, and paranoia have shaped responses to imported contagions. “Good health” enhanced the quality of life and became an integral part of the new nation’s identity; it was essential that all risks and threats with the potential of impairing wellness needed to be forcefully thwarted. Similar prejudices about foreign health threats have persisted over the centuries. The 1918 flu pandemic was blamed on a surge of European migrants. Tuberculosis was associated with arriving Jews while Italians and Polish arrivals were held responsible for spreading poliomyelitis. For a while, Haiti was incriminated in the origin of HIV/AIDS. SARS came from China. Not to be excluded were Mexican newcomers, repeatedly denounced for bringing tropical scourges into the United States.

Q: Stigma and scapegoating play powerful roles in your narrative. What made them so critical in San Francisco?

Risse: Stigma in San Francisco resulted from a unique combination of powerful nativist, racial, medical, cultural, and economic ideologies. The term, originally employed in ancient Greece, sought to highlight the presence of certain cutaneous markers or blemishes suggestive of hidden internal defects based on a popular premise that the skin could “speak” and thus mirror troubles of the body and soul. Such a concept easily led to racial profiling and social distancing, misogyny and cultural conflict. Indeed, “stigmata” revealed much about a person’s identity, moral and social standing, as well as health status. The latter could be quickly recognized and diagnosed for a group of infectious diseases with obvious facial lesions that posed a serious danger of widespread contagion.

Given the presence of a substantial number of migrants from China, San Franciscans eagerly discredited Chinatown residents, considering them as potential purveyors of deadly contagious disease such as smallpox, leprosy, syphilis, and plague. Blaming this ethnic population for each epidemic outbreak and threatening to close the district by forcefully returning the inhabitants to their homeland became routine. The vehemence of this critique and blame was exacerbated because of San Francisco’s 19th century reputation as one of the most healthful U.S. cities, a welcome and profitable destination for convalescents, notably those suffering from tuberculosis. With mild weather and tonic breezes, the “city of refuge” worried about the impact of Chinese casualties on their low mortality rates.

Q: Finally, the rather sad story of the shunned and poorly funded San Francisco Pesthouse has a much happier ending. What convinced San Franciscans—a breed known as rugged individualists, to say the least—-to change their approach in fighting contagious diseases?

Risse: The devastating 1906 earthquake and fire with its vast panorama of urban ruin prompted a dramatic shift in outlook. With large populations of homeless and unemployed, the citizens of San Francisco rallied to display solidarity and empathy, adopting a true communitarian approach to deal with all aspects of the catastrophe. The new emotional tone was aided by the attendant rise of progressivism with its plea for political reform, honest government, and respect for scientific achievements. Moreover, San Francisco’s traditional apathy with regard to public health measures was further shaken following another outbreak of bubonic plague in May 1907 that, unlike its 1900 predecessor, afflicted mostly non-Chinese residents.

At the Pesthouse, expanded operating budgets and a brand new ward for plague cases reversed decades of neglect. Federal demands for an anti-rat campaign—the rat was widely believed to act as host for the plague bacilli—led to the creation of a Citizen’s Health Committee, a broad coalition of civic, commercial, transportation and academic interests in January 1908. In the face of feeble municipal resources, the organization pledged to provide funding for a grassroots drive to educate the public and eliminate rats and their sources of food and shelter. This preventive endeavor succeeded admirably within a few months, leading to the elimination of this scourge, allowing civic leaders to proclaim San Francisco “the healthiest large city in the United States.”

 

New Book Announcement: Driven by Fear

I'm very pleased to announce that my new book, Driven by Fear: Epidemics and Isolation in San Francisco's House of Pestilence has just been published by University of Illinois Pres; 1 edition (December 30, 2015).

Filling a significant gap in contemporary scholarship, this book looks at the past to offer critical insights for our age subjected to bioterror threats and emerging infectious diseases. Public health history requires an understanding of irrational as well as rational motives. To that end it is essential to start delving into the spectrum of negative emotions that drove policy decisions like segregation and seclusion. Feelings of dread and disgust fed psychological, ideological, and pragmatic urges to scapegoat and stereotype victims—particularly nineteenth-century immigrants victims-of smallpox, leprosy, plague, and syphilis who were isolated in San Francisco’s Pesthouse. Practically erased from the historical record, this important establishment arose within a political climate dominated by xenophobia and racism. Since the local authorities chose to protect the city's reputation as a haven for health restoration, the institutional trajectory took place outside the metropolis in an environment of want and despair. Ultimately, the Pesthouse story aims to reclaim people and events hitherto invisible while offering valuable comparisons with American reactions to AIDS, SARS, and more recently Ebola fever.

drivenbyfear.jpg

For more information, including Table of Contents and reviews, see my page under Books.

This book is available for sale on Amazon in Kindle, Hardcover and Paperback versions.

Manipulation of Ancestral Dread: Nativist Resposes to Perceived Foreign Health Threats

MANIPULATION OF ANCESTRAL DREAD: NATIVIST RESPONSES TO PERCEIVED FOREIGN HEALTH THREATS

Newly perceived health risks elicit instinctive and protective emotional responses, mostly anguish, fear, and disgust. Their intensity varies based on a person’s previous experiences, available knowledge about the perceived danger, and efforts by politicians and the media designed to validate, exploit, amplify, or distort the menace. Stigmatization and blame seek to ameliorate anxiety by deflecting responsibility to others, notably foreigners. Earlier it was someone from Haiti that supposedly brought HIV/AIDS to America. China was responsible for spreading SARS and the avian flu. More recently, in 2014, a traveler from West Africa introduced Ebola fever. Courtesy of the “girls” and even “boys” from Brazil, dread of another contemporary introduction, the Zika virus, sustains our current national fear mongering.

A further historical example is quite illustrative. The bizarre incident involving Charles C. O’Donnell’(1835-1912) and his procession with a Chinese sufferer of leprosy through the streets of San Francisco in the fall of 1878 was recounted elsewhere (Historical News Network, February 1, 2016). It remains a stark example for stirring up strong anti-immigrant feelings through exposure to visibly sick foreign individuals pronounced extremely dangerous to public health. Using terror to mobilize like-minded people and create panic, the macabre parade led to O’Donnell’s arrest at the Palace Hotel. Yet, following his release, the racist doctor turned politician continued to display his xenophobia and obsession with both smallpox and the “scaly disease.” His nativist rants, including the charge that China had already conquered California and its elected officials were receiving their orders from the Emperor in Peking, contributed to the eventual passage of the Chinese Exclusion Act of 1882.

Dr. Charles C O'Donnell

Dr. Charles C O'Donnell

Practicing medicine at the edge of Chinatown since the Gold Rush--his office was located on 704 Washington St at the corner with Kearny--O’Donnell boasted to treat families, notably women and children. Originally from Baltimore, his medical credentials remained murky and he often felt professonially threatened, accused of being an abortionist and quack. Undaunted, O’Donnell frequented the streets of the Chinese district, hunting for sufferers of smallpox and especially leprosy “like a hog rooting truffles” as quoted by a New York Times reporter. Detecting and confronting lepers with highly visible lesions, this self proclaimed healer took advantage of their despair and apathy. Pretending to offer a remedy—lumps of pork fat—O’Donnell claimed to “own or seize” his “patients.” Glimpsed from Chinese practitioners, his rationale for employing this popular ethnic food article was supposed to act homeopathically. he blamed those who submitted to his advice for any lack of improvement.

In the summer of 1884 O’Donnell unveiled a new project to spread his fear mongering beyond San Francisco: a trip by train across the country in the company of two Chinese afflicted with advanced symptoms of leprosy. The main purpose was to exhibit them at anti-immigration rallies organized during successive stops in St Louis, Cincinnati, Indianapolis, Chicago, New York, Philadelphia and Baltimore, ending in Washington DC. At the nation’s capital, these unfortunate individuals would be brought to the Capitol building and handed over to US Congress members debating immigration legislation.

The ostensible reason for the transcontinental show was O’Donnell’s determination to enlighten East Coast dwellers and authorities about the horrors of foreign leprosy and risks of admitting further “coolies” suffering from despicable diseases. Asked O’Donnell: “Who gave us lepers for our companions, who installed yellow harlots for the avowed purpose of maintaining gambling dens and opium decays?” Public officials, mostly arrogant and ignorant “Eastern monomaniacs” were at fault, allowing Chinese “harbingers of disease and couriers of death” to freely roam the streets of many American cities and towns.

Transportation of diseased Chinese

Transportation of diseased Chinese

To achieve his goal, O’Donnell requested a special boxcar to house the sorry victims of this disease, with a partition for an exhibit of photographs taken from other sufferers similarly diagnosed. For him the “heathen Chinee” were like domestic cattle, useful specimens for scientific experimentation and exhibition. However, citing potential risks of contagion and violation of interstate commercial regulations, the railroad companies refused to transport the sick Chinese in spite of repeated threats from the doctor’s rowdy supporters who wished to force the issue by creating a human barrier around the travelers. To avoid further clashes with police, O’Donnell decided in late June to start his sojourn without the company of his “scaly pets,” substituting them with a photographic exhibit described by a local newspaper as “rude portraits of a dozen hideously seamed, scarred and swollen faces.” His departure was celebrated with a spectacular procession through downtown San Francisco to the waterfront before taking the Oakland ferry on it way to the train station. Accompanied by two small German bands, an open hack drawn by four horses carried O’Donnell and to their destination, interrupted by a long-winded diatribe speech promoting the destruction of the infidel Chinese. Another supporter, dressed-up as a medieval crusader, sought to symbolize the religious nature of the racist quest, a clash framed between the Christian cross and a pagan dragon, characterized as a “monster of evil omen” spreading its wings.

The leprosy dragon

The leprosy dragon

Seeking to attract the curious and idle at successive stops along his pilgrimage, O’Donnell extolled his expertise concerning leprosy, always promising to eventually unveil his horrific patients after his lectures. Of course, the sick men never materialized, prompting the doctor to blame the railroad for having delayed the freight car in which his specimens allegedly traveled and then sharing his gruesome photographs with the audience. An editorial in the San Francisco Chronicle asserted: ”Some of Dr. O’Donnell’s talk about leprosy is wild enough to entitle him to a place in a lunatic asylum, while his whole scheme is to secure personal notoriety, not to redress any public evil.”

Arriving in New York City on August 1st, O’Donnell sought lodgings at the Grand Union Hotel. Attracted by newspaper advertisements, nearly 2000 persons congregated at Union Square the next morning, looking for a free show and especially the chance to glimpse the “living dead.” When the prejudiced provocateur announced that the mayor had denied a permit for their display and the freight car in which they lived had been diverted to Brooklyn, the disappointed crowd grew restless. O’Donnell was forced to seek shelter in a nearby tailor’s shop. A week later in Washington DC, a largely Black audience of merely 200 idlers listened impassibly at the steps of City Hall as the high priest of ethnic hate delivered the same racist rant without illustrations. On mayoral orders, no exhibitions or delivery of leprous individuals would be permitted. Undaunted, O’Donnell quietly returned to the West Coast. Perhaps the most bizarre trip in the annals of bigotry sustained by fear of a “loathsome” and incurable disease was over, but its high priest of ethnic hate was rewarded: thanks to voters from wards in white working class neighborhoods and waterfront dwellers, O’Donnell was elected coroner of San Francisco in November 1884. Emboldened, he later ran unsuccessfully as an independent candidate for governor of California and in the 1890s repeatedly for mayor of San Francisco. In his History of the Pacific Coast Metropolis (1912(, John P. Young, the veteran editor of the San Francisco Chronicle, called O’Donnell a “malpractitioner.” A brash medical charlatan and political demagogue, he cunningly exploited contemporary fears about the importation of leprosy to further his racist, anti-immigration stance.

SOURCES:

American Federation of Labor, “Some Reasons for Chinese Exclusion,” reprinted in US Senate Documents of the 37th Congress, Washington, DC: Government Printing Office, 1902

John H. Boalt, “The Chinese Question,” in Chinese Immigration; its Social, Moral, and Political Effects, Report to the California State Senate of its Special Committee on Chinese Immigration, Sacramento: State Office, F. P. Thompson, 1878, pp. 254-6.

Yong Chen, Chinese San Francisco. 1850-1943: A Transpacific Community, Stanford, Cal.: Stanford Univ. Press, 2000.

Noble D. Cook, Born to Die: Disease and the New World Conquest, 1492-1650, Cambridge: Cambridge University Press, 1998.

John Duffy, Epidemics in Colonial America, Baton Rouge: Louisiana State University Press, 1971.

Philip J. Ethington, The Public City, Cambridge: Cambridge University Press, 1994, pp. 322-24.

New York Times, Jul 16, 1884, Jul 26, 1884, Jul 29, 1884, August 2, 3, and 9, 1884.

Guenter B. Risse, “Fear of Outsiders is an American Tradition,” History News Network, Feb 1, 2016. http://historynewsnetwork,org/article/161836 

Guenter B. Risse, Driven by Fear: Epidemics and Isolation in San Francisco’s ‘House of Pestilence,' Champaign, Illinois: University of Illinois Press, 2016.

Charles E. Rosenberg, The Cholera Years: The United States in 1832, 1845, and 1866, Chicago: The University of Chicago Press, 1962.

San Francisco Bulletin, Jun 30, 1884.

San Francisco Call, Jul 20, 1884.

San Francisco Chronicle, Jul 19, 1884, Aug 4, 1884.

Nayan Shah, Contagious Divides: Epidemics and Race in San Francisco’s Chinatown, Berkeley: University of California Press, 2001.

Peter N. Stearns, American Fear: The Causes and Consequences of High Anxiety, New York: Routledge, 2006.

John P. Young, San Francisco: A History of the Pacific Coast Metropolis, San Francisco: S.J. Clarke Publ. Co, 1912.

Fear of Outsiders is an American Tradition

Thank you to the History News Network for featuring my recently published book, Driven by Fear: Epidemics and Isolation in San Francisco's House of Pestilence, in my latest article, "Fear of Outsiders is an American Tradition."

On the morning of September 19, 1878, Charles C. O’Donnell, a physician with dubious credentials and the leader of the rabidly racist Anti-coolie League, seized a Chinatown dweller grotesquely afflicted with highly visible leprous sores. Forcing the man to mount an open delivery wagon, this practitioner turned politician proceeded to parade the disgraced individual through the streets of San Francisco. Stopping at several key downtown intersections before reaching Market Street and reaching the swanky Palace Hotel, O’Donnell harangued a growing and terrified crowd, emphasizing the great danger of contagion posed by his repulsive “moon-eyed leper ” At the same time, pamphlets were distributed featuring the drawing of a Chinese face ravaged by the disease and proclaiming the existence of “a thousand lepers in Chinatown.” - Continue Reading Here

THE EBOLA QUARANTINE: ROLE OF HUMAN EMOTIONS

This essay completes the analysis of the recent quarantine imposed on West Point, a district of Liberia’s capital, Monrovia, by exploring the present and past emotional dimensions of epidemic outbreaks and quarantine measures. This approach is timely given recent remarks made by the Liberian president explaining her reasons for ordering the failed closure of an entire neighborhood. Human feelings have become an important subject for interdisciplinary inquiries since they offer valuable assistance and further insights into past events, including the danger posed by epidemics and the imperative to segregate and isolate the sick. Historical recollections--from the quarantines imposed on Chinatown in San Francisco during 1900 to the recent Ebola fever scare illustrate as well as help explain the importance of negative sentiments such as fear and disgust for understanding new epidemic threats and aggressive public health measures.   

Previous essays The Ebola Outbreak: Historical Notes on Quarantine and Isolation (August 26, 2014) and The Ebola Outbreak: History Repeats Itself: Another Failed Quarantine (September 22, 2014) discussed the establishment and subsequent lifting of a total quarantine around Monrovia’s northern township of West Point in the capital city of Monrovia. The drastic and ineffective action occurred between August 20 and 29 of last year. Ordered by Liberia’s President, Ellen Johnson Sirleaf, the 2011 Noble Peace Prize winner, the operation was promptly executed by members of an Ebola Task Force in riot gear and armed with automatic weapons. In response to inquiries from local and foreign health officials, the Liberian military leaders insisted that the quarantine was designed to restore community order and cooperation. After the failed blockade was hastily raised, a presidential board of inquiry justified the intervention and accidental killing of an adolescent boy as rational and necessary given the current atmosphere of chaos and rioting, a rationale strongly challenged by an independent Liberian Commission on Human Rights. The irony was not lost on local and international observers: human rights advocates chastising a world peace-prize winner for her actions. In a recent interview granted to the New York Times, Sirleaf changed her story, seeking to defend her decision: “We did nor know what to do. We were all frightened. It was an unknown enemy. People attributed it to witchcraft. I was personally frightened.”  Her instinctive intent had been to stop the transmission of the Ebola virus at all costs: “we went into a security approach,” she admitted, closing the borders between the healthy and the sick, and thus triggering confusion, anger, and mistrust instead of cooperation. Sirleaf sounded subdued: “Now I know that people’s ownership and community participation work better in a case like this. I think the experience will stay with us.”

Perhaps, but our emotions usually have the last word. In Africa and around the world, the pervasive fear concerning Ebola persists with regard to other potential killers, like avian flu, malaria, cholera, and plague. This sentiment is frequently mentioned but rarely analyzed. To be sure, awareness of dangers, real or imagined, is mostly based on assumptions derived from past experience. Self-preservation is intuitive and we observe examples of emotion-driven behavior in response to an ever-more complex sequence of risks around the world. Feelings are essential forms of human experience; they are central for understanding human communication and survival. With the help of disciplines like cognitive neurobiology, anthropology, as well as evolutionary, social, and clinical psychology, new insights are emerging concerning a broad range of sentiments and their influence on human decision-making and behavior. Negative emotions seem to operate as an early warning system, monitoring and detecting environmental dangers and social threats. While gruesome images provide physical, aesthetic and moral appraisals, smell and touch offer their own impressions and language to reinforce such aversive emotions. Such sensory cues trigger or reinforce repugnance. Trumping reason and knowledge, fear and disgust prompt protective responses meant to be beneficial for survival.

Emotion-driven aversion and protection tend to flourish in crowded urban settings. Early medieval quarantines—envisioned as forty-days of rest and renewal sanctioned by the Scriptures—were first established near key Mediterranean seaports and trade centers. Since the Black Death, so-called “lazarettos” flourished in Europe, together with secluded facilities for sufferers of leprosy and syphilis. German “pox houses,” British “lock houses and “fever hospitals,” as well as American isolation facilities or “pest houses” testify to a far from benign tradition of institutional segregation and isolation. America’s persistent fears regarding threats of epidemic disease from abroad and their evasion through protective public health measures are particularly noteworthy. During the bubonic plague scare in March 1900, the leading Republican newspaper, the San Francisco Chronicle, launched into an attack on the city administration claiming that even a suspicion of the disease “is sufficient to terrify the community, paralyze commerce, turn away strangers and prevent even the visits of neighbors and friends.”

Plague Doctors

Plague Doctors

As described in my recent book Plague, Fear and Politics in San Francisco’s Chinatown, (Johns Hopkins, 2012), one of William R. Hearst's newspapers, the New York Journal, issued a special Sunday “Plague Edition,” leading with a headline that announced “The Black Plague Creeps into America.” The accompanying news story painted a sensational scene of the purported epidemic with men collecting bodies of plague victims in the streets of Chinatown. These fictitious accounts, based on biblical and historical descriptions and medieval iconography, painted a picture of dread and panic among San Franciscans. Even interviews with Chinese residents uncovered “much fear.” Some of these feelings were directed at American physicians said to poison their children. No one played in the streets; mothers were afraid of leaving their homes since “doctors are going to send all Chinese away far out to sea on rock; no room. no place. Chinese must all leave Chinatown.” There was also fear of “Mexican” soldiers invading the district and forcing all inhabitants to be inoculated with deadly poison.

Today, globalization and the formation of pluralistic societies seem to enhance feelings of aversion directed at virtually all aspects of human relationships: we are said to be a fear and disgust-obsessed species. Deviance is a social construction employed to define, separate, and marginalize certain categories of individuals or groups believed to threaten the order, morality, and conformity of an established majority, including those supporting public health and safety. As my new book Driven By Fear: Epidemics and Isolation in San Francisco’s ‘House of Pestilence’ makes clear, a highly charged rhetoric frequently employs emotional imagery and language to assert cultural superiority and achieve social separation. Under such circumstances, dread and revulsion help develop prejudice and stereotyping, especially around issues of morality, ethnicity and nationality—but also with regard to religious beliefs, gender roles, and bouts of sickness.

Since its inception, people in the United States have displayed a singular emotional style: constant fear of contracting diseases brought to its shores by immigrants coming from all over the world. According to the modern Western sanitary gospel, the newly arrived “unwashed” were expected to adopt hygienic values on their road to assimilation and eventual citizenship. In fact, dread has recently been called “one of the dominant emotions in contemporary American public life.” By the end of the summer 2009, as fears of a lethal and catastrophic pandemic of H1N1 influenza outbreak escalated, Fox News aired a television segment with flashing signs of “Mass Quarantines” and a repetitive sound track declaring “Be Very Afraid.” Similar warnings were expressed during the 2003 worldwide SARS outbreak.

Last spring, residents in a small Southern California town angrily protested the illegal arrival of refugees--notably children from Central America. Amid howls of “invasion” and references to other calamities, the question “what happens when they come here with diseases?” revealed this deeply imbedded cultural fear. In fact, replaying a sequence of events linked to the 1900 outbreak of bubonic plague in San Francisco, several US cities near the border with Mexico passed resolutions banning illegal aliens “suffering from diseases endemic in their countries of origin” from their communities.

Ebola’s association with the poverty, filth, and backwardness of underdeveloped Africa, similarly offends the eye and generates widespread dread and repugnance. These anxieties are compounded by the disease’s horrific terminal bodily disintegration, including the messy discharge of bodily fluids. Given Ebola’s lethality, and with no obvious cure or vaccine as yet available to protect sufferers, a revolting foreign invader threatened last summer to slip across America’s borders and cause mayhem. Fear mongering thus became ubiquitous; rumors, print media, and an ever-expanding Internet succeeded in transforming the presence of a few cases of an exotic killer in the homeland not only into an urgent national problem but also an international security crisis.

With barely disguised racial, class, and ethnic overtones, the “pandemic fear” reached nearly apocalyptic levels in mid October 2014 when two hospital nurses caring for the dying Liberian patient also contracted the highly lethal scourge. The reaction was immediate: ”letting the unknown into the country” was totally unacceptable. Perhaps all Ebola suspects should be placed on offshore hospital ships. Moreover, a relentless media blitz went viral, distorting the scientific information concerning the true risks posed by the disease. In word and image, America’s journalists magnified the danger, creating a veritable panic widely characterized as “hysteria.” With Its proximity to Halloween, the drama seamlessly melded Ebola fright with entertainment. Trick or treat revelers featured beaked medieval plague doctors and space-suited—hazmat--health workers. Freaked out by the threats, “who could blame you for deciding to remain indoors, alone in bed, indefinitely,” caustically observed one New York Times columnist.

Humans have always framed their reaction to the presence of disease by employing military metaphors. Because mass disease posed an existential risk, forceful, coercive responses powered by aversive emotions employed police or military force. According to the Centers for Disease Control and Prevention, quarantine and isolation are still considered effective “police power” functions designed “to protect the public by preventing exposure to infected persons or to persons who may be infected. In emergencies, the Department of Defense plays an important role in mobilizing troops while state, local, and tribal law continues to guide the implementation of similar protective measures to control the spread of infectious disease within their borders. After the Ebola outbreak, a presidential executive order urged the Defense Department to prepare for a call up of reservists from the National Guard, and set up a rapid reaction squad. The so-called “Ebola SWAT team” was envisioned as a specialized group of experts in logistics, epidemiology, medicine, and specialized caregiving, assembled by the CDC and ready for deployment anywhere in the US to assist local authorities and healthcare systems in safety and infection control.

Near panic, a majority of the public demanded a more muscular response. A national poll suggested that over 80% of the population favored stringent measures to deal with the Ebola outbreak. Several governors--some engulfed in reelection campaigns—obliged “out of an abundance of caution,” ordering strict and mandatory 21-day home quarantines. The seclusion targeted all health care workers returning from West Africa after an emotionally draining tour of duty attending Ebola patients. Tracked by police, the potentially infected were to monitor their temperature but were not allowed to contact family members or receive visitors. “You can’t take chances on this stuff,” commented the New Jersey’s governor, in reference to an arriving nurse who despite negative tests and lack of symptoms was summarily confined.

Yet, suspicion and apprehension about militarized federal interventions linger, notably in an American culture proud of its organizational prowess and “can do” resolve. Detention can often be counterproductive; violations of human dignity humiliating and degrading since they frequently tend to encourage resistance and evasion. Like their predecessors centuries ago, contemporary public health authorities dealing with Ebola admitted that aggressive monitoring and watching can trigger “perverse incentives” to evade the quarantine. Historically, suspects or sufferers of particularly “loathsome” diseases were expected to cope and adjust to their stigmatized status as pariahs. Medieval castaways were forced to dispose of their properties and leave homes and communities. Shamefully concealing their disgusting appearances, suspects and sufferers were often forced to abandon occupations, break up relationships, seek admission to special institutions, or simply go into hiding. Yet coercion always compromises human dignity, a complex religious and secular concept closely linked to identity and social status. For many in America, the crass inhumanity of such an exile was obvious.

With the advent of civil rights, so-called “social distancing” in private homes or hospital isolation are now seldom involuntary. Yet, the psychological effects of strict social isolation can be serious. Anxiety and fear of contagion--even the possibility of death--could leave many suspects traumatized, lonely, and depressed. In SARS, some observers claimed that, following their ordeal, quarantined persons suffered from post-traumatic stress disorder. In the Ebola crisis, Kaci Hickox, the nurse in New Jersey, found her experience “painful and emotionally draining.” After a tough tour of duty in Africa attending the sick and dying, she was isolated across from a Newark hospital in a drafty tent equipped with a potable toilet. Deprived of her clothing and outfitted with skimpy paper scrubs, the involuntary patient rightly questioned the inhumanity of her virtual imprisonment. Appalled, she was quoted as saying “I don’t plan on sticking to the guidelines” before seeking legal recourse, a rebellion that galvanized the country and led to her eventual discharge, thus illustrating the deeply emotional plight of those subjected to quarantine. Every unwarranted abuse of power can easily heighten and spread the very fear it was meant to suppress. While emotions will always dominate the advent of mass sickness, we must be aware of their influence and power in shaping our behavior and therefore seek to moderate all responses towards a fair way to not only protect the public health but also the suspected or sick victims of disease.

SOURCES

Guenter B. Risse, Plague, Fear and Politics in San Francisco’s Chinatown, Baltimore: Johns Hopkins University Press, 2012.

Rick Gladstone, “Liberian Leader Concedes Errors in Response to Ebola,” NY Times, Mar 12, 2015.

Peter N. Stearns, American Fear: The Causes and Consequences of High Anxiety, New York: Routledge, 2006.

Paul Rozin, Jonathan Haidt and Clark R. McCauley, “Disgust,” in Handbook of Emotions, ed. M. Lewis, J. M. Haviland-Jones, and L. Feldman Barrett, New York: Guildford Press, 2008, pp. 757-76.

William I. Miller, The Anatomy of Disgust, Cambridge, Mass.: Harvard University Press, 1997.

David Gentilcore, “The Fear of Disease and the Disease of Fear,” in Fear in Early Modern Society, ed. William G. Naphy and Penny Roberts, Manchester, Manchester University Press, 1997, pp. 184-208.

Guenter B. Risse, “Epidemics and History: Ecological Perspectives and Social Responses, in AIDS: The Burdens of History, eds. E. Fee and D. Fox, Berkeley: University of California Press, 1988, pp. 33-66.

Guenter B. Risse, Driven by Fear: Epidemics and Isolation in San Francisco’s ‘House of Pestilence,’ Champaign, Illinois: University of Illinois Press (in press)

Cassandra White, “Leprosy and Stigma in the Context of International Migration,” Leprosy Review 82 (2011): 147-54.

Judith W. Leavitt, “Public Resistance or Cooperation: A Tale of Smallpox in Two Cities,” Biosecurity and Bioterrorism: Biodefense Strategy, Practice and Science 1(2003): 185-92.

George J. Annas, “Pandemic Fear,” in Worst Case Bioethics: Death, Disaster, and Public Health, New York: Oxford University Press, 2010.

Lawrence Downes, “When Demagogues Play the Leprosy Card, Watch Out,” NY Times, Jun 17, 2007.

Center for Disease Control and Prevention, Legal Authorities for Isolation and Quarantine, http://www.cdc.gov/quarantine

Jeffrey M. Drazen et al., “Ebola and Quarantine,” New England Journal of Medicine--on line--(Oct 27, 2014)

Noam N. Levey and Kathleen Hennessey, “Obama Tells CDC He Wants Ebola SWAT Team Ready to Go,” NY Times, Oct 15, 2014.

Interview with Kaci Hickox, Dallas Morning News, Oct 25, 2014.

Karin Huster, “Don’t Let Fear Drive Response,” The Seattle Times, Nov 2, 2014.








The Ebola Outbreak: History Repeats Itself – Another Failed Quarantine

As predicted earlier, another fiasco! After lasting less than ten days, the quarantine of Monrovia’s West Point neighborhood was suddenly lifted on August 29, 2014 by orders of Liberia’s President Ellen Johnson Sirleaf. Everybody will remember that she had imposed this draconian measure on the advice of military security officials in spite of warnings from international public health experts who stressed its futility. As noted in my previous blog, the isolation came in retaliation for an attack from panicked residents on a new makeshift center for the treatment of suspected Ebola sufferers collected from other districts of the capital. The violent assault on the temporary facility occurred as tempers flared in the township. The measure, made without local consultations, was considered highly discriminatory, reinforcing notions that the slum was a favored dumping ground for misfits and undesirables.

Already infamous for living in their overcrowded, unsanitary dump, West Pointers did not wish to be further stigmatized by hosting potentially infected outsiders that could spread contagion and death. The assault of the center forced the dispersal of its seventeen detained patients and their belongings, all possibly incubating the dreaded disease. When the quarantine took hold on August 20, the heavily armed security forces managed to block all entries into the township, ordering its residents to return and remain in their hovels. Resisters were beaten with the isolating ropes, others sprayed with tear gas and finally greeted with live bullets that resulted in the death of a child. “We were treated like prisoners,” commented one denizen. Officials announced that the isolation would last 21 days, the average incubation time determined for Ebola fever.

From a historical point of view, the West Point quarantine eerily followed in the footsteps of San Francisco’s second shutdown of Chinatown in early June 1900. Unlike the seclusion of entire villages with their intact infrastructures, mass urban quarantines create an immediate problem: who and how will the isolated dwellers be supplied with vital necessities such as food and water without violating the isolation rules imposed to prevent contagion? High-density areas preclude the creation of backyard plots for cultivation and the raising of chickens. Residents depend almost totally on products imported from surrounding suburban farms. In San Francisco the new city charter made no provisions for the Health Board to provide food. Fortunately, as demonstrations and riots threatened, legal rulings days later lifted the embargo on food shipments of rice and milk.

Another problem was unemployment: many Chinatown residents nursed menial jobs in the rest of the city and their inability to show up for work had led employers to prompt dismissals. Loss of commerce forced shops and restaurants to close. Desperate Chinese openly expressed their willingness to bribe officials, eventually agreeing to pay $10,000 for raising the quarantine of Chinatown but the offer was somehow botched and never accepted. By the time it was over, more than 1,000 Chinese had managed to leave San Francisco and California.

With existing provisions rapidly dwindling and often looted, ostracized West End residents turned to outside family members and friends to smuggle them across the lines encircling the segregated Township. Except for some food supplies donated by the United States, Monrovia’s leadership failed to organize any assistance. Neighbors scrambled for the remaining scraps in garbage bins. Some business owners living elsewhere tried to check on and resupply their starving employees. Local clinics ran out of medicines. Increasingly hungry, frustrated, and unemployed, the trapped inhabitants — estimated at more than 10,000 people — reacted in desperation, throwing stones at the soldiers guarding the exits. Confrontations between residents and security forces turned into running battles. Others simply bribed the soldiers and police officers, or, most often, “sneaked out.” escaping from the sealed up “Ebola Jail” thus rendering the purpose of the quarantine meaningless.

Officials invariably defend their actions. In San Francisco, the coerced segregation was justified because of the unclean habitations of the Chinese. They deserved to be locked up and “suffer from the consequences of their own acts after constant notice.” The quarantine was not only a form of punishment but also a reasonable way of dealing with an incipient epidemic of plague. It was designed to prevent “promiscuous communications” between persons living in the district — all previous plague victims were Chinatown residents — and the people living outside. Moreover, officials claimed that the rules were being enforced without distinctions of race, age, sex, and nationality, a blatant and obvious lie since it exempted white businesses at the edge of Chinatown. Therefore, it was time for the Chinese leadership to take responsibility, cease to deny the presence of plague and cooperate with the authorities. For this purpose, the Heath Board passed a resolution promising to raise the quarantine twenty days following the last confirmed case of plague.

TheWave.jpg

 In Monrovia, the government similarly explained its intent: the quarantine would give West Point residents an opportunity to accept the presence of Ebola fever in their district, then mobilize their own local leadership and, finally devise a local surveillance plan that would contribute to the national effort towards Ebola’s containment. A businessman explained that the quarantine was not a punishment, just an effort to save the isolated population from Ebola. The government was protecting the Township from the ravages of a deadly fever. It was unfortunate that such noble intentions appeared to be misinterpreted by the local and international media that only seemed to stress the hardships inflicted by mass segregation. Instead of protesting, citizens should consider it a privilege and be appreciative; it was time to cooperate with the authorities. However, West Point residents, already deeply distrustful and even hostile of government — they were refugees from previous civil wars in Liberia — viewed the quarantine as an attempt to deflect Ebola fever towards the Township and allow it to burn itself out there as one successful measure to grapple with the epidemic crisis.

The same was true in San Francisco’s Chinatown regarding the plague outbreak.  Everybody seemed aware of the fact that the quarantine actually exposed the isolated population to the disease. Medical experts testifying in court spoke on behalf of the beleaguered inhabitants, warning that the artificially created confinement of about 20,000 people created a potential “plague center,” ready to spread throughout the rest of the city. Instead of such a drastic and dangerous mass quarantine, traditional public health experience suggested the “strict quarantining of particular rooms or houses supposed to be infected.” A failure to individually isolate the plague-infested houses enhanced the danger, especially if the inhabitants were not allowed to leave the district. No new cases of plague were discovered during the quarantine period.

Like in Chinatown, the West Point quarantine was lifted well before its original deadline. Amid traffic jams, shops opened, long queues of hungry residents formed at distribution centers issuing rations of rice from the United Nations Food Program. With the military finally out of sight, people were free to move around. A community leader and student was quoted saying: “The majority of people in West Point are happy for the fact that they just got out of a dungeon of hardship and the dungeon of being dehumanized.” Indeed, “most of our people went to bed on wrinkled bellies. “Like in San Francisco’s Chinatown, the mood turned festive; celebrations were in order after more than a week of deprivation and suffering. Ironically, there were apparently no cases of suspected Ebola fever among them although given the pervading climate of suspicion, dead bodies could have been hidden anywhere or spirited away. Nevertheless stigmatization and scapegoating persisted. “We have been called all sorts of names. People point fingers at us labeling us the Ebola patients. We want the government to tell the public that the people from West Point are indeed Ebola free.” Whatever remnants of confidence residents retained towards their government, the quarantine had delivered the final, crushing blow.

Given such circumstances, fearful and leery Monrovians were surprised when a local resident, a Harvard-educated epidemiologist, sought to personally reverse this trend. Acting alone, Mosoka Fallah began to crisscross the capital, seeking to encourage cooperation with the authorities by acting as an intermediary between anxious and often desperate residents and local officials. “Ebola will keep spreading,” he said, unless trust was restored. Time was of the essence. As an unpaid advisor to the Health Ministry, Fallah started performing heroic acts of compassion and empathy, facilitating the transport of abandoned sick residents lying in the streets to treatment centers. Moving from one neighborhood to another, the former member of Doctors Without Borders systematically visited every block and hut. In West Point, where he had previously resided, Fallah went back to a pre-civil war arrangement that had divided the Township into separate zones, prodding community leaders to establish food supply lines and voluntary surveillance teams that could identify potential Ebola victims and remove them. Such grass–roots community efforts continue to be essential in public health work but they can only be achieved through mutual respect and collaboration. As previously noted, the Liberian government had similarly sought to stimulate such local efforts but decided to achieve its goal through coercion, imposing a heavy top-down, military-assisted quarantine.

Nevertheless, Lewis Brown, the Minister of Information, reiterated the previous claim: “this (quarantine) was a tool intended to help the community to help themselves.” President Sirleaf was more cautious. Although she insisted that the measure was necessary to control the spread of the epidemic, Sirleaf ordered Browne Samukaim, the Minister of Defense, to set up a board of inquiry for the purposes of investigating all the circumstances and events that took place during the West Point Township quarantine. This move preceded a daily brief from Human Rights Watch, an independent, international organization devoted to the support of human rights around the world. The September 15, 2014 summary stressed the notion that protection of such rights was crucial for the control of Ebola fever in West Africa, and that the use of quarantines. A day later Human Rights Watch issued a lengthy statement, reiterating its stance on quarantines. The foundation noted that such tools had been imposed in several West African nations including Liberia, thus “restricting peoples’ rights to liberty and freedom of movement.” Indeed, these quarantines were inadequately monitored, “making them ineffective from a public health perspective and disproportionately impacting people unable to evade the restrictions, including the elderly, the poor, and people with chronic illness or disability.” In the Ebola crisis, better results could be achieved through community engagement and cooperation combined with social support that included home-based care and food aid. Perhaps the lessons from the West Point Township fiasco will endure.


Sources:

New York Times, Aug 29, 2014.

James Butty, “Liberia’s West Point: Life After Ebola Quarantine,” Voice of America, Sep 01, 2014.

The New Dawn (Monrovia) Sep 04, 2014.

Guenter B. Risse, Plague, Fear and Politics in San Francisco’s Chinatown, Baltimore, Johns Hopkins University Press, 2012.

Human Rights Watch, “West Africa : Respect Rights in Ebola Response,” Sep 16, 2014.

The Ebola Outbreak: Historical Notes on Quarantine and Isolation

“Fear remains the most difficult barrier to overcome”

Margaret Chan WHO Director General

Recent headlines immediately drew my attention. Dispatches from Monrovia, the Liberian capital, reported that under presidential orders, armed riot police and soldiers from the Ebola Task Force in riot gear and equipped with automatic weapons had quarantined West Point, a northern township of the city, on August 20, 2014. To physically cordon off the district, makeshift barricades were erected using red rope lines, wood scraps, and barbed wire. Ferry service to the peninsula was cancelled and the coast guard started patrolling the surrounding waters, turning back people attempting to flee in their canoes. A subsequent image depicted a quarantine violator, summarily detained under the chassis of a car.

Segregation and detention of people suspected or suffering from lethal contagious diseases has a long past. In an era of globalization, “contagious anxieties” become ubiquitous. Responses towards an ever-more complex sequence of biological dangers tend to flourish especially in overcrowded urban settings like Monrovia. In despair, the lessons of history are conveniently ignored. Blaming the victim continues to be a common survival skill, employed for self-defense and preservation. Appealing to basic human instincts of survival, successive Western societies developed regulations and erected institutions designed to cope with incoming diseases. Like in Monrovia today, much was judged to be at stake: catastrophic mass dying, social breakdown, political chaos, economic decline, and, at times, even national survival. 

Since the Renaissance, the term quarantine came to designate spaces for the temporary detentions of residents, travellers, and cargo suspected of carrying infection. This tactic differs somewhat from another traditional public health scheme: the sanitary cordon or blockade that closes checkpoints, roads, and borders with the similar purpose of preventing the spread and contamination of deadly infectious diseases. In both cases, groups of individuals previously marginalized on the basis of age, sex, race, religion, and class were blamed for epidemic outbreaks. The favorite scapegoats are often strangers, newcomers, and ethnic minorities. I am particularly sensitive to issues of stigma and isolation because of my recent research and completion of a book manuscript on the San Francisco Pesthouse, now under academic review. My thesis is that both fear and disgust historically managed to frame coping behaviors and drive actions. In my opinion, anger and revulsion explain the heartless, even brutal nature of the responses.

Plague in Rome 1665

Plague in Rome 1665

In Liberia, Monrovia’s crowded and dilapidated slum with its narrow alleys and rickety shanties had long been a public health nightmare: neglected by authorities, residents were without sewage disposal and only obtained their water in wheelbarrows. Ebola was only the last and perhaps the most dangerous disease to visit this community of an estimated 75,000 people. With hospitals closing to prevent further institutional contamination—some health workers were already dead or dying, potential victims had not place to go. Without consulting residents, the government decided to use a former schoolhouse in West Point as a makeshift neighborhood center for the purpose of concentrating, isolating, and managing all Ebola cases reported in the city. This decision only stoked the growing local panic, leading to protests and violence. Perhaps Ebola was only a “government hoax.” The building was stoned and its patients—presumed to be infectious-- forced to flee. The attack led to widespread looting of furnishings and medical equipment, including blood-soaked bedding believed to be contagious.

Enforced by police and even military units, indiscriminate segregations often prove counterproductive: historically they fostered mistrust, fear and panic, resistance and aggression, while encouraging flight, concealment, and social chaos.  Paradoxically, while often unsuccessful in curbing an epidemic, such containment policies contribute to the suffering and risk additional fatalities. Since the Black Death, families and isolated populations trapped in their homes and neighborhoods suffered severe hardships ranging from a lack of water, food supplies, and a higher risk of cross infection. Loss of employment and economic ruin followed. When an outbreak of plague in Rome occurred in 1656, the entire city was immediately closed with temporary stockades placed in front of two gates for screening supplies and people. Police patrols enforced the sealing of homes suspected to house victims of the disease. Individuals considered tainted through contact with the sick and who failed to follow isolation procedures were arrested and often condemned to death. In fact, contemporary iconography depicts public executions by firing squads or hanging from gallows erected in various city piazzas. In Monrovia, a crowd of desperate residents sought to break out of their prison by storming the barricades, hurling stones and attacking their jailers, only to be greeted with live ammunition firepower that managed to kill one boy and injure others. One dweller caught in the crossfire sarcastically asked ”you fight Ebola with arms?”

Risk fluctuates; it can be subjected to reevaluations within shifting social contexts and sanctioned by experts, then amplified by information systems for the purpose of mobilizing public opinion. This is particularly true if uncertainty, rumors, and a sensational media stoke fears of contagion, causing societal support systems like Monrovia’s public health departments and hospitals to break down. In the United States, three such intra-city quarantines were imposed in response to a pandemic of plague: Honolulu (1899-1900), San Francisco (March 1900) and again (May 1900). None proved beneficial.

Like Monrovia’s township, the targets were overcrowded and impoverished enclaves, unsanitary urban slums largely populated by Chinese migrants. In Hawaii, faced with a confirmed case of plague, the Honolulu Health Board issued an order to quarantine the Chinatown district on December 12, 1809 by roping it off with the help of rifle-toting National Hawaiian guardsmen. To avoid losses for white businesses, the perimeter was gerrymandered to exclude them. The afflicted neighborhoods, home to about 5,000 people--half of them Chinese--featured crumbling shanties with cesspools planted on stagnant marshy land strewn with refuse. In such circumstances, potentially plague-carrying rats posed an unacceptable risk.

Based on work by my colleague and friend James C. Mohr, frequent medical inspections and a search for new cases ensued. Disinfection, garbage removal, and the burning of “infected” lodging that had sheltered the sick began. The rationale for burning houses to kill plague germs stemmed from recommendations adopted by Hong Kong’s Sanitary Board during an epidemic in 1894 involving the walled-in and overcrowded and dilapidated district of Taipingsshan. While the official quarantine boundaries in Honolulu were constantly patrolled, security remained lax and direct contact with residents through the influx of food supplies and services continued. Yet, angry denizens protested the seclusion with threats, concealment, and flight. Businesses suffered, exports dwindled, and a paucity of new cases created pressures for the Health Board to terminate the quarantine six days after its initial decree. Reinstated on December 28 after the appearance of new cases, Honolulu’s quarantine was responsible for reducing Chinatown to ashes on January 20, 1900. With orders to burn down additional rat-infested housing, the local Fire Department misjudged the weather conditions: strong winds spread the flames to nearby buildings and a church, forcing residents to flee the district. The fire eventually scorched thirty three acres of the city and destroyed 40,000 homes. Rejected by dwellers from the outlying neighborhood, the Chinese were forced into makeshift detention camps, initially under monitored by national troops but soon replaced with Hawaiian Republic guardsmen. While the accidental fire and destruction of Honolulu’s Chinatown contributed to the eradication of plague, this episode remains a stark remainder of the adverse consequences of coerced mass segregation.

Honolulu Chinatown 1900 Quarantine

Honolulu Chinatown 1900 Quarantine

Given plague’s relentless march across the Pacific, and the close commercial ties with Honolulu, particularly the extensive sugar trade, San Franciscans braced for its arrival. Indeed, the events in Honolulu had already prompted an inspection of the city’s Chinatown by Health Board officials in early January 1900. The subsequent story is told in greater detail in my recent book Plague, Fear, and Politics in San Francisco’s Chinatown (2012). Not surprisingly, the death of a Chinese laborer on March 6 with a presumptive diagnosis of bubonic plague encouraged health officials to immediately impose a quarantine of the entire district. The enforcement was quite similar. Early Chinatown risers, including cooks, waiters, servants, and porters heading for their jobs outside the district, discovered that ropes encircled the space between Broadway and California, Kearny and Stockton Streets. Two policemen on every corner demanded that everybody turn around and return to their homes. Traffic was blocked and streetcars crossing through the area were not allowed to stop. Massive confusion ensued as frantic Chinese escaped across rooftops or sneaked through the lines, fearful that outside employers would dismiss them if they remained absent. In the confusion, stores closed although some provisions were passed across the lines. The quarantine hampered the movement of Western physicians living outside the district, preventing them from attending the sick at the Oriental Dispensary. Chinese passengers could not board river and coastal steamers to leave the city. Many managed to cross the Bay in small boats and find shelter in vegetable gardens of suburban friends and laundries. Others hid with Chinese cooks in private residences. Crowds quickly assembled in the streets, stunned by the encirclement. Rather than respond forcefully, the Chinese at first aimed for a restoration of harmony. For them the quarantine constituted an operation designed to impress the district’s ethnic population that the local health department and police were, after repeatedly faltering, up to their jobs.

San Francisco Chinatown 1900

San Francisco Chinatown 1900

Bacteriological confirmation of plague’s presence, however, remained elusive. Facing growing skepticism, the Health Board—like in Honolulu—unanimously voted to lift the “preventive” quarantine a few days later. No further cases of plague were found. The temporary encirclement of Chinatown, widely characterized by the press as “bubonic bluff,” turned into a setback for sanitarians: with regular scavenger service suspended, mountains of rotten food littered the streets, creating an unbearable stench while providing further food sources for its hungry rats. With the Honolulu experience vivid in their minds, Chinese residents expected another possible razing or burning of their homes. In a series of patterns replicated in the months ahead, laboratory findings and quarantine threats would be announced, manipulated, and denied in a climate of profound political and economic divisions. In the meantime, the reported mortality rate in Chinatown had dropped dramatically.

When cleaning and disinfection operations failed and an anti-plague vaccination program fizzled, San Francisco’s authorities decided to rent a number of grain warehouses at Mission Rock, a small island near the city’s port, converting the facilities into a temporary detention facility for Chinese suspected of suffering from plague. However, an empowered Chinese leadership hired prominent local lawyers and challenged the plan in federal court. Soon thereafter, in a landmark decision, the presiding judge ruled that all public health measures, while lawful, were not totally immune to judicial scrutiny. Inasmuch as they impaired personal liberties, totally arbitrary measures could not be permitted to stand. In this particular case, the quarantine was also racially discriminatory and harmful since confinement actually increased the risk of infection in the district.

Within hours, however, business leaders and health officials from San Francisco and California met to deal with a growing trade embargo against the state because of the presence of plague. The only proper response: reassure the trading world by sending a clear signal of California’s determination to control the presumed foe. Thus the Health Board was authorized to proceed immediately and once more close down Chinatown, again a “merely precautionary” move. Instead of ropes, a large police force and workmen descended on Chinatown, erecting a veritable fence around the district with wooden planks, posts, cement blocks, as well as barbed wire, creating an enclosure to seal off the entire area. The issue of detention centers for suspected plague victims raised, once more, its ugly head. To force the issue, San Francisco’s mayor refused to take responsibility for feeding the blockaded population.

By June 5, the Chinese leadership filed another judicial request in US District Court to stop the planned deportation. The federal judge immediately lifted the embargo on food supplies to Chinatown. He also delayed the removal of Chinese to the Mission Rock detention camp, but granted a continuance, further infuriating residents of the district. Many had already lost their jobs and were without food and other supplies. Because of growing shortages, prices for the remaining merchandise were skyrocketing. Thousands would soon be destitute and a serious disturbance was expected. While the case continued to be debated in court, protests and demonstrations escalated. Attorneys for the Chinese argued that by confining thousands of residents, the quarantine had, in effect, created a potential “plague quarter” capable of spreading disease to the rest of the city.

Ten days after the filing, the judge handed down his decision in favor of the Chinese plaintiffs. Like the previous ruling, the court employed the legal standards of due process and equal protection to grant the injunction against quarantine. Morrow ruled that the San Francisco Board of Health acted “with an evil eye and unequal hand” in an arbitrary and racially discriminatory manner, actually increasing rather than lowering the risks of infection in the isolated population. The siege of Chinatown was over. A defiant ethnic community with the help of the federal judicial system had managed to thwart two quarantines in the course of a few months. Municipal workers began taking down the makeshift fence that had encircled the district for sixteen days, allowing more than one thousand trapped Chinese residents to pour across the breached fences. Wagon after wagon of supplies arrived. After two weeks of fright, deprivation and protests, euphoria descended on Chinatown. Numerous celebrations marked the passing of the “fake” quarantine. During this judicial proceedings, only three additional deaths had been bacteriologically confirmed as caused by plague.

The above historical examples illustrate the challenges, hazards and futility of mass quarantines, their adverse economic impact and human toll. Public health is ill served when brutal coercion exacerbates mistrust and triggers hostility. Moreover, like in the Chinatown examples, forceful segregation was totally out of synch with the contemporary epidemiological understanding regarding disease transmission. In plague, the vectors were mostly fleas and rats instead of humans while Ebola fever is originally contracted from handling and cooking contaminated bush meat. Monrovia’s quarantine, imposed on an overcrowded population already burdened with poverty is unconscionable: the slum already shelters some ill and dying cases of the disease. Their contagious body fluids will only magnify the outbreak already in progress. As provisions dwindle in West Point, food prices will skyrocket, making them unaffordable and soon unavailable. Faced with a standoff, hunger and desperation will lead to further skirmishes with the military units guarding the township. Interviewed by a CNN reporter, one residents admitted: “The hunger, the Ebola, everything. I’m scared of everything!”


Sources:

Liberian News, August 17, 2014.

Margaret Chan, “Ebola Virus Disease in West Africa—No Early End to the Outlook,” New England Journal of Medicine.org, August 20, 2014.

Guenter B. Risse, “Pesthouses and Lazarettos,” in Mending Bodies, Healing Souls: A History of Hospitals, New York: Oxford University Press, 1999, pp. 190-216.

Alison Bashford and Claire Hooker, eds. Contagion: Historical and Cultural Studies, London: Routledge, 2001.

New York Times, August 21, 2014.

CBC News and CNN News, August 26, 2014.

Howard Markel, “The Concept of Quarantine,” in Quarantine!: East European Jewish Immigrants and the New York City Epidemics of 1892, Baltimore, Johns Hopkins University Press, 1997, pp. 1-12.

James C. Mohr, Plague and Fire: Battling Black Death and the 1900 Burning of Honolulu’s Chinatown, New York: Oxford University Press, 2005.

Guenter B. Risse, Plague, Fear, and Politics in San Francisco’s Chinatown, Baltimore: Johns Hopkins University Press, 2012.

The Historian as Sleuth: Malaria in Scotland

In a section of his 2002 collection of essays, the distinguished, late professor Owsei Temkin recommended old age to the historian. Indeed, Dr. Temkin, the William H. Welch Professor of the History of Medicine at Johns Hopkins University, lived long enough--he became a centenarian--to function as his own critic and revisionist, allowing himself to have “second thoughts” about some of the work he had penned decades earlier. For Temkin, senior historians had clear advantages: “the freedom from the hustle and bustle of life and the reduced nightly sleep allow more time for thinking.” If scientists reached the peak of their careers early on in their youthful years, aging historians—like good wine—became better at their craft, seasoned by a lifetime of experiences and transformation into primary sources often sought by the next generation of journalists and scholars. After all, studying history is meant to unravel the complexity of human lives and society, a practice that favors elders burdened with ample exposures to worldly events.

Now in my early eighties, I find myself in the same boat, drifting into the sunset with a brain refusing to quit. In the middle of the night or while swimming or pedaling on an exercise machine, random thoughts emerge, drift, and then spontaneously coalesce, leading to novel insights. Because of a deteriorating memory and vocabulary, a trail of hand-scribbled phrases seeks to capture the momentary inspiration, although many notes will fail to make sense later. Freed of the usual academic gamesmanship, administrative burdens, shrinking budgets and good cop-back cop deans, I empathize with Temkin’s desire to revisit certain aspects of our body of work with the proviso that the reconstructions and supplements should always include some human-interest stories necessary to stir the emotions of new potential readers. It was Temkin who jumpstarted my second career as a historian. I was grateful for his advice to attend the University of Chicago when, towards the end of my medical residency in Ohio, I visited Baltimore in the fall of 1962. He made it clear that physician/historians with proper academic credentials in both fields would stem the growing estrangement between medicine and history, an issue that still challenges today’s professionals.


For my part, more than just simply rethinking or rewriting, I hold closer to the model of the historian as perpetual investigator and synthesizer, albeit at a more leisurely pace. Early teenage fascination with detectives--notably Sherlock Holmes, Hercule Poirot, and Ellery Queen who solved mysteries by tracking down facts, assembling evidence, and then logically sifting through clues for interpretation--was seductive. Through engagement of our grey brain cells and deduction, coherence and plausibility emerged, dots were connected, the gaps skillfully bridged. At this point, Ellery always issued his challenge to readers to find the answers on their own. No matter, with tensions building, the final summing up was at hand: everything was explained and the case came to a successfull end. Since 2003, a popular PBS documentary television series, “History Detectives” has followed a similar path, its stated mission “exploring the complexities of historical mysteries, searching out the facts and conundrums,” albeit mostly limited to specific artifacts related to American history. The word ‘detective’ stressing detecting and collecting has been recently upgraded to that of ‘investigator’ examining and evaluating the evidence.

My two great loves, medicine and history, share similar investigative methodologies. Both patiently track down all pertinent information from multiple sources. The data is then carefully organized and provided with meaning. Far from being solely logical, this problem-solving process is imbedded in emotional states. The investigator’s mood is key: cherishing the suspense of the chase, the excitement of finding clues, together with the growing exhilaration of solving puzzles, all eventually ending in joy and satisfaction as the quest successfully ends, or does it? Can the final conclusions stand up to scrutiny, particularly if new pieces of information become available or the historian feelings, beliefs, and experiences shift? Indeed, ideas do not last forever and nothing is written in stone. New evidence and understanding, historical or scientific, may emerge. Historians live in a changing world with shifting priorities and perspectives. As the context changes, they must be ready to ‘recalculate’ if they intend to revisit previous topics.

A brief, concrete example of my historical sleuthing relates to the presence of malaria in Scotland. The subject is relevant given the current concerns about global warming and dire predictions that warmer temperatures will trigger a nefarious resurgence of this disease in Europe. Malaria is still a common, worldwide mosquito-borne parasitic disease that can be traced back at least to the Neolithic agricultural revolution, especially in tropical and subtropical regions of the world. For Europe, there is evidence that the disease was already present during the first millennium in coastal regions surrounding the North Sea. While temperature remains an important factor in malaria’s transmission, this essay stresses the pivotal role humans play in shaping its epidemiology. Preliminary findings were presented at the John F. Fulton Lecture at Yale University in 1988 and a fuller story published in a chapter in my book New Medical Challenges During the Scottish Enlightenment (2005). Malaria’s causal agent, a microscopic protozoa parasite called plasmodium, initially enters the human bloodstream via bites from a variety of previously infected female Anopheles mosquitoes. Part of the parasite’s life cycle takes place inside the human liver and red blood cells. Here the parasites rapidly multiply, bursting free into the bloodstream at regular intervals, destroying their cellular host. The release occurs every two or three days depending on the species of plasmodia, triggering sudden episodes of chills followed by a spike of high fever that ends with profuse sweating.

The intermittent and rhythmic character of these fevers is unique to malaria, making it easier to follow and diagnose even in historical times. In a forthcoming blog I will explain in more detail this approach for other historical places and times, emphasizing that historians must usually exert great caution when equating vague and shifting past disease constructions and nomenclatures with modern ones. Venturing into the hazardous currents of retrospective diagnosis malaria is a rare exception because of the telltale clinical manifestation and our current scientific understanding of environmental factors involved in its spread.   

My first clues to the existence of malaria came while researching aspects of 18th century Scottish medicine. Originally known as “ague” in northern Europe, malaria (male aria or bad air)--the name derived from the somewhat sulfurous odor of bacterial decomposition in saltwater mudflats--became what contemporaries characterized as the “scourge of Scotland”, vanishing after the early 1800s never to return except as an occasional tropical import. While malaria probably existed in Scotland for a millennium, I detected its presence when I discovered numerous “intermittent fever” cases between 1770-1800 in the records of charitable institutions such as the Edinburgh Infirmary and the Kelso Dispensary, which were supplemented by accounts in medical student notebooks, and writings from prominent Scottish physicians. Their presence and care--the patients were poor--confirmed the gradual medicalization of Scotland’s society during the Enlightenment.

Any Google search will uncover a steady parade of sporadic cases detected in recent decades, all imported in an era of globalization and rapid human travel from Asia and especially West Africa. However, thanks to several historians, especially Mary Dobson, we know that since at least the sixteenth century the salty marshlands of Kent and Essex in England were notorious for their lack of salubrity, depopulation, and high levels of mortality. Together with the river estuaries and flood plains in Lincolnshire, Cambridge, and Norfolk, these regions were singled out for the frequency of ‘marsh fever’ or ‘ague’ among its dwellers, imported by migrant Dutch farmhands and traders from Baltic ports where the disease was also endemic. In fact, eighteenth-century Scottish physicians frequently referred to intermittent or “autumnal” fevers as the “Kentish ague.” Malaria flourishes near their stagnant pools of salted water, ideal for breeding several genera of vector mosquitoes, notably the Anopheles atroparvus and messae. Advanced molecular methods suggest that Europe’s malaria was primarily caused by a faster reproducing but less deadly variety of protozoa parasites: the Plasmodium vivax, still present in wild apes and derived from an ancestor that escaped Africa, and perhaps Plasmodium malariae, an even less aggressive species identified with chronic cases.

My working hypothesis was that, like elsewhere in Europe and Africa, malaria was a component of a distinctive ecology of disease, a product of the unique geography, microclimates, and particular social organization prevailing in the Scottish Lowlands. My first task was to determine the geographical location of the reported cases, an undertaking made easier by the publication of Scottish parish reports in the Statistical Account of Scotland published by John Sinclair in 1794. The collection contains detailed information about climate, population, agricultural development, commerce, and prevailing diseases not available for previous centuries. Two separate regions stood out: both were near important trading centers and areas of considerable agricultural activity, thus attracting seasonal Scottish farmhands, many of them previously employed in the English fens. The first comprised lowlands located in Angus County northeast of the Firth of Tay as well as a stretch of alluvial terrain between Perth and Dundee on the northern edge of the River Tay known as the Carse of Gowrie. The second region was south and distributed around the Scottish Borders, involving various floodplain parishes near the market town of Kelso at the confluence of the Teviot and Tweed Rivers. It also included Roxburgh, and towards the northeast, the Tweed’s estuary in Berwickshire bordering England. Like in the north and because of their salinity, the marshes all offered ideal breeding grounds for A. atroparvus mosquitoes.

Climate seemed less of a factor. As researchers studying malaria in England pointed out, the “ague” was already firmly established in stagnant waters near the North Sea coast during the Little Ice Age that lasted from the 1560s to the 1750s. More important were seasonal variations with warmer and drier summers creating conditions that not only accelerated parasitic reproduction in their vectors but concentrated hungry hordes of mosquitoes in remaining pools of water desperate for blood meals. Here deficient clothing—common among poor workers, became another factor facilitating exposure. So far, so good: I concluded that ague could and indeed was present in Scotland, an occasional and temporary import linked to poverty as well as lack of employment forcing migration. I surmised that new local cases could be prevented since the country’s cold winters made it virtually impossible for the offending insects to survive.

Not so fast. Instead of just a trickle of indigenous cases from notorious swampy areas, Scotland in the early 1780s--particularly in the Borders region--witnessed some unusual epidemic outbreaks that defied previous explanations. As noted, Kelso traditionally functioned as a regional job market for farmhands and servants. With seasonal unemployment in the Highlands, this exchange place would periodically swell with people as job seekers and their families crowded into decrepit and thatched cottages. Surrounded by dunghills, these windowless cabins served us ideal shelters and potential hibernation spaces for the malaria-infested mosquitoes. A perusal of eighteenth-century weather charts confirmed that the Scottish lowlands had been subjected to an erratic climatic pattern, especially during the decade 1780-90: excessive rainfall during winter and spring but warm and dry summers. This produced excellent conditions for the proliferation of the pesky vectors necessary for parasite transmission. Moreover, in spring, strong easterly winds allowed vast mosquito populations from coastal estuaries near the Tay and Tweed Rivers to drift inland towards urban centers in Perthshire and around Kelso. Such conditions coincided with harvest time with its the usual movements of traders and itinerant migrant workers. Scottish mosquitoes were ready to pounce on the native rural population after being Infected from blood meals containing plasmodia obtained from previous visitors to the English fens--many currently in remission or suffering from chronic malaria. The warmer weather insured their year-around presence and posed the threat for larger outbreaks. Under such circumstances, climate, people, and poor housing conspired to temporarily make malaria endemic in Scotland.

Kelso Dispensary

Kelso Dispensary

Examples of eighteenth-century malaria in Scotland can be found among cases extracted from medical student notebooks. A typical patient was Leslie C., a 22-year old mother of a three-year old daughter Margaret. Both were admitted together to the teaching ward at Royal Infirmary of Edinburgh on February 4, 1795.  People diagnosed with “intermittent” fevers represented a special population of sufferers admitted to the hospital during the winter months after extended and stressful voyages from the endemic areas of malaria in southeast England. Many had experienced the first debilitating paroxysms of ague during the previous harvest season there before going into a temporary remission. Given the severity of their new symptoms, Edinburgh professors wanted students to follow the clinical course of the most challenging cases. Exhausted from riding in open wagons, Leslie claimed that for three months she had been experiencing regular attacks of high temperature every 72 hours precisely around 4.30 PM, a malaria variety defined as a “quartan” fever. Her sallow complexion and lack of menstruation hinted at anemia and emaciation. Each fit lasted for about an hour and left her extremely tired and weak. Margaret’s shivers, headaches and high fever occurred every other day, known as a “tertian” fever. Weaned for eight months, the child seemed to be grinding her teeth. The belly was quite swollen, a sign of possible starvation or spleen enlargement common in malaria. The mother’s symptoms started while near the marshes around Hilton, in Huntingtonshire. Her daughter’s onset of symptoms occurred shortly thereafter, further north at Stamford in Lincolnshire. Both locations suggest that the pair were part of an itinerant labor force returning to Scotland, presumably from Essex or Kent, desperately attempting to escape the insalubrious region and survive the winter. Hospital consultants such as William Cullen, the famous University of Edinburgh professor, believed that the “Kentish ague” was much easier to cure by nature than art. Indeed, left alone, most vivax malaria sufferers eventually went into long-term remission. According to the ledgers, Leslie and Margaret remained in the hospital for about ten weeks, provided with a flannel shirt to stimulate further sweating, a phenomenon believed to forestall further fits. Initially, both patients received ineffective small doses of a brew obtained from cinchona, the Peruvian tree bark containing the active ingredient: quinine. Unfortunately, the bitter-tasting powder seriously “loaded up” Leslie’s stomach, prompting the attending physician to temporarily stop the medication and administer the drug to the child in the form of enemas. Other patients were exposed to electric sparks in hopes of stimulating further sweating and thus “escape the fits.” With rest, full diet, wine and other tonics, later supplemented with higher doses of cinchona, Leslie’s and Margaret’s fever attacks eventually ceased. Both patients survived their ordeal and left the Infirmary in better condition.

KelsoAgueGraph.jpg

So having figured out that malaria indeed existed in Scotland early on, my next question was why did it decline and eventual disappear from Scotland? From further research, I deduced that the key was agricultural reform and the concomitant demands for additional land for cultivation including drainage of impermeable clay soils. Over time, in Scotland’s Lowlands, old salt marshes and other stagnant pools of water were eliminated. Indeed, they turned into arable land for turnip husbandry and the crop rotation, thereby eliminating most breeding habitats for mosquitoes and gradually diminishing malaria transmission. Encouraged by the presence of root crops, additional cattle and sheep were kept out grazing in the fields year around, taking the place of humans as providers of blood meals for the shrinking population of insects.

Most importantly, however, the new demands of mixed farming halted the seasonal exodus of farmers to and from England, traditionally the main source for malaria importation. New crops and tools reduced the farming workforce. Current tenants were ejected, sent towards cities like Edinburgh and Berwick, incipient centers of the Industrial Revolution, thus reducing the number of potential human hosts. Farmsteads were consolidated, separated from the traditional rows of primitive cottages hitherto crowded with laborers. Dank, dark, and buggy abodes were demolished, front door dunghills removed. With steady work, the remaining population sought higher quality wooden housing with window frames for ventilation that would prove less hazardous to their health. Last but not least, infusions or tinctures of cinchona bark, a successful remedy for malaria at high doses, became a popular domestic remedy for the survival of remaining sufferers in the1800s, further breaking the disease’s infective cycle.

Aha! Riddle solved! My historical sleuthing revealed that while relatively benign, malaria not only influenced the course of Scottish medicine, but its origins and evolution became recognized as expressions of particular geographical, climatic, biological, and above all social, and economic forces prevailing in Scotland during the late eighteenth century. Case closed.

Sources

Owsei Temkin, “On Second Thought” and Other Essays in the History of Medicine and Science, Baltimore: Johns Hopkins University Press, 2002.

Guenter B. Risse, ”Ague in Eighteenth-Century Scotland? The Shifting Ecology of a Disease,” in New Medical Challenges During the Scottish Enlightenment, Amsterdam: Rodopi, 2005, pp. 171-97.

L.J. Bruce-Chwatt, “Ague as Malaria: An Essay on the History of Two Medical Terms,” Journal of Tropical Medicine and Hygiene, 79 (1976): 168–76.

Otto S. Knottnerus, “Malaria Around the North Sea: A Survey,” in Climatic Development and History of the North Atlantic Realm, eds. Gerold Wefer et al, Berlin: Springer, 2002, pp. 339-53.

Paul Reiter, “From Shakespeare to Defoe: Malaria in England in the Little Ice Age,” Emerging Infectious Diseases (Feb 2000): 1-11.

J. Sinclair, Analysis of the Statistical Account of Scotland, II parts, Edinburgh: A. Constable, 1825, reprint, New York: Johnson Reprint Corp., 1970.

Register of the Weather’, in Transactions of the Royal Society of Edinburgh 1 (1788): 206.

C. Wilson, ‘Statistical Observations on the Health of the Labouring Population of the District of Kelso in Two Decennial Periods, From 1777 to 1787 and from 1829 to 1839,” Proceedings of the Border Medical Society, (1841): 47-85.

Cases of Leslie and Margaret Campbell, in Andrew Duncan, Sr., Clinical Reports and Commentaries, February–April 1795, presented by A.B. Morison (Edinburgh: 1795), MSS Collection, Royal College of Physicians, Edinburgh. For more details of Leslie's treatment see J.W. Estes, “Drug Usage at the Infirmary: the Example of Dr. Andrew Duncan, Sr.,” in Guenter B. Risse, Hospital Life in Enlightenment Scotland: Care and Teaching at the Royal Infirmary of Edinburgh, New York: Cambridge University Press, 1986, (note 65), 359–60.

F. Home, ‘Experiments with Regard to the Most Proper Time of Giving the Bark in Intermittents’, in Clinical Experiments, Histories, and Dissections, 3rd edn London: J. Murray, 1783,pp. 1-13.

L. J. Bruce-Chwatt and J, de Zuleata, The Rise and Fall of Malaria in Europe: A Historico-Epidemiological Study, Oxford: Oxford University Press, 1980.

M. Dobson, “Malaria in England: A Geographical and Historical Perspective”, Parassitologia, 36 (1994): 35–60, and “Marshlands, Mosquitoes and Malaria,” in Contours of Death and Disease in Early Modern England, Cambridge: Cambridge University Press, 1997, pp. 306–27.

J.H. Brotherston, “The Decline of Malaria”, in Observations on the Early Public Health Movement in Scotland. London: H. K. Lewis & Co, 1952, 26–36.

T.M. Devine, The Transformation of Rural Scotland: Social Change and the Agrarian Economy, 1660-1815, Edinburgh: Edinburgh University Press, 1994.


Getting started

Blogging. Why not? Just because I have retired from my position as a full-time teaching professor, doesn't mean I have given up researching and writing. And with today's technology, I have the opportunity to share my current projects with a much wider audience.

I am currently working on my latest masterpiece. Let me tell you all about it.