Displaying the Dead: A Day Out at the Paris City Morgue

People have always been fascinated by the bizarre and macabre. During the nineteenth century, one of the most popular tourist attractions in Paris was not a famous monument or an art gallery, but the city morgue. At its peak between 1830 and 1864, it boasted of around 40,000 visitors a day as people flocked to see the victims of grisly murders and tragic suicides. Being on the banks of the Seine, drowning victims were particularly common, and the bodies of children often drew the largest crowds.

Practical Purposes

The original intent of the viewing galleries of the city morgue was to allow relatives of missing or deceased persons to identify unclaimed bodies. Bodies were displayed behind a plate glass window, with their clothing (and any other personal effects they may have been found with) hung on a rail above them. Decomposition was slowed by dripping cold water onto the bodies from taps running across the ceiling. As well as putting a name to unidentified corpses, the city morgue also played a role in crime solving. Often, the police would bring murder suspects to the morgue to confront them with their actions, in the hopes of spurring a confession. Indeed, the technique was thought to be so successful that electric lights were introduced to the morgue in 1888 with the sole intention of increasing the effect of the confrontation.

Sensational Media

However, the morgue soon developed a reputation as a public spectacle as urban society developed to encompass both mass media and exhibition culture. Newspaper readership grew exponentially during the nineteenth century, and it therefore is unsurprising that stories of industrial accidents and gruesome murders were reported both more often and in even greater detail than ever before. Sensationalist headlines and dramatic speculation not only sold more papers, but also meant that readers would rush to see the bodies of victims at the city morgue. However, these crowds were not necessarily just morbid viewers. Much of the public that formed the huge crowds to be found at the morgue on any given day may have felt a sense of sympathetic solidarity with the unfortunate corpses laid out on the slab, and in many ways helped draw attention to the shocking and often avoidable fates they had suffered.

The Paris City Morgue became the place to see…and be seen

Display by Design

When the city morgue was relocated in 1864, it was designed with its huge visitor numbers and international repute in mind. Its new location behind the famous Notre Dame cathedral made it easy for even the most directionally-challenged tourist to find, and its opening hours of dawn til dusk seven days a week made it vastly more accessible than any other comparable morgue in Europe. Though death was by no means a stranger to either country, both France and Britain experienced a shifting view of death during the nineteenth century where both death itself and mourning became a far more public and performative phenomenon. The viewing window of the morgue was often compared to that of a department store, and the grim display represented an accessible form of entertainment for both rich and poor alike. The Paris City Morgue closed its doors to the public in 1907, and today most countries have strict laws dictating if and how human remains can be displayed.

Sources and Further Reading

https://www.atlasobscura.com/articles/paris-morgue-public-viewing

https://wellcomecollection.org/articles/W-RTBBEAAO5mfQ3M

Trednnick, B. “Some Collections of Mortality Dickens, the Paris Morgue, and the Material Corpse.” Victorian Review, vol. 36, no. 1, 2010, pp. 72–88

Shaya, G. “The Flaneur, the Badaud, and the Making of a Mass Public in France, circa 1860-1910.” The American Historical Review, vol. 109, no. 1, 2004, pp. 41–77

The Most Curious Cravings: Pica

Pica is a psychological disorder that causes a strong desire to consume non-food items, such as dirt, glass, or hair. The phenomenon was first described by Hippocrates, and is usually found most often in children, though they usually grow out of it. Occasionally, adults are also diagnosed with the disorder, though genuine cases can be difficult to identify. The disorder can have a lot of unpleasant consequences, including stomach ulcers, intestinal blockages, kidney failure, and internal bleeding.

Rapunzel Syndrome

Pica has many different sub-categories, depending on the substance being consumed. Rapunzel Syndrome is used to refer to people who are compelled to eat hair, known medically as trichophagia. The compacted material forms an object called a bezoar, which is normally only removable by surgery. Many medical museums have such curiosities in their collections, as the advent of modern surgical techniques meant that doctors were better able to identify and remove the foreign objects.

A hairball removed from the stomach a young woman, from around 1890

Though rare, modern cases do still present themselves. In 2015, a 23 year old woman had a hairball weighing a massive 14 lbs removed from her stomach, after she had been plucking and eating her own hair for around seven years.

The Human Pincushion

In 1825, a doctor from Copenhagen described a highly unusual case of a patient suffering from some form of delirium. After a few days of feverish gibbering, he states that small lumps began to appear all over the patient’s body. The doctor incised it, expecting to draw out blood and puss, but was instead confronted by a handful of sewing needles! In total, he removed over 400 needles from numerous lumps, after which the patient apparently recovered. The most likely explanation is that the patient had been eating the needles, as small sharp objects like this tend to perforate the stomach and migrate all over the body. The body’s immune response would have rejected the needles, forcing them out wherever it could, resulting in the lumps the doctors found. 

Aiding Digestion

Another historical occurrence of apparent pica was observed among the slave population in Georgia during the 1800s, where it was noted that they would often eat white clay known as kaolin. Though at first written off as unexplainable behaviour, later studies showed that kaolin actually contains a mineral that can treat diarrhoea and other abdominal ailments, and the mineral later became the active ingredient in many commercial medicines. Therefore, what appeared to be an irrational act actually turned out to be a perfectly sensible course of action in a case where there was little to no access to proper medical care.

Consumption and Consumerism

Genuine cases of pica in adults can be difficult to identify, as the desire to eat inedible objects is often the result of other psychological disorders. Additionally, tv shows such as ‘My Strange Addiction’ and internet phenomenons like the ‘Tide Pod Challenge’ have increased the temptation for people to eat strange things for their ‘five minutes of fame’. People genuinely suffering from pica can face ridicule and misdiagnoses, meaning that the prevalence of real cases may be higher than realised as people may be unwilling to seek medical advice.

Sources and Further Reading

http://psychiatriapolska.pl/uploads/images/PP_3_2016/ENGver497Michalska_PsychiatrPol2016v50i3.pdf

Morris, T., The Mystery of the Exploding Teeth and other Curiosities from the History of Medicine, (2018)

https://en.wikipedia.org/wiki/Pica_(disorder)

The Tonsil Problem: Surgical Bravado and Childhood Trauma

Following both the First and Second World Wars, children came to be viewed as ‘citizens of the future’, and there was an increasing concern for their health, welfare and education. Following the formation of the NHS in 1948, many children were regularly seeing doctors for the first time, and this increased the rate at which minor health conditions like recurrent sore throats were being discovered during routine medical examinations at schools. Tonsillectomies were the standard treatment for recurrent throat infections, and the procedure became massively common for most of the remainder of the twentieth century.

Why was the Procedure so Popular?

Tonsillectomies became a popular procedure for a number of reasons. Firstly, it was considered quite safe and could be performed extremely quickly. Many surgeons boasted of how quickly they could perform the operation, leading to a flurry of professional bravado that helped to cement the procedure as a benchmark of surgical competence. Additionally, doctors at the time were often paid per procedure carried out, so there was also a financial incentive for surgeons to perform quick and easy operations such as tonsillectomies. 

A tonsillectomy being performed using a tonsil guillotine

Secondly, the theory was that diseases could be caused by local infections in other areas of the body, known as ‘focal theory’, had become popular around 1900 and was quite widely accepted by the 1920s. Often, maladies were attributed to oral infections, either in the teeth or tonsils. It actually became common practice to remove teeth in order to prevent certain health conditions, even if the tooth itself was healthy. It was believed that diseases such as tuberculosis, which was the biggest cause of adolescent death at the turn of the century, was caused by infection in the tonsils. Many doctors were therefore of the opinion that all tonsils should be removed as a matter of course in order to protect children from future disease.

A Simple Surgery?

By the 1940s, thousands of children every year were undergoing routine tonsillectomies, often at the request of their parents. In 1948, the Ministry of Education reported that more than 98,000 procedures had been performed that year alone. Having one’s tonsils removed was considered to be a normal part of childhood, just like learning to ride a bike or going to school. However, there was some debate between pediatricians and throat surgeons as to how effective or necessary tonsillectomies really were. The surgery, though simple, was not without risk, and complications and infections following the procedure were often dismissed. It was not unheard of for children to die as a result of the surgery, but these were put down to ‘accidents’, and the question as to whether the operation should have taken pace at all was never addressed. Pediatricians also argued that the anxiety, pain, and discomfort that children experience both before and after the procedure far outweighed any minor benefits, if any, that the operation might have.

The Polio Pandemic

During the 1940s and 50s, doctors began to notice a link between tonsillectomy and the polio virus. It was increasingly found that children who had had their tonsils removed were more susceptible to the disease than those who had not. Though many surgeons argued against the claim, empirical evidence from the United States, who had experienced a very similar epidemic, was undeniable, and in 1955 the Ministry of Health officially reported that tonsillectomies caused a persistent risk of polio infection. This report meant that parents began to rethink the routine nature of the operation, and by the late 1960s, the procedure was drawing closer scrutiny and heavier criticism.

Archaic and Unnecessary

By the 1970s, antibiotics had rendered tonsillitis a minor ailment, akin to a cough or a heavy cold. The need for patients with recurrent sore throats to have their tonsils removed was therefore drastically reduced. There was also an increasing lack of evidence as to the procedure’s benefits, as study after study showed that tonsillectomies were not as effective as had once been thought. Further studies showed that many children also suffered long term anxiety and mental trauma from the procedure and subsequent hospitalisation. Through various parliamentary enquiries, it also became apparent that elitist surgeons were using the procedure to elevate their own standing and push for further funding for their own departments. As a result of these revelations, tonsillectomies became synonymous with medical overreaching, and the procedure quickly fell out of fashion. 

Sources and Further Reading

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5883156/

https://en.wikipedia.org/wiki/Tonsillectomy#History

Freaks of Nature: Animal Abnormalities

During the early nineteenth century there was a huge interest in physical anomalies in both humans and animals, both from a scientific point of view as well as a more voyeuristic fascination from the general public. There was a huge demand for taxidermied and preserved specimens of deformed animals, and many such examples were showcased in sideshows, museums and private collections throughout the United States, Europe, and Great Britain.

Macabre Fascination

Up until the early twentieth century, taxidermied animals displaying deformities were not an uncommon sight in farmhouses. Farmers often kept newborn animals that were born with extra limbs or other such abnormalities, and took them to a local taxidermist to have them stuffed and mounted. The specimens undoubtedly made an interesting conversation piece for visitors, and it wasn’t long before people started charging money to allow people to see them. Soon, anatomists began acquiring these ‘monstrosities’ for their own personal studies. The Scottish anatomist and physician William Hunter had an enormous collection, much of which is still on public display today. 

An example of conjoined piglets

Faking It

As demand grew for specimens from private collectors, it wasn’t long before taxidermists began to fabricate their own creations. Siamese animals were often made by stitching two separate animals together, with animals supposedly having two heads being the most popular. The stitching was often well concealed, but the unusual symmetry of the piece is usually a good indication that it is not authentic. It also became popular to sew two different species together to create hybrid or mythical animals. Small monkeys were commonly sewn onto fish to create ‘mermaids’, and pigs and goats were sometimes altered to give them human-like forms.

Another popular creation was miniature dogs, which were usually made by either mounting still-born puppies to give them a more adult posture, or else were carefully crafted models which were then covered in fur to give them a realistic appearance. These specimens were purported to be tiny adult dogs, and often sold for large sums of money to unsuspecting collectors.

Scepticism 

It was partly due to the prevalence of these fakes that when genuine scientific discoveries were made they were often dismissed as hoaxes by the scientific community. Famously, when the duck billed platypus was discovered in Australia,  British scientists insisted that it had to be a fake, and that such a ludicrous looking creature could not exist. Robert Knox, the Scottish anatomist, even took a pair of scissors to the pelt that had been sent from New South Wales to check for stitches that would confirm the specimens inauthenticity.

Similarly, the British ornithologist Richard Bowdler Sharpe was convinced that the newly discovered ‘King of Saxony’ Bird of Paradise was the work of a taxidermist when the species was described to him in the late nineteenth century. The bird had extraordinary plumage that Sharpe was sure could not exist in nature. It was only when he saw specimens with his own eyes that he admitted that the creature was indeed real.

Scientific Study

Of course, the study of these physical abnormalities (known as teratology) is  invaluable for understanding and preventing the causes of birth defects in humans. By studying cases of abnormal development in animals, scientists have been able to identify numerous environmental and genetic factors that can cause birth defects and developmental problems, including lead, fetal alcohol syndrome, and now discontinued medications such as thalidomide.

Sources and Further Reading

https://en.wikipedia.org/wiki/Teratology

Turner, A., Taxidermy, (London, 2013) 

Remember You Must Die: Memento Mori and the Art of Victorian Mourning

In times before modern medicine and the introduction of health and safety legislation, it was common for people to die young, and to die suddenly and unexpectedly. Since the beginning of human history, people have followed a huge variety of grieving rituals in order to mourn the dead. However, the Victorians famously made something of an art of mourning, turning it into a conspicuous public display.

Love Never Dies

Queen Victoria started the trend when Prince Albert died in 1861, and Victoria famously spent the next 50 years in mourning. She withdrew from public life, wore only black for the rest of her life, and continued to have her husband’s linens and towels changed daily. Following her example, it became commonplace for families to hold elaborate rituals to mark the passing of their loved ones. There were strict etiquette rules to follow, with specific dress codes and different periods of mourning ascribed to one’s relation to the deceased. Widows were expected to remain in mourning for two years, while the mourning period for the death of a child was six months. This was supposed to reflect the accepted natural grieving period, and to break from these customs was viewed as the ultimate desecration of the memory of the dead.

Memento Mori

Another custom that is attributed to Queen Victoria was the practice of taking hair from the deceased and having it fashioned into jewellery. After Prince Albert’s death, Victoria carried a lock of his hair with her at all times. Rings, brooches and necklaces were all common memento items, and were often inscribed with the person’s name, and their date and place of death. Sentimental pieces like this were often passed down through families, or else were buried with the mourner when they themselves died. Sometimes, more elaborate pieces would be commissioned and entire shrines were created dedicated to the person’s memory. More unusually, the body might actually be preserved and kept on display, though this was very rare and only an option for very wealthy families.

Mourning brooches containing hair from the deceased.

Post-Mortem Photography

As new photographic techniques developed, photo portraits became very popular as photography became cheaper and more accessible. It became a popular practice to have a photograph taken of people after they had died as a memento, especially in the case of babies and young children. Sometimes the eyes and cheeks would be coloured to give a more lifelike appearance, or else would be posed as if sleeping. However, the practice may not have been as widespread as commonly thought, as many photographs purporting to show a deceased person often show living subjects who are using a posing frame, or just have a slightly odd facial expression! 

Mass Mourning

Such elaborate public mourning rituals fell out of fashion after the First World War. With loss on such an unprecedented scale, the desire to move on from the horror and loss of the war during the 1920s meant that long term grieving that the Victorians had performed seemed both outdated and excessive. Today, mourning is seen as a much more private and personal affair, though many religions and cultures continue to follow specific practices to honour and remember the dead.

Sources and Further Reading

https://www.bbc.co.uk/news/uk-england-36389581

The Man with the Silver Jaw

The case of Alphonse Louis is one of the earliest and most famous examples of the use of a prosthetic jaw. Not only was the piece a masterful piece of craftsmanship, it also returned the patient in question to a relatively normal life after he suffered a horrific disfigurement.

War Injuries

Today, most facial prosthetics are used by those who have undergone surgery for tumours. However, until the end of WWII the most common cause of disfigurement was war injuries. As warfare evolved, more powerful weapons meant more horrific injuries, with heavy artillery and the resulting shrapnel making loss of limb and facial disfiguration a very real danger on the battlefield.

During the siege of Antwerp in 1832, Alphonse Louis was hit in the face by a large piece of shrapnel which completely destroyed his lower jaw. Perhaps due to luck and the quick actions of the field surgeon in equal measure, he did not die from the severe injury. However, though Alphonse survived the initial ordeal, he was left with virtually no quality of life. He could no longer eat or talk, and was left constantly drooling as his tongue was left dangling from his throat. However, at the time, even the best surgeons had little idea how to effectively reconstruct an entire jaw, and so a more creative solution had to be found.

An Elegant Solution

Once it became clear that Alphonse was going to make a full recovery, it was suggested that a mask should be made for him to allow him to have as normal a life as possible. Not only would a prosthetic help to disguise the extent of his injury, but it would also allow him to use his tongue much more easily, enabling him to speak and eat again. A cast was taken of his face, which was passed to a silversmith so that a replacement jaw could be constructed for the injured soldier. The completed silver mask weighed three pounds, and was painted to resemble flesh. It also had a moustache and whiskers made of real hair in order to make it appear as realistic as possible. The mask had a tray that could be emptied of saliva, and the fastenings could be covered by a cravat. 

Alphonse’s new jaw was designed to look as realistic as possible

Rehabilitation

Alphonse returned to Lille in 1833. Doctors from the Royal College of Physicians in Edinburgh reported that when they visited him the day before his departure, he was in high spirits, could talk perfectly clearly, and had even managed to gain weight since his unfortunate injury. Alphonse seemed quite happy to show off his mask, which he was able to take off and adjust one-handed, and was reportedly as proud of it as he was of his medals.

Sources and Further Reading

Strackee, S. D., ‘Mandibular reconstruction revisited; on modeling and fixation techniques of the fibular free flap’, University of Amsterdam, (2004)

Kaufman MH, McTavish J, Mitchell R., ‘The gunner with the silver mask: observations on the management of severe maxillo-facial lesions over the last 160 years’, Journal of the Royal College of Surgeons Edinburgh, (1997) Dec;42(6):367-75

Gout: A Most Desirable Disease

Gout is a painful type of arthritis caused by an excess of uric acid in the blood, which then builds up in the joints, causing painful swelling and inflammation. Though not typically dangerous, it can be debilitating if untreated. The condition is usually caused by overconsumption of alcohol and fatty foods combined with general inactivity. 

Though gout is usually associated with the decadence of the pre-revolutionary era, the disease has been affecting people for thousands of years. The ancient Greeks observed that the malady only seemed to affect older, sexually active men, and thus associated it with those who were typically healthy, rather than people who were usually sickly. This association continued throughout the centuries, causing gout to become something of a desirable disease to have, as it was seen as a sign of a vigorous constitution.

A Badge of Honour

Gout became a fashionable malady partly because it afflicted many wealthy and influential people. As well as royalty and nobility, famous sufferers included talented intellectuals like Samuel Johnson, John Napier, and Pitt the Elder. Thus the desire to be associated with the great and good made gout seem an attractive status symbol. The disease was widely believed to be hereditary, running in good families, so gout was thought to be a sign of good breeding. It was also thought to prevent other more serious illnesses, such as dropsy, apoplexy (stroke), or fevers. Gout was therefore seen not as a problem but a boon,  and many people who were afflicted with gout were quite happy to endure its agonies in exchange for its supposed benefits. 

Tincture Treatments

One of many remedies marketed for gout sufferers

Though people were unwilling to have their gout cured, they were eager to relieve the painful symptoms. Since antiquity, the typical treatment for gout had been bloodletting, following the assumption that the ailment was caused by an excessive buildup of blood. In 1683, Thomas Sydenham, known as the English Hippocrates, wrote his own treatise on gout, perhaps because he himself was a long term sufferer. He recommended a light diet and regular doses of his own digestive remedy containing various herbs, which he dubbed ‘bitters’. The tincture became very popular, and other entrepreneurs soon began marketing their own remedies. In 1783 Nicolas Husson included meadow saffron in his mixture. The new addition turned out to be effective, and colchicum became a popular remedy. George IV famously suffered from gout among a plethora of other health problems caused by his gluttony and laziness. He rejected the more traditional treatments of his physicians, preferring to self medicate with colchicum. This royal influence helped to elevate colchicum’s status, and it soon became the remedy of choice.

A Chemical Cure

In 1776, Karl Scheele, who is best known for the introduction of the arsenical compound Scheele’s Green, isolated and identified uric acid. Over the next century, other chemists would discover the presence of the acid in the joints of those suffering from gout, and make the connection between the two. Later, it was discovered that colchicum contains the alkali colchicine, which blocks the metabolic pathway that causes the buildup of uric acid. The discovery of colchicine in the nineteenth century has remained central to the treatment of gout until very recently. Today, the condition is generally treated with non-steroidal anti inflammatory drugs (NSAIDS) such as ibuprofen.

The Rich Man’s Disease?

Following the French Revolution, gout soon fell out of fashion as it came to be associated with the decadence of the wealthy that the populus had so vehemently rejected. Gout was no longer flaunted as a sign of one’s success, and instead came to be viewed as a sign of laziness and self-indulgence. Though today the disorder currently affects around 2% of people, it is starting to see something of a resurgence as people live longer, and general lifestyles and eating habits shift.

Sources and Further Reading

Barnett, R., The Sick Rose or; Disease and the Art of Medical Illustration, (London, 2014)

Porter, R., ‘Gout: Framing and Fantasizing Disease’, Bulletin of the History of Medicine, (Spring 1994)

https://www.nhs.uk/conditions/gout/

Leprosy: The Ancient Scourge

Leprosy is one of the oldest known diseases, with written references to sickness that scholars suspect were leprosy dating back to as early as 1500 BC. Geneticists have traced the origins of the disease to East Africa, where it spread across the continent and then to Europe by following human movement along trade and migration routes. When leprosy became a common affliction in Europe during the medieval period, it was seen to be as much a moral affliction as a physical one, as it was often believed that those suffering from leprosy were being punished by God.

Today, leprosy (now referred to as Hansen’s Disease) is known to be caused by bacteria, and is treatable with antibiotics. Though cases of the disease do still occur, instances are much rarer and tend to be isolated to areas with tropical climates.

The Living Dead

In medieval Europe, people afflicted with leprosy were commonly referred to as ‘lepers’, and were shunned from mainstream society. They were often segregated and forced to live in colonies with others like them, many of which were run by or aided by monasteries. Though most of these church-run hospices, known as leprosaria, provided food, comfort and what medical care they could, some religious sects took a more dismal view of the disease. For example, some Catholic communities viewed leprosy as a kind of living death, and held ceremonies where lepers were declared symbolically dead; sufferers were made to lay in a grave while a priest recited burial rites over them. Lepers were not allowed to own property, and those who couldn’t find a place in a colony had to rely on begging. Famously, lepers had to ring a bell in order to warn people of their approach.

A Horrific Affliction

One reason that leprosy inspired such fear was its disfiguring appearance. While some forms of the disease were relatively mild, such as when the person infected had a strong immune response, it could also present in ways that made the affected person appear quite frightening. Famously, some forms of necrotic leprosy could cause body parts to literally fall off, as nerve damage caused a loss of feeling that meant injuries and infection would often go unnoticed. Sometimes, the septum (the centre part of the nose) would collapse and the nose would become disfigured or even become completely detached. Some cases of nerve damage paralised the facial muscles, causing the face to sag lopsidedly. These physical disfigurations only exacerbated the superstitious beliefs that surrounded the disease, and people suffering from leprosy often received little sympathy.

A victim of tuberous leprosy, where the nose has completely detached

A Cure in Oil

For several centuries, the only known effective treatment for leprosy was chaulmoogra oil, from a tree native to India and South-East Asia. Though the oil had been used in alternative Eastern medicine for some time, the treatment only entered the Western world in the 19th century, after it was introduced to Europe in 1854 by a professor at Bengal Medical College. However, taking the oil treatment was almost as unpleasant as the disease itself. If swallowed, the oil caused severe nausea, and injections to the skin caused fever and rashes. The oil also was not completely effective, and there was no guarantee of being cured. Despite this and the unwanted side effects, chaulmoogra oil remained the best treatment for leprosy until the 1940s, when antibiotics began to be developed. Promin was the first drug shown to be effective against leprosy by Guy Henry Faget at the National Leprosarium in Carville, Louisiana. The hospital, now known as the Gillis W. Long Hansen’s Disease Center, still houses some patients, and new treatments and vaccines for Hansen’s disease are still being developed today.

Sources and Further Reading

https://web.stanford.edu/class/humbio103/ParaSites2005/Leprosy/history.htm

https://www.healthline.com/health/leprosy

https://en.wikipedia.org/wiki/History_of_leprosy

https://www.hrsa.gov/hansens-disease/history.html

Barnett, R., The Sick Rose or; Disease and the Art of Medical Illustration, (London, 2014)

It’s all in your Head: The Phrenology Movement

Phrenology is the idea that the shape of the head is a physical manifestation of the personality. By feeling the shape and contours of a subject’s head, phrenologists believed they could determine their overall personality, including any predisposition they might have to insanity or criminality. The theory was developed in 1796 by Franz Joseph Gall, who believed that bumps on the skull were caused by pressure from the brain, and that this could help inform assumptions about a person’s fundamental characteristics. The theory is now recognised as a complete pseudoscience, but the idea enjoyed a period of relative popularity for well over 100 years.

A phrenology chart showing the supposed locations of different personality traits

Well-Intentioned Ideas

Gall believed that the brain was made up of 27 separate organs, each responsible for different personality traits, such as integrity, willpower, and intuition. He reasoned that areas that were often used and well exercised would grow larger, whilst areas that were not used would shrink, thus creating a unique topography that could be ‘read’ to determine a person’s character. The idea was popularised in Scotland and subsequently the rest of Britain by George Combe, who formed the Edinburgh Phrenological Society in 1820, which became the centre of the British phenology movement. At the peak of its popularity, the society had 120 members, many of whom came from medical backgrounds. Many early proponents of phrenology thought that it could be used as a basis for positive social reform, as they believed that the mind had only to be exercised in the right way in order to improve a person’s character. Many were advocates for education, asylum reform, and rehabilitation of criminals.

Popular Science

Phrenology became extremely popular in Britain partly because Combe cleverly marketed it towards the newly developing middle classes. Scientific lectures were becoming a popular form of entertainment as people hastened to find ways of making themselves appear intellectual and sophisticated, thus distinguishing themselves from the uneducated lower classes. The newly emerging ideal of self-betterment (and the benevolent betterment of the lower classes) meant that phrenology became a huge money maker for phrenologists who offered personal examinations and sold their charts and books to the nouveau riche. 

However, this popularity and quick monetisation meant that phrenology soon came to be considered by many intellectuals as little more than a parlour trick, and phrenology was quickly relegated to a fringe movement within the medical and scientific communities. Interest in phrenology among scientists and social reformers began to wane by the 1840s as they drifted towards psychiatry, anthropology, and criminology instead.

Pseudoscience to Neuroscience

Although phrenology was considered a pseudoscience very early on, it was based on principles that were not too far from the truth. The idea that certain areas of the brain are connected to separate thoughts and emotions was an important step towards modern neuropsychology and the understanding of brain function. It is now understood that different areas of the brain do indeed control different things, but rather than dictating certain personality traits, the brain is divided by much more general functions such as motor skills, cognition, and perception. In 2018, a group of neuroscientists conducted a study to see if the theory of phrenology had got anything right at all by comparing thousands of MRI scans to data about lifestyle and demographic information, and cognition and language tests. They did discover some interesting associations; they found a strong link between the traits ‘amativeness’ (sexual arousal) and ‘words’. In short, they found that the more sexual partners someone has had, the higher their verbal fluency. However, overall they (unsurprisingly) found no correlation between skull contours and Gall’s personality traits.

A Theory Disproved

Ironically, it was the efforts to prove the accuracy of phrenology that ultimately led to its downfall. Early experiments that dissected and impaired areas of the brain in both animals and humans showed that the regions of the brain that Gall had originally identified as dictating certain faculties did not seem to match up with their actual functions. As more and more emphasis began to be placed on empirical evidence for scientific theories, it became apparent that phrenology just didn’t hold up. However, the theory still enjoyed some support well into the twentieth century, and continued to be used to justify racist ideologies such as eugenics until the 1950s.

Sources and Further Reading

https://archive.org/details/systemofphrenolo00combuoft/page/n7/mode/2up

https://www.cambridge.org/core/journals/british-journal-for-the-history-of-science/article/science-and-selfassessment-phrenological-charts-18401940/8224EA2B31FB3EA67BEA89AE5B63EAF5/core-reader

https://www.history.com/topics/germany/eugenics#:~:text=Eugenics%20is%20the%20practice%20or,characteristics%20from%20the%20human%20population.

https://theconversation.com/neuroscientists-put-the-dubious-theory-of-phrenology-through-rigorous-testing-for-the-first-time-88291

https://www.sciencefocus.com/the-human-body/phrenology-from-bumps-on-the-head-to-the-birth-of-neuroscience/

https://en.wikipedia.org/wiki/Edinburgh_Phrenological_Society

A Picture of Health: Asylum Photography

Before the reform movement of the mid 1800s, insane asylums were seen as little more than dumping grounds for those who were considered to have no place in society. Likewise, the people who found themselves confined in ‘madhouses’ were often regarded with fear and ridicule. There were many different theories about what caused mental illness, and just as many theories about how to diagnose and cure it. As new technologies such as photography began to emerge, forward thinking practitioners began to explore the potential therapeutic and diagnostic benefits it might bring.

A Difficult Diagnosis

Before psychiatrists had a clear idea about what caused mental illness, they only had external symptoms on which they could base their diagnoses. As well as observing abnormal behaviours, many early psychiatrists believed that the key to diagnosing mental illness lay in facial expressions, an idea known as physiognomy. Many practitioners believed that a person must be able to present an outward appearance of normality in order to be considered ‘cured’. Conversely, any abnormal appearance would be classified as a symptom of mental disorder or even criminality. The idea was closely linked to phrenology; the theory that a person’s personality and predispositions could be determined by the shape and contours of their head. 

Photography as Therapy

The advent of new photographic techniques meant that photography became a useful tool by which doctors could diagnose and classify their patients. The first person to begin photographing patients in an asylum setting was Hugh Welch Diamond, who was the superintendent of Surrey County Asylum during the 1850s. He was an avid proponent of photography and firmly believed that photography could be used therapeutically to help cure his patients. He argued that by showing his patients photographs of themselves they would be forced to recognise and confront their illness, which would ultimately aid their recovery. 

Many other influential psychiatrists agreed with this notion, including John Conolly, who was the director of Hanwell Asylum and the first practitioner to follow complete non-restraint in the asylum. Connolly was a strong advocate of humane treatment of psychiatric patients, and used photographs to demonstrate the positive impact of his regimen. He published a number of articles illustrated with photographs that showed his patients ‘before and after’, demonstrating their recovery and return to respectability. 

Reproduced recovery photos published in the Medical Times Gazette

A Clearer Picture

Photographs were considered to be more scientific than drawings by some medical practitioners, because photographs were seen as completely objective. Unlike drawings, they would be free from any preconceptions or biases that may have been consciously or unconsciously added by the artist. However, this was not entirely the case. As any other portrait artist would, photographers often posed their subjects in very specific ways, and asylum practitioners often staged their photographs in a way that sought to highlight the physiognomic features they believed they saw in their patients. For example, photographs of patients were often taken after a diagnosis had already been made, and the photograph was therefore intended to try and display the elements that the psychiatrist believed typified the disorder they saw.

The invention of the calotype in 1840 meant that photographs could be produced much more quickly than before, and most importantly could be easily duplicated, as the images were produced on paper rather than copper plates. This meant that photographs could be reproduced and distributed via publications that were available both to other professionals and the general public. This meant that asylum patients could be presented in a more humanised way, rather than in the fanciful or romanticised manner that artists often used to portray madness.

A Lasting Impact

Even though physiognomy was little more than a pseudoscience, the images of asylum patients still produced some positive outcomes. By publishing photographs of recovered patients, Conolly and Diamond helped to humanise those suffering from mental disorders. Far from being ‘hopeless cases’, these images demonstrated that with the right care people could recover. Rather than fear or derision, the public came to view these people with sympathy, and the way that patients in asylums were treated gradually improved as a result.

Sources and Further Reading

https://fr.wikipedia.org/wiki/Hugh_Welch_Diamond

https://www.britannica.com/technology/photography/Photographys-early-evolution-c-1840-c-1900

https://repository.upenn.edu/cgi/viewcontent.cgi?article=1510&context=asc_papers