An Abbreviated History of Neuroscience | Guest Blog

Image of a human head depicting neuroscience
Share this:

Where We Were v.s. Where We Are Today

Ah, neuroscience, the study of the squishy, slimy, three-pound computer that rests between our ears. Although the brain is the most complex organ in the body (or as a Trends in Cognitive Sciences Journal review aptly put it, “one of the most complex multicellular structures in biology”), neuroscience itself is only a mere 55 years old.

That’s right — the study of the brain, this omnipotent, protein and fat blob of soft tissue, is probably much younger than your grandparents.

However, that’s not to say humans didn’t attempt to study the brain prior to 1965. There were certainly gallant efforts to make sense of the inner workings of the human mind, arguably starting with some of the greatest ancient philosophers, artists, authors, and poets.

Then we have psychology, the study of human behavior, which was founded at the University of Leipzig in Germany in 1879. Understandably, psychology and neuroscience are deeply intertwined, woven together like thick vines laced around the bark of an ancient tree.

Perhaps in the fashion of academic sibling rivalry, neuroscience would ultimately lead to breakthroughs that shattered ancient tropes projected from the works of notable figures in psychology, like Sigmund Freud.

“In the 1950s, especially in the United States, Freud and his successors stood at the center of all cultural explanations for psychological suffering,” says an essay by the Society for Neuroscience. “In the new millennium, we perceive such suffering as erupting no longer from a repressed unconscious but, instead, from a pathophysiology rooted in and caused by brain abnormalities and dysfunctions.”

And so our story begins amidst the mid-20th century, in the dawn of pop-psychology, radical advancements in science and medicine — just along the brink of the counterculture movement, mind you.

“The physical nature of the brain made it especially difficult to study,” the essay continues. “On gross visual inspection, the brain looks like a gelatinous mass. The invention of the microscope at the end of the 17th century did little to help scientists visualize the inner substrates of neurons and glia.”

Even with improved microscopes, it was still rather difficult — dare I say, near impossible — for scientists to attain a clear visual of what a neuron looks like. After all, the size of a neuron varies between 4 microns to 100 microns long. Or, 0.004 mm to 0.1 mm, to put it into perspective.

As you can imagine, picturing some of the most complex tasks stemming from 0.004 mm cells in a groovy, fleshy mass was an arduous effort. The earliest neuroscientists weren’t just intelligent, they had to be pretty damn imaginative, too.

Nevertheless, they traversed. Throughout the decade, researchers who specialized in anatomy, biochemistry, neurology, physiology, and pharmacology put their heads together to map neural pathways and systems. They identified and characterized neurotransmitters. Simultaneously, they dirtied their hands with unveiling the mysteries behind movement, pain, memory, and vision. These studies took place in pristine laboratories at universities across the globe.

A vibrant energy radiated through the field, akin to child-like wonder. It didn’t take long for the Committee on Brain Sciences — the very first neuroscience committee in the United States (later renamed “Society for Neuroscience”) — to realize the field lacked a central focus.

The committee published various reports to inspire more united efforts among researchers who were scattered across miles of land and ocean. After all, what good was it to have so many brilliant brains chipping away at some of the greatest biological wonders if their efforts lacked order or consistency? Who would fund such radical, unconcise research?

During the 1970s, neuroscience continued to expand our understanding of human behavior.  Prominently, the discovery of opioid receptors grasped the public’s attention and drew focus towards “natural highs,” pain, and addiction as the War on Drugs waged without an ounce of mercy. We obtained new insights into the inner workings of memory, as well.

Moral and ethical dilemmas plagued neuroscience, just as it did with any other field of science during this era. These became particularly apparent during the 1980s, as the People for the Ethical Treatment of Animals (PETA) began putting “mad scientists” under fire for their seemingly blatant disregard. This perceived aloofness during pain and spinal injury experiments on primates and house pets would no longer be tolerated by many.

Not all researchers were heartlessly driven to autonomy in the name of science, of course. But a few bad apples quickly spoiled the pie, leaving a foul taste in the mouths of the activists, the media, prosecutors, and judges.

Proactively, the Society for Neuroscience formed the Committee on Animals in Research to establish more ethical practices and demonstrate the (unfortunate) necessity for animal testing in modern medical research. Such discoveries made during these animal studies would go on to improve the welfare of countless individuals worldwide, after all. But it came at the cost of running these trials on animals first.

Like cell division, neuroscience continued to expand and diversify as the 20th century drew towards its eclipse. A survey distributed by the Society for Neuroscience in 1982 found that 92 percent of neuroscientists worked in at least two broad areas of the field.

Approximately half of the respondents held positions at hospitals, veterinary, or medical schools. 34 percent were linked to universities, while the remaining percentage worked in a variety of other medical and scientific departments.

Although the society strived for ethnic and gender diversity, it should come as no surprise that 79 percent of respondents were male, and 91 percent were Caucasian.

Technological advancements allowed the scope of research within the field to reach new heights, as well. fMRI (functional magnetic resonance imaging) and PET (positron emission tomography) gave neuroscientists the ability to peer deeper into the magic behind brain functioning.

fMRI scans measure and map brain activity based on blood flow, while PET scans measure metabolic processes within the brain — i.e. which regions are consuming energy during certain tasks or brain states.

Throughout the late 1980s and early 1990s, neuroscientists discovered Huntington’s disease, Duchenne muscular dystrophy, and the biological roots of Alzheimer’s disease.

In an effort to garner federal recognition and underline the importance of neuroscience during this period, the society’s Government and Public Affairs Committee proclaimed the 90s the “Decade of the Brain.”

And it worked. “In July 1989, President George H.W. Bush signed a joint congressional resolution designating the 1990s as the ‘Decade of the Brain,” the essay explains. “Decade of the Brain” became a powerful slogan, campaigned to allure legislators into allocating more funds toward scientific research.

In a testimony before the Senate Appropriations Committee in 1991, the Society for Neuroscience’s former president Dominick Purpura said, “[The Decade of the Brain] could be a prelude to the Century of Man, in which humankind will be emancipated from the dread of disability and the stigma of dehumanization that attends dissolution of the human spirit.”

Where We Are Today

It has been nearly 30 years since Purpura’s prophetic testimony. Two decades into “the Century of Man,” and facing a ruthless, global pandemic, where do we stand today?

Well, for starters, “emancipation from the dread of disability” does not come without it’s moral and ethical dilemmas. As author Siddhartha Mukherjee, P.h.D. explains in his book, “The Gene: An Intimate History,” the technology to screen fetuses for devastating ailments is just around the corner.

But the repercussions of such technology are fertile grounds for a revival of eugenics —the unforgiving discipline of deciding which heritable traits are desirable. The same unforgiving discipline that has a history of fueling mass genocide and horrifying discrimination.

“We want to eliminate the suffering, but we also want to ‘keep those sufferings,’” Mukherjee says while describing such moral conflicts. “… If the history of the last century taught us the dangers of empowering governments to determine genetic ‘fitness,’ then the question that confronts our current era is what happens when this power devolves to the individual — to carve out a life of happiness and achievement, without undue suffering — with the desires of a society that, in the short term, may be interested only in driving down the burden of disease and the expense of disability.”

While comparing genetics and neuroscience may seem like comparing apples to oranges, they’re both relatively-new fields with ambitious goals. Perhaps it is not the “emancipation of disability” we should seek, but rather, ways to establish equilibrium between suffering and comfort.

Effective treatment options — behavioral, pharmaceutical, and natural — have continued to surface since the Century of Man took foothold. Notable, public efforts to destigmatize seeking treatment have taken place throughout the last decade, too.

Our school teachers, counselors, employers, politicians, screenwriters, celebrities, parents, and children are paying more mind to mental and cognitive health now more than ever.

As of August 2020, in the late summer heat and blaze of the pandemic, the US National Library of Medicine displays 134 active, enrolling, and recruiting clinical studies under the term “neuro.” And this certainly doesn’t do the field any justice, either.

There are, of course, neurological studies that will not appear under the term “neuro” because they bear the name of the neurological illness they’re focused on. And naturally, there are neurological studies taking place elsewhere, in the distant landscapes beyond our nation.

Neuroscience has and will continue to make profound breakthroughs throughout this century, especially as technology, funding, and awareness continue to increase. While it may seem the “big discoveries” have already been made, that couldn’t be more inaccurate.

Five and a half decades of research has taught us so much about our minds and bodies, from the phenomena of memory to the enigmatic gut-brain axis. But there are still major gaps in our understanding of the brain that loom over us. We hardly understand how dreams work and there’s still much, MUCH debate surrounding consciousness.

Yet, these two big questions hardly scratch the surface of what’s left for us to explore. The outlook is optimistic; the future is undoubtedly fruitful with scientific opportunities.

Image: Adobe Stock

The Burgundy Zine is a bi-monthly digital zine that releases content centered around a different theme each issue. We are a growing, global community of creatives who embrace self-expression. Passion and research are at the heart of our work.

Free Email Updates
We respect your privacy.