7 Myths about Creativity
Creativity researchers and psychologists may disagree about exactly what creativity is, but there is growing consensus about what it certainly is not!
If your friend circles are anything like mine then you’re probably sick of hearing this question:
What is creativity?
Pose this question to ten different neuroscientists and psychologists and you’ll get ten different answers. Trust me, I’ve asked.
In all fairness, there are some consistencies across the different responses I’ve received, however any hope of consensus around a definition of creativity is still in the distant horizon.
After reading several dozen books about the topic of creativity (there are millions), I’ve noticed that even though experts may disagree on a definition of the word, they share broad agreement on what creativity is not. More surprisingly, what they have discovered often contradicts popular beliefs about creativity.
Even though experts may disagree on the definition of [creativity], they share broad agreement on what it is not.
So in our ongoing attempts to understand our creativity, here are some myths about the process that you should definitely put to rest.
Myth 1: Only a chosen few are born creative
The prevailing man-on-the-street wisdom is that certain people are born creative while others are not. Some may even cite anecdotal evidence about their friend, cousin or brother who could “always draw,” forgetting that most children up to a certain age, enjoy drawing.
There is overwhelming evidence from recent brain science which shows that human beings are not born with a fixed creative capacity. In fact, it's quite the opposite. What we recognize as creativity can be expanded and improved with effort much like training a muscle. Furthermore, this improvement can be accomplished at any age.
People who “appear” to be creative from birth just have more immediate agency over the brain states that facilitate creative work. For instance, most human beings are born with the ability to breathe on their own--it's not a special trait. However, not every successfully breathing human being knows how to use their breathing to control mood, mental state or cognitive function. This sort of proficiency can be acquired early or late in life.
Myth 2: Painters and poets are creative but certainly not doctors or accountants
People often associate the word “creative” with individuals who have chosen a profession in the arts of some kind. They think of poets, painters, and musicians as creative but fail to draw the same conclusion about professions in the sciences.
The start-up accountant who figures out how to extend the life of his cash-strapped firm for another three months is being just as creative as the poet who pens a new sonnet. Physicists developing new theories, entrepreneurs developing new services or products and doctors diagnosing and developing new treatments for diseases are all exercising tremendous creativity.
A helpful way to understand the universality of creativity is not to think of the practitioner as a “creative,” but to think of the person as engaging in “creative thinking.” Viewing creativity as a mode of thinking regardless of the domain emphasizes just how pervasive it is in humans.
Myth 3: People are Left-brain or Right-brain dominant
This model about how the brain is structured, though once useful, is now widely believed to be inaccurate and a bit antiquated. The idea that humans are either left-brain or right-brain dominant originated from a series of studies conducted on patients who had their corpus callosum severed.
In these patients, their left brain hemispheres could no longer communicate with the right one. And a series of interesting behaviours were observed as a result. For example, some patients could recognize and read letters and symbols with their right eye but not their left (the right side of the brain controls the left side of the body and vice versa). Or could hum along with a tune piped into their left ear but not their right one.
Based on such compelling results, scientists developed the familiar left-brain, right-brain model of brain function. The left brain is believe to specialize in rational, analytic thought like mathematics and language processing, while the right brain specializes in more non-verbal tasks like spatial awareness, music, emotion and intuition.
Current brain-imaging software has revealed that this simple left-right delegation of tasks to particular brain hemispheres isn’t exactly accurate. There is far more cross-communication between the hemispheres on all tasks than was previously thought. Although there is some degree of hemispheric specialization, most brains engage both hemispheres quite dynamically on many tasks.
It's actually more accurate to think of different brain states for different types of activities rather than different brain regions handling one type of activity over others.
Myth 4: Mental illness improves creativity
Forget all the Hollywood lore and popular fiction fetishizing the “mad genius” or “tortured artist.” Despite well known examples like Van Gogh, Howard Hughes or Nina Simone, there is no evidence to support a link between mental illness and enhanced creative output. Such famous examples remain the exception, not the rule.
Despite the perception, highly creative individuals with mental illnesses still make up a tiny percentage of the larger pool of creative exceptionals. In fact, upon further investigation artists like Van Gogh and Sylvia Plath were creative despite their illness, not because of it.
The image, though, is a persistent one. This could be because of what creative researcher, Scott Kaufman, calls the “messy creative mind.” He reports that highly creative minds are often characterized by a compendium of paradoxes, capable of maintaining contradictory extremes such as fierce individualism and devoted collectivism, a high tolerance for disorder but an ability to thread order out of chaos.
Such extremes in the same person may lead observers to conclude that creatives are mentally unstable when in fact, they are quite psychologically sound.
Myth 5: Creativity is an individual endeavour
Consider this assertion, Thomas Edison invented the lightbulb. Or this one, Steve Jobs created the iPhone. As much as we love to repeat them, both statements are of course, false.
Thomas Edison employed an army of engineers at Menlo Park and so did Steve Jobs at Apple. These creative geniuses understood something fundamental about creativity--it thrives and multiplies with collaboration. Some historians like Edmund Morris will even say Edison’s greatest invention was his team and culture at Menlo Park, not the thousands of inventions with which he is credited.
Similarly, Steve Jobs crafted the cultures of Apple and Pixar to encourage group interaction and what he called “social collisions.” Jobs insisted the work space at Pixar be built around an enormous atrium which everyone had to pass through to get to their cubes and offices. This raised the frequency of worker interactions, sharing of ideas and likelihood of collaborations. Apple's current headquarters is built in the shape of a giant ring for the same reasons. These creative collaborations gave us our beloved iPhones, iPods and Macs; hardly the products of a singular tireless genius tinkering away in a dark bunker.
Myth 6: Creativity strikes in a flash of brilliance
This one is a myth only in the way it is recast as a near mystical event. Tales of Archimedes running naked through the streets of Syracuse shouting “Eureka,” or August Kekule discovering the benzene molecule* in a dream are quite inspiring to hear. Sadly, these over-dramatizations of miraculous moments of insight are a tiny sliver of the entire creation process.
The truth is less dramatic. Research by creativity myth-debunker, David Burkus, shows that such moments of insight are usually the culmination of months and often years of prior hard work. If the insight is the visible tip, then the prior failed attempts and long hours of study are the entire iceberg.
It takes time for the brain to assimilate and synthesize information in a particular problem space, but our minds are remarkably good at condensing and synthesizing information in this way. Think about the amount of mental effort you had to exert when you were learning how to drive a car. You had to think about steering, speeding up and slowing down, shifting gears (if you learned a stick-shift transmission) and many other details that seemed overwhelming at the beginning. However, ten years later, when was the last time you consciously thought about those things while driving. More likely, you’re thinking about an office meeting, picking up the kids and fixing your hair all while switching lanes on a freeway at 80-mph.
Our minds are good at this type of chunking down and synthetization of information and we do it all the time. When such synthesis happens in a specific problem space, the end results can seem like a flash of insight from the gods. It's more likely your mind finally grasping the problem space after countless hours of wrestling with it.
* The benzene molecule is an important hydrocarbon used to make everything from plastics and synthetic fibers to life-saving drugs and detergents
Myth 7: Creativity cannot be learned
Perhaps the greatest and most debilitating myth of all is the view that humans are given a fixed amount of creative capacity. Thankfully for us life-learners, this has been thoroughly debunked by neurologists and psychologists all over the world.
In the past, scientists believed the human brain reached its intellectual peak around 40 years old before slowly declining all the way into the 70s and 80s when new learning was thought to be almost impossible. Today we know that the brain remains incredibly elastic and pliable well into the late 60s, and even later for some.
So, since we know that the mental “hardware” remains capable for the majority of one’s lifetime, our next question becomes, is creativity learnable?
According to psychologists like Mitchel Resnick, head of the Lifelong Kindergarten group at MIT Media Labs, the answer is a resounding, “Yes.” To improve educational creativity, Resnick says, “all stages of education should allow more time for students to work collaboratively on interdisciplinary projects that pique their interests.” Which is to say, a creative learner’s curiosity should lead them to the subjects and people that most interest them.
As long as curiosity is maintained it can be used as a driver for cultivating creative capacity throughout one’s life. After all, we are all born with curiosity and some degree of creative capacity. Learning and problem-solving would be next to impossible if this were not the case. Matisse, Georgia O’Keefe and Buckminster Fuller are just a few examples of creative notables whose output increased (and changed) in the latter years of their lives as their craft improved over a lifetime. They allowed themselves to be led by their curiosity even when it meant changing direction or in some cases changing fields entirely.
Summary
There are many more myths about creativity worth debunking, but in my studies and everyday conversation, these seem to be the most pervasive. Psychology and Neuroscience, like many fields today, are constantly improving and developing at a rapid pace. Especially around the study of creativity, which is receiving increased attention as job automation advances, it's not uncommon for once heralded ideas to fall out of favor due to new evidence and better technology-aided experiments.
So to summarize, here are the ideas you can take away with you:
All human beings are born with a capacity for creativity
Creativity is exhibited in all fields, not just the “artistic” ones
The left vs right brain model is incomplete. The brain uses both hemispheres in tandem for most tasks.
There is no correlation between mental illness and enhanced creativity (despite what Hollywood tells us).
Creativity is not a solo act—it works better in collaboration.
Creative flashes of insight are usually the result of prior, extensive hard work.
Creativity can be cultivated at any age, partly because the brain remains elastic and supple well into old age.