Is suicide the deadly result of a deep psychological condition — or a fleeting impulse brought on by opportunity?
"We must not think too much," cries Euripides' Medea. "People go mad if they think too much."
Replace these every slider sentences with your featured post descriptions.Go to Blogger edit html and find these sentences.Now replace these with your own descriptions.
Replace these every slider sentences with your featured post descriptions.Go to Blogger edit html and find these sentences.Now replace these with your own descriptions.
Replace these every slider sentences with your featured post descriptions.Go to Blogger edit html and find these sentences.Now replace these with your own descriptions.
In other videos and blog postings, Spikol, 39, a writer in Philadelphia who has bipolar disorder, describes a period of psychosis so severe she jumped out of her mother's car and ran away like a scared dog.
In lectures across the country, Elyn Saks, a law professor and associate dean at the University of Southern California, recounts the florid visions she has experienced during her lifelong battle with schizophrenia — dancing ashtrays, houses that spoke to her — and hospitalizations where she was strapped down and force-fed medications.
Like many Americans who have severe forms of mental illness such as schizophrenia and bipolar disorder, Saks and Spikol are speaking publicly about their demons. Their frank talk is part of a
conversation about mental illness that stretches from college campuses to community health centers, from YouTube to online forums.
"Until now, the acceptance of mental illness has pretty much stopped at depression," said Charles Barber, a lecturer in psychiatry at the Yale School of Medicine. "But a newer generation, fueled by the Internet and other sophisticated delivery systems, is saying, 'We deserve to be heard, too.' "
About 5.7-million Americans over 18 have bipolar disorder, which is classified as one of a group of mood disorders, according to the National Institute of Mental Health. Another 2.4-million have schizophrenia, which is considered a thought disorder. The small slice of this disparate population who have chosen to share their experiences with the public liken their efforts to those of the
gay rights movement of a generation ago.
Just as gay rights activists reclaimed the word queer as a badge of honor rather than a slur, these advocates proudly call themselves mad; they say their conditions do not preclude them from productive lives.
Mad pride events, organized in at least seven countries including the United States, draw thousands of participants, said David W. Oaks, director of MindFreedom International, a nonprofit group in Eugene, Ore., which tracks the events and says it has 10,000 members.
Recent activities include a Mad Pride Cabaret in Vancouver, British Columbia; a Mad Pride March in Accra, Ghana; and a Bonkersfest in London that drew 3,000 participants.
Members of the mad pride movement do not always agree on their aims. For some, the objective is to destigmatize mental illness. A vocal, controversial wing rejects the need to treat mental afflictions with psychotropic drugs and seeks alternatives to the shifting, often inconsistent care offered by the medical establishment. Many say they are publicly discussing their struggles to help those with similar conditions and to inform the public.
"It used to be you were labeled with your diagnosis and that was it; you were marginalized," said Molly Sprengelmeyer, an organizer for the Asheville Radical Mental Health Collective, a mad pride group in North Carolina. "If people found out, it was a death sentence, professionally and socially. We are hoping to change all that by talking."
In 1996, Tom Wolfe wrote a brilliant essay called “Sorry, but Your Soul Just Died,” in which he captured the militant materialism of some modern scientists.
To these self-confident researchers, the idea that the spirit might exist apart from the body is just ridiculous. Instead, everything arises from atoms. Genes shape temperament. Brain chemicals shape behavior. Assemblies of neurons create consciousness. Free will is an illusion. Human beings are “hard-wired” to do this or that. Religion is an accident.
In this materialist view, people perceive God’s existence because their brains have evolved to confabulate belief systems. You put a magnetic helmet around their heads and they will begin to think they are having a spiritual epiphany. If they suffer from temporal lobe epilepsy, they will show signs of hyperreligiosity, an overexcitement of the brain tissue that leads sufferers to believe they are conversing with God.
Wolfe understood the central assertion contained in this kind of thinking: Everything is material and “the soul is dead.” He anticipated the way the genetic and neuroscience revolutions would affect public debate. They would kick off another fundamental argument over whether God exists.
Lo and behold, over the past decade, a new group of assertive atheists has done battle with defenders of faith. The two sides have argued about whether it is reasonable to conceive of a soul that survives the death of the body and about whether understanding the brain explains away or merely adds to our appreciation of the entity that created it.
The atheism debate is a textbook example of how a scientific revolution can change public culture. Just as “The Origin of Species” reshaped social thinking, just as Einstein’s theory of relativity affected art, so the revolution in neuroscience is having an effect on how people see the world.
And yet my guess is that the atheism debate is going to be a sideshow. The cognitive revolution is not going to end up undermining faith in God, it’s going end up challenging faith in the Bible.
Over the past several years, the momentum has shifted away from hard-core materialism. The brain seems less like a cold machine. It does not operate like a computer. Instead, meaning, belief and consciousness seem to emerge mysteriously from idiosyncratic networks of neural firings. Those squishy things called emotions play a gigantic role in all forms of thinking. Love is vital to brain development.
Researchers now spend a lot of time trying to understand universal moral intuitions. Genes are not merely selfish, it appears. Instead, people seem to have deep instincts for fairness, empathy and attachment.
Scientists have more respect for elevated spiritual states. Andrew Newberg of the University of Pennsylvania has shown that transcendent experiences can actually be identified and measured in the brain (people experience a decrease in activity in the parietal lobe, which orients us in space). The mind seems to have the ability to transcend itself and merge with a larger presence that feels more real.
This new wave of research will not seep into the public realm in the form of militant atheism. Instead it will lead to what you might call neural Buddhism.
If you survey the literature (and I’d recommend books by Newberg, Daniel J. Siegel, Michael S. Gazzaniga, Jonathan Haidt, Antonio Damasio and Marc D. Hauser if you want to get up to speed), you can see that certain beliefs will spread into the wider discussion.
First, the self is not a fixed entity but a dynamic process of relationships. Second, underneath the patina of different religions, people around the world have common moral intuitions. Third, people are equipped to experience the sacred, to have moments of elevated experience when they transcend boundaries and overflow with love. Fourth, God can best be conceived as the nature one experiences at those moments, the unknowable total of all there is.
In their arguments with Christopher Hitchens and Richard Dawkins, the faithful have been defending the existence of God. That was the easy debate. The real challenge is going to come from people who feel the existence of the sacred, but who think that particular religions are just cultural artifacts built on top of universal human traits. It’s going to come from scientists whose beliefs overlap a bit with Buddhism.
In unexpected ways, science and mysticism are joining hands and reinforcing each other. That’s bound to lead to new movements that emphasize self-transcendence but put little stock in divine law or revelation. Orthodox believers are going to have to defend particular doctrines and particular biblical teachings. They’re going to have to defend the idea of a personal God, and explain why specific theologies are true guides for behavior day to day. I’m not qualified to take sides, believe me. I’m just trying to anticipate which way the debate is headed. We’re in the middle of a scientific revolution. It’s going to have big cultural effects.
The final vision of Sylvie and Ruth, after all, is not far from that
of two homeless people, adrift in the world. I don’t agree with
John Shannon that Sylvie is clinically insane — far from it. But
I think she is pitiably alone, and that, hard as she may have tried,
she couldn’t really save herself or Ruthie. All she could do was
to keep the last remnants of her family together, if not house-keeping,
at least maintaining the essential bond.
Years after Helen has been cast out of the family, she returns to the
scene to cast her own children into what will be a “mourning that
will not be comforted.”
If Rod Serling were alive and writing episodes for “The Twilight Zone,” odds are he would have leaped on the true story of Anne Adams, a Canadian scientist turned artist who died of a rare brain disease last year.
Trained in mathematics, chemistry and biology, Dr. Adams left her career as a teacher and bench scientist in 1986 to take care of a son who had been seriously injured in a car accident and was not expected to live. But the young man made a miraculous recovery. After seven weeks, he threw away his crutches and went back to school.According her husband, Robert, Dr. Adams then decided to abandon science and take up art. She had dabbled with drawing when young, he said in a recent telephone interview, but now she had an intense all-or-nothing drive to paint.
Dr. Adams, who was also drawn to themes of repetition, painted one upright rectangular figure for each bar of “Bolero.” The figures are arranged in an orderly manner like the music, countered by a zigzag winding scheme, Dr. Miller said. The transformation of sound to visual form is clear and structured. Height corresponds to volume, shape to note quality and color to pitch. The colors remain unified until the surprise key change in bar 326 that is marked with a run of orange and pink figures that herald the conclusion.
Ravel and Dr. Adams were in the early stages of a rare disease called FTD, or frontotemporal dementia, when they were working, Ravel on “Bolero” and Dr. Adams on her painting of “Bolero,” Dr. Miller said. The disease apparently altered circuits in their brains, changing the connections between the front and back parts and resulting in a torrent of creativity.
The debate about the effectiveness and safety of psychiatric drugs rambles on while new (if not conclusive) psychological studies come out with the frequency of fad diets.
We invited some people who think a lot about such issues — David B. Baker, John Medina, Dan Ariely, Satoshi Kanazawa, Peter D. Kramer, and Laurie Schwartz — and asked them the following:
How much progress have psychology and psychiatry really made in the last century? Do we know enough about the human psyche to prescribe the medication that we do?
As a professor of computer sciences at Carnegie Mellon University, Randy F. Pausch expected students to pay attention to his lectures. He never expected that the rest of the world would listen, too.
But today, more than 10 million people have tuned into Dr. Pausch’s last lecture, a whimsical and poignant talk about Captain Kirk, zero gravity and achieving childhood dreams. The 70-minute talk, at www.cmu.edu/randyslecture, has been translated into seven languages, and this week Hyperion is publishing “The Last Lecture,” a book by Dr. Pausch and a collaborator, Jeff Zaslow, that tells the story behind the story of the lecture.
Downloads and transcripts available here ...
The Monty Hall Problem has struck again, and this time it’s not merely embarrassing mathematicians. If the calculations of a Yale economist are correct, there’s a sneaky logical fallacy in some of the most famous experiments in psychology.
The economist, M. Keith Chen, has challenged research into cognitive dissonance, including the 1956 experiment that first identified a remarkable ability of people to rationalize their choices. Dr. Chen says that choice rationalization could still turn out to be a real phenomenon, but he maintains that there’s a fatal flaw in the classic 1956 experiment and hundreds of similar ones. He says researchers have fallen for a version of what mathematicians call the Monty Hall Problem, in honor of the host of the old television show, “Let’s Make a Deal.”
Here’s how Monty’s deal works, in the math problem, anyway. (On the real show it was a bit messier.) He shows you three closed doors, with a car behind one and a goat behind each of the others. If you open the one with the car, you win it. You start by picking a door, but before it’s opened Monty will always open another door to reveal a goat. Then he’ll let you open either remaining door.
For half a century, experimenters have been using what’s called the free-choice paradigm to test our tendency to rationalize decisions. This tendency has been reported hundreds of times and detected even in animals. Last year I wrote a column about an experiment at Yale involving monkeys and M&Ms.
The Yale psychologists first measured monkeys’ preferences by observing how quickly each monkey sought out different colors of M&Ms. After identifying three colors preferred about equally by a monkey — say, red, blue and green — the researchers gave the monkey a choice between two of them.
If the monkey chose, say, red over blue, it was next given a choice between blue and green. Nearly two-thirds of the time it rejected blue in favor of green, which seemed to jibe with the theory of choice rationalization: Once we reject something, we tell ourselves we never liked it anyway (and thereby spare ourselves the painfully dissonant thought that we made the wrong choice).
But Dr. Chen says that the monkey’s distaste for blue can be completely explained with statistics alone. He says the psychologists wrongly assumed that the monkey began by valuing all three colors equally.
You May Be a Bit Histrionic... |
![]() And you'll do anything it takes to get noticed. You love to be seductive, even when it's inappropriate. If you're ignored, you're easily hurt ... and act out even more! |