Google

Asian University Presents Psychological Perspectives

"Asian University Presents Psychological Perspectives" is a weekly column appearing in the English language newspaper The Pattaya Mail, Pattaya, Thailand.

My Photo
Name:
Location: Baghdad, Iraq

Friday, July 29, 2005

Threats to freedom from within

Freedom is a precious commodity. Those of us fortunate enough to have lived our lives in relatively free, democratic societies might take freedom for granted. Perhaps we feel secure that threats to freedom will remain in check, protected by our values and democratic institutions. It may be difficult to imagine that our most valued rights and freedoms could suddenly be taken from us by an authoritarian regime.

Perhaps we think of fascism as being restricted to remote lands, or as relegated to the pages of history. Yet, a body of research in the social sciences suggests that fascism is never far from us. There is persuasive evidence that the potential exists for totalitarian rule to arise even in the most democratic societies on earth. This potential lies within an attitude, a personality trait that is identifiable and measurable. Like the aliens in H.G. Wells’s “The War of the Worlds,” this potential exists among us, dormant, ready to spring forth when the conditions are ripe.

In his 1996 book, The Authoritarian Specter, psychologist Robert Altemeyer summarized the then current state of the research on “right-wing authoritarianism” (RWA), a label used by Altemeyer and other theorists to describe the conjunction of three clusters of attitudes in a given person. According to Altemeyer, RWA consists of:

“1. Authoritarian submission – a high degree of submission to the authorities who are perceived to be established and legitimate in the society in which one lives.

“2. Authoritarian aggression – a general aggressiveness, directed against various persons, that is perceived to be sanctioned by established authorities.

“3. Conventionalism – a high degree of adherence to the social conventions that are perceived to be endorsed by society and its established authorities.”

Respect for authority, of course, is a quality we generally consider as innocuous, even beneficial. It is something we, as parents, like to instill in our children, and teachers like to receive from our students. It contributes to a pleasing community atmosphere. It greases the skids of social interaction.

Similarly, obedience to authority can be viewed as an essential requirement of social life. It is what keeps our complex societies from collapsing into anarchy. On a local level, respect and obedience to authority are seemingly intractable parts of the cultural fabric of Thai society, and promoted through a variety of local practices and rituals.

Unfortunately, authorities cannot always be trusted to make decisions that are just and humane. This fact seems to have escaped notice by the High RWA individual, who generally accepts the statements and actions of established authorities unquestioningly. Yet, between 1933 and 1945 millions of people were systematically exterminated in response to the commands of legal authorities. Thousands of participants from every walk of life were required to carry out the horrors of the Holocaust.

There are, unfortunately, many other examples that could be cited, of authority gone seriously awry. Can obedience to authority be defended under such circumstances?

The notion that these inhuman acts were carried out by monsters was dispelled by the pioneering work of the psychologist Stanley Milgram, in his 1974 book, Obedience to Authority. In laboratory studies, he showed how easy it is for a sanctioned authority to obtain the cooperation of normal subjects in delivering what they believed were excruciatingly painful, and even life-threatening electrical shocks to others.

Not all subjects in Milgram’s experiments obeyed orders with equal ease. Results showed, however, how certain elements of the situation could be arranged to either increase or decrease the chances of eliciting subjects’ obedience. The fact that normal subjects varied in their willingness to obey orders to inflict pain upon others lends support to the idea of a personality trait that varies within the normal population.

One need not look farther than today’s headlines to see evidence of these dangerous patterns emerging within ostensibly free societies. The U.S. House of Representatives voted overwhelmingly to renew provisions of the so-called “Patriot Act” which seriously curtails rights and freedoms of American citizens in the name of combating terrorism. British police have “shoot to kill” orders for terror suspects. According to Human Rights Watch, police in Nigeria, where military dictatorship ended six years ago, routinely engage in torture and killings of criminal suspects. Police tactics reportedly include the use of electrical shocks and the rape of women to extract confessions.

Here in the Kingdom, government officials are accused of “stealing” water intended for local and agricultural use to benefit private industry. Government authorities recently instituted a controversial executive decree which grants immunity to law enforcement officials who commit criminal offenses in the line of duty, and increases government control of the media.

It is vital to our free societies that we learn about the threats to our freedom represented by High RWAs. Familiarity with the work of psychologists like Stanley Milgram and Bob Altemeyer is highly recommended for anyone who holds freedom dear.

Saturday, July 23, 2005

Public policies on counter-terrorism need to be informed by research

With the recent series of attacks on the public transit system in London, together with similar events, you may be wondering what, if anything, can be done to protect our societies from the increasing threat of suicide terrorism. The answer to this question may lie in a growing body of knowledge produced by the social sciences.

The increased use of suicide bombers in terrorist attacks in recent years is a cause for particular concern. The method, timing, and location of these attacks seem to be chosen based upon their potential to produce maximum casualties and disruption of community life. To make matters worse, terrorists have undoubtedly set their sights upon acquiring the most dangerous and deadly weapons known to man. Despite increased awareness of the threat since 9-11, and a palpably heightened level of security, authorities seem largely incapable of preventing someone, willing to sacrifice his or her life for a cause, from carrying out a deadly attack against hapless civilians.

Following the London attacks, politicians could again be heard making emotional speeches vowing to defeat the terrorists, and policy makers could be seen scrambling to enact new laws intended to increase security. In the midst of these events, one can’t help wondering, however, to what degree those officials have fully grasped the nature of the problem. To what extent have they considered the most effective means of tackling it?

Politicians, for example, often portray suicide bombers, as evil, inhuman, mentally deranged public misfits. President Bush has proposed that supporters of terrorism “hate our freedoms.” Some have suggested that conditions of extreme poverty, illiteracy, and anarchy promote terrorism. It is assumed that policies to defeat terrorism proceed from such assumptions. Research on these issues, however, does not suggest that suicide attackers are necessarily crazier than the average person, nor are they particularly ignorant, poor, and uneducated as a group. Moreover, surveys consistently show that those who support suicide terrorism and bin Laden, nevertheless, value democracy and the freedoms that go with it.

There is a danger that well intentioned, yet poorly informed steps taken to combat terrorism might unintentionally exacerbate the problem by increasing extremist sentiment, or facilitate the recruitment of moderates by terrorist organizations. The war in Iraq, for example, was launched for the expressed purpose of reducing the risk that weapons of mass destruction developed by Saddam’s regime might fall into the hands of terrorist organizations. The invasion was obviously successful in toppling Saddam’s corrupt regime, and did lead to seemingly democratic elections. Nevertheless, the current situation with a growing military insurgency and increasing casualties is, to put it delicately, not looking good.

The open-ended occupation of Iraq by American troops is clearly an irritant to Islamic militants and Iraqi nationalists. The American CIA recently issued a report identifying post-invasion Iraq as a potent new training ground for extremists from around the world, providing them with practical experience in kidnappings, assassinations, car bombings, and other methods of urban warfare. The report warns that militants could later export these methods to other trouble spots around the world. It seems unlikely that those involved in planning the American-led invasion intended this result. Academics and other experts knowledgeable about Middle East politics, however, had predicted as much.

Another related concern is that poorly informed public policies intended to reinforce the fight against terrorism could unnecessarily erode the civil liberties of free nations. When government agencies are given greater power to collect surveillance information on its citizens, hold suspects under conditions in which their basic rights are denied, and use threatening and coercive methods to extract information from witnesses and suspects without due process, we are justified in being concerned that our most cherished values have been compromised.

Given the importance of implementing effective counter-terrorism measures and avoiding the unnecessary curtailment of individual liberties, one would hope that public policies would be informed by empirical research into the problem. One gets the impression, however, that public officials often respond to terrorism in a “knee-jerk” manner, or with an eye to public opinion, which itself may be misinformed on our current state of knowledge about terrorism.

Our understanding of terrorism has grown tremendously in recent years, thanks to numerous empirical investigations. An excellent example of this work is the interesting analysis and recommendations provided by anthropologist Scott Atran, “Mishandling Suicide Terrorism,” available for download at http://www.twq.com/04summer/docs/04summer_atran.pdf.

It is time for policy makers to become more informed by our current state of knowledge concerning the nature and causes of terrorism. I believe psychologists and other social scientists have an important role to play in this endeavor.

Friday, July 15, 2005

Psychological theory applied in filmmaking and community awareness campaigns

Thabo, Thabiso and Moalosi are three energetic and engaging young men, who travel to communities throughout the countryside of Lesotho, a mountainous enclave of South Africa. They carry with them a mobile cinema unit. In these remote communities where a third of the population is HIV+, they screen an HIV awareness film featuring themselves as the main characters. They discuss openly with their audiences their own struggles with HIV and public acceptance.

Young women they meet along the way find the men irresistible. They are, after all, movie stars!

The three friends are featured in the documentary, “Ask Me I’m Positive.” The premise of the film is that a person who is infected by the AIDS virus need not live a life of secrecy. There is no shame in being HIV+.

Last year people from around the world gathered in Bangkok to view “Ask Me I’m Positive,” along with over 50 other films offered during the 2004 AIDS Film Festival, organized by the XV International AIDS Conference. Many of these films featured real or fictional characters depicted as dealing courageously and effectively with problems brought about by HIV and AIDS.

The power of such films to educate and change people’s attitudes and behavior is based upon a psychological theory advanced during the 1960s by a remarkable psychologist named Albert Bandura. His revolutionary idea, known as “social cognitive theory,” has found applications, not only in the more traditional clinical settings, but also in campaigns to increase literacy, reduce the stigma of HIV/AIDS, reduce unwanted pregnancies, promote environmental responsibility, and empower women in male-dominated societies.

At first glance, Bandura’s insight may seem obvious: People can learn through observing the experiences of others. However, this notion was in stark contrast to the ideas that were in vogue in psychology when it first appeared, those promoted by strict behaviorism.

The behaviorists, led by the prolific experimentalist B.F. Skinner, taught that all learning is based upon the individual’s direct experience of the consequences of his or her own behavior. According to this view, when one’s behavior is followed by consequences he considers favorable, he tends to repeat it. Conversely, when rewards are not forthcoming, or if desired conditions are removed as a consequence of an act, that act tends to occur progressively less frequently.

The behaviorists could not bring themselves to acknowledge the importance of learning by observing the experiences of others. That’s because learning by observation suggests that mental processes were somehow involved in the learning process. The behaviorists were committed to the doctrine that only observable events like behavior could form the basis of a scientific psychology. The admission of unobservable mental or “cognitive” elements into psychology seemed to many a step backwards from the establishment of a truly objective behavioral science.

Bandura’s earliest experiments on observational learning demonstrated how children, allowed to view videotapes of others behaving aggressively and without restraint, subsequently showed more aggressive behavior. By contrast, children exposed to videotapes lacking in displays of aggression, exhibited no such increase in aggression.

It stands to reason that if children can be influenced to behave badly by observing others, so could they be influenced favorably by observing attractive role models behaving in such a way. And not just children, but adults, too, would seem subject to the principles of observational learning. The research has supported this conclusion, and forms the basis for a new genre of filmmaking exemplified by “Ask Me I’m Positive.”

The characters presented in these films are ordinary people, folks with whom the audience can easily identify. Story lines are typically compelling. The challenges faced are those encountered by ordinary people. Positive role models exhibit behavior that has favorable consequences. Negative role models suffer unpleasant consequences for their mistakes. Lessons are learned, not through wordy speeches and sermons, but through credible actions and consequences - the experiences of the characters.

Studies have shown that attitudes in communities where films like “Ask Me I’m Positive” are shown do change in the desired direction. Misinformation concerning HIV/AIDS in parts of Tanzania, for example, began to evaporate following the airing of a 1993 radio drama called, “Twende na Wakati,” or “Let's Go with the Times,” aimed at increasing AIDS awareness. Researchers believe the popular radio drama was instrumental in producing the changes.

It is gratifying to see research in psychology put to such practical use. Successes like these can point the way to new and innovative applications of psychological ideas like Bandura’s social cognitive theory, aimed at improving public awareness around important issues of public concern.

Tuesday, July 12, 2005

Why do we fail to act in an emergency?

Scientists are in agreement. Our planet is getting hotter.

Global warming, of course, is not exactly breaking news. Stories about climate change have been around for many years. Nevertheless, we continue our daily activities with little thought of a looming disaster. What psychological explanation could account for our complacency in the face of predicted irreversible global calamity?

Although the Earth’s temperature has fluctuated naturally over the past 4.5 billion years, it has remained relatively stable since the end of the last ice age. Changes currently underway, however, provide a reason for particular concern. That is because they have occurred relatively suddenly, are accelerating rapidly, and appear to be driven largely by human activities since the industrial revolution.

According to scientists, the major factor responsible for producing today’s climate change is the burning of fossil fuels, gas to power our cars, and coal and oil to generate electricity. The byproducts of these activities, the so-called “greenhouse gasses” are emitted into the atmosphere, trapping light and heat from the sun which otherwise would be released back into space.

Scientists also agree that left unchecked, the consequences of global warming will have dramatic and far-reaching effects upon life on the planet, some of which are already being experienced. Experts indicate that warming has begun destroying ancient mountain glaciers, the source of water for millions of people. As the process continues, northern forests will shift further north. Melting polar icecaps will produce rising sea levels, flooding the world’s coastal areas, including our beloved Pattaya and Thailand’s eastern seaboard. The list of warming effects reads like a horror story.

As a solution, scientists have suggested reducing human consumption of fossil fuels, and protecting threatened forests that store carbon in their biomass. They warn, however, that the window of opportunity to take effective action will close, roughly within the next 17 years. Beyond that time, global warming effects may become irreversible.

The issue of climate change was a major item on the agenda of the G8 Summit last week in Scotland. All countries except for the United States have demonstrated a willingness to cooperate by signing the Kyoto Protocol agreement to reduce greenhouse gas emissions. Unfortunately, without the participation of the U.S., the world’s leading consumer of fossil fuels, efforts to halt global warming seem doomed to failure. Even with an agreement by the leaders of all nations, the willingness of the world’s people to take the tough steps necessary to avoid disaster seems highly questionable.

Humans have never before faced a problem of such global magnitude, one whose solution demands a concerted effort by so many around the world. Will we act in time to save our planet? What psychological process could explain our current inaction?

It could be an issue of “denial,” the human capacity to ignore or reject an unpleasant or unacceptable reality. Another possibility is a profound feeling of “helplessness,” the decision that a problem lies beyond our capability to solve or make a meaningful difference. Maybe it’s a case of “habituation:” we’ve gotten so used to hearing the warnings; they have become part of the background noise, no longer commanding our urgent attention.

Another explanation might rest on a theory from social psychology known as “diffusion of responsibility.” This notion is based upon the observation that often, when a large group of people witnesses an emergency in progress, no one provides assistance, perhaps assuming that others will. There are numerous anecdotal examples of this phenomenon, exemplified by the infamous Kitty Genovese rape and murder. This vicious attack took place on a populous city street before many onlookers, none of whom took immediate action to intervene or to summon help.

Laboratory investigations into diffusion of responsibility suggest that as the number of bystanders to an emergency increases, the likelihood that someone will intervene decreases. The idea is that with many observers present, the responsibility for taking action is shared, each person experiencing a diminishing portion of the total responsibility.

There is, nevertheless, evidence that people can and will take action, even at extreme personal sacrifice, during certain types of emergencies. Evidence of this appeared during the terrorist attacks upon New York, Madrid, and London, and in the aftermath of the Asian tsunami.

By contrast, global warming is an invisible problem, and thus, easy to ignore. The effects, so far, are subtle, and hard to detect without sophisticated scientific techniques of observation and analysis. Its victims are not bruised, bloodied, and crying for help on CNN.

It would be unfortunate if we must wait until global warming reaches a stage of vividness comparable to a devastating tsunami before taking action. Such delayed action, according to the experts, would likely be too little, and too late.

Friday, July 01, 2005

Researchers identify factors responsible for change in psychotherapy

Although still somewhat of a novelty in Thailand and other developing nations of the world, the use of psychological therapies to resolve interpersonal issues and alleviate emotional suffering has exploded in recent years. In 1996 researchers reported that the number of professional therapists had increased by 275% since the mid 1980s.

Today people in the developed nations of the world seeking help with mental health problems need not look far. Mental health services are currently provided by numerous categories of professionals, including psychologists, psychiatrists, psychoanalysts, social workers, licensed professional counselors, marriage and family counselors, psychiatric nurses, alcoholism counselors, addiction specialists, members of the clergy, and any number of others who profess competence in the area.

A similar trend has been seen in the growth of therapy models and techniques used to treat mental health problems. It has been estimated that there are more than 200 therapy models to choose from, each one claiming particular effectiveness and curative power embodied within their various rituals and procedures. Studying growth trends in psychotherapy techniques during the 1980s, researcher Sol Garfield predicted, “…that sometime in the next century there will be one form of psychotherapy for every adult in the Western World.”

Before long, the appearance of rival therapies generated an interest among researchers to identify the most effective therapies, and to weed our therapies demonstrating little or no utility. The ensuing 40+ years of therapy outcome research, taken as a whole, produced a very surprising conclusion: Although therapy clearly works, no specific system of therapy has emerged as consistently more effective than the others. Luborsky, Singer, & Luborsky dubbed this the “Dodo Bird Verdict” borrowed from Alice in Wonderland. “Everyone has won and all must have prizes.”

This finding eventually led to efforts to identify elements common to various forms of psychotherapy that might account for their general effectiveness. As early as 1936, psychologist Saul Rosenzweig suggested the notion that the effectiveness of different therapy approaches might be attributable to common elements, as opposed to their theoretical foundations.

In their recent book, The Heart and Soul of Change, building on the work of Dr. Michael Lambert of Brigham Young University, psychologists Mark Hubble, Barry Duncan and Scott Miller, identified four factors they believe to be the “active ingredients” of effective therapy. They consist of 1) client/extratherapeutic factors, 2) relationship factors, 3) placebo, hope, and expectancy, and 4) model/technique factors.

According to these investigators, characteristics of the client and his/her life circumstances are the primary contributors to the therapeutic outcome. Each client brings to therapy a number of personal resources, such as a sense of responsibility, persistence in the face of adversity, supportive family members, religious faith, an education and work history.

Chance factors that periodically affect clients’ life circumstances would also be included here, such as landing a new job, the establishment of new supportive friends, winning the lottery, and the readjustment of life stresses. According to Lambert, client and extratherapeutic factors are, by far, the most potent contributors to change in psychotherapy, accounting for 40% of the outcome variance.

Relationship factors are those that result from the quality of the relationship or “alliance” between the client and therapist. These are identified as “caring, empathy, warmth, acceptance, mutual affirmation, and encouragement of risk taking and mastery.” These factors are believed to account for 30% of the therapy outcome variance.

It is a widely known fact that the client’s hope and expectancy also contribute to favorable treatment outcomes. When a person enters treatment with a credible professional, he generally expects a favorable treatment outcome. Drug researchers, aware of this fact, routinely utilize inert substances, known as “placebos,” to control for such factors in clinical drug trials. This expectancy, in itself, can contribute to a patient’s feelings of relief, and a sense of well-being. Lambert places the contribution of these factors to treatment outcome at 15%.

Model/technique factors consist of the set of beliefs and procedures promoted by the particular treatment approach. Lambert suggested that these characteristics, like those of expectancy, account for only 15% of the outcome variance.

The surprising conclusion offered by this body of research and its implications have yet to be fully absorbed by professionals in the mental health field. Some theoretical camps continue to squabble about the differences among various techniques. Controversies over theoretical models seem minor, however, compared to the relative importance of the more salient common factors in affecting treatment outcomes. During the coming years it will be interesting to see to what degree practitioners modify the way in which they provide mental health services, in response to the compelling findings produced by these clinical investigations.