2.1 What Makes Psychological Research Scientific?

When we say that psychologists are scientists, we do not mean they work with complicated gadgets and machines (although some do). The scientific enterprise has more to do with attitudes and procedures than with apparatus (Stanovich, 2010). Here are a few key characteristics of the ideal scientist.

Precision and Reliance on Empirical Evidence

Scientists sometimes launch an investigation simply because of a hunch they have about some behavior. Often, however, they start out with a general theory, an organized system of assumptions and principles that purports to explain certain phenomena and how they are related. Many people misunderstand what scientists mean by a theory. A scientific theory is not just someone's personal opinion, as in “It's only a theory” or “I have a theory about why he told that lie.” Many scientific theories are tentative, pending more research, but others, such as the theory of evolution, are accepted by nearly all scientists.

From a theory, a psychological scientist derives a hypothesis, a statement that attempts to describe or explain a given behavior. Initially, this statement may be quite general, as in, say, “Misery loves company.” But before any research can be done, the hypothesis must be made more precise. “Misery loves company” might be rephrased as “People who are anxious about a threatening situation tend to seek out others facing the same threat.”

A hypothesis, in turn, leads to predictions about what will happen in a particular situation. In a prediction, terms such as anxiety or threatening situation are given operational definitions, which specify how the phenomena in question are to be observed and measured. “Anxiety” might be defined operationally as a score on an anxiety questionnaire, and “threatening situation” as the threat of an electric shock. The prediction might be, “If you raise people's anxiety scores by telling them they are going to receive electric shocks, and then give them the choice of waiting alone or with others who are in the same situation, they will be more likely to choose to wait with others than they would be if they were not anxious.” The prediction can then be tested using systematic methods.

Any theory, idea, or hunch may initially generate excitement because it is plausible or imaginative, but it must eventually be backed by empirical evidence—information that is observable and verifiable, gathered using the techniques of science. A collection of anecdotes or an appeal to authority will not do, nor will the intuitive appeal of the idea or its popularity. As Nobel Prize–winning scientist Peter Medawar (1979) once wrote, “The intensity of the conviction that a hypothesis is true has no bearing on whether it is true or not.” In 2011, Richard Muller, a prominent physicist who had doubted that global warming was occurring, made headlines when he reported, after a 2-year investigation, that temperatures really are rising, and the following year, that human beings are a large part of the reason. Muller had been funded in large measure by two conservative oil billionaires who did not welcome his results. But Muller let the evidence trump politics, as a scientist should.

Figure2.1 illustrates the process of moving from a theory to evidence and back again, which is the central process of all sciences.

Figure2.1

The Cycle of Scientific Research

“Doing science” involves many interactive elements. Theories allow a researcher to derive testable hypotheses, and make predictions about the pattern of results that should occur. Hypotheses are tested empirically by gathering data on operationally defined variables. By examining the evidence, modifications, extensions, and revisions to the theory can take place, thereby generating new hypotheses and continuing the cycle of research investigation.

Skepticism

Scientists do not accept ideas on faith or authority; their motto is “Show me!” Some of the greatest scientific advances have been made by those who dared to doubt what everyone else assumed to be true: that the sun revolves around the earth, that illness can be cured by applying leeches to the skin, that madness is a sign of demonic possession. In the world of science, skepticism means treating conclusions, both new and old, with caution. Don’t take our word for it; the video Thinking Critically 1 provides more insights about the value of being skeptical.

Watch

Thinking Critically 1

Thus, in the case of facilitated communication, psychological scientists did not simply say, “Wow, what an interesting way to help autistic kids.” Rather than accept testimonials about the method's effectiveness, they have conducted experiments involving hundreds of autistic children and their facilitators (Romanczyk et al., 2003). Their techniques have been simple: They have the child identify a picture but show the facilitator a different picture or no picture at all; or they keep the facilitator from hearing the questions being put to the child. Under these conditions, the child types only what the facilitator sees or hears, not what the child sees or hears. This research shows that what happens in facilitated communication is exactly what happens when a medium guides a person's hand over a Ouija board to help the person receive “messages” from a “spirit”: The person doing the “facilitating” unconsciously nudges the other person's hand in the desired direction, remaining unaware of having influenced the responses produced (Wegner, Fuller, & Sparrow, 2003). In other words, facilitated communication is really facilitator communication (Schlosser et al., 2014). This finding is vitally important because if parents waste their time and money on a treatment that doesn't work, they may never get genuine help for their children, and they will suffer when their false hopes are finally shattered by reality.

“Skepticism” is not simply about debunking some claim, but showing why the claim is invalid—so that better methods can replace it. Skepticism and caution, however, must be balanced by openness to new ideas and evidence. Otherwise, a scientist may wind up as shortsighted as the famous physicist Lord Kelvin, who reputedly declared with great confidence at the end of the 19th century that radio had no future, X-rays were a hoax, and “heavier-than-air flying machines” were impossible.

Willingness to Make “Risky Predictions”

A reliance on empirical evidence and a sense of skepticism are important characteristics of scientists. A related principle is that scientists must state an idea in such a way that it can be refuted, or disproved by counterevidence. This important rule, known as the principle of falsifiability, does not mean that the idea will be disproved, only that it could be if contrary evidence were to be discovered. In other words, a scientist must risk disconfirmation by predicting not only what will happen but also what will not happen. In the “misery loves company” study, the hypothesis would be supported if most anxious people sought each other out, but would be disconfirmed if most anxious people went off alone to sulk and worry, or if anxiety had no effect on their behavior (see Figure2.2). A willingness to risk disconfirmation forces the scientist to take negative evidence seriously and to abandon mistaken hypotheses.

Figure 2.2

The Principle of Falsifiability

The scientific method requires researchers to expose their ideas to the possibility of counterevidence. Examining the outcomes of a simple study testing the idea that “misery loves company” would allow a researcher to either support or refute that hypothesis.

The principle of falsifiability is often violated in everyday life because all of us are vulnerable to the confirmation bias: the tendency to look for and accept evidence that supports our pet theories and assumptions and to ignore or reject evidence that contradicts our beliefs. If a police interrogator is convinced of a suspect's guilt, he or she may interpret anything the suspect says, even the person's maintenance of innocence, as confirming evidence that the suspect is guilty (“Of course he says he's innocent; he's a liar”). But what if the suspect is innocent? The principle of falsifiability compels scientists—and the rest of us—to resist the confirmation bias and to consider counterevidence. Learn more about the benefits of challenging your assumptions by watching the video Thinking Critically 2.

Watch

Thinking Critically 2

Openness

Science depends on the free flow of ideas and full disclosure of the procedures used in a study. Secrecy is a big “no-no”; scientists must be willing to tell others where they got their ideas, how they tested them, and what the results were. They must do this clearly and in detail so that other scientists can repeat, or replicate, their studies and verify—or challenge—the findings. Replication is an essential part of the scientific process because sometimes what seems to be a fabulous phenomenon turns out to be only a fluke.

If you think about it, you will see that these principles of good science correspond to the critical thinking. Formulating a prediction with operational definitions corresponds to “define your terms.” Reliance on empirical evidence helps scientists avoid the temptation to oversimplify. Openness to new ideas encourages scientists to “ask questions” and “consider other interpretations.” The principle of falsifiability forces scientists to “analyze assumptions and biases” in a fair-minded fashion. And until their results have been replicated and verified, scientists must “tolerate uncertainty.”

Do psychologists and other scientists always live up to these lofty standards? Not always. Being only human, they may put too much trust in their personal experiences, be biased by a conflict of interest when they are funded by private industry, or permit ambition to interfere with openness. Like everyone else, they may find it hard to admit that the evidence does not support their hypothesis; it is far easier to be skeptical about someone else's ideas than about your own (Tavris & Aronson, 2007).

Commitment to one's theories is not in itself a bad thing. Passion is the fuel of progress. It motivates researchers to think boldly and do the exhaustive testing that is often required to support an idea. But passion can also cloud perceptions, causing scientists to misinterpret their own data to confirm what they want to see. Other scientists, motivated by the desire for fame, discovery, or fortune, have even resorted to plagiarism, faking their data, and deceptive methods. That is why science is a communal activity. Scientists are expected to submit their results to professional journals, which send the findings to experts in the field for evaluation before deciding whether to publish them. This process, called peer review, is an effort to ensure that the work lives up to accepted scientific standards. Peer review and scientific publication are supposed to precede announcements to the public through press releases, Internet postings, or popular books. The research community acts as a jury, scrutinizing and sifting the evidence, judging its integrity, approving some viewpoints, and relegating others to the scientific scrap heap.

The peer-review process is not perfect, but it does give science a built-in system of checks and balances. Individuals are not necessarily objective, honest, or rational, but science forces them to subject their findings to scrutiny and to justify their claims.

Journal: Thinking Critically-Examine the Evidence
A lighthearted article in the New England Journal of Medicine noted that there is a strong relationship between the per capita consumption of chocolate in a given nation and the number of Nobel laureates produced by that nation (Messerli, 2012). One conclusion readily presents itself: People who consume more chocolate are smarter, and therefore national populations that consume more chocolate produce more Nobel laureates (who, by definition, need to be pretty smart). But how else could this relationship be interpreted? Be precise, be skeptical, look at the evidence, and offer some alternative interpretations