In this draft of an article that appeared in an American Medical Association publication, "Prism," Luria suggested
that a pathology existed in the personality of a researcher who would cheat in science. Because science is marked by external
and internal control measures which continually demand verification, he noted that only a distorted sense of reality could
account for someone who would falsify or fabricate results.
Item is a photocopy.
Number of Image Pages:
13 (958,202 Bytes)
Luria, Salvador E.
Original Repository: American Philosophical Society. Library. Salvador Luria Papers
Reproduced with permission of Daniel D. Luria.
Reproduced with permission of the American Philosophical Society.
Medical Subject Headings (MeSH):
Later Career: Teacher and Administrator, 1972-1991
Every profession has its Watergates. There are scandals in science as there are scandals in politics. But the scandals in
science -- the uncovering of falsification of scientific findings -- are different from those in politics. Like scandals
in the banking world, the scandals in science resemble scandals in the religious world: they have a quality of desecration
and violate public expectations. For different reasons, scientists and bank managers are expected to be honest. What would
happen to our lives if the factual knowledge provided by science were revealed not to be true knowledge? Or if bank accounts
were found to be subject to willful manipulation?
Public expectation is in fact well founded, with respect both to banking and to science. Scandals in either field are rare
and integrity is presumably the norm. In both fields integrity is guaranteed in the long run by the internal structure of
the operations. Sooner or later the accounts of a bank must tally -- at least we members of the general public are naively
confident that it is so. Likewise, scientific findings in order to be valid and become part of the structure of science must
be repeatable and must mesh with the body of information already available -- confirming it or disproving it, but always in
a rationally interpretable way.
How does the integrity mechanism work in scientific research? (I shall say no more about banking; I may already have been
too optimistic in my remarks). The operations of research are relatively simple: thinking about a problem, performing experiments
or observations, and reporting results, conclusions and theories. At each level of these operations controls are needed and
these controls are both internal and external.
As far as experimentation itself is concerned, the most important part of the training of scientists is to acquire the habit
and the insight to introduce into their protocols the necessary controls in order to exclude as many accidental factors as
possible. This is the internal control mechanism. Even more important are the external controls: other scientists must
be able to repeat the findings. If findings are important, they will generally be verified within a matter of months or days.
This accountability to verification is the guardian of scientific integrity. I shall return to this point further on.
Reporting the results of experiments is also in need of controls. For the serious scientist the selecting of data for publication
is always an upsetting operation. If he publishes the "best" experimental results, that is, those that fit a certain
hypothesis most closely, he will add a caution about the extent of variation from one experiment to another. Leaving out
any data that are inexplicably in conflict with the published ones or with the proposed interpretation is anathema.
Finally, there must be controls at the level of theory. Scientific thinking at its best is a form of creative imagination.
It leaps from a set of known facts and new findings, not only to an integrated explanation, but to a synthesis that envisages
a new or more consistent structure of a field and, therefore, predicts novel findings that will in turn be testable. The
most important control mechanism is in the relation of theory to facts: theory must account for all well established facts
and its predictions must be subject to critical tests that can potentially disprove it. An open-ended, undisprovable theory
is no scientific theory at all. In this sense, for example, scientific theory differs from such hypothetical structures as
Freud's psychoanalytic theory. No matter how much illumination psychoanalytic theory may throw on certain aspects of
human behavior, it cannot be considered a scientific theory in the same sense as the theory of relativity or the gene theory
or the theory of evolution: it is not clear that it could be disproved by any critical set of tests.
In science as in any other field trouble begins when the controls fail. At the level of theory, creative imagination may
give way to wishful thinking. In publication, faithful reporting may be replaced by selective reporting. In experimentation
itself, careful skepticism may yield to self-delusion and, at worst, to manipulation of data and outright cheating. I doubt
that any scientist ever set out to plan and build a structure of deceptions, in the way a dishonest cashier may plan a consistent
pattern of falsification in his accounts. Rather, what may be involved is a developmental process, not unlike that of some
degenerative diseases, in which an initial error of function generates a response that accentuates the damage to the tissue.
In other words, cheating may be like a cirrhosis of the integrity: an initial slip made in good faith gives rise to a reaction
that magnifies the emotional commitment to the mistaken belief, finally leading to the actual destruction of the truth.
Cheating in science is admittedly so rare, or at least so rarely discovered, that it is difficult to attempt any generalizations.
The important cases can be counted on one's fingers. Each one of them becomes a cause celebre. The actual incidents
seem to have all involved a progress from self-delusion to willful distortion -- the developmental process outlined above
-- rather than a process of deliberate planned fabrication. Some classical examples may serve to illustrate this point, even
though in no case have all facts been published by unbiased reporters.
Paul Kammerer was an Austrian zoologist who, like many other embryologists in the first decades of this century, questioned
the dominant genetic theory, which denies the inheritance of characteristics environmentally acquired by individual organisms
(as distinct from characteristics brought forth by natural selection). In newts-and toads and other animals, Kammerer began
to find, or at least to report, all sorts of instances of inheritance of characteristics produced by external manipulations.
Distrust apparently arose even before others tried unsuccessfully to reproduce his findings. What made people uncomfortable
from the start was probably not any entrenched orthodoxy, as has been argued by Kommerer's defenders. It was the fact
that Kammerer had set out, not to test a theory but to prove it. The end was tragic. Two respectable investigators gained
access to the only specimen available to support Kammerer's claims and found that this specimen, a toad, had been doctored
by injection of India ink where a supposedly inherited dark pigmentation was claimed to be present. A few weeks later Kammerer
shot himself, in a rather romantic fashion that jibed with a romantic streak apparent in his personal life.
Almost fifty years separate Kammerer's case from a somewhat similar one that made headlines two years ago. An American
medical scientist having reported the surprising and, if true, extremely important finding that cold-stored skin and other
organs could be transplanted between genetically and immunologically incompatible individuals, was found to have doctored
with black paint the skin of some of his experimental animals. Upon which, he retracted some of his claims to discovery,
pleading, among other extenuating circumstances, the pressure to produce results supposedly present in his institution. (His
claims, however, had originated several years before he joined that institution).
In the half century between these two well publicized events there must have been several that did not reach public notice.
I know of at least two cases in which highly respected scientists had to retract findings that had been reported from their
laboratories because they discovered that these findings had been manufactured by some collaborator.
I already mentioned the controls that operate within the practice of research and are certainly responsible for the apparent
rarity of cheating: most important; the awareness that valid findings must be verifiable and are going to be verified, the
sooner the more significant or unexpected they are. More interesting from the human and social viewpoints is the question,
What goes into making a scientific cheater? Which elements of personality or of society contribute to produce this phenomenon?
A search for the answers is bound to be elusive because of the small size of the sample. Yet this very fact -- that many
do research, yet few cheat -- immediately serves to discount any dominant role of certain social factors that are often cited:
pressure to produce results, competition with other scientists, ambition for advancement. If these were determinant rather
than contributory factors, the number of deviants would evidently be much greater.
The answer I wish to propose is that cheating in science is the manifestation of a peculiar type of pathologic personality,
whose closest analog may be the personality of the compulsive gambler. Given the existence and effectiveness of controls
-- the demand for verifiability of findings -- an illusion to get away with false or doctored data presumably requires a distorted
sense of reality. As in gambling, there must be a delusion of one's ability to beat the odds, even a belief that one's
will may force reality to turn out to be as one wants. Such delusions can stem from an initial good-faith hope to be on the
right track, just as a gambling compulsion may arise from enjoyment of the game (or a disastrous deficit in the books of a
bank manager may develop from an "innocent" unauthorized loan). But the subsequent stages of the process are compulsive
and abnormal: career and self-respect become sacrificed, or rather gambled on a single compulsive commitment to a scientific
claim. In Dostoyevsky's famous story The Gambler the author, himself a compulsive gambler, has portrayed in a compelling
way this behavior and the corresponding personality, including the block to respond to normal emotional impulses.
It is not difficult to visualize how this process can start. Scientists are quite as human as everyone else. We all are
subject to the excitement of discovery, to the inner drive of wanting to believe a questionable result, to the pressure of
competition, and to the desire for professional success. These inner pressures are part of the functioning of a scientist
in his laboratory. But scientists internalize, as a matter of training, the realization that all findings are subject to
verification and, more important, that only true reproducible findings are part of the body of science. To contribute deliberately
a false finding would be like a musician playing deliberately a false note in the midst of a Beethoven sonata, or a printer
inserting a dirty word into the Lord's Prayer. I use these examples advisedly: in each of them an element of desecration
is coupled with the certainty of being found out. Cheating, therefore, means a failure to understand, or rather to internalize
the austere rules of the game. Whatever the emotional drives that impel the cheating scientist, maybe, there must be a separation
between emotion and understanding.
There is one emotional element that scientists must continuously guard against: this is, surprisingly, enthusiasm. Enthusiasm
for science, for its intellectual constructions, for the power it gives mankind over the forces of nature is; of course, a
valuable and even essential component of the scientific personality. But enthusiasm should bring the scientist to the door
of the laboratory and then be left outside, together with umbrella and overshoes. Within the laboratory the password must
be skepticism. The first reaction of most scientists, including myself, to a student announcing a novel finding is invariably
"What did you do wrong?" Enthusiasm -- yielding emotionally to the excitement of discovery -- means the risk of uncritically
"believing" a finding or a theory, of investing emotionally in it and, at worst, trying to protect it from disproof.
As Karl Popper has demonstrated, disproof is the sole valid method of scientific research. Belief must never go beyond the
conviction that a finding or a theory is worth subjecting to further tests. Repetition is but an attempt to disprove an experimental
finding. Devising critical tests is an attempt to disprove a theory. Science consists of facts and theories that have, up
to now, withstood the Popperian challenge. The trap is in the emotional commitment to an unchallenged statement of fact or
of theory. The mirage of being privy to a new truth may, when this truth melts away upon re-examination, lead to a conscious
distortion of the truth. Trapped by his own delusion and assertions as they progressively become lies, the unhappy scientist
comes ultimately to the manufacturing of untruth -- an action doomed to exposure and-disgrace. Such pushing of one's
commitment to what one had believed to be true fact or a valid theory to the point of cheating is certainly more frequent
than the outright dishonesty -- the deliberate manufacturing of a scientific "finding" from scratch. Both, however,
are likely to be manifestation of some derangement of personality that blurs the awareness of the futility of the cheating
Are we to conclude, therefore, that the integrity of science is guaranteed, like that of bank managers, only by the controls
intrinsic in the operation of the system? Are scientists not more honest than bankers or dentists? Some writers, Jacob Bronowski
for example, have speculated that the habit of truth enforced by the internal structure of science may make scientists more
reliable in other respects as well. There is no evidence for or against this. We do not know whether scientists report their
income more faithfully, or lead a more faithful married life than average persons of similar background and social status.
Like everyone else, scientists are subject to the temptations of the society of which they are part. Thus a Lysenko can yield
to the lure of Soviet political power and propagandize a false genetic theory and many followers may join him in his opportunistic
path. Fortunately, in capitalist society great economic power is not so readily available to scientists to lure them into
faking discoveries: fake discoveries do not pay except in trouble. This does not mean that the ethos of our competitive,
gain-oriented society is without influence on the practice of science. But its impact is exerted more on the choice that
people make of research discipline and research projects than on the integrity with which they perform.
There is current in our society today, an anti-rationalist viewpoint, which considers science as a self-serving orthodoxy
and scientists as hypocrites. In a subtle way, this viewpoint does represent a threat to the integrity of science insofar
as it tends to depreciate the value of rigorous, scientific thinking and to extoll the value of an intuitive approach to the
world of reality. With Theodore Roszak and Charles Reich, this current of thought welcomes the subjective as superior to
the objective. With R. D. Laing, it praises insanity above sanity. The resulting confusion would probably by-pass the practice
of science were it not that it influences a substantial fraction of the young generation of beginning scientists. Will this
confusion provide bridges between sound observation and wild belief, between the recognition of the laws of nature and the
illusion of mental power over nature?
In one respect we see already the impact of anti-rationalism and scientific integrity. The addicts to this current of thought
have a romantic view of the scientific faker as hero. A remarkable example is in Arthur Koestler's book The Midwife Toad,
which purports to rehabilitate Kammerer, the embryologist with the doctored toad, into the victim of a conspiracy by geneticists.
A reading of Koestler's book reinforces indirectly the conclusion I tentatively drew above, that cheating in science is
the expression of a distorted personality. For, in defending Kammerer, Koestler himself indulges in such consistent and unabashed
intellectual cheating that any pretense of credibility is abandoned. (1) Koestler's book is as much an example of the
pathological commitment to a screwball thesis as the cheating operations of a few unfortunate scientists are evidence of the
desperate plight to which their distorted personalities have brought them. For both the data-falsifying scientist and his
self-appointed champion, the key element appears to be willful neglect or ignorance of the structure of science - in fact,
a challenge to science as an intellectual discipline. The champion of the deviant scientist is by far the more dangerous
of the two since his defense distorts, not just one set of findings, but the entire scientific undertaking and there by obscures
the public understanding of science.
1. For example, Koestler implies that Mendel's laws, the basic rules of genetics, are as invalid as Kammerer's results
because of the well-known suspicion that Mendel, in his 1866 paper, may have selected those results that best filled his theory.
As if literally thousands of experiments on hundreds of organisms had not since then confirmed Mendel's laws!
There is one area, not belonging to science but often claimed to be part of it, in which the question of integrity in research
receives some further illumination. I refer to the so-called field of para-psychology, including the investigation of an
entire series of non-facts and non-phenomena: extra-sensory perception or ESP, telepathy, and the like. Here self-delusion
and cheating are not at the two ends of a tragic chain of events as in the falsification of scientific data. Rather I suspect
that self-delusion and cheating, in about equal proportion, constitute the entire content of this field. Here, of course,
cheating is aimed at exploitation of the gullible rather than at achievement of professional success. But the situation is
instructive precisely because it is outside the domain of science. A claim is made that a phenomenon has been observed whose
existence is not readily reconciled with the existing body of knowledge. Moreover, the phenomenon cannot be reproduced at
will under controlled circumstances. The proponents and their self-deluded supporters insist that the burden of disproof
is on the skeptics and that, disproof not forthcoming, they are entitled to funds and facilities for further "research".
In fields like these no proof is needed, only the will to believe: there is no attempt to disprove a hypothesis, only the
resolve to confirm one's fantasies. This is a different kind of deviance. The ESP proponents are not unfaithful to the
integrity of their profession in the way cheating scientists are: they are in a sense in the mainstream of a disreputable
profession. The cheating scientist may be betrayed by his will to believe a finding that might have been true. The para-psychologist,
if honest, is betrayed by a lack of understanding of the basic principles of science.
The rewards that society offers to the successful scientist are many and sweet, at least by moderate standards. But they
are in no way forceful enough to lead the normal person astray. Yet, it is reasonable to be aware of the dangers that arise
as scientists become more and more coupled with the world of high reward and low controls. The jet-set scientist may be more
likely to fall into temptation than the bench-bound laboratory worker. Yet, the greatest danger to the integrity of science,
at least at the level discussed in this paper, is likely to come from the weakening of the spirit of rationality from the
anti-scientific attitude manifest in many aspects of our culture, an attitude whose roots are in social malaise rather than
in ignorance. It would be disastrous, for both science and society, if contempt for factual truth and reliance on subjective
intuition should infiltrate the edifice of science and play havoc with the scientific enterprise.