Reasoning Strategies in Molecular Biology:

Abstractions, Scans, and Anomalies

Lindley Darden and Michael Cook

University of Maryland at College Park and Rockefeller University

Abstract

Molecular biologists use different kinds of reasoning strategies for different tasks, such as hypothesis formation, experimental design, and anomaly resolution. More specifically, the reasoning strategies discussed in this paper may be characterized as (1) abstraction-instantiation, in which an abstract skeletal model is instantiated to produce an experimental system; (2) the systematic scan, in which alternative hypotheses are systematically generated; and (3) modular anomaly resolution, in which components of a model are stated explicitly and methodically changed to generate alternative changes to resolve an anomaly. This work grew out of close observation over a period of six months of an actively functioning molecular genetics laboratory.

1. Introduction

In the spring of 1994, Lindley Darden spent five months as a visitor in Joshua Lederberg's Laboratory for Molecular Genetics and Informatics (MGI) at Rockefeller University. Michael Cook is a Research Associate in the MGI Lab. This paper discusses reasoning strategies at use in that lab. In Part I, Lindley Darden recounts her experiences in the lab and discusses reasoning strategies that she observed in use there. In Part II, Michael Cook presents a technique of methodical hypothesis generation in the light of an anomaly for a model.

Part I by Lindley Darden

2. Overview of Reasoning Strategies

The reasoning strategies to be discussed in this paper are reasoning in hypothesis formation, reasoning in experimental design, and reasoning to generate hypotheses in the light of an anomaly. More specifically, the reasoning strategies may be characterized as (1) abstraction-instantiation (Darden, 1987; 1991; Darden and Cain, 1989), (2) the systematic scan (Lederberg, 1965), and (3)modular anomaly resolution (Darden, 1991; 1992; forthcoming). All three of these strategies play roles in the reasoning about molecular genetics experiments in the MGI Lab.

3. Experiment Planning

The research focus of one group at the MGI Lab during the spring of 1994 was the question of how conformational states of DNA relate to DNA's susceptibility to mutagenesis. More specifically, the research focused on the relation of transcription and mutagenesis. (This research program is related to the controversial topic of adaptive mutation, but a discussion of that topic is beyond the scope of this paper. It is a fascinating idea that the genes that are being actively used, that is being actively transcribed, may have the opportunity to change more, presumably in a non-directed way. On the topic of adaptive mutation, see Foster, 1993; Keller, 1992; Sarkar, 1991; Thaler, 1994.)

A more specific hypothesis investigated by the lab group is that DNA that is being actively transcribed (genes that are being expressed) will be in a state that renders that section of the DNA molecule more susceptible to alteration by mutagens (Davis, 1989). During DNA transcription, the DNA double helix is "open," that is, the Watson-Crick base pairs are separated and the DNA is single-stranded inside the bubble produced by the RNA polymerase enzyme that is copying the DNA into RNA. The hypothesis is that DNA in this state is more vulnerable to mutagens than when it is tightly packed.

One version of an abstract skeletal model of this process is as follows:

DNA=>"open DNA"=>(more) lesions=>production of mutants=>detect mutants
     |            |                                   |
    induce       introduce                            repair?
    transcription   mutagen                            replication

When transcription (gene expression) is induced in DNA, then the DNA double helix opens. The Watson-Crick base pairs separate, as the messenger RNA is produced. The hypothesis is that during such transcription the DNA is more susceptible to mutagens, so that if a mutagen is present, then more lesions (mutations) will be caused in the transcribing genes than in other portions of the DNA. In order to detect those mutations, bacteria are allowed to grow and the mutant colonies observed. So, between the possible effect of the mutagen and the detection of the mutant, DNA replication must occur. A possible confounding factor is the repair of mutations that is known to occur during DNA replication. If the mutations are repaired before being detected, then the hypothesized effect may be occurring but it may not be measured or the amount measured may be less than the amount that occurred.

The design of an experimental system that provides a good instantiation of this skeletal model has been a focus of research in the MGI Lab for several months. Thus a reasoning strategy for designing an experimental system may be characterized: develop an abstract skeletal model of steps (known and hypothesized) in the process under study; instantiate the variables in the model. For example, choose a particular gene in a particular organism that can be induced; determine what mutations in that gene are to be the focus of the mutagen and how the mutations can be detected; determine what mutagen is to be used; and so on for all the abstract components of the skeletal model. The realization of this instantiation in laboratory work results in an actual experimental system. This reasoning strategy provides a method for reasoning from a hypothesis to a skeletal model (an elaboration of the steps of a process in which the hypothesized step plays a role) to an experimental system.

The traditional philosophical account from the hypothetico-deductive (H-D) model totally obscures this reasoning. Recall the steps in the traditional H-D account--conjecture hypothesis, plug in initial conditions, derive prediction, test prediction. The step of designing an experimental system that will provide a good test of the hypothesis is totally obscured. No clues are given as to how such a design is made. Yet this is an important reasoning task of scientists. Hans Rheinberger (1992) argued for the importance of the concept of an experimental system in history and philosophy of molecular biology, although he did not discuss the reasoning in constructing such experimental systems.

Thus, the reasoning to be discussed here is the elaboration of this abstract model and its instantiation in a laboratory experimental system. To give the game away a bit: the prediction that genes undergoing transcription will show an increase in the rate of mutagenesis has been tested in several experimental systems, one of which will be discussed in detail. The prediction has not yet been confirmed; the failed prediction constitutes an anomaly. Reasoning to resolve this anomaly is discussed in Part II by Michael Cook.

4. The Systematic Scan

Reasoning about this skeletal model often employs a reasoning strategy that was devised by Lederberg and is frequently used by the lab group--the systematic scan (Lederberg, 1965; see also Zwicky, 1967). Before showing a role that this strategy played in the design of an experimental system, it is useful to characterize the general strategy. Given a problem, find one approximate solution to it. Then examine the solution to find constants that can be converted into variables. If the variables are numeric, then the scan could potentially go from minus infinity to plus infinity. Construct a set of solutions and aggregate subsets at appropriate grains of resolution. Evaluate the subsets and choose one; then evaluate the sets within that subset to choose which particular solution to pursue.

A key issue in using the scan method is to identify the variables and to determine the range of the variables in order to construct sets and subsets of types of solutions. Most powerfully, the sets and subsets should not be a mere heap but should be constructed via a vector or a matrix with orthogonal axes. The choice of variables to construct the axes, and thus the sets, is a key to success of the systematic scan method. One must have good items, constructed in a systematic way, in order for the scan to be productive.

Philip Kitcher (1993) advocates a method of reasoning about types of hypotheses and the use of eliminative induction to narrow consideration to one of them. However, he does not attempt to develop a method for systematic generation of alternative hypotheses and talks vaguely of prior constraints providing the alternatives. Kitcher, as well as most of his colleagues in twentieth century philosophy of science, neglects the reasoning in hypothesis generation. This systematic scan method is a useful addition to Kitcher's view of scientific methodology.

For some problems, given a set of variables and the range of their values, one can do a complete scan, that is a scan that is exhaustive and non-redundant. Lederberg wrote the algorithm for the first expert system, DENDRAL, which could exhaustively generate all possible three-dimensional chemical structures for a given chemical formula. Thus, the search space for a problem has all possible topological connections of the atoms (Lindsay, Buchanan, Feigenbaum, and Lederberg, 1980; 1993).

A complete scan will typically have too many possibilities to consider all of them explicitly. With regard to actual experimental design, the problems are too vague and the possible solutions too little specified to have any confidence that one is doing a complete scan. The looser application of the systematic scan method involves a question: what are all the possible solutions to a problem that come to mind or that can be extracted from a search of the scientific literature? Or, another way of posing a question: what is the set of which the single instance that I have found is a member? Would some other member of the set be better than the current instance under consideration?

5. Scanning and Instantiation to Produce an Experimental System

The task now is to see how these reasoning strategies of abstraction-instantiation and the systematic scan are related, and how their use results in the design of an experimental system. The most abstract level is the abstract hypothesis, which is elaborated to produce an abstract skeletal model of a process. The abstract model can be further elaborated with additional steps or various steps can be left as implicit. Then, variables that must be instantiated are identified and a scan of the range of possible values of the variables is made. Finally, one value for each variable is specified and a fully instantiated experimental system results. Recall the transcription-mutation skeletal model to be instantiated:

DNA=>"open DNA"=>(more) lesions=>production of mutants=>detect mutants
     |            |                                   |
    induce       introduce                            repair?
    transcription   mutagen                            replication

In the reasoning to develop an instantiation of the transcription-mutation skeletal model, a scan has been made of possible mutagens. A number have been used in actual experiments. Much of this scanning occurred before I (Lindley Darden) arrived at the Lederberg lab. Various criteria were used to generate this list and to choose the best candidates. The one that was the focus of study during my visit was EMS (ethylmethane sulfonate).

In addition to the choice of mutagen, other necessary components for constructing an experimental system that instantiates the skeletal model are a gene that is (1) able to have its expression induced and (2) has mutations that can be easily and rapidly detected. The lac operon of E. coli is a system of choice because it is so well understood and because it has been engineered to be a versatile laboratory tool. (In the late 1940s and early 50s, Lederberg pioneered in the study of lac mutants and in developing laboratory methods for their manipulation. See Brock, 1990.) No scanning of other genetic systems as alternatives to the lac operon system occurred during the period of my visit to the MGI Lab. Lederberg, in reflecting on their earlier decisions, suggested that they "scanned in their heads" before beginning the current line of research focused on the lac operon.

As Jacob and Monod (1961) had worked out in the 1960s, the lac operon in E. coli bacteria has three structural genes that produce enzymes. The one of interest here is the lac Z gene that produces the enzyme b- galactosidase, which functions to break down the sugar lactose. In the absence of lactose, b-galactosidase is not produced. But if lactose is added to the system, it induces the production of the enzymes that break it down. In order to explain this induction phenomenon, Jacob and Monod postulated the existence of another gene, the lac I gene, that produces a repressor. If no lactose is present, the repressor prevents transcription of the lac operon. If lactose is added, then lactose forms a complex with the repressor, freeing the genes to begin transcription. However, if the lac Z gene is mutated, then no functional b-galactosidase is produced, even if an inducer is present.

Since the 1960s, the lac operon has been extremely well-studied. Numerous experimental techniques have been developed to make the lac operon very useful in genetic experiments. Of particular interest to us is the gratuitous inducer, the chemical, isopropyl-thio-b-D-galactoside, called IPTG. IPTG induces transcription of the lac operon genes in the absence of lactose, the normal inducer; but, unlike lactose, IPTG is not itself affected by the enzymes that are produced. Another engineering feat is the development of strains of bacteria that do not have the lac operon on their chromosomes. In such strains, the lac genes can be altered in various ways and introduced into the bacteria on plasmids, which are extra-chromosomal bits of DNA. (Historians and sociologists of late have discussed the engineering of laboratory artifacts for experimental purposes and the lac operon certainly qualifies.)

A major task for geneticists is strain construction. Mutations are moved from one strain into another to create a desired combination of mutations. The reasoning in strain construction is an important topic in a study of reasoning in genetics, but time will not permit pursuing that topic here. (Michael Cook and David Thaler are designing an expert system to plan bacterial genetic experiments and do strain constructions.)

In the strain construction for the mutagenesis experiments, the MGI Lab group used a strain in the lab (labeled A70) which lacks the entire lac operon on its chromosome and also lacks the gene for making the amino acid proline; the strain is designated "delta lac pro." The strain is also resistant to streptomycin. A set of strains of bacteria were engineered and supplied by Claire Cupples (formerly at UCLA, now in Montreal) (Cupples and Miller, 1989; Couples et al., 1990). Cupples's strains are designated as derivatives of an E. coli strain (P90C) which is delta lac proB on the chromosome. The Cupples's strains also contains a sex factor (a plasmid, which is piece of DNA separate from the bacterial chromosome) specifically engineered with lac mutants, designated: F' lac I- Z- proB+. The F' is a plasmid that carries a gene for making proline and has a specific lac Z- mutation. Cupples created six strains with different lac Z- mutations at a critical site (residue 461) in the lac Z gene; this is a critical site for the gene to produce a functional enzyme (b-galactosidase). Each strain has a specific known base substitution that must be restored to the normal DNA codon (GAG, codes for glutamic acid) in order to get the wild type phenotype (i. e., functional enzyme).

Note how the molecular level gets used in engineering a plasmid to go into a strain for a genetic experiment. Schaffner (1993), when he isn't discussing his formal reduction model, correctly notes such "intertwining" of the genetic and the biochemical levels in biology and medicine. It is interesting to note that this is the first point in the experimental design that DNA sequences become relevant. Most of the work is done at the genetic level, or at a level in between the gene (considered as a functional unit that produces an enzyme) and the detailed DNA sequence; this intermediate level is the level of the three-dimensional, conformational state of DNA during transcription. Investigation of the lowest level, the sequence level, can be deferred during the planning and execution of the genetic experiments. DNA sequencing of the mutants would be done only after a positive result in the genetic experiments.

In addition to the bacterial chromosome, which does not have the lac operon genes, and the F plasmid with the specific lac Z-mutants, the engineered strain also carried another plasmid, called pZC21, which carries the lac I gene. Having lac I means that the lac repressor is produced, and, thus, the lac Z gene is repressed unless an inducer is present. This engineered bacterial strain for use in the mutagenic experiments is thus (A70) delta lac pro/F' lac Z-pro+/pZC211ac I+ ampR.

The skeletal model can now be instantiated to produce one specific experimental system:

  lac Z- DNA=>"open DNA"=>(more) lesions=>production of mutants=>
             |             |                                   |
            induce        introduce                           repair?
            transcription    mutagen                           replication
            with IPTG      EMS

  => detect mutants by growth on E lac medium (medium without lactose)

After scanning for possible values of the variables and the instantiation of each, a detailed experimental system is designed. The DNA is the lac Z - gene on the F plasmid. The lac Z - gene is inducible, because lac I+ is present and is producing the repressor. Transcription is induced by adding IPTG to the medium: the IPTG interacts with the repressor and allows transcription to begin. A mutagen, EMS, is then put into the medium. The prediction is that more mutations (lesions) in the lac Z- gene will be produced in a system in which transcription is induced than in a system where transcription has not been induced.

The mutation that is being detected after subjecting these bacteria to the mutagen EMS is the reversion of lac Z- to lac Z+. In other words, a dysfunctional gene is being mutated and what is detected are those mutants which restore functionality to the gene. What is actually detected is evidence for the activity of the enzyme that the gene produces. The presence of the enzyme is detected because the bacterial colony will grow on a minimal medium with lactose as its food (that is, carbon) source; the functional enzyme is necessary for lactose metabolism.

A complete instantiation of the skeletal model results in a detailed experimental system in which transcription is induced in the mutant lac Z- gene on the F' plasmid by IPTG, the mutagen is EMS, and growth of colonies on a minimal medium with lactose indicate that a lac Z+ reversion occurred. The control system has no IPTG and thus has very little transcription of the lac Z gene. (There is a small amount of residual activity in normal cells without inducer, but that may be neglected in these experiments.)

Raphael Stimphil (a technician in the MGI Lab) kindly supplied me with his detailed protocol from one of the mutagen experiments:

Two cultures of CC102pZC21 were grown overnight in NCE(.2%)Glucose 100ug/ml Amp +/- 0.5mM IPTG at 37_C. The cultures were diluted 1:10 in fresh medium of the same composition +/- 0.5 mM IPTG, and grown to an optical density at 600 nm of 0.2 at 37_C. They were aliquoted 1 ml each in microcentrifuge tubes on ice. Ethylmethane Sulfonate (EMS) was added to 35mM and incubated at 37_C for the indicated time periods. The mutagen was neutralized with 2-Mercaptoethanol to 35mM. Survival was assessed on NCE(.2%) glucose plates, and mutagenesis on NCE(.2%) lactose. Plates were incubated at 37_C, and counted after 48hrs.

Raphael continued: "there were other variations to this protocol, mainly in the temperature at which mutagenesis was carried out...with this protocol I [Raphael Stimphil] was not able to observe any difference between repressed and induced. There was another protocol through which I could observe a 1.6 fold higher yield of lac+ revertants when the operon was induced. However, I have not been able to widen the gap (get more of an effect) between repressed and induced. This was the starvation protocol: cells were starved for 4hrs +/- 0.5 mM IPTG were mutagenized with DES (Diethyl Sulfate)."

6. Experimental Results--An Anomaly

Thus, the particular EMS experiment did not show an increase in lac Z+ revertants in the presence of IPTG. No difference in the number of lac Z+ revertants was found between the experimental system and the control system. In various other experiments done with different mutagens and slightly different protocols, no consistent increase in lac Z+ revertants was found, although in one a slight increase was observed. If the hypothesis is correct that mutagenesis increases during transcription and if this experimental system is a good test of that hypothesis, then more lac Z+ revertants would have been expected consistently in the presence of IPTG. So, the data do not match the prediction of the model that a consistent increase in revertants would occur in the cells that were subjected to the mutagen during transcription. This lack of agreement between the prediction and the data constitutes an anomaly.

7. Reasoning in Anomaly Resolution

I (Lindley Darden) participated in discussions in which the MGI Lab group did anomaly resolution. (It was great fun for me to watch scientists using some of the anomaly resolution strategies that I had proposed hypothetically in my book but had never seen scientists actually using, especially the strategy of considering each module of a model and generating alternatives to each one. Also, I saw how the process of anomaly resolution led to uncovering previously implicit assumptions, and I sometimes nudged the group to make their implicit assumptions explicit.)

As I have argued elsewhere, reasoning in anomaly resolution is, first of all, like a diagnostic reasoning process (Darden, 1990; Darden, 1991, Ch. 12; Darden, 1992). That is, it is analogous to debugging a computer program, to determining a mechanical failure in a car, to diagnosing a disease. One must find locations in the model which may be failing. Secondly, after localization, anomaly resolution is a redesign process, like engineers designing a new part to prevent failure in the future (Karp, 1990). The model must be redesigned to avoid the failure.

Such anomaly resolution reasoning involves three major factors: the modular representation of the failing system, the nature of the anomaly, and the reasoning strategies to localize and to fix the failure. (These reasoning strategies are interestingly similar to the reasoning strategies of decomposition and localization, discussed by Bechtel and Richardson (1993).)

This particular anomaly provided geneticists with less positive guidance in localization than others that I have examined (see, for example, the discussion of linkage anomalies in Darden, 1991, Ch. 9). What happened here is simply the lack of an effect, when one had been predicted. One could conclude that the original hypothesis--that genes undergoing transcription show an increased mutation rate--has been disproved. But such a conclusion is premature. Whether this particular experimental system is an adequate test is a prior question. Efforts were made to reason about the experimental details and tinker with them.

Collectively in lab meetings and individually by email, we outlined the skeletal model in more detail and considered what could be going wrong at each step. Lederberg and I tend to draw pictures of processes with sequential steps (these correspond nicely, I think, to what Ken Schaffner (1993) calls "causal, mechanistic models" and Bechtel and Richardson (1993) call the "causal components" of systems). (On the role of visualization in his hypothesis formation, see quotations from Lederberg in Judson, 1980, pp. 184-186.)

Lederberg said in one email message during this anomaly resolution process:

We probably want to expand this [the skeletal model] to at least 2-fold more detail, and try to attack any of the steps. This model expansion is too case specific to lend itself to exhaustive enumeration. But for each step we could try to construct "all the possible ways" that it could be wrong, or experimentally bolstered or tested.

Michael Cook elaborated the model by stating assumptions in sentences and performing systematic permutations on the sentences. He discusses these reasoning strategies in Part II.

Part II by Michael Cook

8. Hypothesis Generation Based on a Technique Suggested by A Model for Defense Mechanisms

Scientific discussions are often devoted to explaining experimental results. Over the past year we had a series of lab meetings in which we tried to explain why a certain expected effect was not observed. In this context, I (Michael Cook) outlined a method for systematically generating such explanations, based on a technique suggested by Patrick Suppes and Hermine Warren in a different context (Suppes and Warren 1975).

The first step is to abstract a verbal model of the knowledge we are working with in the lab. Although the underlying biochemical models are highly technical, it is necessary to verbalize the basic principles, and much of the discussion in our meetings is an exercise in just that. In one particular case, I suggested the following sentences as a summary of the relevant knowledge, based on diagrams put on the whiteboard by Joshua Lederberg:

BACKGROUND KNOWLEDGE:

(1) Transcription opens DNA.

(2) EMS mutates DNA.

(3) IPTG induces transcription.

HYPOTHESIS:

(4) EMS has better access to open DNA than closed DNA.

PREDICTION:

(5) IPTG will result in more mutation in the transcribed gene.

The first three statements are considered to be "well known," whereas the fourth statement is a working hypothesis. The fifth statement is a prediction.

Experiments were performed, and the predicted result (statement 5) was not observed. In analyzing this, we questioned the hypothesis, but we also ended up questioning the background knowledge. As the discussion progressed I suggested the above model as an abstracted summary of our assumptions.

Once a verbal model is made explicit, one can generate explanations as to why the expected effect was not observed by systematically transforming the sentences in the model. This technique was used in a paper by Suppes and Warren (1975), which represents defense mechanisms as transformations of propositions. (For example, the thought "I hate my mother" gets transformed to: "I love my mother" or to "My mother hates me" or to "My husband hates my mother." Suppes and Warren enumerate a set of such transformations and correlate them with known defense mechanisms.)

When we don't see the expected effect, we generate possible explanations by systematically varying our assumptions. This is done sentence by sentence as follows:

(a) Negate the sentence. (DENIAL)

(b) Substitute a new variable for an existing one. (DISPLACEMENT)

(c) Insert a new clause in the chain. (RATIONALIZATION)

Comment on (c): The new clauses are generally of the "Yes, but..." variety. They acknowledge the truth of the sentence, but add a new sentence whose function is to suggest why the original sentence did not result in the expected observation. This process is reminiscent of the "qualification problem" described in expert systems literature: a "rule", such as "all birds fly" is constantly being qualified to account for reality: penguins, ostriches, baby birds, wounded birds, dead birds, comic strip birds, Kentucky Fried Chicken, etc. It often happens that when a rule-based system is applied to a real problem, the exceptions start to drown out the original simplicity and modularity of the initial rule set (see Genesereth, 1987).

Here are some illustrative examples of inserted clauses. Case 1: Transcription opens DNA. Yes, but something is closing the DNA, e.g., a single-strand binding protein. Case 2: IPTG induces transcription. (a) Yes, but something (the EMS?) is preventing the IPTG from doing its job. (b) Yes, but something (the EMS?) is "stalling" the transcription. Case 3: EMS has better access to open DNA than closed DNA. Yes, but repair enzymes repair open DNA more efficiently than closed DNA.

Often the insertion of new clauses amounts to bringing in the possibility of side-effects, or exogenous variables that have not been considered but are in fact relevant. This of course is especially prevalent when dealing with in vivo systems.

Thus, four sentences and three transformations leads to twelve basic "explanations." It takes non-trivial semantic processing to perform the transformation, and to interpret it in terms of the result to be explained. The twelve transformed sentences, with some discussion, are listed below. Not all twelve are equally meaningful or suggestive, but they are all listed for completeness.

(1) Transforming "Transcription opens DNA":

(a) Transcription does not open DNA. Perhaps the "transcription bubble" image is misleading? Sri Sastry pointed out that the "openness" of the complex is inside the enzyme that is doing the transcribing (the RNA polymerase), which thus may protect the transcribed DNA from the action of the mutagen.

(b) Something else opens DNA. For instance, "breathing," namely the normal opening and closing of DNA base pairs. Th˙is would explain the negative result, since then both untranscribed and transcribed DNA would be open, hence similarly susceptible to mutagenic effects of EMS.

(c) Another variable is canceling out the effect of open DNA. This variant suggested the possibility of single-strand binding proteins binding to the exposed strands of the transcription bubble, protecting the DNA from the EMS.

(2) Transforming "EMS mutates DNA."

(a) EMS does not mutate DNA. The issue of the mutagenic effect of EMS must certainly be looked at, although it is not in dispute. But perhaps the conditions of the experiment are insufficient to get maximum effects?

(b) Something else mutates DNA. This idea doesn't seem relevant to the situation at hand, because mutagenesis was detected in these very experiments, and not detected in controls in which the EMS was left out.

(c) Another variable is canceling out this effect. For instance, EMS might mutate the DNA, but repair enzymes might repair the damage.

(3) Transforming "IPTG induces transcription."

(a) IPTG is not inducing transcription (or: not noticeably). This led to the speculation that the so-called glucose effect--the presence of glucose inhibits expression of the lac gene even in the presence of inducer--may be at work here.

(b) Something else is inducing transcription. This is hard to interpret.

(e) Another variable is canceling out this effect. For example, EMS/IPTG interactions.

(4) Transforming "EMS has better access to open DNA than closed DNA."

(a) EMS has better access to closed DNA. This doesn't seem likely, as there is no observed difference.

(b) Something else has better access to open DNA. (Again, repair enzymes)

Or: EMS has better access to open DNA in replication, and this effect swamps out the one we are looking for, because all the DNA would be affected equally, not just the lac Z- gene.

Or: EMS has better access to the nucleotide precursors and it is they which are mutated before incorporation into the DNA during replication. This would explain why there was no observed difference between transcribed and untranscribed DNA because all the nucleotides would be just as likely to be affected.

These last two possibilities are, strictly speaking, not derived from the original sentence by substitution. They were thought of by David Thaler, who observed that if "open" DNA is really an important variable, then DNA undergoing replication is a strong candidate for susceptibility to mutagenesis, because replicating DNA (as well as transcribing DNA) is open. In fact, these effects may be much stronger than transcription effects, and may swamp out any such examples. In addition, precursor nucleotides in some sense are the most "open" of all (since they are not yet bound to any DNA) and may be the most subject to mutagenesis. The question then becomes: can a mutated precursor be incorporated into a DNA molecule? These questions have opened up many interesting subsequent and ongoing discussions in our group.

(c) Another variable is canceling out this effect. Again, single stranded binding proteins; or IPTG/EMS interactions.

This list covers the results of several hours of joint brainstorming in the lab, and can be derived "mechanically" by applying transformations to the verbal model. I put "mechanically" in quotes, because there are many semantic nuances involved here, and much contextual knowledge. The point is, though, that given the model, there are rules for using it to generate hypotheses, and the need for "insight" is reduced, although not eliminated.

Two additional points need to be made about this hypothesis generation method. First, this approach depends on the formulation of the four statements in the model. Thus, if I said: "Transcription makes DNA more susceptible to mutation," we would not be locked into the model of "open". Not surprisingly, generating the representation of the model is a significant part of the task. Secondly, the transformations are syntactic, but their rationalization must be done by a human. For example, "something else opens the DNA" doesn't generate the concept of "breathing." But adding "something else" (namely a confounding variable) can act as a catalyst for thinking of an instantiation, namely breathing.

How to be certain we have covered all the bases? How systematic was our scan? As usual, answering that question is hard. We can perhaps begin to be certain that we have covered all the bases vis-a-vis our model; the one door into the greater outside universe is the ability to insert new sentences. This allows us to generate other variables not in the model, and think about them without fully instantiating them.

The question is, which variables are likely to be important? Choosing variables on which to focus can be a hard problem. A fruitful line of investigation might be to find heuristics used in variable choice. Once the choice to focus on a particular set of variables is made, the universe of discourse is defined, and work can begin: model-building, model-fitting, model-testing, etc.

It is interesting to note that the use of defense mechanisms in hypothesis generation is not that far-fetched. After all, "rationalization" is in the basic catalog of defense mechanisms, and it is also true that science does attempt to "rationalize experience." That the same word is used for both activities--"making up excuses" and "generating hypotheses"--is more than just a pun, and points out a general pattern in human psychology. Moreover, perhaps defense mechanisms are not necessarily neurotic, as the phrase tends to connote. For example, Lakatos lists as one of the elements of his "positive heuristics": Defend the ramparts of your theory (Lakatos, 1970).

More can be said along these lines. The process of more fully articulating the model on which the experiment is based can be compared to abreaction, a term in psychoanalytic theory referring to the process in therapy by which unconscious material is made conscious. The model sketched above was certainly not fully spelled out before the discussion of the experimental results. The model and the experimental results were clarified together, and the failure of the prediction led to a deeper elaboration of the assumptions behind the work.

Finally, if one views a defense mechanism as an attempt by the mind to avoid the tension associated with being conscious of two or more contradictory statements at the same time, then the connection with scientific research becomes even clearer. And in this context, one may delineate a hierarchy, in which denial is the most primitive method, displacement is slightly more clever and roundabout, and rationalization is the most sophisticated (of these three). Why is rationalization the most sophisticated? Because it allows the contradictory statements to remain in the consciousness, and deals with the tension by making up new, reconciling, statements. Now in this spectrum, the defense mechanism of sublimation may tentatively be identified with "revolutionary" science (Kuhn, 1970). Sublimation is the creative resolution of psychic tensions on a higher level of being--the usual examples being the activity of science, art, or philosophy--as sublimations of sexual energies. But so-called revolutionary science involves precisely the ability to accept seeming contradictions, and to find a new framework in which they appear as harmonious expressions of a hitherto unidentified reality.

These ideas are preliminary, but they seem well suited to being developed to the point where a computer can take part in the process of hypothesis generation during anomaly resolution.

9. Conclusion

In summary, we have examined reasoning strategies used in a molecular biology laboratory: strategies for hypothesis formation; strategies for designing an experimental system; and strategies for debugging a model in the face of an anomaly. These strategies are abstraction-instantiation, the systematic scan, and modular anomaly resolution. An original abstract hypothesis was further specified in an abstract skeletal model that underwent several elaborations. Reasoning about this abstract model involved a scan of possible values of variables. Several were actually instantiated in detailed plans for experimental systems and experiments were run in the laboratory. For the one instantiation discussed here, the predicted effect was not found. Then, the modules of the abstract model were systematically formulated and manipulated to form alternative hypotheses to remove the failure. The experiments to choose among the alternative modifications to the model are still on-going. The problem with working at the forefront of science, rather than with its history, is that you do not know how the story will come out.

Acknowledgments

Lindley Darden's visit at Rockefeller University was supported by a grant from the Andrew W. Mellon Foundation. She thanks Joshua Lederberg for making this opportunity available and for stimulating discussions during the visit. She also thanks others in the lab who were so friendly and hel˙pful: Michael Cook, David Thaler, Sri Sastry, Greg Tombline, Raphael Stimphil, Ken Zahn, Mick Noordewier, Mary Jane Zimmermann and Joice Johnson. Michael Cook's work was supported by a grant from the Defense Advanced Research Projects Agency, ARPA Order No. 8145, No. MDA972-91-J-1008. They both thank Joshua Lederberg, David Thaler, Raphael Stimphil, Sri Sastry, Nancy Hall (of the Committee on the History and Philosophy of Science at the University of Maryland, College Park) and William Wolfe (of the Mathematics Department at the University of Colorado, Denver) for inspiring comments or specific comments on earlier drafts of this paper.

References

Bechtel, William and Robert C. Richardson (1993), Discovering Complexity: Decomposition and Localization as Strategies in Scientific Research. Princeton, N. J.: Princeton University Press.

Brock, Thomas D. (1990), The Emergence of Bacterial Genetics. Cold Spring Harbor, New York: Cold Spring Harbor Laboratory Press.

Cupples, Claire and Jeffrey H. Miller (1989), "A Set of lacZ Mutations in Escherichia coli That Allow Rapid Detection of Each of Six Base Substitutions," Proc. Natl. Acad. Sci. USA 86:5345-5349.

Cupples, Claire, Mila Cabera, Chris Cruz, and Jeffrey H. Miller (1990), "A Set of lacZ Mutations in Escherichia coli That Allow Rapid Detection of Specific Frameshift Mutations," Genetics 125:275-280.

Darden, Lindley (1987), "Viewing the History of Science as Compiled Hindsight," AI Magazine 8(2):33-41.

Darden, Lindley (1990), "Diagnosing and Fixing Faults in Theories," in J. Shrager and P. Langley (eds.), Computational Models of Scientific Discovery and Theory Formation. San Mateo, California: Morgan Kaufmann, pp. 319-346.

Darden, Lindley (1991), Theory Change in Science: Strategies from Mendelian Genetics. New York: Oxford University Press.

Darden, Lindley (1992) "Strategies for Anomaly Resolution," in R. Giere (ed.), Cognitive Models of Science, Minnesota Studies in the Philosophy of Science, Vol. 15. Minneapolis: University of Minnesota Press, pp. 251-273.

Darden, Lindley (forthcoming) "Exemplars, Abstractions, and Anomalies: Representations and Theory Change in Mendelian and Molecular Genetics," in James G. Lennox and Gereon Wolters (eds.), Philosophy of Biology. Konstanz, Germany: University of Konstanz Press and Pittsburgh, PA: University of Pittsburgh Press, pp. 137-158.

Darden, Lindley and Cain, Joseph A. (1989), "Selection Type Theories," Philosophy of Science 56:106-129.

Davis, Bernard D. (1989), "Transcriptional Bias: A Non-Lamarckian Mechanism for Substrate-Induced Mutations," Proc. Natl. Acad. Sci. USA 86:5005-5009.

Foster, Patricia L. (1993), "Adaptive Mutation: The Uses of Adversity," Annual Reviews of Microbiology 47:467-504.

Froehlich, Werner D; Gudmund Smith; Juris G Draguns; and Uwe Hentschel (eds.) (1984), Psychological Processes in Cognition and Personality. Washington, D.C.: Hemisphere Publishing Corp.

Genesereth, Michael R. and Nils J. Nilsson (1987), Logical Foundations of Artificial Intelligence. San Mateo, California: Morgan Kaufmann,

Jacob, Francois and Jacques Monod (1961), "Genetic Regulatory Mechanisms in the Synthesis of Proteins," Journal of Molecular Biology 3:318-356.

Judson, Horace Freeland (1980), Search for Solutions. New York: Holt, Rinehart and Winston.

Karp, Peter (1990), "Hypothesis Formation as Design," in J. Shrager and P. Langley (eds.), Computational Models of Scientific Discovery and Theory Formation. San Mateo, California: Morgan Kaufmann, pp. 275-317.

Keller, Evelyn Fox (1992), "Between Language and Science: The Question of Directed Mutation in Molecular Genetics," Perspectives in Biology and Medicine 35:292-306.

Kitcher, Philip (1993), The Advancement of Science: Science without Legend, Objectivity without Illusions. New York: Oxford University Press.

Kuhn, Thomas (1970), The Structure of Scientific Revolutions. 2nd Edition. Chicago: The University of Chicago Press.

Lakatos, Imre (1970), "Falsification and the Methodology of Scientific Research Programmes," in I. Lakatos and Alan Musgrave (eds.), Criticism and the Growth of Knowledge. Cambridge, England: Cambridge University Press, pp. 91-195.

Lederberg, Joshua (1965), "Signs of Life: Criterion-System of Exobiology," Nature 207:9-13.

Lindsay, Robert K.; B. G. Buchanan; E. A. Feigenbaum; J. Lederberg (1980), Applications of Artificial Intelligence for Organic Chemistry: The DENDRAL Project. New York: McGraw Hill.

Lindsay, Robert K.; B. G. Buchanan; E. A. Feigenbaum; J. Lederberg (1993), "DENDRAL: A Case Study of the First Expert System for Scientific Hypothesis Formation." Artificial Intelligence 61:209-261.

Rheinberger, Hans-Joerg (1992a), "Experiment, Difference, and Writing: I. Tracing Protein Synthesis," Studies in the History and Philosophy of Science 23:305-331.

Rheinberger, Hans-Joerg (1992b), "Experiment, Difference, and Writing: II. The Laboratory Production of Transfer RNA," Studies in the History and Philosophy of Science 23:389-422.

Sarkar, Sahotra (1991), "Lamarck Contre Darwin, Reduction Versus Statistics: Conceptual Issues in the Controversy Over Directed Mutagenesis in Bacteria," in Alfred I. Tauber (ed.), Organism and the Origins of Self," The Netherlands: Kluwer, pp. 235-271.

Schaffner, Kenneth (1993), Discovery and Explanation in Biology and Medicine. Chicago: University of Chicago Press.

Suppes, Patrick and Hermine Warren (1975), "On the Generation and Classification of Defense Mechanisms," International Journal of Psychoanalysis 56: Part IV, pp. 405-414.

Thaler, David S. (1994), "The Evolution of Genetic Intelligence," Science 264:224-225.

Zwicky, Fritz (1967), "The Morphological Approach to Discovery, Invention, Research and Construction," in Fritz Zwicky and A. G. Wilson (eds.), New Methods of Thought and Procedure. New York: Springer-Verlag.