On the different ways of doing theory in biology

Theoretical biology’ is a surprisingly heterogeneous field, partly because it encompasses ‘‘doing theory’’ across disciplines as diverse as molecular biology, systematics, ecology, and evolutionary biology. Moreover, it is done in a stunning variety of different ways, using anything from formal analytical models to computer simulations, from graphic representations to verbal arguments. A few years ago I co-organized a workshop on this topic at the Konrad Lorenz Institute for theoretical biology in Vienna, and then published an edited volume of the journal Biological Theory collecting all contributions.

In my paper I surveyed a number of aspects of what it means to do theoretical biology, and how they compare with the allegedly much more restricted sense of theory in the physical sciences. I also tackled a somewhat recent trend toward the presentation of all-encompassing theories in the biological sciences, from general theories of ecology to an attempt to provide a conceptual framework for the entire set of biological disciplines. I also discussed the roles played by philosophers of science in criticizing and shaping biological theorizing. The full paper is available for download here (free), and the edited volume can be found here (articles beyond paywall). Let me, however, summarize my main points to facilitate a general discussion.

First, I discussed the issue of alleged laws in biology. If there is anything that characterizes physics as a science it is its unending quest for universal laws, from Newton’s mechanics to the current (and highly controversial) string theory. This is the case despite the fact that influential philosophers of science like Van Fraassen and Giere maintain laws play a marginal and mostly didactical role, even in physics. Regardless, it is not surprising, that discussions of general laws in biology are a recurrent staple of the literature and—interestingly—one that provides a good example of positive interactions between theoretically inclined biologists and philosophers of science.

In a number of cases authors draw a direct parallel between physical laws and proposed biological equivalents. For instance, M. Elgin argues that the ‘‘epistemic functions of a priori biological laws in biology are the same as those of empirical laws in physics.’’ Elgin begins by acknowledging the (almost) universal agreement among philosophers who subscribe to the concept of laws that these must be both universal and empirical in nature, though he hastens to say that these conditions are necessary but not sufficient to distinguish laws from ‘‘accidental’’ generalizations. He then discusses Elliot Sober’s proposal that the Hardy–Weinberg principle in population genetics is an example of a biological law, even though it is universal but not empirical.

There are several problems with this proposal, chiefly the fact that Hardy–Weinberg cannot meaningfully be thought of as a ‘‘zero force law’’ analogous to, say, the law of inertia (as Elgin suggests), as well as the above mentioned lack of empirical content. Jonathan Kaplan and, back in 2006, have discussed in detail why the various evolutionary mechanisms that can cause a population to deviate from Hardy–Weinberg equilibrium are not conceptually equivalent, and should not be thought of as ‘‘forces’’ characterized by intensity and direction. Moreover, it simply seems strange to suggest that a scientific law can have no empirical content and instead simply be true a priori (as Hardy–Weinberg surely is, mathematically speaking). This risks embarking philosophy of science down the slippery slope of considering logical and mathematical principles themselves as ‘‘laws,’’ a usage that clearly does not accord to scientific practice at all. Apparently, however, this point is not at all clear in the minds of some biologists, since it is possible to find statements like the following: ‘‘The global-optimum model is not so much a predictor of nature as a definition of nature. It must be true that a perfectly adapted organism leaves the most possible offspring!’’ (In a paper by Nonacs and Dill, published in 1993). Or: ‘‘The existence of a global-optimum point is a ‘deep axiom’: a tautology that guarantees logical consistency at the core of a theory’’ (by Stearns and Schmid-Hempel, in 1987). This is surely one area where more communication between theoretically minded biologists and philosophers of science would be welcome.

Besides appeals to Hardy–Weinberg as an example of biological laws, the next most popular area of discussion concerning this topic is perhaps the possible existence of laws in ecology. For instance, G.M. Mikkelson makes a case for moving ecology from an idiographic (historical) mode of explanation to a nomothetic (law-based) one. He maintains that—contra to what he perceives as the practice among ecologists—generalizations should be interpreted in terms of law-like generalizations because functional kinds (such as ‘‘predators’’) and structural kinds (like the various community types) correlate better than taxa (historical kinds) with fundamental ecological patterns and processes. As Mikkelson puts it, ‘‘Imagine being dropped at a random spot on the land surface of the Earth. Which would allow you to predict the density of plant species around you—that is, the number of species per 10,000 square kilometers—most precisely: knowing the climate, or knowing the landmass on which you stand? Answer: climate wins, hands down.’’. Well yes, but it is arguable that such predictions are the result of ‘‘laws’’ in any way like those that physicists are after, and it is telling that Mikkelson is in fact cautious enough to talk about ‘‘law-like generalizations.’’

Interestingly, the issue of, shall we say, “physics envy” shows up explicitly in yet another author’s treatment of the issue of laws in ecology, D.R. Lockwood. In this case it is an ecologist who takes on the matter, and comes down rather negatively on the possibility of laws in his discipline. Lockwood discusses two frequent suggestions as examples of ecological laws: Malthusian growth and the logistic equation. He quickly finds them inadequate to the task, as they do not support counterfactuals, are not temporally universal, and in fact repeatedly fail empirical tests. In the end, Lockwood agrees with philosopher W.C. Wimsatt’s suggestion that ‘‘aggregative systems’’ (those typically studied by physics) do follow robust laws, while emergent systems (like those studied in biology) do not. This does not mean that biologists cannot generalize their empirical findings (within certain limits), and that such generalizations cannot be used to make reasonable predictions about the behavior of the systems of interest to them. And that, after all, is what actually matters.

If not laws, are there general theories in biology? Theodosius Dobzhansky famously said that ‘‘nothing in biology makes sense except in the light of evolution.’’ Adding that to Richard Dawkins’ quest for ‘‘universal Darwinism’’ and to Daniel Dennett’s contention that Darwinism is a ‘‘universal acid’’ of sorts that cuts across disciplines, extending the idea of Darwinian evolution well beyond biology itself, one would think that biologists have settled on their version of a theory of everything long ago. One would be surprised. A perusal of the recent literature shows quite a bit of activity in this department, again largely on the side of ecologists. I will briefly comment on one such attempts, referring the interested reader to two more case studies discussed in the paper.

Stephen Hubbell’s unified neutral theory of biodiversity and biogeography attempts to do precisely what its name implies: to propose a combined theoretical framework for biodiversity (measured by species–abundance curves) and biogeography (measured by species–area curves), where the ‘‘neutrality’’ consists in assuming that the differences among species that belong to the same trophic level within a given ecological community do not matter for the dynamics of that community. Hubbell’s theory draws from explicit parallels with the neutral theory of molecular evolution proposed by Motoo Kimura back in 1985, and from the above mentioned Hardy–Weinberg equilibrium in population genetics.

The unified theory has generated a significant literature, including a number of critics and empirical tests. It is important to realize a couple of things, however: first, that the scope of the theory is crucially limited by the clause that it applies to species of similar trophic level within a given community, which makes it quite a bit more narrow in scope than its name (and some of the discussion that has followed the publication of Hubbell’s book) might otherwise give the impression. Moreover, the theory is notoriously difficult to test, because while it does make distinctive predictions when compared to, say, niche assembly theories (which are non neutral), the predicted differences are very small, and easily lost in the noise characteristic of ecological data sets. This is not the place to get into an in-depth discussion of Hubbell’s theory, but I can hazard a prediction based on the similar history of the neutral theory of molecular evolution: in that case more than a decade of discussions led to the conclusion that a modified ‘‘quasi-neutral’’ theory was the best bet. Which basically means that stochastic as well as selective processes affect the outcome of evolution, just as it would be reasonable to expect.

My more general point in the paper was that even a cursory look at the literature allows one to distinguish four modalities for theoretical biology (though similar distinctions can also be found in, say, physics, especially if one considers the entire discipline, and not just specific subsets like particle physics). I refer to these as analytical modeling, statistical modeling, computer modeling, and conceptual analysis.

The classic example of analytical approaches in theoretical biology is represented by much of the body of works that makes up population genetics theory, beginning again with the Hardy–Weinberg principle and arriving at more recent advances such as coalescent theory. The basic approach here is to use mathematical formalism to arrive at analytical (i.e., precise, non-statistical) solutions of sets of equations describing the behavior of idealized populations of organisms.

The second general type of approach to biological theorizing is statistical in nature, beginning with Ronald Fisher’s famous ‘‘fundamental’’ theorem of natural selection, which was proposed as explicitly equivalent to one of the most solid pieces of theory in classical physics, the second principle of thermodynamics. Fisher laid the foundations for statistical genetics, which—when reconciled with the apparently discrepant Mendelian genetics—resulted in the Modern Synthesis of the 1940s, basically the still current standard model in evolutionary theory (but see this).

The third way of doing theoretical biology is based on computer modeling, and it is in a sense a continuation of a long standing trend in the field: when things get too complicated even for a quantitative (i.e., statistical) genetic approach (let alone for a population genetic, analytical one), researchers move toward computationally intensive simulations of biological populations. There are many examples of this, some of which are continuous with the population-quantitative genetic type of issues just discussed, some having to do with broader questions concerning the evolution of evolutionary mechanisms (evolvability), and some concerning the relationship between structural biology and evolutionary dynamics.

The fourth and last modality of biological theorizing is based on the articulation of verbal-conceptual models, and obviously comes closest to what philosophers of biology themselves engage in when they analyze the concepts deployed by working biologists. Verbal-conceptual models in science have the reputation of being second grade when compared to ‘rigorous’ mathematical modeling, even though of course both the original work by Darwin and much of the work done during the Modern Synthesis (except for the part that was explicitly population-genetic) fall into this category. Indeed, there seems to be a resurgence of this approach as a necessary complement to increasingly ‘‘experimental’’ mathematical treatments like the ones discussed above. Verbal-conceptual models include a broad category of biological theorizing that is particularly popular in molecular biology and biochemistry, where many papers present the results of complex experiments on the structure of genetic networks or biochemical pathways in the form of conceptual diagrams that are meant to both summarize the current status of knowledge and provide food for thought for the developing of new hypotheses and subsequent empirical tests.

My conclusions at the end of the full paper: the term ‘‘speculation’’ has a rather bad reputation in science, often associated with the much-dreaded accusation hurled at philosophers that they engage in ‘‘armchair theorizing.’’ But of course all theory is armchair speculation, and unless one thinks of mathematics in a special Platonic fashion, mathematical approaches are simply continuous with, and complementary to, all the other ways of doing theory in science.

Which brings me to the role of philosophy of science in all of this. I think that philosophy of science itself is characterized by different modalities, some of which have little to do with helping scientists and reflect instead on the logic of scientific theories, the epistemology underlying scientific claims, and so on. Indeed, philosophy of science itself is continuous with the history of science, since it would be difficult to attempt generalizations about the nature of science while focusing only on currently ongoing (and therefore far from being settled) scientific research.

To begin with, then, classic philosophy of science is concerned with the study of the logic of scientific discovery, as exemplified by the well-known names (even among scientists!) of Popper, Kuhn, and—to a lesser extent—Feyerabend and Lakatos (and, of course, a number of contemporary scholars, too many to mention). This type of philosophy of science is, arguably, of very little direct relevance to scientists themselves (except insofar as they are curious about how outsiders see and analyze their own activity). It is perhaps this sort of philosophizing that has brought a number of physicists (e.g., Steven Weinberg, Stephen Hawking, and Lawrence Krauss) to claim that ‘‘philosophy is dead’’ on the ground that, of late, it has not managed to solve any scientific problem with which physics is concerned. In so arguing, these scientists are committing an elementary category mistake prompted by a combination of intellectual hubris and a surprising amount of ignorance.

Philosophy of science, however, also functions in modalities that are (or ought to be) of more direct interest to practicing scientists themselves—whether the latter realize it or not. One such modality is represented by always necessary (if prone to annoy the targeted scientists) external criticism of socially relevant scientific claims (e.g., concerning race, gender, or the validity and application of certain types of medical research). I hesitate to use the label ‘‘science criticism’’ for this activity—even though it is arguably the most appropriate one available—because the term has been possibly irreparably tainted by much post-modern-inspired nonsense at the height of the so-called ‘‘science wars’’ of the 1990s. Regardless of what we end up calling it, it is the sort of philosophical inquiry that actually has practical implications, analogous to the better known ones usually associated with, say, business ethics, medical ethics, and bioethics, and one that should develop into an earnest dialogue between philosophers and scientists about the social implications of science itself.

The third and final modality for philosophy of science is in even more close symbiotic relationship with science, one that seems to be welcome by scientists themselves. Indeed, recent years have seen an increasing number of philosophers of physics, biology, and other disciplines who have been publishing conceptual papers on a large variety of topics that are hard to distinguish from theoretical physics, biology, etc. This is, I think, a much welcome development, and a small (but, hopefully, growing) number of scientists have started to collaborate with philosophers and/or to publish in philosophical journals, as the case of debates about laws in biology discussed above exemplifies. As I pointed out elsewhere, this is along the lines of what Hasok Chang called ‘‘the continuation of science by other means’’:

Complementary science [based on history and philosophy of science] is critical but not prescriptive in relation to specialist science. … Complementary science identifies scientific questions that are excluded by specialist science. … The primary aim of complementary science is not to tell specialist science what to do, but to do what specialist science is presently unable to do. It is a shadow discipline, whose boundaries change exactly so as to encompass whatever gets excluded in specialist science. (pp. 249–250)

From this perspective, then, philosophy of biology represents a fifth type of theoretical biology, albeit one that is practiced from the outside looking into the core discipline. Because of that, it is uniquely positioned, I think, to perceive the threads connecting the other four modalities, as well as the advantages and limitations of each. The idea, of course, is not to make philosophers the ultimate arbiters in theoretical biology (or in anything else, for that matter). Rather, it is a recognition that it does take some distance from the nitty gritty of the specialized literature to be able to perceive the broad picture that is necessary for the advancement of broadly construed theoretical biology. Accordingly, it is not by chance that when biologists themselves step back to contemplate a more inclusive level of analysis they begin to sound like philosophers. Perhaps, then, ongoing cross-fertilization—like the one fostered by that special issue of Biological Theory—will bring less distrust and more fruitful collaboration between the two disciplines.

Advertisements


Categories: Massimo's Technical Stuff, Philosophy of Science

109 replies

  1. Hi Massimo,

    But my understanding — per Bunsen — is that quantum mechanics actually makes it impossible to do in principle.

    What Bunsen is saying is that the current “official” and rigorous formalism of quantum mechanics doesn’t lend itself to computer Monte-Carlo simulations. But (unless you think that one-big-wavefunction, many-worlds QM is the final answer) that is because current formalism is incomplete, it doesn’t do the “collapse” of the wavefunction and thus doesn’t do the quantum indeterminacy properly.

    But, computer simulations are never a matter of doing things fully rigorously, that is always way too impractical. Such simulations are always good-enough approximations, and the art of being such a theorist is to develop ways of approximating that work well enough.

    Thus, to computer-simulate a quantum system, what one does is to throw in some random-number probabilities. This works fine because, in any complex system, the probabilistic aspects almost always average out. The example I gave of modelling the mass of the proton is a good example. Internally the proton is a huge mess of virtual particles, all coming and going according to QM probabilities. But these probabilities average out to give a definite value for the total overall mass.

    For any macro-scale ensemble we know that QM probabilities average out because classical physics does work very well in the macro-scale world. Thus computer modelling techniques are routine in many areas of physics.

    Like

  2. I think the non-collapse of probabilities is similar to determinism, as it projects future probabilities onto the past, as determinism projects past determination onto the future.
    Biology, at least neurology, is presentist;
    https://www.newscientist.com/article/2132847-your-brain-is-a-time-machine-why-we-need-to-talk-about-time/?utm_campaign=Echobox&utm_medium=Social&utm_source=Twitter#link_time=1496743708
    I realize the point I keep making, that time is change, by which future possibility coalesces into the present and fades into the residual, future becomes past, making it an effect of activity, similar to temperature, gets roundly ignored, but consequently, no one is disproving it either.
    By treating time as a measure of duration, physics explicitly assumes it is the flow from past to future, with duration as evidence of this temporal dimension, between two events, just as distance is a measure of a single spatial dimension, between two points.
    Yet duration is only the present, as events unfold.
    This lack of distinction between past and future leads to projecting them on each other, thus determinism, or Everitt’s multiworlds. The present is where the input of events are calculated and determined.

    Like

  3. Nearly every theory in physics is a computational simulation of a sort, and has been since the time of Archimedes

    Like

  4. Hi ej,

    … didn’t comment further because I admit I am an amateur at biology, …

    Then how can you be sure that “the physicists and mathematicians commenting here have a difficult time wrapping their heads around biology”? Do you just mean that you disagree with them?

    Like

  5. Bunsen,

    Again, thanks for the clear explanation.

    Synred,

    Maybe I misunderstand either you or Bunsen, but it seems like you are not talking about Laplace-type kind of problem, nor what DM is suggesting. It is perfectly possible, as you say, to simulate quantum behavior on a classical computer. The question is whether one can, by using only the equations of QM, predict the unfolding of the current universe. The answer to that questions seems to be no. My understanding of physics is limited, but I gather also from previous discussions that the reason for the negative answer is that while the equations describing QM systems are deterministic, actual physical QM events are fundamentally random. So, as someone (Bunsen?) said earlier, all you can do is to simulate all possible universes, which is of no help at all.

    DM,

    “Laplace imagined that a demon who had perfect knowledge of the positions of all the particles in the universe could predict what was going to happen. But on QM there are many possible futures (and on some interpretations all these futures happen), so no computational process can pick out a single future.”

    Right, but then you say:

    “I claim that (a) one can in principle simulate all possible futures (and their respective probabilities) and (b) that one can in principle simulate a single arbitrary possible timeline using pseudorandom number generators”

    That sounds to me awfully close to saying that you want to do what Laplace’s demon was supposed to do, especially your move from (a) to (b).

    Saphsin,

    “Like not just in terms of overlap but what it “means” to calculate an electron’s position in the electron cloud”

    Good point, but that’s why the discussion has been mostly about “calculating” macroscopic events from QM principles. And as Bunsen said above, we have no idea, really, of how to do that. So at the macroscopic level the distinction between ontology and epistemology still holds clearly, I think.

    David,

    Back to biology! Mendel’s “law” has all sorts of exceptions, so it is a fairly limited empirical generalization, and nothing like laws in physics. Again, I suspect early 20th century biologists called it a law because of their pronounced physics envy. Fisher explicitly modeled his “fundamental” theorem of natural selection (which also has all sorts of known exceptions) after the second principle of thermodynamics, precisely because he thought that the new science of biology should model itself after physics. Fortunately, the new generations of biology seem to have gotten over that particular hang-up.

    Like

    • >Maybe I misunderstand either you or Bunsen, but it seems like you are not talking about Laplace-type kind of problem, nor what DM is suggesting. It is perfectly possible, as you say, to simulate quantum behavior on a classical computer. The question is whether one can, by using only the equations of QM, predict the unfolding of the current universe. The answer to that questions seems to be no. My understanding of physics is limited, but I gather also from previous discussions that the reason for the negative answer is that while the equations describing QM systems are deterministic, actual physical QM events are fundamentally random. So, as someone (Bunsen?) said earlier, all you can do is to simulate all possible universes, which is of no help at all.

      When we do a simulation, we do collapse the wave function, usually using a pseudo random number. One could use a ‘real’ random number, by say using the time between radioactive decays to ‘generate’ them. There can be problems with the pseudo random number generators that have correlations in them, but the art of making random number generators is well advanced and such problems are minimal if you chose one of the good ones (freely available in, e.g., CERN software products).

      It’s also to make a ‘many worlds’ MC, but the amount of memory and CPU needed grows exponentially, so it’s not practical accept for a very small problem run for a very short time. It’s better to ‘collapse’ (a.k.a., pick one) as ;you go. It’s not an approximation, but how QM is done in the standard interpretation which gives the same answer as ‘many worlds’.

      I really don’t understand Bunsen argument, but then I’m only an experimentalist and we use eyes and ears and brains to observe our results. I do not know what Bunsen’s field is, but to me his argument appears meaningless.

      As Coel points out no simulations today are done w/o approximation. Simulations of the evolution of Universe are mostly done using Newtonian gravity with little GR and QM only put in at the beginning to predict the spectrum of fluctuation at seed galaxy formulation and they do not work at the level even atoms, much lets quarks and such and still take every bit CPU and memory we can muster.

      Any thought of simulating all the way from whatever the fundamental physics is to a cat is a just ‘thought simulation’. It would be nice. You could do experiments on ‘mc cats’ that would be considered unethical on real cats – or could you? </:-]

      https://goo.gl/lJbcx5 ‘Causality — a tale of many worlds’ in which a truly super computers is imagined and the programmers are plants (very rough draft)

      Like

  6. Hi Massimo,

    My understanding of physics is limited, but I gather also from previous discussions that the reason for the negative answer is that while the equations describing QM systems are deterministic, actual physical QM events are fundamentally random.

    That depends on your interpretation. On MWI, for instance, there is no randomness fundamentally. Everything happens. It just looks random subjectively because you’re seeing an arbitrary set of events, not all of them.

    That sounds to me awfully close to saying that you want to do what Laplace’s demon was supposed to do, especially your move from (a) to (b).

    (a) You can simulate all futures and predict how the overall wavefunction of the universe will evolve. But that doesn’t tell you what you will subjectively experience, so it differs from Laplace in this respect.

    (b) You can simulate one possible future, but there is no reason to suspect this will be the future you will subjectively experience, so again it differs from Laplace.

    So I agree with you that it is fundamentally impossible to predict the future (unless by predict the future you mean predict all possible futures, which on MWI will all happen to you, so that’s actually an accurate prediction if not all that helpful).

    Like

  7. EJ: Yeah, I found this essay interesting too, but don’t know enough about the subject to really push the discussion. Unfortunately, it is clear that no one else does either, as the discussion has slid back to the old favorites: “supervenience” (one of the most explanatorily useless words ever concocted by philosophers) and “computable universes” and “mathematical descriptions of everything,” none of which is even remotely serious, but is sort of the internet’s version of bar chatter.

    Liked by 1 person

  8. From my point of view, the problem is only that Massimo covered the topic very well in the original article. There doesn’t seem like there’s much to say on that apart from congratulating him. I’m indulging my interest in the tangents only because Massimo also seems interested. I’m not sure that it comes at the cost of a deeper discussion on Massimo’s article.

    Like

  9. Massimo: Here is an effort at several questions, regarding the article’s topic. If they are completely inapt, please say so.

    Does part of the difficulty with theory in the biological sciences have to do with the fact that biological explanations are teleonomic? Or at least, that some are? That not simply material and efficient causality are involved, but a type of cause that is dependent on the sorts of variables that do not easily admit of lawlike generalization?
    On a related note, I take it that biological laws, to the extent that there are any, rely far more on ceteris paribus clauses than do laws in physics or chemistry. Is this also part of the problem as you see it? After all, once a ceteris paribus clause gets too large, its unclear in what sense the generalization consists of a law at all.
    I wonder to what extent many explanations in biology, while grammatically appearing to be causal in a roughly mechanistic way, are really more natural-historical in nature; i.e, as a matter of their underlying logic, rather than their surface structure.

    **Again, please let me know if these questions are inapt or so wrong as to be unanswerable.

    Liked by 1 person

    • Newton’s first law states that every object will remain at rest or in uniform motion in a straight line unless compelled to change its state by the action of an external force. … The third law states that for every action (force) in nature there is an equal and opposite reaction.

      ceteris paribus ? Seems so to me.

      Of course you can just write F=mA and fit the first two in one line.

      Or you can invoke the principle of least action (God is lazy <|:-o) ) and make it appear teleological.

      Like

  10. “On a related note, I take it that biological laws, to the extent that there are any, rely far more on ceteris paribus clauses than do laws in physics or chemistry. Is this also part of the problem as you see it?”

    Excellent question, I think. One of my problems when I try to understand ET is that the theory not always explains which ceteris are supposed to be paribus.

    Like

  11. I’m not sure that it comes at the cost of a deeper discussion on Massimo’s article.

    = = =

    Well, obviously, a number of us disagree. And the idea that there is nothing to discuss in an essay on a subject of this complexity and depth isn’t credible. Also, it isn’t just some coincidence that this particular cluster of tangents is brought up over and over and over and over again, regardless of the topic of the initial post.

    Liked by 1 person

  12. Dan,

    Those are all damn good questions!

    “That not simply material and efficient causality are involved, but a type of cause that is dependent on the sorts of variables that do not easily admit of lawlike generalization?”

    Right. That is the basis on which Fodor — mistakenly — thinks that Darwin “got it wrong.” Because Fodor seems to think that the only acceptable scientific theories are those that lend themselves to law-like generalizations. I dont have to explain to you why that’s very clearly off the mark.

    “After all, once a ceteris paribus clause gets too large, its unclear in what sense the generalization consists of a law at all.”

    Right, that was also part of Fodor’s problem. But all it means is that there are no law-like generalizations to be made in biology, not that the theory of evolution is somehow wrong or unscientific.

    And of course, Cartwright argues that a very similar problem holds also for physics, only to a different degree.

    “I wonder to what extent many explanations in biology, while grammatically appearing to be causal in a roughly mechanistic way, are really more natural-historical in nature”

    That’s an ongoing open debate in the philosophy of biology. I wouldn’t exactly say natural-historical, but certainly there is disagreement about mechanisms vs causal explanations, and “cause” means different things in different biological contexts. I participate to a worksho at the KLI this past spring on that very topic, proceedings due out next year.

    As for your debate with DM concerning why we keep going off topic into physics, I strike a balance between the two positions. In this case I was genuinely curious about Laplace vs QM, but yes, it gets rapidly away from the focus of the OP…

    Liked by 1 person

  13. Per Dan, Couvent, et al, the environmental factor in biology is always an issue. But, how much, where, and when, certainly varies. I just saw a piece a couple of days ago that prion-like proteins could be part of the reason for the spread of Type II diabetes. http://www.sciencemag.org/news/2017/08/could-diabetes-spread-mad-cow-disease

    ==

    Otherwise, Dan’s question kind of gets at what I was kind of getting at in the question I wasn’t sure how to ask yesterday. That’s how much environment and other ceteris paribus, sometimes stochastic, can be addressed by a theory. Darwin himself used the famous “tangled bank” phrase, after all.

    Like

  14. Massimo: Interesting point re: Fodor, which is why I thought that his book, while mistaken, was mistaken in an interesting way, such that it did not deserve the abuse it got. In a sense, you, Coel, and Fodor are on a spectrum, considering what should be called “science.” Coel has the broadest sense in mind. You a narrower one. And Fodor an even narrower one. (In a sense, Fodor’s stance is not entirely surprising, given the generation he comes from and the influence that Hempel had over the philosophy of science.)

    But, as a virtue ethicist, of course, you choose the mean between extremes. 😉

    Liked by 1 person

  15. My remark was really a footnote to an idea that is not yet fully formed. I’m getting the notion that biology and physics require two very different epistemic strategies for research and understanding, and that the epistemic strategies of physics are similar to those of math, while those of biology are closer to the strategies we use to understand language. I don’t know why that would be so. Nor is it clear to me why any scientist (or social scientist) – may trust the epistemic strategies dominant in other fields of research than their own as somehow superior or more explanatory to those in their own field, while others are quite at home with the strategies dominant in their own field. But this raises another issue, namely, the manner by which these choices of epistemic strategy lead to further choices in epistemic or even ontological claims.

    All this would help explain, or rather raise the question, why it is that scientists from one field would try to bring a conversation in another field into the domain of the epistemic strategies they are most comfortable with.

    In any event, this article, and the comment thread, triggered these thoughts, because at the base of Massimo’s discussion here of theory in biology, there does seem to be a problem of bringing together stories of how we know what is learned in research into a viable explanation of what we know – what I mean by “epistemic strategy.”.

    Liked by 1 person

  16. EJ, per your comment, at a minimum, because of its greater feedback loops vis-a-vis physics, that alone would require a different research strategy, I would think. Or rather, a different research mindset. I think that’s the difference between different sciences, and social sciences and hard sciences, for that matter.

    And, of course, this would directly militate against the likes of Coyne.

    It would also underscore Massimo noting that there is no “philosophy of science,” but rather various philosophies of different sciences. For that matter, we should probably speak of different philosophies for different social sciences.

    And, to go meta on Massimo … “philosophy of philosophy”? Ask Doug Hofstadter to work that into a GEB type essay. 😉

    Liked by 2 people

  17. Massimo,

    … the reason for the negative answer is that while the equations describing QM systems are deterministic, actual physical QM events are fundamentally random.

    If you count the Born rule as part of the equations of QM then it does do the non-deterministic part, and one can then use QM to do computer simulations.

    Dan,

    And Fodor an even narrower [view of science]

    If Fodor is insisting “that the only acceptable scientific theories are those that lend themselves to law-like generalizations” (where “laws” need to be both simple and universal) then that would rule out large swathes of physics, much of chemistry, nearly all of biology, all of geology, meteorology, oceanography, archaeology, psychology et cetera.

    ej,

    … and that the epistemic strategies of physics are similar to those of math, while those of biology are closer to the strategies we use to understand language.

    It’s important to realise that physics covers a broad swathe of topics. Fundamental physics deals with the simplest and most basic entities, and that lends itself to simple and general descriptions (“laws”). But fundamental physics is only one part of physics, and physicists are just as at-home dealing with complex aggregate systems.

    For example, astronomers study the nature, formation and evolution of stars, galaxies and planetary systems. These are all vast, complicated agglomerations, and while the fundamental laws still apply, the ensembles are heavily affected by their environment, by historical contingency and by stocasticity. Further, they are way too complex and multi-part to model exactly (a galaxy can contain 10^68 particles) so absolutely all models about them are rough approximations at best. Again, physicists are quite at home dealing with this. Ditto lots of things that physics gets applied to, such as meteorology and climate, physical geology, physical chemistry, et cetera.

    The point is that the nature of biology is not that different. Many of the ways in which it is different from fundamental physics also apply to many other areas of physics. (Though obviously biological organisms have a much greater degree of complexity.) When I read Massimo’s account of how theory is done in biology, there is nothing there that seems alien to a physicist, nothing radically different from how a physicist might think if they turned to that subject.

    There are obviously differences in subject matter, and that dictates lots of pragmatic differences in how to approach it, but beyond that it doesn’t seem very different in epistemological terms. As you have noted, some successful biologists, such as Jerry Coyne, think pretty much like physicists on plenty of issues.

    Like

  18. Ej,

    “I’m getting the notion that biology and physics require two very different epistemic strategies for research and understanding”

    In part, mainly for two reasons: (i) biology deals with incredibly complex and variable systems compared to physics; and (ii) biological systems evolve, i.e., they change through time, which makes biology an inherently historical science (more like astronomy, say, than fundamental physics).

    Socratic,

    I wouldn’t say that there is no philosophy of science. There is, but it has taken a back seat to the more specialized philosophies “of.” We are no longer in the times of Popper, Kuhn, Lakatos and Feyerabend.

    As for meta-philosophy, as you know, here is my contribution to it: http://tinyurl.com/y8rb2r6e

    Coel,

    “If you count the Born rule as part of the equations of QM then it does do the non-deterministic part, and one can then use QM to do computer simulations”

    I don’t know nearly enough about this, but from what I’ve read it still doesn’t get us to anything like Laplace-style predictions of the macroscopic world, does it?

    Like

    • “If you count the Born rule as part of the equations of QM then it does do the non-deterministic part, and one can then use QM to do computer simulations”
      I don’t know nearly enough about this, but from what I’ve read it still doesn’t get us to anything like Laplace-style predictions of the macroscopic world, does it?

      Hi Massimo, w/o the Born rule QM predicts nothing and ‘many worlds’ gets the answers wrong (if you just count worlds in the obvious way).

      Advocates of ‘many worlds’ (E.g., Oxford philosophers) do elaborate mathematical hand waving to hide the fact that they are assuming the Born rule too.

      Like

  19. Massimo, I may have overstated you a bit, but yes, that’s the gist. And a philosophy of an individual science is in part, or at least a fair overlap, with its “mindset,” is it not?

    ==

    Per the other dialog here, given that there’s at least, what, 15 different interpretations of what QM “actually” is, isn’t most of this really spit-balling?

    Like

  20. Coel: Fodor does think there are psychological laws. About a third of his quite substantial output was devoted to demonstrating that. As for evolutionary biology, he thinks it really is a form of natural history, rather than explanatory science, which is why I asked Massimo what I did about natural-historical explanations.

    Fodor comes from a generation when the Hempelian covering-law model of scientific explanation still loomed large over the philosophy of science. It’s not surprising that his views are what they are in light of that fact. While ostensibly anti-positivistic, like many others, he is still strongly affected by that framework and way of thinking.

    Like

  21. Hi Massimo,

    … but from what I’ve read it still doesn’t get us to anything like Laplace-style predictions of the macroscopic world, does it?

    Correct, it doesn’t. Laplace dreamt of being able to make exact predictions, such as predicting whether it will rain in a certain place on a day 100 years into the future. There’s no way that’s even in-principle possible, given the combination of quantum indeterminacy and chaos (sensitivity to initial conditions).

    But that doesn’t negate the general usefulness of computer simulations. For example, if you build a computer model of the climate, and how it responds to CO2, you’ve no chance of the rain-on-a-certain-day prediction, but you could in-principle predict the climate 100 years into the future and thus the average yearly rainfall.

    Further, many systems are lnot that affected by quantum indeterminacy, chaos and stocasticity, because the system averages over those things and the behaviour is robust to them. To give an example, if you want to predict whether a certain star will go supernova in the next 10 million years, a computer simulation can (in principle) predict its future evolution reliably and give you a pretty certain yes/no answer.

    Liked by 1 person

  22. Socratic: “And, to go meta on Massimo … “philosophy of philosophy”? Ask Doug Hofstadter to work that into a GEB type essay. 😉”

    In grad school, we used to chuckle over the category “bibliography of bibliographies.”

    Like

  23. WTC, “bibliography of bibliographies” — I think that’s somewhere in Borges’ library!

    Like

  24. “a fairly limited empirical generalization, and nothing like laws in physics”: I guess I would be closer to those that argue there are no “universal and exceptionless (across all space–time points)” laws, even in physics, let alone chemistry. I suspect the term Law was applied to Mendel’s theories because he finds nice geometrical ratios like 1:2:1, 1:3 in such messy stuff as the results from biological experiments – Galton’s attempts to find similar numbers were not quite right.

    Like

  25. Hi Arthur,

    At most (even DM I think) would claim that nature is the result of a QM Hamiltonian operating on stuff that obeys QM ‘law.’

    I guess so, although I wouldn’t put it in those terms necessarily. I’m just imagining that if you had a perfect simulation of physical law on an infinitely powerful computer, you ought to be able to look inside it and explore it much as we do with much simpler virtual worlds all the time (e.g World of Warcraft). You could do so by simulating a viewpoint within the simulation which wouild act as an “observer”, and we could render to a screen what that “observer” would see. That “observer” would see stars and planets much as we do. The simulation wouldn’t think of them as stars and planets — they’re just collections of particles from the point of view of the code, but we humans peering into the simulation would recognise them.

    Liked by 1 person

  26. Given the feedback loops and the epistemic nature of knowledge, philosophy of philosophy would probably amount to the politics of philosophy.

    Like

%d bloggers like this: