Is game-based learning assessment a Big Data problem? Lessons from piles of horse manure.

loading horse manure

Several years ago I was invited to participate in a panel on Big Data and Machine Learning in on-line learning games at the annual Neural Information Processing Systems meeting. NIPS has become THE annual meeting place for the leaders of the Machine Learning field as well as the leaders of Google, Facebook and others looking to hire the leaders in the field. Introduced at the workshop as someone who has been involved in on-line learning for many years, a participant in the front row remarked that it was interesting that one of the founders of the NIPS meeting, way back in 1986, had my same name. I responded that that was more interesting than he knew. J

It turns out that about the same time I was helping to found NIPS, our idea that computerized games could be used to assess science learning started the research and development effort that resulted, 15 years later, in the founding of Now an additional 17 years later, it turns out we have been thinking about and working on game-based learning and assessment for a very long time.

Perhaps for this reason, this year at the annual South by Southwest EDU festival in Austin Texas, I was invited, this time by the U.S. Department of Education, to participate in a workshop on the potential use of on-line games for assessing student learning. Several in the room introduced themselves as being interested in applying NIPS-like Machine Learning approaches to the enormous amounts of data potentially generated by learning games. However, when it came my turn to introduce myself, I decided to self identify instead with my latest commercial enterprise: selling high quality horse manure compost in Southern Oregon.

Judging from the reaction, several participants appeared to regard this as some kind of confession: and then seemed surprised when I went on to suggest that the process of making high quality organic horse manure compost was potentially relevant to the question at hand. How is that?


With the growth in backyard gardening and reduce, reuse and recycle (in Texas, blast it, burn it, and bury it J), more and more are familiar with the seemingly miraculous biological process by which table scraps turn into organic soil.

My own experience with this biological process is even more historical than my involvement in game-based learning, going back 60+ years to my early life on the family dairy and horse farm in upstate New York. One can argue about how much I know about game-based learning, but I do know how to compost horse manure.

So how does that process work? First, it is important to start with high quality horse manure from horses fed good food, free of as many pesticides and other drugs as possible. Many pesticides in particular go straight through your horse, into the compost, to then suppress plant growth where it is eventually applied. Likely, the level of exercise and fitness of the horses matters as well, although I dont know anyone who has studied that. In other words, the quality of the horse manure you start with is critical.

For any biological process, the quality of the eventual product produced depends strongly on establishing the best possible environment for further development. In the case of horse manure, ideal conditions for the microbes that actually do the work is a pile approximately 5 feet high and 5 feet across, and as long as you want.

In any biological process, then, the trickiest part is process control, i.e. knowing how to monitor the process and when and how to intervene to keep it going to its full and best completion. In the horse manure business, progress is monitored using temperature. At start, if you have the right manure and the right conditions, the pile reaches 100 degrees (F). When the temperature drops down to about 80 degrees, it is time to turn the pile to push optimal composting. At that point the temperature can go up to as high as 160 degrees, where decomposition is at maximum and weed seeds, fly larvae and other organisms potentially carrying diseases are also killed. As the temperature starts to drop again you turn the pile one more time and wait for it to return to ambient temperature. The whole process takes about 3 months.


Learning as a biological composting process

So what possible relevance could any of this have to game-based learning and its assessment? First, abundant research suggests that the physical and emotional state of children is critical to success in school. Recent evidence suggests that the right conditions for learning actually start in utero; a big problem in a country, like the US, where there are huge and growing disparities in the quality of life of young children.

Second, optimal learning depends on the quality and sophistication of the overall learning environment. was built on the assumption that optimal on-line game-based learning requires a sophisticated overall environment, not just a series of one-off games. As in an effective classroom, many different forms are available for learning, not just one. The sophistication with which those different forms interact to promote learning is key to success.

Third, success takes time. Current classroom structure puts time to do anything sustained at a premium, and as a result many learning games have been designed accordingly. However, children, like the microbes in horse compost, need sustained involvement for success. Whyville is being used increasingly as an outside classroom supplement for sustained involvement in game-based learning.

Fourth and perhaps most importantly, the sophistication of the methods used to monitor progress is critical. In the world of formal learning, the process of monitoring progress is referred to as formative assessment, and, as in horse manure composting, it is essential for success. Applying current trends in Big Data to horse composting, one could see yet another computational opportunity: instrument the trillions of microbes, collect all that data, push it into the cloud and use machine learning to figure out the best metric for assessing progress. Similarly, many of the discussions I have heard about big data in on-line learning imagine collecting every possible bit of data, pushing it to the cloud, and using some analysis method (Machine Learning for example) to figure out what is there. In the worst examples, all that big data is simply pushed at teachers on the assumption they will be able to figure it out.

Fortunately, when humans began composting (likely a very long time ago), we didnt have sophisticated measuring devices, computers, the cloud, etc. Instead, temperature became the global measure for composting, telling you when conditions were optimal to engage the microbes to do their thing and when conditions need to be changed for re-engagement.

It is my assertion that the best and most successful teachers effectively use engagement measures (i.e. temperature), to measure progress in their classrooms. However, for this global measure to work, it is necessary for the learner to be in a sophisticated well-designed learning environment. If all you want is heat, you can construct a compost pile that goes over 160 degrees; however, at those temperatures the beneficial organisms for decomposition are immobilized. Similarly, if all you want is maximum addictive human engagement in games, you can certainly design the games accordingly, but it is not likely that your users will learn much that is useful. On the other hand, as in manure composting, a wonderful thing about biological organisms is if they are properly prepared and the environment is properly constructed, monitored and adjusted, and they are given time, they will naturally do what they evolved to do. For humans, that is learning. Skilled teachers who use engagement measures to monitor their classrooms know that if you engage children, they will learn. If you engage them in a rich and well-designed learning environment, temperature is an ideal global measure of their progress. For this reason, we have spent considerable effort in Whyville developing measures of engagement.

Finally, there is one more assessment lesson from composting that I would like to suggest could and should be applied more vigorously to learning games. In addition to Formative assessment, we also care about Summative assessment, both to know if the manure composted and to know what our children have learned. In the case of horse manure composting, there is one age old measure of the quality of the final product you pick it up in your hands and smell it if it still smells like horse shit, the composting didnt work. It might be useful to apply a similar smell test to learning games.

Training your brain with games: The “Brain Science” (BS) of Neuro-marketing.

One could wonder at the sophistication of Google Ads, when the most common ad I am served, as an aging boomer, is from Lumosity, hawking their brain training games. Or, because I am actually a neurobiologist, one could question Google’s sophistication because, in fact, I know that there is actually no scientific basis for Lumosity’s claims for “cognitive improvement”. Further, Google should know that over the last several years I have become more and more strident, concerned and vocal about the growing misuse of neuroscience by companies and individuals hawking complete nonsense like “the Science of Neuroplasticity” to sell their wares.

While perhaps the growing number of non-neuroscience keynote speakers now referencing brain imaging and some version of pop cognitive neuroscience in their talks can be forgiven to some extent, there is a decidedly disturbing trend towards commercial enterprises that claim to include neurobiologists in their ranks, like Lumosity. Of even more concern are the increasing number of faculty with actual neuroscience appointments who are giving talks, or hosting PBS specials without acknowledging up front their commercial interests. In my view, this kind of deception hurts neuroscience, the public as well as the development of games for learning. As evidenced in this blog, I have steadfastly for years refused to justify my efforts with Whyville based on my knowledge of neuroscience.

Not surprisingly, perhaps, there is common thread of false assumptions, exaggerated claims, and miss-use of neurobiological data and especially brain imaging running through many of these enterprises. While this forum is not appropriate for a full scientific refutation of these miss-representations, perhaps it is useful to briefly mention a few:

1)   Is it possible to improve cognitive function by making a game out of a task originally developed as a cognitive assessment measure? This slight of hand is the basis for many of the brain training games and especially those now being promoted by Lumosity. To understand the issue, let me change the frame of reference to nutrition. While it is not generally known, the original objective of the FDA’s recommended daily adult nutritional requirements was to identify the amounts of certain nutritionally important substances (Calcium, Protein, Vitamin C, etc.), which, if present in a normal 1950’s diet of ‘real’ food, meant one was eating well. In other words, the FDA’s objective was to come up with a set of indicators as to whether your diet was healthy given you were eating non-manufactured food. Of course, food manufacturing companies distorted the intent, producing, for example, breakfast cereal like ‘Total”, claiming that one bowl would fulfill all your daily requirements. Can you survive eating ‘Total” alone, of course you can’t.   The FDA’ s ‘indicators of a healthy diet’ were simply misused as a marketing ploy to sell cereal. Similarly, when Lumosity converts an ‘indicator’ of cognitive function, into a game claimed to enhance that cognitive function, it is once again only a marketing ploy.

2)   Does improved performance on a game actually mean that you have ‘cured’ a cognitive deficit? While Lumosity would like you to believe that you can tune up your brain with their games (no evidence see below), a new set of companies founded or directed by neurobiologists (e.g. want you to believe that their games have the potential to cure cognitive disease. Of course another marketing ploy used by these Neuro-marketers is to generate as large a bibliography of apparent scientific support for their products.  Lumosity itself builds their “scientific bibliography” with their own research papers as well as by making their game data available to ‘collaborating scientists”. In one such recently published example, a clinical researcher reported that his patients suffering from liver disorders performed significantly slower on several Lumosity games ( Using this study as an example, if, as is usually the case, repeated play of the game increases the performance on the game, does that mean that the livers of his patients have been returned to health? Of course not. It simply means that repeated playing of the game results in improved performance on that game. Similarly, does this mean that you can diagnose liver disease by playing the game. Of course, not there are probably thousands of reasons why your performance might be slower on this game. It is only when you know someone has liver disease that you can interpret the results – not diagnostic at all.

3)   Does improved performance on a game mean general improvement in cognitive function?  While it is perhaps obvious that improving game performance through repeated play is unlikely to affect a patents liver, the same concern actually also applies to claims of general cognitive enhancement by playing games. This is known as the question of transference, and has probably been the subject of the largest number of scientific studies to date related to the claims of the brain-training marketers. While a review of that literature is beyond the scope of this article, there is no question that the preponderance of the current evidence shows no transference at all. In other words, the only thing you learn by playing a game is to improve your performance on that game, or closely related games, with no measurable influence on your general cognitive abilities. That is not to say that there are not claims made in the literature for such transference.   However, not infrequently, those claims have made by individuals with vested interests.   Repeatedly, when non-commercially interested scientists run the experiments again, they fail to show the same results. For a recent review of this issue see:

4)   Are changes in human cognition with aging “cognitive decline” and can or should we all be striving to have brains that work like 20 year olds? This assumption is a core assertion and marketing ploy of most of the brain-training marketers. As already mentioned, there is a reason that Lumosity’s Google ad words campaign is aimed at aging boomers, while their on air marketing is primarily on PBS and NPR (mostly watched by boomers). Their marketing strategy is based on what I believe to be a societally induced (and profitable in general) cognitive insecurity in seniors coupled with the claim that they can restore younger cognitive function by playing brain games, a core assumption betting that brain performance peaks in your mid 20’s and declines thereafter. This, however, is a false and misleading statement of the reality of the process of human brain aging. Having spent the last 3 years in one of the top research institutes on aging in the U.S. ( the fact that human cognition changes from birth to death most likely reflects the different roles and positions of individuals with different age in human society. In a kind of reverse miss understanding of the process of brain development and aging, one often hears these days that adolescent males have deficits in their frontal lobes that preclude them from accurately assessing the possible longer term consequences of their short term behavior. If this was not the case, I predict you would have many fewer young males signing up for the military, or assuming the long-term risks associated with a few moments of pleasure on Saturday night. Young males and females play a distinctly different role in society than do older individuals. Cognitive changes almost certainly reflect those roles. While again this is not the place for a long discourse on the subject, brain modeling has suggested that brains with less experience solve problems faster, but in a less sophisticated way than do older brains with more experience. In the parlance of brain modeling, less experienced brains solve problems using look up tables, while older brains (with more experience and fewer neurons) solve problems through a process of generalization. Call it the Wisdom effect – but the point is that, in fact, brain computation is always trading off different kinds of processing – fast and less general, slower and wiser. While there is, as mentioned, no evidence that playing games can restore ‘young brain function’, it is also likely to be a very bad idea. Perhaps one should even consider whether the attempt to do so might have deleterious psychological effects. For example, what are the potential psychological consequences to seniors of watching their performance deteriorate with time on games requiring more youthful cognitive abilities? Of course, Lumosity would seem to take care of that problem by assuring that performance continues to appear to improve as you play their games. Throughout marketing history companies have made money by exploiting older people, this is just a new way to do it.

5)   Do the measures of brain function used to justify and support neuro-marketing actually measure what is claimed? For a neurobiologist, this is perhaps the most disturbing aspect of the recent growth in Neuro-Marketing. It is remarkable how many marketing efforts, across a broad spectrum of products and keynote addresses, now at one point or another provide a bright red, yellow and green colored picture of the brain as some kind of proof of some claim being made. It is not unusual, for example, for presenters to show a brain image in which the brain playing the games has all kinds of bright colors (best if in ‘frontal cortex’), while one not playing games has none at all. Having spent 10 years as a professor in a Brain Imaging Institute, I can tell you that one of the many scientific problems with brain imaging is that it is remarkable how easy it is to manipulate the outcome – however, a brain with no activity at all is simply dead. For sure a dead brain is hard to engage in game play.   However, there is a deeper and more fundamental problem with fMRI brain imaging, and that is that the technique does not actually measure the activity of neurons, instead it measures metabolic changes (the degree of tissue oxygenation, for example). Though a serious subject of research for many years, there is still no definitive connection between these metabolic measures and the actual behavior of the brain’s neurons. Therefore, when someone points at a colored brain and tells you that some part of the brain is being ‘activated’ by their game or product, there is actually no scientific basis in fact to make that claim. It is simply marketing. The fact that most of cognitive neuroscience itself relies on this slight of hand is no excuse. I have suggested publically for several years that the federal government should suspend funding for MRI studies until we actually know what the signal means to brain function. This situation is of even more concern when fMRI images are combined with other non-specific measures of brain function, like EEGz.  First, theoretically it is not possible to localize the brain sources of EEG recordings without assuming in advance where the signals are coming from. You can therefore understand the problem if you are using fMRI data, which has no known relationship to neuronal activity, as a mechanism to localize the source of EEGs. The argument is completely circular. The fact that such a combination might make a fascinating illuminated display is useful for marketing, but not for brain science. Further claims now made that you can use this combination of measures during game play to ‘fix’ broken brain circuits, is technically ridiculous, in addition to discounting at least 40 years of research in computational neuroscience about how brains actually work.

6)   Finally, one can ask, are there actually known scientifically valid ways to stay cognitively ‘sharp’. One of perhaps the most infuriating aspects of the endless Lumosity ads I keep being subjected to, is that they almost all start by pointing out that you go to the gym to keep your body in shape, but aren’t doing anything for your brain. A great marketing ploy, but, in fact, the only scientifically proven way to maintain cognitive vigor is physical exercise. If Lumosity where the ‘neurobiologically based’ company it claims to be, they would know that. Instead, of course, their objective is to get as many seniors as possible to sign up for their brain training games – while, by the way, making it very hard to unsubscribe. Classic marketing ploy for older people – and shameful.

If you are interested in a somewhat more playful discussion of these issues, you might find the session I organized at last years Games for Change conference of value:


The Structure of Scientific Revolution: From Newton to Neuroscience

WARNING –  A LONG READ FULL OF PHILOSOPHY AND LARGELY UNREFERENCED.  Originally written as an essay of collected thoughts of neurobiologists, but the editors decided not to publish it, so I decided to do so here.  My apologies, but you were warned.  🙂

Many years ago, I was asked by Jack Cowan, Professor of Mathematics and Neurology at the University of Chicago, to give a lecture to his friends and colleagues in the Department of Physics on the current state of Computational Neuroscience.  That title for that lecture is the same title as the title for this article.  Arriving in his office 30 minutes prior to the start of the lecture, and having just read my talk title, Jack implored me to please not suggest that Neuroscience was “non-paradigmatic” as he said that this was the excuse used by his physics colleagues to not get involved in neuroscience research. I told him that I couldn’t very well change the entire point of my talk with 30 minutes to go, but that I would try to fix the problem.  Stating at the start of the talk that I believed, as they apparently did too, that neuroscience is as yet non-paradigmatic, I then asked who in the audience didn’t want to be Newton?  On reflection later, I realized that, in fact, nobody did – their careers, perhaps especially including the Nobel Laureates among them, being entirely dependent on the fact that modern physics is ‘paradigmatic’.

50 years ago, it is very likely that the use of ‘paradigmatic’ and ‘non-paradigmatic’ in the first paragraph of this essay would have already lost the reader.  However it is now more than 50 years since Thomas Kuhn introduced this terminology to science (and sadly the mass media) in his now classic book The Structure of Scientific Revolution  (Kuhn, 1962).  While Kuhn himself subsequently labored with the concept of a scientific paradigm (and the associated ‘paradigm shift’), his work has more than confirmed the sentiment expressed in the book’s first sentence that:

History, if viewed as a repository for more than anecdote or chronology, could produce a decisive transformation in the image  of science by which we are now possessed.”  (Pg. 1, Kuhn, 1962).

In this essay, my intention is to use a “Kuhnian” framework to consider the current state of Neuroscience as a ‘Science’ as well as the consequences of that state for our likely short term and longer term progress toward understanding how the brain actually works.  In its essence, my view is that neuroscience is in fact today  ‘non-paradigmatic’, reflecting a more folkloric, descriptive enterprise than a true science.  Accordingly, I have serious concerns regarding current explanations and understandings of neural computation at all levels.  While presenting this concern, I will also briefly attempt to demonstrate how the path to a paradigmatic structure for neuroscience can be informed by exploring the origins of modern physics, and, in particular, the accomplishments of physics in the 16th and 17th century which also formed the basis for Kuhn’s own analysis.

Is Neuroscience Paradigmatic?

While a core component of the content of the ‘Structures’ book, the question of what compromises paradigmatic science troubled Kuhn long after he introduced the idea.  As Kuhn himself stated clearly in the postscript of a subsequent edition: “The paradigm as shared example is the central element of what I now take to be the most novel and least understood aspect of this book” (pg. 186: Kuhn 1962).   As has been pointed out by a number of other authors, Kuhn’s use of paradigm in “Structures” was quite inconsistent, and he subsequently published several papers in an effort to clarify the concept (“Second Thoughts on Paradigms” Kuhn).  This confusion over what constituted paradigmatic science obviously also confuses the definition of non-paradigmatic science.  In some instances Kuhn seems to suggest, for example, that physics reached paradigmatic status in the 16th and 17th centuries, while in others he suggests that both Aristotle’s Physica and Ptolemy’s Almagest provided a paradigmatic base for physics 2000 years earlier (pg. 10,   Kuhn, 1962).  It will be my claim that most of theory in neuroscience today resembles a Ptolemaic style argumentation which, because of complexity of biological systems, has failed to systemically organize the field.

As quoted above, a key indicator for Kuhn for the existence of paradigmatic science is the extent to which a core set of beliefs, values and approaches are shared (my emphasis) among the community of scientists involved.   Quoting again from Kuhn, he believed that paradigmatic science exists when a community of scientists adopt  “accepted examples of actual scientific practice—examples which include law, theory, application, and instrumentation together (my emphasis)—provid(ing) models from which spring particular coherent traditions of scientific research.” (pg. 10, Kuhn, 1962).    While it is important to note that Kuhn himself acknowledges that it is possible to perform scientific experiments in a non-paradigmatic context, it is only when the community as a whole shares a common paradigm that science proceeds in an orderly way (Kuhn’s “Normal Science”), with periodic changes in paradigms, (Kuhn’s “paradigm shifts”).

As just stated, Kuhn regarded paradigmatic science as constituting a combination of “law’s, theories, applications and instrumentation” linked together to provide an accepted community model for the coherent advancement of the field.  While, as a physicist, Kuhn principally used the history of physics for his examples, he himself clearly believed that this analysis likely applied to all scientific endeavors.  While it has often been suggested that the strong emphasis on theory and laws emerging from an analysis based on the history of physics might not apply in as deeply an experimental field as biology, I disagree with this sentiment strongly.  In my view it is the mathematical basis for theories and laws that explicitly establishes the necessary substructure for paradigmatic science.  There are many reasons for this, but one is that mathematics provides a rigorous set of definitions of terms, which can be used as the basis for communication as well as collaboration. While almost all modern biological seminars and presentations start with a ‘box and arrow’ diagram intended to provide a ‘theoretical’ (note in quotes) context for the results to be presented, these diagrams almost never represent actual mathematical entities.  Similarly, articles written about ideas for neural function typically rely on these kinds of box and arrow diagrams with little or no mathematical definitions, or underlying mathematical models.  In actual practice, this often means that different practitioners in a field, while using the same words, often mean quite different things.  If communication in a science is only dependent on ideas presented without mathematical definitions, then the ideas are too poorly defined to really be tested.  In this sense, I am happy to align myself with Karl Popper, who believed that hypotheses were only “scientific” if they are falsifiable.

This is not to say that there are no mathematically defined theories and models in neuroscience, there are.  However, I would claim that almost none of those theories have risen to the level of community acceptance or even general understanding necessary to provide a basis for paradigmatic Neuroscience.  At the largest scale, the vast majority of published electrophysiological and biophysical experimental studies make no mention of modeling results, and only passing, inconsequential and non-specific reference to brain theories.  Typically these theories are mentioned only in the first paragraph of the introduction and the last paragraph of the conclusion.  It is exceedingly rare to have a theory or model influence the paper’s methods, or help organize its’ results.  While on the surface this would appear to be less the case for so-called  “Cognitive Neuroscience”, its theories and models, by intent, largely stand independent of the actual machinery of the brain itself, usually informed by some form of behavioral analysis.   Behavioral analysis of the cognitive type, however, can be very misleading with respect to the actual structure and performance of the underlying neuronal machinery (Vehicles by Valentino Braitenberg).   Ultimately how the brain works depends fundamentally on how information moves through its neurons and networks.  In recent times, cognitive neuroscience has claimed to have access to that movement of information through the use of brain imaging techniques.  However, it is entirely unclear how the signals measured in brain imaging relate to the actual underlying behavior of neurons and their networks.  Further, the design of brain imaging studies is especially subject to built in assumptions inherent in the theories themselves.  In a section to follow I will discuss the application of Ptolemaic style ‘curve fitting’ modeling to neuroscience, but I regard most cognitive theories to be of that type.

Even within the small subset of neuroscience referred to as Computational Neuroscience, almost no common set of models or laws have emerged that are accepted or even referred to by all.  There are exceptions to this general rule like, for example, the now 60 year old Hodgkin / Huxley model of the generation of the action potential.  In fact a conference I recently helped organize to celebrate the 60th anniversary of the publication of this model was, I believe, the first ever meeting in neuroscience organized around a mathematical model.  However, the Hodgkin/Huxley model addresses neuronal behavior at the biophysical level, not explicitly at the level of how the brain actually organizes and coordinates behavior, which is the level of interest of most neurobiologists.

Some have explained the lack of coordinated or common acceptance of a paradigmatic model because the structure of the neurobiological neurons and networks are not yet well enough understood.  However, in a domain I know very well, the study of the cerebellum, we have known the anatomical structure of its neurons and networks for more than 100 years .  Yet, no sets of rules or laws have emerged for what constitutes, for example, a standard set of laws for what actually constitutes a Purkinje cell, the primary neuron in the cerebellum, even though this cell has been modeled for more than 60 years (20 years of Computational Neuroscience, Springer, Bower (ed), 2013).  I would argue that because of the lack of accepted ‘laws’, most of the Purkinje cell models published in the literature bear no actual resemblance to a real Purkinje cell.  Furthermore, considering the computational function of the cerebellum as a whole, most of the dominant theories explicitly require neuronal behavior contrary to actual experimental evidence.  Perhaps for this reason, as with neuroscience as a whole, the vast majority of experimental papers concerning the cerebellum make no mention of modeling, and only passive reference, of the type already discussed, to cerebellar “theory”.

This situation is not isolated to the cerebellum.  While the hippocampus and visual cortex are both probably the most modeled structures in the mammalian brain, there is currently no commonly accepted model for either structure and no effort to make one.  Even fundamental questions like what are the appropriate measures of neuronal behavior, or the right resolution to consider the synchrony of neuronal activity, lack a common operating definition.  Instead, Neuroscience is dominated by the kind of structure described by Kuhn to exist prior to Newton’s publication of Opticks:  “ No period between remote antiquity and the end of the seventeenth century exhibited a single generally accepted view about the nature of light. Instead there were a number of competing schools and subschools, most of them espousing one variant or another of Epicurean, Aristotelian, or Platonic theory.” (pg. 12, Kuhn, 1962).

Reflecting this non-paradigmatic structure, the vast majority of Neuroscience publications remain fundamentally descriptive in nature, at the same time as the field itself actively resists the development of a common and articulated underlying theoretical or law-based structure.  The fact that Computational Neuroscience is regarded as a sub-specialty in neuroscience is itself a clear indication of the non-paradigmatic nature of Neuroscience as a whole.  Without models and theories to provide common definitions, a coherent scientific tradition of research is not possible.  Neuroscience then defaults to the lowest common denominator human organization, a folkloric and effectively a religious style enterprise, based on mythological story telling, and driven by the egos, opinions, and the predetermined worldviews of the individuals involved.  Without an underlying mathematical structure, the field can easily incorporate or ignore conflicting data or not even recognize when data conflicts exist.  As Kuhn recognized, this is they key feature of paradigmatic science.

How then do we proceed?

It is clear from the historical record that the long slow evolution of planetary science provided the foundation for the origins of modern physics as it also also strongly influenced the development of Kuhn’s thesis.  It is my view that this history can also provide an important guide to establishing a paradigmatic base for neuroscience.  As has already been mentioned, however, Kuhn’s own consideration of the history of paradigms in the context of planetary science is somewhat confused.  On the one hand, he clearly recognizes that the work of the 16th and 17th century physicists fundamentally changed the underlying structure and behavior of physics and physicists, more than would be expected from a simple “paradigm shift”.  On the other hand, in the first chapter of ‘Structures’ he lists both Ptolemy’s Almagest, and Newton’s Principia as serving, “for a time implicitly to define the legitimate problems and methods of a research field for succeeding generations of practitioners”  (Pg. 10, Kuhn 1962).

For Kuhn, a critical feature of the success of a theory in supporting paradigmatic science was in its ability to align scientists together into a common model of practice.  Kuhn believed that for a theories to provide this kind of organizing (paradigmatic) structure they should be: “accurate in their predictions, consistent, broad in scope, present phenomena in an orderly and coherent way, and be fruitful in suggesting new phenomena or relationships between phenomena.  (Kuhn, 1962).  In fact, however, the Ptolemaic model only partially met Kuhn’s description.  While the model was “broad in scope, consistent, presented phenomena in an orderly and coherent way, and predicted the data”, it did not actually, ‘suggest new phenomena or relationships between phenomena.”  In fact, one likely reason for its success was that it reinforced the assumed relationships between the phenomena it sought to describe: in this case dominantly that the assertion (and clearly observational fact) that the heavens revolved around the earth.  The engagement of generations of astronomers in further progression of the Ptolemaic model largely involved adding to its complex without changing its core assumptions to account for more accurate experimental data.  Accordingly, while the success of the Ptolemaic model was clearly due to its ability to predict the movement of the planets, and its relative mathematical breadth and simplicity, the fact that it fit with the dominant worldview didn’t hurt.

While not described in this way in children’s textbooks, it was actually the evolved complexity of the Ptolemaic model and especially the fact that the earth was no longer actually at the center of the dynamics, that lead Copernicus to propose his model as an alternative.  In fact, however, the sun- centric model actually predicted the position of the planets less accurately than did the Ptolemaic model for several hundred years, and the model had serious problems with consistency, breadth of scope and the presentation of data in an orderly and coherent way.  Improving the accuracy of the Copernican model required a great deal more fundamental work, as well as the introduction of much more complicated mathematical relationships.  So why then did scientists like Galileo, Kelper, and Newton ‘gravitate’ towards this model?  While there are likely religious and sociological explanations, scientifically, it was clearly the fact that this theory provided “new phenomena or relationships between phenomena.” as well as a significant challenge and opportunity for new science that drove its adoption and development.  The Ptolemaic model could not, by its nature, tell modelers or astronomers anything they did not already know or assume to be true about the structure of the solar system, or anything about the mechanical forces responsible for its structure.

While the work of Kepler and others on extending the Copernican model is historically interesting, it is actually the work of Newton on celestial dynamics that provided the key step in the transition to modern physics.  Interestingly, that work didn’t involve planetary movement, but instead, an analysis of the motion of the moon with respect to the earth.  It turns out that eventual emergence of Newtonian mechanics as manifest in the Principia, almost certainly originated in a mathematical model he constructed, likely at the age of 19 on his sister’s apple farm, to help him understand why the moon moved the way it did around the earth. While documentation is sparse, Newton apparently generalized a conundrum involving the behavior of a ball swinging around at the end of string, to the relationship between the earth and the movement of the moon.  Either inventing or borrowing the calculus he calculated the forces involved discovering what appeared to be an inverse square relationship between that force and the distance between the earth and the moon.  While it is not possible to recount the full story here, in fact, the estimate was not exactly the inverse square, and so Newton put the work aside for many years, until he was informed that someone else was about to publish the inverse square relationship.  Returning to the question and recalculating now with a better estimate for the distance between the earth and moon, he returned to the subject of celestial mechanics, eventually producing the Principia.  In a tour de force, he used his laws of mechanics to derive Kepler’s Laws of planetary motion, even providing an explanation for why the moon failed to obey them.    Again in Kuhn’s words, “No other work known to the history of science has simultaneously permitted so large an increase in both the scope and precision of research.” (pg. 30, Kuhn, 1962).

So what about Newton and Neuroscience? 

So how, then does this brief history of planetary science relate and potentially guide modern neuroscience.  First, I would assert that most neurobiological models today are in fact Ptolemaic in nature, constructed to demonstrate a particular preexisting idea about how the brain works, and largely designed to convince others of their plausibility.   Like Ptolemy, the “predictions” made by most of these models are actually most often “postdictions” of phenomena already known to occur.  Like the Ptolemaic model, these models usually taught their relative mathematical simplicity as a virtue, and often have the associated property that they can be readily adjusted to account for inconsistent data.  The most important way in which they reflect Ptolemaic modeling, however, is that they do not, as their first priority represent the actual physical structure of the nervous system itself.  Instead, they usually pick and choose convenient structural details to include, which among other things, makes them very difficult to test or falsify experimentally.  Like Ptolemy, they principally rely on their ability to ‘postdict’ experimental data as evidence of their validity.  Perhaps most importantly, however, like Ptolemy, the fact that are not built first and foremost to reflect the actual physical structure of the brain, means that it is much less likely the will identify new and unexpected relationships.  Often, like Ptolemaic models, their assumptions represent the worldview of those that built them, as well as a subset of the neuroscientists who share the same preconceptions.  Unlike Ptolemy, however, these types of models have failed to provide a strong enough attraction to the field as a whole to provide a basis for paradigmatic science to emerge.  The reasons for this are complex, but probably generally reflect the disconnect and even suspicion of most experimentalists towards modeling that Ptolemaic modeling actually produces.  Recently in a conversation with several computational neurobiologists of the Ptolemaic variety who were complaining that experimentalists paid no attention to their models, I repeated a statement I have heard attributed to Kant, who, it was claimed, when asked why he didn’t even bother to read Voltaire, stated that he knew the conclusions and therefore the assumptions had to be wrong.  In sympathy for Ptolemaic modelers in neuroscience, they also suffer from the fact that, unlike Ptolemy who was modeling the motion of planets, it is far from clear what are the right measurements to make of neuronal behavior. Unfortunately, figuring that out will very likely require that neuroscience become paradigmatic first.  It is clear, however, that for whatever reasons, Ptolemaic models have failed to have the force necessary to organize the field.

Like planetary science before us, I believe that paradigmatic neuroscience will only emerge when models are built to reflect the actual physical structure of the brain first, in the way that Newton’s model reflected the physical relationship between the moon and the earth.  These models must be built without first assuming how the system works.  By analogy, Newton could have build a model of the moon / earth system that assumed Kepler’s vortex forces were at work.  While, in fact, he apparently attributed the difference between an exact inverse square relationship in his early calculations  to a vortex force, had he assumed that such a force governed the dynamics to begin with, he never would have discovered the inverse square relationship, and as a consequence gravitational attraction.  Similarly and perhaps even especially in a structure as complex as the nervous system, unless we use modeling tools that put us in the position to learn its fundamental relationships from the structure itself, I believe it is unlikely we will ever generate the kinds of structurally testable and unexpected predictions necessary to organize neuroscientists into a paradigmatic enterprise.  In some sense there is already an existence proof for this assertion.  While I believe it is largely misunderstood, especially by those who build Ptolemaic models in neuroscience, the process that Hodgkin and Huxley used to develop their model of the action potential was Newton-like.  While the model they eventually produced had a more abstract, curve fitting form, it is very clear from their own description of their process that they originally intended to make a model based on the actual physical structure of the neuronal membrane.  The fact that they did not was a consequence of their awareness that they did not yet know enough about its physical structure to do so.  This realization in turn arose when early efforts to model the data made clear that their assumptions about mechanism going in were wrong.  Never the less, their strong predisposition towards a structurally accurate model meant that the structure of the model they did produce made specific and testable predictions.  It is on the basis of the success of those predictions as well as the ability of the model to replicate the shape and structure of the action potential that neuroscience as a whole adopted the model as a paradigm for the generation of the action potential.


While the Hodgkin Huxley model is biophysical in nature and therefore does not directly address how the brain’s neurons and networks processes information, in an almost paradigmatic way, their model has provided the structural basis for a new type of model of the Newtonian type focused at the computational level.  Usually described as ‘realistic’ or ‘biophysically accurate’ models, they still represent the large minority of published Neuroscience models, although their use is slowly growing.  Because the structure of the nervous system is complex, these models are complex can only be run as numerical simulations and accordingly depend on the continued exponential growth in available computing power.  Perhaps more importantly, a few of these models have started to emerge as true ‘community models’ with shared use and development by multiple investigators  (Bower, 2013).  There is even now a database for these types of models allowing their transfer between laboratories.

While these models exist, the majority of Computational Neurobiologists are suspicious of their complexity and the majority of experimental neurobiologists do not have the skills or training necessary to appreciate their value or potential importance for experimental studies.  In fact, the overall non-paradigmatic nature of Neuroscience provides a general resistance to the use of these models as an organizing principle.  Like Copernican models before them, there results, often challenge the ad hock assumptions of the Ptolemaic models.  Yet, I believe, the history of science makes it clear that it is only through these types of realistic models that Neuroscience can transition into a paradigmatic science.  For this to happen, however, several changes and developments will be necessary.  Many, interestingly enough, parallel developments in physics in the 17th century:

–       Like Newton’s model, understanding these new complex realistic biology models will also require the invention of new mathematics and analysis tools, as well as new tools for visualization.  While several modeling systems have been built to provide a foundation for this work, the lack of paradigmatic organization of neuroscience as a whole means that many would rather ‘invent their own’, hindering the formation of an open and inclusive modeling community.

–       Neuroscience must not adopt too readily the structure of modern day physics.  By this I mean that it is only when a paradigm is established that there is a base to coordinate theory and experiment as separate enterprises.  Like Newton himself, neuroscience should be training students facile in both theoretical and experimental neuroscience.  Unfortunately, most computational neuroscience training programs are designed to attract a cadre of theorists trained in physics, without a deep understanding of experimental neuroscience.  Unfortunately, many of the ‘tricks’ of physics (scale independence, averaging) are unlikely to apply to biology.

–       A key component in the development of physics and science in general n the 17th century involved the invention of the scientific journal.  However, traditional scientific journal structure is not in the least amenable to the description let alone reuse of models of the complexity of neurobiologically realistic models.  Accordingly, new methods of publication need to be invented.  Fortunately, the Internet provides an ideal opportunity to do so.

–       The community as a whole must seek and support the construction and understanding of community models.  The development of community models must actually be promoted and encouraged by modelers themselves, many of whom also prefer to build their own. In principle the capacity for understanding and sharing community models should be a central feature of both a new publication system, as well as neurobiological education.

–       In fact, perhaps most importantly, as Kuhn himself pointed out, the methods used to educate the next generation of neuroscientists are a key indication of the presence of a paradigmatic science.  “The study of (these) paradigms… is what mainly prepares the student for membership in the particular scientific community with which he (sic) will later practice”  (pg. 10, Kuhn 1962).  Instead of studying a common set of “paradigms”, our graduate students are subjected to a set of descriptive biological facts and purported mechanisms devoid of any underlying mathematical model or description. I have recently proposed that it would serve the field in its transition if the usual neurobiology survey course were replaced by a semester or even a yearlong study of the origins and structure of the Hodgkin Huxley model, as this represents as close to an established paradigm for one level of neuroscience. Further, as I have stated, the historical development of model in my view reflects a fundamentally Newtonian method.  In fact, however, the majority of our introductory textbooks do not even include the models relatively simple core second order differential equation.  Furthermore at present admission standards to most graduate programs in neuroscience do not require the mathematical skills necessary to understand or build mathematical models of any sort.   Few graduate programs current require or even offer courses in this or any kind of modeling.

–       Finally, for all these reasons, extreme caution is warranted in considering small or large-scale theories of brain function.  Those theories divorced from the actual physical structure of the brains neurons (I include almost all cognitive theories in this category) should be particularly suspect.  Further, it is critical for the field as a whole to recognize and understand the difference between Newtonian and Ptolemaic style modeling.  In my view, until that happens, neuroscience will remain fundamentally folkloric; dependent on what is, in effect, story telling, without a clear and organized direction forward.





50 Years Ago

corfu new york-p

50 years ago, in 1963, I was a child of 9 living in a small conservative town in Western New York State.  That town, of only a few hundred people, was strictly divided by the general store and gas station into a Catholic side to the west and a Protestant side to the east.  In the dead center of town was a ramshackle house in which the only African American family lived.  My father was the minister of the Presbyterian Church and I was the preachers kid.  My impression at the time was that my parents were the only people certainly on the east side and perhaps even on the west side of town who had voted for John F. Kennedy in 1960.  In late October 1963, I remember, even as a 9 year old, the adverse reaction when I told my fellow 4th grade students that my parents had gone all the way to Amherst Massachusetts to see President Kennedy give a speech in honor of Robert Frost, one of my father’s favorite poets.

On Thanksgiving week 1963, my father had, as usual, prepared his sermon early in the week for Sunday’s holiday workshop service.  Because my father kept careful notes of all his sermons over his entire career, I know that his original sermon was focused on his concern that Thanksgiving had become in effect, a way to brag about our wealth and blessings, especially as a country, and especially in the context of the cold war.  Not unusually, this was not necessarily a sermon that his overly patriotic and conservative parishioners were likely to want to hear, and likely I would hear about that in school the following Monday.   By Thursday of Thanksgiving week, the Order of Service had been printed and everything was ready for worship on Sunday November 24th.  On November 22nd, 50 years ago, in Dallas Texas, the world changed.


The following is word for word the commentary with which my father started his church service on that Sunday morning, November 24th, 2 days after the assassination of the President.  I reproduce it here in honor of my father and mother, their courage and conviction, and also of course in honor of President Kennedy himself.  Ironically, in my view my father’s thoughts and concerns resonate perhaps even more today than they did in November 1963 when evoked by a great national tragedy.

On the Presidents Death

Reverend M. James Bower

Delivered in the Corfu Presbyterian Church

November 24th, 1963.


“It is both fitting and proper to depart from the regular order of service given the sad and tragic events of the last several days.

I can certainly add nothing to the vast number of eulogies being offered by all of the world’s people.  I can say to you that, with millions of fellow Americans, I am deeply saddened.  My wife and I were greatly impressed when a month ago we were standing less than twenty feet from our President.   We were especially moved by his youthfulness, his energy, and the sincerity of his words.  This was no political occasion but instead the president spoke with learned words describing the place of art and artists in the American way of life.  Especially poignant were his words about the great American poet Robert Frost, whom he had come to honor.   And now this man is cut off in his youth, and the light of probably the best educated mind that has ever served in the White house has been put out.

I know very well that the sentiments of the majority of people in our community are not with the Presidents party and many have been strongly critical of his programs.  This is our inherent right to disagree with those in high office.  Yet, I am sure that there is not one person here or in our community who does not have, somewhere in their heart a bit of respect or even admiration for such a man who had attained so much at such a young age; who had served his country in the heat of battle as well as in the Nation’s highest office, who had stood up to the single most powerful man in the world and come out seemingly in first place.

It is this man with whom many have disagreed, but yet whom we respected that we mourn today.

The sadness, the cold bloodedness of the act that removed him from our midst is of course the most tragic aspect of these events.  It would be so much easier to understand if illness or some other accident had claimed his life.  But the fact that one man has defied the will of 175 Million people is a tragedy of greatest import.

And yet, this, it seems to me, is the risk that freedom has to take.  In many other countries of the world, this man would have been locked up, even executed long ago.  His avowal of ideas contrary to our own was open and flagrant.  But his right to criticize is protected by law, as is yours and mine – and it could not be taken away from him without at the same time demanding that you and I be silent.

In a democracy, such as ours, where the people’s freedom is the greatest protection and the leaders of the government has the greatest limitations, there is always the danger that people in groups or individually will abuse the very freedom that is there’s.  Our system is not wrong because a tragic abuse of freedom occurs.  Our system is right because of the page after page of responsible acts for every one of irresponsibility.  Our system is right because the son of an immigrant family, of a minority religion, can reach the highest office in the land.  Our system is right because the wound caused in our national fabric will eventually heal and only a mournful sorrow will remain.  Our system is precisely right because even this despicable slayer will be treated according to the due process of the law.

It is not a time for drastic changes in our Nation’s life and habits.  It is a time for steadfastness and evenness.

It is a time to pray for God’s guidance for our country and for our new president.

It is a time to ask God’s blessing and comfort upon the late president and his family.

It is a time to pray that our own lives be filled with constant responsibility for our nation and our community.

It is even a time to give thanks.

Now, I would ask that this congregation rise and that we stand for a minute with lowered heads in memory of John Fitzgerald Kennedy.

And now I would ask that you pray silently for Mrs. Kennedy, her children and all the members of this bereaved family.

And finally, may I lead you in a prayer for our country and our new President.”




black elk- redface


As you can see from looking at the first posting on this blog, its original impetus was to post, and to some extent explain, last year’s ‘Rant’ at the Games for Change Annual Festival in New York City (  Now one year later, I find myself once again explaining a rant.

This time the explanation doesn’t have so much to do with the rant itself (see previous post) as with my appearance doing the rant (see photo above).

From the start when considering showing up as the Ogallala Lakota Sioux Medicine Man, Black Elk at a festival of activists deeply committed to all kinds of social issues, I knew that I had to make sure and do my homework.  That included, for example, making certain that the clothing I wore was both authentic, actually made by Indians and purchased from Indians.  It also meant studying the history of Black Elk, his vision and times.  Year’s ago, I had read both Black Elk Speaks ( , and Bury My Heart At Wounded Knee ( , both of which I re-read for the rant. I also spent additional time and effort reading academic ethnographic sources, including several who specifically evaluated the origins and accuracy of John Neihardt’s account of his interviews with Black Elk recounted in Black Elk Speaks.

On finishing the rant, however, I was approached by two conference attendees apparently associated with Native American Museums in New York and Washington D.C.  They stated that while they didn’t want to appear impolite, that they were deeply offended on behalf of Native American’s that I had showed up in ‘Red Face’.

I was frankly flabbergasted.  Why would anyone think that I or anyone else would be so foolish as to appear at Games For Change in Red-face, Blackface or any other kind of face?  Furthermore, although squeezed into 4 minutes, I had explicitly said that Indians danced the ghost dance  “with red paint”.  Although again time limitations didn’t allow me to go into the point – one of the reasons that the initially skeptical Black Elk had finally accepted the ghost dance ritual was that it shared with his own vision the painting of the body red. Although the photograph is in black and white, in the famous picture taken by John Neihardt of Black Elk attempting one more time to speak to his grandfathers from the most sacred point in the Black Hills (reproduced in the previous post), Black Elk was apparently wearing long red underwear (the best he could do at the time).  I also made it clear several times in the rant that a central focus and image of Black Elks original vision was the distinction between the “red road” his nation would have to be on to survive and the “black road” that lead to ruin.

Why had these well-meaning defenders of Native Americans not paid attention?  Further, and perhaps more surprisingly, why didn’t they already know about the symbolism inherent in being painted red?  (I will note, that the paint on my face also included black lighting bolts).

Later that afternoon, I was again called out for my depiction of Black Elk, this time with the additional concern as to whether I had sought permission to use Black Elk’s words.  While again, the 4 minute limitation didn’t allow elaboration, I had explicitly attributed to Black Elk the view that his vision was not only for his people, but also “for all peoples even including you the Wasichas” (non Indian people).  It turns out this also derives from historical fact:

In August of 1930, John Neihardt, who had written several previous books on the plains Indians, was writing a new book on the ghost dance movement and wanted to interview Black Elk.  Arriving on the Pine Ridge Sioux reservation, he was told that Black Elk would likely not want to talk to him, as he had just the previous day declined an interview with someone else. Neihardt persisted however, and after having sat quietly and respectfully on the ground near Black Elk for some time, Black Elk reportedly said: “As I sit here, I can feel in this man beside me a strong desire to know the things of the Other World.  He has been sent to learn what I know, and I will teach him.”  Later Flying Hawk, the translator, apparently remarked, “it was kind of funny the way the old man seemed to know you were coming”.   In several interviews over the next year, Neihardt and his children transcribed Black Elk’s story which then became the basis for Black Elk Speaks, and through that book, the basis for my own rant.

Of course, I make no claim what-so-ever that my small rant at Games for Change has any significance in the larger Black Elk story, however, the fact is that Black Elk clearly intended and desired that his vision and his story be recorded and continue to influence history.  It is also likely that more Washicas have read Black Elk Speaks than any other Indian account of the history of that time.  Search twitter for “Black Elk” and you will see the continuing influence.   Had I gone on stage to represent Crazy Horse, who Black Elk described as “the last big chief and then it’s over”, and whose one willingness to talk to Wasichas was trickery leading to his murder, I think one could rightfully criticize.  But I would like to believe that Black Elk himself would have been at least in some small part pleased by my small and insignificant effort to translate “what he knew” and perhaps more importantly what he had learned about ghost dancing, to a more modern time.

It turns out that I do have some reason to believe that this could have been the case.  Rather remarkably given the ethnic composition of Games For Change (on that another rant), just after my first discussion about “red-face”, and still somewhat befuddled, I walked around the corner to be greeted by a women who approached me saying, “I am from a Pueblo in New Mexico and am part Indian, and I wanted to let you know that I thought your presentation was beautiful and a tribute to Indian people.”  I am not a mystic, am not religious in the least, and strive hard to distinguish correlation from causality.  I also have no doubt that there are Indians in addition to well meaning White People who might object to my efforts.  However, for just a moment I felt a rather strange feeling. The thunderclouds that were just then forming outside didn’t help, I have to say.  At the very least, the overall experience gave me perhaps a deeper sense of the wonderment and power that Black Elk and his people must have felt in the world.

Returning to Games for Change, and more generally, to the question of the enterprise of restructuring learning and society that the festival represents – what is the lesson of these events.  For me, the bottom line is that what we are trying to do is hard and requires a deep, not superficial consideration of the issues and their consequences. It is time for deep thinking and deep seriousness – as we are trying to do something extremely important.  We have to shed our own biases and preconceptions (and knee jerk reactions) if we have any chance of succeeding.  Everyone must listen and try to understand.  Superficial reactions or superficial treatments perhaps especially in learning games won’t get the job done.  Learning in the end isn’t about being handed meaning, but about uncovering meaning for yourself.  I deeply believe that Black Elk thought that.  I do too.


Black Elk Speaks

Another year,  another Games For Change rant – perhaps also somewhat opaque.  But as Black Elk himself said as reported by John G. Neihardt in his book  Black Elk Speaks:  “When the part of me that talks would try to make words for meaning it would be like fog and get away from me “.

Here then is the text for this year’s Game For Change rant:

black elk in black hills

Hey a a hey, Hey a a hey, Hey a a hey, Hey a a hey


‘Revealing this, they walk

A sacred herb – revealing it, they walk

The sacred life of the bison – revealing it, they walk

A sacred eagle feather – revealing it, they walk

The eagle and the bison – like relatives they walk”


I am happy to be with you here in the moon of the fat making

Here on the island of many hills or Manna-hata named by my brothers the Lanape of the Algonquin nations.

When I was in my 9th year, now many many moons ago, I had a great sacred vision not only for my people, the Ogallala Lakota Sioux but also for all peoples even including you the Wasichas.

My vision came at a time of the start of great change for the Lakota nation, but at a time when the grass was still tall and our brothers the bison were many, and all the footed, winged and rooted beings were living in harmony.

In my vision, six grandfathers told me of 4 ascents that were coming, each of greater difficulty for my people.  They also told me of a red road north and south leading to prosperity and happiness, and a black road east and west, that was full of war and unhappiness.

Though I tried for many moons to help my people find the red road, it is hard to follow one great vision in this world of darkness and many changing shadows.  Among those shadows men get lost

I am here to tell you when I got lost because I fear you may be lost in the same way.

In the year of the blue man, who dried up all our rivers and grass, my people were starving as the promises of the Wasichas chiefs were empty and all the buffalo were gone.  At the height of our desperation, we heard of a new powerful vision coming from a Paiute Medicine man named “Wovoka”, that a new world would soon start.  In this vision, if Indians danced the ghost dance, with red paint – they could get on this new world and the Wasichas couldn’t and would therefore disappear.

When I first heard of it, I thought it was foolish talk someone had started somewhere.  I thought it was of despair that made people believe just as a man who is starving might dream of plenty to eat.  But when I went myself to ghost dance, I came to believe that my vision and Wovoka’s vision were the same.

I started making ghost shirts, which I believed would make our warriors not feel the white mans bullets.

The last ghost dance for my people was at Wounded Knee – where the ghost shirts did not protect our warriors and the Wasichas wagon guns murdered many of our people including women, old men and children.  I did not know then, how much was ended.  At Wounded Knee – my nation’s great hoop was broken and scattered.

So I am here today to warn you about the false promise of ghost shirts.  This was my great mistake, because I lost sight of my great vision and focused on little visions instead.   In times of great change, it is easy to think that a ghost shirt will protect old institutions from the coming ascents.  But Digital Ghost shirts will not protect you from the revolution happening all around you just like my ghost shirts did not protect our warriors or our nation.

However, in this case, I do not understand what the digital ghost shirts are intending to protect.  For my people, change destroyed a people and many beings in harmony.  For you, digital ghost shirts are covering a system that is not in harmony, and has failed to uplift certainly the children of my nation and many of the children of your own as well.

Square boxes, like the ones that the Wasichas make us use, and the ones that we make for our children, are a bad way to live.  In these square boxes our power is gone and we die.  Because there is no power in a square.  The power of the world works in circles and everything tries to be round.  The sky is round, the earth is round, the wind in its greatest power swirls around.  The sun goes up and comes down in an arc; the life of all peoples is a circle, from childhood to childhood.

And birds, who have the same religion as we, and who the great spirit tells use to use as a guide to raise our children, raise their children in round nests.  Our teepees are round like the nests of birds.  The tepees in our villages are placed in the round, so the community as a whole can nurture, protect and educate our children, and the great hoop of our nation was round nurturing all.  Things that are round have many possible connections and are not easy to control.  This is why the Wasichas took away the tepees of our own making and that we could move where we wanted when we wanted, and put us in square houses that can not be moved.

And we have put our children in boxes, in rows all facing forward, and all in lock step.  There is no roundness in that.  Further, what you would have our children do is also not round, but square.  Games that lead children by the nose to what you want them to know and believe.  Many of the digital ghost shirts I have seen, are like this, fundamentally reflecting the structure they seek to protect which is not the right structure.  Not round and connected, free and open but closed and square.

But, there is a new way – while many moons ago the new way was not better for my people, the new way I see now is a new version of my vision.  It is children connecting and connected to everyone in the community, no longer in rows in boxes.  Children playing real games to learn including those of their own making, with each other, and their grandfathers and grand mothers. Children with the opportunity to learn their own way, make their own connections – their own circles, connected to all our circles.

And this is important, boys of my people began very early to learn the ways of men. And no one taught us, we just learned by doing what we saw. And we were warriors.

This is the old way, and I believe it will be the new way.  While, in the end, there was no protection for my people from the bullets of the Wasichas.  I do not believe that there is any way to protect these old institutions from replacement by something round not square.  We must stop building square things – and stop protecting square things.  It is time to finally seek the red road and get off the black road  –  and build a new hoop together.

“Grandfathers, behold this pipe In behalf of (our) children I offer this pipe, that we may see many happy days”

Hetchetu aloh  (good indeed)


Haho  (thanks)

What About Life Long Learning?



On April 1st, at the end of Whyville’s month long celebration of 14 years of existence, the coveted “oldbee” medals were handed out to thousands of Whyville’s longest members.  Remarkably, nearly half the total metals were given to users who have been active on Whyville for more than 5 years with some on the site for more than 10 years.  That means that some of Whyville’s users started when they were 12 and are now graduating from college.

Last week I visited both Penn State and New York Universities, both in the throws of quite visible upheaval, but with challenges not different in kind from many if not most other Universities.  Over the last 10-15 years, many have seen a dramatic bloating of middle management (Deans, Vice Deans, Assistant Vice Deans, Vice Presidents, Assistant Vice Presidents, etc.) and associated dramatic increases in costs.  At the same time, a growing number of faculty seem aware that both undergraduates and graduate students are having a harder and harder time ‘cashing in’ on their degrees with an astounding number of new college graduates now living back at home and working minimum wage jobs.  From “middle management”, faculty are hearing that they are now actually part of a business (rather than the academic institution they originally signed up for), and are under more and more pressure to raise money (largely to support the bloated and overpaid administration) and offer services to more for less.  Tenured faculty who are unwilling to do so, are being replaced by non-tenure track, ‘contract teachers’, who are cheap and will do whatever the administration wants (or be fired).  Faculty are being told that MOOCs are the future (one faculty member, thousands of students) but seem not at all clear how this will work for anyone other than the Universities that already have mass appeal (Harvard, MIT, Stanford, you know who you are).  It is not even clear that those so-called ‘elite’ universities have a viable MOOC business plan  (don’t they know that the public still expects information on the Internet to be free??, and for years we have hired new employees based on real-world performance tests rather than their degrees).

Perhaps, therefore, despite their ivory towers, and 600 years of stubborn survivability,  even universities may be subject to the upheaval going on in the rest of the ‘business’ world.

But what is to be done?

What I suggest is that perhaps it is time to rethink and reconsider one of the most basic assumptions underlying the modern university system – that we train students for life.   Perhaps the bloating of university middle management is evidence that 40 years ago, when many of the faculty who have now taken assistant deanships went to college, there was such a thing as a lifetime career.  That a certificate obtained by isolating yourself for 4 years as an undergraduate, and perhaps another 4 as a graduate student, would launch you on a 50 year career.  However, evidence suggests that in most cases (perhaps even in universities) those careers are gone.  Most of today’s college graduates, once they find a job, are likely to change that job multiple times, in some cases dramatically, over the next 50 years.

Whyville has now influenced and stayed connected to students from 6th grade to college graduation with no end in sight.  With lifetime connectivity possible, shouldn’t educational institutions be enrolling students pre-K and continuing to serve them for their lifetimes.  Instead of a certificate and a handshake, shouldn’t they continue to offer services, on demand, depending on the needs of their permanent enrollees?  Wouldn’t that also make it more likely that the offerings were relevant to the real world?

As many Whyvillians in their early 20s last week celebrated long continuous involvement with a rather odd educational site they first joined when they were 12 – shouldn’t someone take notice?  We did   🙂

Ghost Dancing

Ghost Dancing at Pine Ridge

A week ago I found myself once again in a completely over run and outmatched Austin Texas, at the start of what has become a string of SxSW (‘south by’ newbies learn quickly), festivals starting with the relatively new SxSW edu.  It is rumored that when the original SxSW music festival started in 1987, one could buy a pass to every performance for under $20.  This year a platinum badge cost almost $1,600 and that didn’t get you into all the events, many requiring corporate connections.  In fact, in what is an old story with ‘successful’ events of this type, SxSW has become completely corporate, through and through.  Thanks Twitter J

As I sat in panel after panel at SxSW edu I found myself reflecting on how the ‘corporate nature’ of SxSW as a whole was playing out at edu.  Yes, of course, the corporate sponsors including the College Board (who brings you the SATs), the publisher Houghton Mifflin Harcourt, McGraw-Hill, Scholastic, the Pearson and Bill and Melinda Gates Foundations and others in what a friend of mine now refers to as ‘the educational cabal’ were omni present – on the walls and in the talks.

In fact the biggest continuous set of presentations was organized around inBloom,  rumored to have received many millions of dollars to aggregate digital assets so that individual teachers could build customized educational experiences for their individual students.  The inBloom folks had enough funding, apparently, to offer free beer and snacks every afternoon for the entire meeting, assuring attendance.  In contrast, I felt sorry for the poor corporate speakers in the big Google sponsored e-learning Lounge who lectured enthusiastically about what is essentially the same basic idea in ‘Google world’ without altering their scripted regular requests for feedback from an audience not present, probably due to the lack of beer.  (Don’t these guys know why most people now come to SxSW??).

Anyway, in these corporate presentations and the largely corporate styled panel-based product pitches, the theme over and over again seemed to be that ‘digitalization’ (a more general version of ‘gamification’) was the future of the educational system.  This trend made glaringly obvious when an executive of one of the ‘former’ textbook publishers made the specific point that they were now in the digital media market not the textbook publishing business.  (hmmm, the product he showed sure looked an awful lot like a digitized textbook).

So what was going on here?  What teachers and which educational system have the time or expertise to individually customize digital assets for individual students?  (the beer-drinking teachers in the  back of the inBloom room seemed to think that this was marvelously amusing and THEY were at SxSW  J ).  And what teachers or students for that matter were being referred to by the panelist who proclaimed that teachers need to teach children how to use digital media?  (Do the actually KNOW any children – clearly haven’t spent any time on Whyville).   And did anyone really expect Bill Gates (the former CEO of Microsoft you remember) to offer an insightful evaluation of the current state of the educational system and its future – instead of pitching more examples of digitalization Gates certified?  Do they all really believe that putting a digital cover over the current educational system is going to protect it from being undermined by tech savvy children?

Ghost Dancing:

Ah yes, again, history might be instructive.  In 1889, near the end of the U.S. Indian Wars, the prophet Wovoka of the Northern Paiutes Nation had a vision during a total eclipse of the sun, in which his God told him that if every Indian danced a new “Ghost Dance” all evil in the world (read the white man) would be swept away leaving the Western United States filled with food, love and faith.  The Ghost Dancing movement spread across the plains and all the way to the coast of California, adapted to the cultures and beliefs of individual tribes.  The Sioux, locked in a running battle with the U.S. Cavalry, introduced Ghost Shirts to the celebration, believing that by wearing the shirts their warriors would be impervious to the white man’s bullets.  Sadly, Wounded Knee was the last Ghost Dance for the Sioux.


The strong sense that I had at SxSW (and increasingly elsewhere) is that we were all being lulled into a Ghost Dance that involved covering up the mess of the educational system with digital ghost shirts.  Don’t projects like inBloom, for example, miss the point that children are already using the Internet to teach themselves and customize their own learning experiences?  Instead of providing tools for the existing educational system to impose its own preferred order, shouldn’t we be building communities in which not only teachers, but parents, grand parents and others encourage and engage with what children are already doing?   Shouldn’t we be asking, what does this new technology natively allow us to do, rather than how can we paste what we have always done into this new technology?  In Wovoka’s original vision, he himself was chosen by God to manage the people of the Western United States, leaving the (white) President of the United States to manage the East.  Is Ghost dancing in the world of education also about control, and protecting the old established order from the onward march of a new technology and the kids that use it?  Surveys today already suggest that many parents believe that their children are learning more on the Internet than in school and that most parents now regard at least K-8 as primarily custodial.  Will the SxSW panel pickers next year accept my proposed panel titled “Improving the custodial function of public education”?  Or will we hear more panels and more corporate presentations explaining how a digital ghost shirt will save the day (and their old top-down business models)?  It seems to me that far from instructing children in the use of digital technology the question is whether the educational system, given its encumbrances, including the industry long arrayed around it, is going to be able to catch up to what children are already doing?  If it can’t (and I have my doubts), is it headed to its own Wounded Knee?

Perhaps I will organize a raid of children (I do have the horses nearby) on SxSW next year – might be interesting.  J




Myths, Methods, and Madness in Science Education Reform

20 years ago, I was asked by the U.S. National Academy of Sciences to participate in a special panel at the National Research Council charged with reviewing and recommending best practices for university-based science education interventions in public pre-college science education.  At the time I was co-director of the Caltech Pre College Science Initiative (CAPSI), and the panel as a whole was made up of directors for other such programs around the country.  The only other practicing scientist on the committee however, was its chairman (and NAS member) Sam Ward, then at the University of Arizona in Tucson.   To make a long story short, at the end of the two year investigation, I felt that it was important to independently summarize what I had learned from the process – which I did in the form of a document titled Myth, Methods, and Madness in Science Education Reform.

Now 20 years later through my involvement in, I am watching well meaning university faculty and staff once again attempting to intervene in science education now using new digital technology including gaming and virtual worlds.  Based on what I have seen in many of these efforts, I thought it might, perhaps, be useful to once again publish Myths, Methods, and Madness – included below in its entirety.  This document might also provide some additional insight, for those interested, in Whyville’s origins as well as some of its structure.

The more things change, the more some things seem to stay the same.

Front cover of a version of the Myths document in the form of a children's story

Front cover of a version of the Myths document rewritten in the form of a children’s story – art work by Erika Oller

Originally published as:

Bower, J.M. (1994) Scientists and science education reform:  Myths,  methods, and madness.  In:  Scientists, Educators and National Standards: Action at the Local Level , Sigma Xi Forum, Edwards Brothers Inc. North Carolina. pp. 123-130.

Over the last several years, the deplorable state of public science education and the perceived consequences for our nation’s economic and intellectual vitality has attracted not only the attention of educators and politicians, but also an increasing number of professional scientists and engineers.  As a consequence a remarkable number of science professionals are becoming or are already involved in attempts to improve public science education.  While, in principle, this increased involvement of the scientific community is encouraging, it is also the case that scientific training often includes little or no focus on science education itself.  Instead, it is simply assumed that a PhD in experimental science is adequate preparation for ones eventual educational responsibilities.  Based on ten years of involvement in elementary science education reform, I can assure you that this is not the case.

For the last eleven years, myself and my Caltech colleague Dr. Jerry Pine have been involved in a close collaborative partnership with the Pasadena Unified School District in an attempt to introduce and support high quality inquiry based “hands on” science teaching for all children.  As of the fall of 1993, all 650 K-6 teachers in this large urban school district teach 4, 10-12 week science units each year.  These units emphasize an open ended experiment-based approach to understanding science.  We have also developed a substantial professional development program in science for all teachers in the district as well as an extensive materials support system.  Program extensions are now being made into middle and high school classrooms as well as preservice teacher training.  Over the last five years, we have also transplanted this project into two additional school districts, one in California and one on the island of Maui.  As a result of these successes, in the fall of 1994, the National Science Foundation established a center at Caltech intended to transfer our model for systemic reform to 14 new school districts in the state of California.  At present we are working with 9 new school districts located throughout Central and Southern California.

“Myths” of science education reform

nbs003-world While I believe that our efforts to change science teaching in public schools has met with some success, this success absolutely required that I, as a scientist, reexamine many of the most basic educational assumptions I had developed as a result of my own science education.  While I started these projects 10 years ago with enthusiasm and a sense of great need, I realize in retrospect that I was, in fact, poorly equipped for my role as a partner in change.  I knew essentially nothing about education in general, or science education in particular.  Many of the assumptions I had made about the change process, as well as what good science education looked like were flat wrong.  I also had little or no real understanding of the structure of school districts, teacher capabilities, or the effort really required to produce lasting change in public science education.  Ten years later I continue to learn important lessons regularly, guided by our school district collaborators.

Never-the-less, based on the initial success of the Pasadena projects, I am increasingly asked to evaluate other science reform efforts involving scientists.  From this exposure it has become clear that many of the incorrect assumptions I myself initially made are often evident in the plans of other science education reform efforts involving scientists and scientific organizations.  In fact, these assumptions appear to be strong enough that scientists often invent nearly identical science education reform programs often with limited success.  The purpose of this article is to explicitly identify some of these “common myths of science education reform”.  While several of the points made will probably be regarded as controversial, at a minimum this listing will expose potential reform advocates to several important program design issues.  After all whatever the final structure of a particular program, no program, just as no research project, should be created or run in a vacuum.

Myth 1 – The problem with public science education is that a large percentage of teachers are incompetent.


It is remarkable how widespread the view is that teachers, especially in early grades are minimally functioning human beings.  It is also remarkable how rapidly this notion disappears when one becomes seriously involved with teachers and the worlds they live in.  Teachers in California public schools are now expected to manage the learning of 30-40 students per classroom with almost no outside help, and almost no budget.  It is absolutely remarkable that more of them do not quit outright.  The reason they do not, in our experience, is that almost all of them have a deep personal commitment to student learning.  With such a commitment, and a rational approach to science education reform, we have found that the vast majority of teachers enthusiastically participate in improving the quality of science education.

Myth 2 – Teachers are under motivated to teach science because they do not understand how exciting it is.

When surveyed teachers actually report that they already consider science to be one of the most exciting contemporary fields of study.  However, attempts to transfer the excitement of science through lectures never give teachers the opportunity to experience the thrill of doing science themselves.  Instead, science is presented as the purview of the elite.  Even programs that combine “science excitement lectures” with later “hands on” experiments usually reinforce unproductive attitudes.  For example, in most cases, the “hands-on” activities are do-it-yourself “cook-book” demonstrations of the sort professors design for their own undergraduates.  These are usually primarily intended to assure that everyone gets the same, right answer.  This type of lab is in sharp contrast to inquiries which give teachers opportunities for real open ended scientific discovery.  Obviously, they also reflect that fact that in “real science” the answer is often not simple, singular, stable, or in many cases even known.

Myth 3 – The primary reason teachers do not teach science well is a lack of science content knowledge:

nbs008-listsIt is perhaps not surprising that many program’s run by scientists focus on increasing the scientific content knowledge of teachers.  In my view this directly reflects the structure of undergraduate and graduate level science education which is most often predicated on the assumption that a strong understanding of science content is a necessary prerequisite for eventual success in research.  While I personally doubt that this is true even in higher education, in the context of K-12 science education reform, there is no question that an inordinate upfront focus on science content only reinforces the inadequacy many teachers already feel with their own science content knowledge.  This, in turn, reduces the likelihood, especially in younger grades, that teachers will actual teach science.

When the focus of science education is changed from science content, to science process, the hesitation of teachers to teach science greatly diminishes.  As teachers understand that the skills they need to teach science are not substantially different from those necessary to teach other subjects, their willingness to engage their students in real scientific inquiry increases dramatically.

Myth 3 – Supplemental teacher training is necessary because too few teachers especially in the early grades, have been required to take science classes in college.

We have found that a teacher with adequate materials, enough time, and good classroom and science experiment management skills can actually provide their students with an excellent science education with remarkably little science content knowledge.  In fact, in general, the more college science courses a teacher has taken, the more likely they are to model their teaching on the lecture based approach of most university science professors.  Accordingly, teachers with fewer college lecture-based science courses are often more amenable to fundamental change to inquiry teaching methods than are those whose examples for science teaching come from college and university professors.  In our experience, as these teachers become involved in real science experiments in their classrooms, they inevitably seek additional science content knowledge.  However, in this case the information they seek is directly related to their own needs as science teachers, not to lists of “what all teachers (or students) should know” generated by others.

Myth 4 – The key to scientist involvement with teacher training is to provide complex information in as digestible a form as possible.

nbs011-science as magic It follows from my previous statements that distributing simplified scientific information is about the last thing that a scientist should do.  Watered down lectures only serve to reinforce in teachers the sense that they are not really capable of understanding scientific principles, reinforcing the insecurity that many teachers already feel about science.  As I have also stated, scientific information in this form is almost worthless to teachers in any event.  Young students, unlike those in college and graduate school, have not yet learned what questions not to ask, and therefore will rapidly expose holes in the knowledge of a teacher trained to be a “mini-expert”.  In fact, these students regularly expose holes in my own scientific knowledge.  On the other hand, if the role of the teacher is as a guide to students in their own scientific investigations, then the lack of detailed knowledge of the teacher is a source of motivation and ownership by students.  Of course, this change also substantially alters the role of the scientist in educational reform.  The “classroom management” skills now required to organize time and materials or help students work in cooperative groups are not something that most scientists know anything about.  However, what scientists do know about is how to conduct investigations.  Accordingly, in our programs the primary role of the scientist is to model inquiry, not to fill in teacher backgrounds.  Just as we are comfortable guiding our graduate students to explore subjects for which we do not yet know the answer, teachers should be comfortable guiding their students explorations.

Myth 5 – The problem with science education is a lack of good curriculum and therefore we must develop it.

If the emphasis of the reform project is on grades K-6, this statement is absolutely wrong.  Over the last several years, numerous companies have begun marketing excellent early science curriculum.  In fact, I believe that, at this point, there is almost no need for further curriculum development in K-6.  Instead, reform programs should focus on how to implement and support the use of this existing curriculum.

Beyond the elementary school level, however, there is as yet almost no good readily available inquiry-based curriculum.  This is one of the many reasons that I believe reform efforts should being in elementary school.  The vast majority of what is available in higher grades is either fundamentally lecture based, or based on “cook book” hands on activities intended (as in our undergraduate laboratories) to assure that every student gets the “right” answer.  As I have stated, enforced “correct” answers should have no place in real science education.

This said, however, the answer to this problem is NOT to have reform efforts develop their own curriculum.  Curriculum development is a much more costly and time consuming process than most scientists believe, requiring long term revision, field testing and evaluation by a highly talented, motivated, and educated development team.  A reasonable estimate of the cost of developing a real 12 week curriculum module for elementary school, for example, is $400,000 and three years.  Curriculum developed in the context of reform efforts is often mostly of the demonstration variety that does not support good inquiry teaching.  Further, an emphasis on curriculum development tends to underestimate the far more difficult problem of curriculum support and implementation.  Many millions of education dollars spent on “grass-roots” curriculum development programs have not corrected the perilous state of science education in our schools.

Myth 6 – One reason to develop new curriculum is to introduce modern scientific techniques derived from current laboratory experiments.

It is my view that the drive to make curriculum “modern” is misplaced.  While understanding the political and social implications of modern science is clearly important, a specific focus on this objective often indicates a hidden agenda.  For example, a teacher training program in modern biology might be intended to directly counteract the effectiveness of animal rights activists.  Such political considerations, when they are primary, often directly undermine the open inquiry process that is supposed to define scientific methods.  It also places science training programs at risk of using the same tactics as those they are attempting to counteract.  Further, modern experiments and experimental techniques are often not accessible to deep process knowledge or active exploration, instead they infrequently come across as being more magical than scientific.  Classroom activities developed from research laboratory experiments, in particular, are very often only simple demonstrations of previously presented science facts.  Such activities bear little resemblance to real experimental science and seldom support inquiry-based learning.

In my view, any subject considered as a base for science curriculum should be evaluated for its value in teaching and learning, not solely for its degree of contemporary content.  While questions of relevance are often important to teachers and students, especially in higher grades, we have found that any real scientific investigation, correctly conceived and supported is regarded as a valuable experience.

Myth 7 Training a few highly motivated teachers will produce “trickle down” reform when they return to their schools.

Regardless of the emphasis on content or process, the most common form of educational reform project is one that assumes that a small number of highly trained teachers will transfer their abilities and enthusiasm to other teachers in a school or district.  Again, this approach to educational reform reflects the hierarchical structure of science education in universities.  In fact, there is little evidence that individual training courses have much effect outside the classroom of the trained teacher.  Teachers that have elected to take these courses are often regarded as “special teachers” by other teachers, in effect isolating them from their colleagues, and reducing their effectiveness as reformers.  Further, real teachers seldom have the means or time to support or transform the teaching techniques of their colleagues.

If systemic change is the objective, then it must be the specific target not an assumed side benefit.  In Pasadena, our initial focus on all teachers, not just the recognized mentor teachers, in a single school produced the local proof of concept necessary to convince the rest of the district to make the change.  The fact that the majority of teachers in the initial school were enthusiastic about the program, in effect, certified for the other teachers in the district that this was something that they too could do.  As we now move into other school districts, the primary problem is slowing down the implementation, not convincing other teachers to try it.

Myth 8 – If teachers are motivated enough during training, they will find a way to obtain the material necessary to teach science in their classrooms.

Over the last several years, there has been a clear migration away from lecture-based instruction towards more hands-on approaches.  Unfortunately, however, most programs supporting this change still do not take into account the need to provide material support to teachers back in their own school districts.  In fact, far too many university-based programs seem to assume that participation in a summer workshop will provide the necessary teacher motivation to change classroom instruction.  There is little evidence that this is true.  Instead, to be effective a program needs to take into account, at the outset, that indistrict support and follow-up will be necessary for success.  This is particularly true with respect to science instruction materials.  Very few public schools in the 1990s have budgets that can support the materials necessary to teach science well.  Teachers often do not have the political clout necessary to obtain what minimal money is available.  For most of our teachers today, teaching is a lonely and personally expensive occupation.  If a program intends to maintain a lasting commitment from the teachers it has trained, direct and continuing school district support is essential.  The lessons of the last 30 years make this absolutely clear.  The wonderful hands-on materials developed in the 60’s remained completely unused without support for the material and professional development needs of teachers.  Unfortunately, this means that school districts as well as project coordinators have to deal with the nuts and bolts issues involved in supporting real experimental science at the beginning and throughout a project.  Without this support it is well known that good science teaching can not be sustained.

Myth 9 – Reform can be accomplished with existing resources if they are simply allocated more efficiently.

In my view, this is perhaps the greatest myth of education reform.  While it may be the case that 30 years ago resource allocation could fuel reform efforts, it is no longer the case today.  Public school districts, especially those serving poor children (i.e.  districts that can not rely on direct parental financial support) have been cut so close to the bone that there is little money left to support even the existing curriculum.  With cuts in social services, these school districts are rapidly becoming social service agencies, rather than educational institutions.  The basic health and safety of their students inevitably takes priority over something as relatively esoteric as science education, let alone its reform.  For this reason, no matter what else happens, if public schools continue to be denied the resources they need, no reform effort will be sustainable, and the cultural, educational and political spiral we find ourselves in now will continue.  As an advocate for science education reform, I now also spend considerable time evaluating educational projects in third world countries.  It is becoming increasingly difficult to distinguish schools in these regions of the world with our own public schools.  As the richest and most economically vital country in the world, there is no excuse for this situation.

What can I do as a scientist?

nbs015-child While the forgoing list of “don’ts” might be daunting, in fact, I believe that scientists should be encouraged to get involved in science education reform.  Scientists can play a critical role in the process of reform, even if the role they actually play is somewhat different from the role they imagine they should play.  The following partial list is based on our experience with several school districts and the many scientists involved in our programs.

Program Validation: Perhaps surprisingly I believe that the largest contribution the scientific community can make to science education reform is related to the popular perception of scientists rather than their scientific knowledge directly.  Through involvement in a reform program scientists can certify the validity of a program.  For teachers, parents, administrators, students and even funding agencies, the involvement of real working scientists in a science education program can lend essential political support for a project.  While this political clout may be a result of what, in my opinion, is the mistaken public impression that professional science content knowledge is a critical component of any science education reform effort, it provides scientists a tremendous opportunity not available to many other sectors of society (or members of the traditional educational community).  Of course, this makes it especially important that we use the opportunity wisely.

Teacher support: The involvement of working scientists can have a profound effect on teacher optimism.  Changing teaching style and/or adopting new curriculum requires tremendous energy and commitment on the part of the teachers involved.  Through supportive participation in the process, scientists can provide crucial emotional support for teachers and also advocate for teachers within a program, school district, and/or community.

Resource acquisition: To be a professional scientists in today’s world it is necessary to have exemplary grant writing and communication skills.  Such skills, or the time to use them, are often lacking in school systems.  As the current financial conditions of most public schools make the need for outside funding of reform projects critical, scientists can provide an extremely valuable service as grant writers and administrators.  Without outside funding, today’s reality in public education virtually assures that innovative programs can not exist.

Modeling the scientific process: While scientists must be very careful in the use of their content knowledge, real science whether in the laboratory or the classroom depends substantially on the application of good scientific process.  By scientific process I do not mean the famous four steps in the scientific method that are drilled into the heads of children from grade 3.  Instead I mean the real scientific skills of investigation, critical thinking, imagination, intuition, playfulness, and thinking on your feet and with your hands that are essential to success in scientific research.  We have found that trained scientists, properly prepared and with attitudes adjusted, can easily apply these skills independent of their particular area of expertise.  In fact, in our programs we intentionally assign scientists to teacher training groups outside their area of expertise to reduce the likelihood that fun and exploration is replaced by a quickly offered factual answer.  In our experience, when scientists and teachers are mixed together in inquiry teams where no one has the answer (or better yet, where a “correct” answer does not even exist), the result can be extremely valuable for teachers.  There is no more effective means to convey the excitement of science than to let teachers, and their students really do science where doing is dependent on involvement in an open ended, inquiry-based, student driven exploration of almost any subject.

In Conclusion

girl-together All Teachers, All Children:  The myths I have considered in the previous sections are obvious and understandable given the type of science education most scientists themselves have encountered.  However, there is another myth that is perhaps more sinister and deeply buried than these and that is that only a select subset of our society can really be involved in scientific exploration.  In this view the rest of our society simply become consumers of scientific facts.  Those programs that focus on exceptional teachers or on so-called gifted students, reinforce elitist views of who can and can not do science.  Our experience in the elementary school grades of the urban and predominantly minority Pasadena Unified School District suggests that every teacher and every child can benefit from high quality science instruction when given the opportunity.  For these reasons, I believe that effective reform of precollege science education in our nation depends on supporting the professional development of all teachers in service to all students.  To do this, it is necessary to explicitly design programs that involve entire school systems, all teachers, and all students.  Any other approach effectively reinforces science as an elite subject for elite teachers and special students.  We are already living with the educational and political consequences of this attitude.

Educate and Reform Thyself: While most of the above discussion concerns scientist involvement in the public schools, perhaps the most important personal consequence of my involvement with science education reform has been a growing awareness of how poorly I have taught my own students (c.f. Bower, 1995, Systemic reform from the inside out: Look who’s changing now.  The Catalyst, #3, NRC Press, Washington, D.C.).  Prior to involvement in this project, I knew remarkably little about good science education.  After ten years of involvement with precollege science,  I have become profoundly aware of the negative effect the poor teaching of science in colleges and universities has on the rest of the educational system.  In many ways, colleges and universities set the standards for the entire educational system.  So, while I wish to encourage scientists to contribute to the public schools, the most significant consequence of this involvement may very well be fundamental reform in the way we educate our own students.  After all,  the curriculum we ourselves control should be the easiest to change.