Ideally, everything we teach, everything we believe, we would test/demonstrate in one or more laboratory experiments. But this is impractical. Learning from experience is slow and costly. It would take too long to "PROVE" everything, and we don't have the equipment to do it. It would cost a great deal to obtain all the required hardware.
If standardized student test scores were the proper measure of performance we might do no experiments, no hands-on work at all.
But experiments are where new knowledge comes from. As profession scientists we must do experiments. (Or at least some of us must.) Even when we are "just teaching" students need to see some experiments even if this learning method is "less efficient." Students need to see examples of how new knowledge is obtained. Students need to see some subset of our beliefs "PROVEN." We can all argue over how much time and money to spend on this. (Neither can these experiments all be computer simulations. I've argued previously why computer simulations are not as good as real experiments. see my blog of 17 Sept. 2010)
I am interested in the 'balance point' between experiencing everything in the laboratory and accepting some knowledge as demonstratABLE.
ReplyDeleteModeling does a very good job of making laboratory work the foundation of as much as possible, and 'extending' that experience to the breadth of the intended curriculum, but I know that MANY modelers (myself included) have difficulty maintaining a pace that allows us to cover the full breadth of traditional topics in our courses.
I think my students develop their scientific skills much better now than they did when I was getting more coverage and less depth. It is a trade-off, and I push to optimize the speed of instruction every year.