Mathematica-assisted learning was .. Re: Speak errors (was Re:
- To: mathgroup at smc.vnet.net
- Subject: [mg130593] Mathematica-assisted learning was .. Re: Speak errors (was Re:
- From: Richard Fateman <fateman at cs.berkeley.edu>
- Date: Thu, 25 Apr 2013 02:51:03 -0400 (EDT)
- Delivered-to: email@example.com
- Delivered-to: firstname.lastname@example.org
- Delivered-to: email@example.com
- Delivered-to: firstname.lastname@example.org
- References: <17744967.14121.1366277874273.JavaMail.root@m06> <email@example.com> <20130422071048.E1C5A6AF5@smc.vnet.net> <firstname.lastname@example.org>
On 4/24/2013 3:16 AM, Murray Eisenberg wrote:
> See interspersed responses:
> On Apr 22, 2013, at 3:10 AM, Richard Fateman <fateman at cs.berkeley.edu> wrote:
>> Historically, experiments at higher educational levels to introduce a
>> computer algebra system into a math course have resulted in consequences
>> like this:
>> 1. Students, on average, resented having to learn "something else" (i.e.
>> using computer program) that wasn't "on the final".
> Simple solution: let students use the computer for all exams, too. (I've done that.)
Finding a room with (say) 350 computers, all running (say) mathematica,
all DISconnected from the internet to avoid collusion, (etc) presents
certain physical and electronic problems.
>> 2. On average they learned "no less" than students in the control group
>> not using computers. But "no more" either.
> What, exactly, does that mean? By what standards is this being judged?
I think there are a number of peer-reviewed papers in this area; I
recall one that had to do with a "modern algebra" type course.
Or this report (1991) regarding calculus and Mathematica
and how students performed in a physics course, it found
" a nonsignificant difference in the mean grades of the two groups "
E.g., when comparing with a conventionally taught control group, does
the comparison test asking
what-if questions that require simulation or calculations, etc.,
beyond the normal capabilities of paper and pencil?
I think the comparisons are generally with control groups that were
being taught the same material but without "benefit" of computers. It
seems to me that comparing the two groups of students on their ability
to write programs would not be pertinent to the question of whether the
two groups learned (say) calculus equally well.
If you are making the point that you think that students gain something
by learning to write program for computers, generally, that is something
I agree with. However, there is scant evidence that introducing
computers into a conventional course improves learning of that
conventional course material. You can accuse the instructors of
lacking imagination, or the students of lacking in ambition, interest,
curiosity, or the curriculum specifications of lacking in flexibility,
or the testing process bogus, or the selection of control groups wrong,
or any other hypothesis that you can come up with to invalidate the
published results. But other than the anecdotal comments from students
who really liked (but some hated...) the course, what can you do?
I'm all in favor of technological aids to teaching that work. Finding
them is not so easy. Proving that they work is hard too. Evidence that
consists solely of anecdotes from enthusiasts doesn't count. Making neat
demos is fun for the instructor, but that's not the question here...
Prev by Date:
Re: memory issue
Next by Date:
how to obtain the inverse series for multivariable functions using
Previous by thread:
Re: Speak errors (was Re: audio)
Next by thread:
Re: Mathematica-assisted learning was .. Re: Speak errors (was Re: