My college (Metro) has a course that is designed to transition you into higher (proof-based) math. It is called "Introduction to Proofs." At the Colorado School of Mines, the equivalent course is Discrete Math (which is also taught at Metro - I took it, as you might recall - but doesn't have the same gateway function). A lot of other schools seem to have a similar course.
I expected Proofs to be pretty easy, because I did very well when I took Logic at Rice, and I've always been good at doing logic and proofsy things. But actually, the course was pretty challenging although I did do well at it.
But there was one main thing that I learned, and I find that I keep learning this thing over and over again: math is about definitions. When you need to do a proof in math, the main thing you need to focus on (at least in my experience) is the relevant definitions.
Case in point. I was thinking the other day about how to prove the distributive property of multiplication over addition, that is
a * ( b + c) = (a * b) + (a * c)
It's pretty easy to prove in the natural numbers (i.e., positive integers). a * b is b added to itself a times, and you can get to it pretty easily that way. But when I got to thinking about the real numbers, I couldn't imagine how you could go about proving it.
I was talking to Ed about it, and he wasn't sure either, and at some point I realized, of course I can't prove this: I don't know how to define multiplication in the reals at all! How could I have an inkling of a proof if I don't even have a definition of multiplication? (There is no way to get at "pi times the square root of 2" through repeated addition, at least not that I can see.)
And that's how it always is. If someone asks me to write a proof, the main technique I have is to look at the definitions involved and then work from there. If I'm missing a definition, then I have to solve that problem first. For some reason, I find this easy to forget, but also easy to learn.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment