[If you are using Firefox, click here to read this thought with math formulas typeset in LaTeX by Avital Oliver's TeX the World add-on.]

Today I was grading the quizzes I gave this week to my Calculus I students on implicit differentiation. The first question asked them to compute *dy*/*dx* for the curve ln(*xy*) + π = *y*.

There were of course a handful of students who didn't differentiate π correctly, and in a few cases the algebra was too garbled for me to even figure out what they were thinking. But a certain surprising mistake occurred on a few papers: Students used the product rule to differentiate ln(*xy*), which they treated as the product ln · (*xy*) involving the *symbol* ln rather than as an application of the *function* ln to the argument *xy*. The results were variations on ln · (*y*′ · 1) + 1/*x* · (*x y*).

How does this happen? Does such a student not understand that the logarithm is a function? Does the student realize it is a function but thinks that the product rule still applies? Is it simply a case of blind symbol manipulation? While we're at it, why is it that some students continually succumb to the temptation of distributing a square root over addition or pulling the 3 out of sin(3*x*)? As an instructor I get annoyed when calculus students haven't had these unlawful habits corrected, and while grading I am sometimes tempted to scrawl admonishments in big red letters on their papers.

But what is really going to make this stop? How can you really get through to these students to make them understand that the things they write sometimes aren't just wrong but aren't even syntactic?

I don't have a complete answer. Obviously the first step is to help the student to directly confront the mistake they've made, and hopefully drive home the point that it is not valid in general. Toward this end, one semester I issued a violation to students who committed heinous crimes on the midterm exams. (For this idea I credit my fifth grade teacher, who made students write "Consequences" after misbehaving, and one of my high school English teachers, who had a notorious "ugh list" of forbidden words and phrases (like 'thing' and 'very') that earned an automatic F if used in an essay.)

But let's also take some responsibility. I propose that several issues contribute to the problem. (Some are clearly easier to fix than others.)

• Often we abuse notation. For example, sin^{2}*x* and sin^{–1}*x* are *really bad* notations for (sin *x*)^{2} and arcsin *x*, and we should never use them in lower division courses. Calculus texts should be cleansed of them as well. These notations aren't widely used until much later courses (when certain sets of functions become objects of study in their own right), so students do not have the context for dealing with them as general notations. Of course mathematicians and instructors can (usually!) figure out what is meant from context, but we shouldn't let our students see that we are careless with notation, because then they think they can do the same.

• One issue with undergraduate mathematics is that many identities *do* hold! I would like to believe that most people at some point were shown and understood why multiplication distributes over addition (say, for positive integers). But eventually, employing this rule in computation becomes a reflex. My guess is that after so many reflexes are acquired, the act of acquiring a reflex becomes a reflex! Since some students haven't reached the maturity (or willingness) to distinguish the identities that hold from those that do not, perhaps while teaching new identities (even very early on), a substantial emphasis should be placed on the fact that not *every* nice identity is actually true. There are standards, after all: No one chose to prevent square roots from distributing over addition; it comes intrinsically from the definitions.

• Usually we prove (or at least motivate) identities that will figure prominently in computations. For example, calculus students are shown how to derive the power law (*x*^{n})′ = *n* *x*^{n – 1} from the definition of differentiation. In other cases (for example, the chain rule), proofs may not be given in lecture, but at least they are addressed by the textbook. However, in some cases we train students to use identities without providing a rigorous explanation. For example, we tell them that it is okay to use differentials as symbolic quantities that can be manipulated as such, so that e.g. *dx*/*dx* = 1. Students should not be satisfied with this explanation, and indeed a proper theory of differentials is far beyond the scope of Calculus I. I'm not sure what to do about this; certainly we should emphasize that this notation provides *intuition*.

• Standard mathematical notation is so "intuitive" that it is heavily ambiguous, and so it requires a lot of extra knowledge/experience/context/guidance in order to figure out what things mean. Perhaps students wouldn't mistake function application for multiplication if we used more discerning notation, where the syntax is a very tangible element. To take this to the extreme, we could use a symbolic notation such as the FullForm notation in *Mathematica*, in which every expression is of the form *head*[*arg*_{1}, *arg*_{2}, …, *arg*_{k}]. For example, I could have asked to compute

D[y[x], x]for the curve

Equal[Plus[Log[Times[x, y[x]]], Pi], y[x]]. This is too formal for humans, but perhaps it would not be crazy to write ln[

For that matter, if students were consistently using *Mathematica*, even simply as a tool for algebra, then they would very quickly discover that it doesn't perform an "obvious" transformation and would hopefully wonder why.

Here I have of course assumed that all students care about correcting errors in reasoning, that their future work depends on mastering these concepts, and that they will eventually be in my position, trying to understand why a fraction of students in college mathematics courses are still confused about seemingly basic facts. And of course this is not the case, which is also something we should remember.