Category: Learning math

TMI in the GCF and LCD of MATH: TTYL, PEMDAS!!

If we could tweet and text math, we would say things like “Need LCD, remove GCF, remember PEMDAS”.  Wait a minute, we already say those things.  Seems like math classes are ahead of the curve on not communicating well.  Let’s look a little deeper.

The human brain has some limitations that impact how well acronyms work in communication; as a teacher, I would say that communicating with acronyms tends to keep the information processing at the surface level — translating the acronym in to the words — rather than connecting ideas to important concepts.  We say “you need an LCD to add fractions” and have to remind students that an LCD is not needed for multiplying fractions.  Perhaps we would improve our instruction if we banned acronyms.

I’ve tried taking a compromise approach in my intermediate algebra class, where we are currently taking the test on rational expressions.  We use the label “LCD” after we’ve shown that terms need something in common before adding and subtracting.  I’d like to say that the approach improves student learning, but that is not likely to be measurable — partly because it is such a challenge to get students to reason mathematically instead of memorizing rules for getting answers.

Within five to ten years, we will have a different curriculum for most of our students (STEM and not-STEM) where we provide a better mathematical experience for all students.  In the meantime, many of us will struggle with better understanding in our students.  It might help to say “TTYL” (talk to you later) to all acronyms, as much as we can manage.

 Join Dev Math Revival on Facebook:

Khan, Comfort, and the Doom of Mathematics

Perhaps you already knew this:

If students perceive instruction as clear, the result will be reinforcing existing knowledge (often not so good knowledge).

I recently ran into a reference to a fascinating item posted by Derek Muller, specifically about videos like the Khan academy; Dr Muller’s specific interest is science education (physics in particular), and you might find the presentation interesting http://www.youtube.com/watch?v=eVtCO84MDj8 (it’s just 8 minutes long).

In mathematics, even more than physics, students come to our classrooms with large amounts of prior experience with the material.  Of course, much of their existing knowledge is either incomplete or just plain wrong (whether they place into developmental math classes or not).  A ‘clear’ presentation means that the existing knowledge was not disturbed in any significant way.  Clear presentations make students even more confident in the validity of the knowledge they possess.  This is not learning.  Reinforcing wrong information is the doom of mathematics.

In Dr Muller’s study, two types of presentations were done.  The first were the ‘clear’ ones; students felt good about watching, but the result was absolutely no improvement in their learning.  The second type were ‘confusing’ ones, where the presentation deliberately stated common misunderstandings and explored them.  Students did not like watching these;  however, the result was significantly improved learning.

We see this in our classrooms.  This past Friday, a young man from my beginning algebra class came in to see me … he had left class in the middle, in a distracting way to other students.  Turns out that he left because he could not stand the confusion.  In talking to him, he believes he can do the algebra but he is getting very confused by the discussion in class about “why do that” and “here is another way to look at it”.  In fact, this student has a very low functioning level about algebra.  If he does not go through some confusion, his mathematical literacy will remain unchanged; that is to say … he won’t have any meaningful mathematical literacy.

Khan Academy videos are popular; I understand … I have watched some myself.  I consider them to be very clear and essentially useless for learning mathematics.  If a person already has good knowledge, they will not need them; if a person lacks some knowledge, they will not perceive what they lack from watching a video.  [Just like witness research in criminal justice, students perception is controlled by their understanding.]

The attraction of modules and NCAT-style redesign is often the clarity and focus.  Students do not generally see anything that might confuse them; the environment is artificially constrained to avoid as many confusing elements (inputs) as possible.  To the extent that students in these programs are ‘comfortable’ and the instruction ‘clear’, that is the extent to which existing knowledge is reinforced.  Learning can not happen if we primarily reinforce existing knowledge; confusion is an essential element in a learning environment.  [I sometimes tell my students that instead of being called a ‘teacher’ they should call me ‘confusion control expert’.]

I suspect some readers are thinking that “He has this wrong … I have data that shows that students do really learn.”  It’s true that I don’t have proof; I don’t even have my own research (though I would love to see some good cognitive research on these issues).  What I do know is that student performance on exams — especially procedural items — is a very poor measure of mathematical knowledge.  I suggest that you interview some average students that you think know their mathematics based on exam performance; have them explain why they did what they did … and have them explain the errors in another person’s work.  Based on what I have heard from students, I think that you will find that only the best students can show mathematical knowledge in an interview at a level equal to their exam performance; average students will struggle with the interview about their mathematics.

How do we avoid the doom of mathematics?  How do we prevent our classes from becoming reinforcers of existing knowledge?  I think we need to create environments for learning where every student faces some confusion on a regular basis … not overwhelming confusion, and not trivial confusion, but meaningful confusion about important mathematics.   Do we need an LCD to do that?  Must we move terms in an equation before we divide by the coefficient?  Is that distrubuting, or is that subtraction?   Confusion is where students bump into the areas of knowledge that need their attention.

Our students have a strong tendency to drive through our courses as fast as possible, without really dealing with mathematics.  They believe the myth that the experts always understand, that we are never confused.  We need to be comfortable in showing confusion to our students and model appropriate behavior to resolve it.  The appropriate response to confusion is figuring out where we went wrong … not running away for a comfortable explanation.  Confusion may call for some meta-cognitive efforts, or we may simply need to polish one particular mathematical idea.

Confusion is the fertile soil of learning.  Avoiding confusion creates a sterile environment without growth.  Comfort is fine, and we all need comfort; however, comfort never learned anything. 

 Join Dev Math Revival on Facebook:

Are Modules the Answer for Developmental Mathematics?

The number of institutions implementing modules in developmental mathematics continues to increase, which I expect to continue for another year or two.  Over the next 5 years, I expect most of these institutions to shift to other models and solutions for their developmental mathematics programs.  Perhaps you can think of some reasons why colleges would try modules now and then replace modules.

Our context for this problem is complex, with multiple expectations for developmental mathematics and multiple measures of current problems.  Modules are appealing because of the clear connection between a modular design and some measures of current problems — low pass rates and low completion rates in particular.  For a change to survive in a longer term, the methodology needs to address enough of the basic problems to be sustainable.

When we started the New Life project in the Developmental Mathematics Committee of AMATYC, we asked a set of national leaders in the field to identify the basic problems they saw.  In analyzing that input (done via email, primarily), the problems could be clustered in a few basic categories:

The content of developmental mathematics courses is not appropriate for the majority of students.

The typical sequence of courses has too many steps for students to complete in a reasonable amount of time.

The learning methods emphasized in most programs were not effective, and do not reflect the accumulated wisdom about learning and cognition.

Faculty, especially in developmental mathematics, were professionally inactive and they tended to be isolated.

Faculty were not using professional development opportunities, both due to lack of information and due to lack of institutional support.

Modules are often selected based on rationale of content and sequence.  However, when we look deeper at the content problem, the issue is a very basic one: the typical developmental mathematics sequence emphasizes symbolic procedures presented in isolation from both applications and other mathematics.  In other words, completing a developmental math course typically does not result in a significant increase in the mathematical capabilities of students … the learning was of the type that is quickly forgotten.

One reason, then, that modules will tend to be a short-term process is that the design does not generally address basic content problems.   A modular program makes it easier for students to complete; a consequence of this is that the content is deliberately compartmentalized and isolated.  Module 4 is independent of Module 3; the learning is not connected, nor is there (normally) a cumulative assessment at the end of a sequence (like a final exam). 

I am hoping that you are thinking … “Wait a minute, modules can do more — the learning can be connected, and we can have a cumulative assessment”.  Great, good job thinking critically.  However, every single modular implementation I am reading about focuses on the independence of the modules, and none have a ‘final exam’.  Some colleges will eventually try to address this problem.  The challenge is that doing so is fairly difficult, and will tend to increase cost.  [You might have noticed that cost was not a general problem as identified by leaders in the profession.]

The learning methods are also a problem in the typical modular design.  Modules have a high probability of using online homework systems; these systems tend to be limited to symbolic procedures.  More fundamentally, though, I see modular programs as missing the learning power of groups and language.  Modular programs tend to be individual-based; social settings, such as small group work, are either difficult to manage or just plain impossible.  Language (meaning speaking and writing) are often quite limited; as in traditional developmental programs, modules tend to emphasize the correctness of answers as a measure of learning … as opposed to quality of work, written explanations, or spoken explanations.  Therefore, I generally expect that modular programs will result in levels of learning that are statistically equal to the programs they replace; this (if true) is enough of reason for colleges to leave the module design in a few years.

Some modular designs have addressed some of the problems related to faculty … at the point of implementation, and in limited areas.  Not enough for long-term viability.  We, the faculty in developmental mathematics, have much to do.  The overwhelming majority of us are not engaged in any professional activity (beyond a few hours of work per year at our own campus); we generally do not attend conferences, we don’t join AMATYC and state affiliates; we don’t read professional journals, let alone publish in them.  We need to develop a deeper understanding of our profession; in particular, we need to be proficient in analyzing learning mathematics as a matter of mathematics and of cognition.  We need both deeper toolsets and the knowledge about best uses for those tools.  None of the modular designs I read about have a long-term strategy for supporting faculty.

The designs I often call “the emerging models” all deal with multiple problem areas, resulting in long-term viability.  The emerging models (AMATYC New Life, Carnegie Pathways, Dana Center Mathways) address content, sequence, learning, and faculty issues.  Over the next few years, you will begin hearing of institutions who had implemented modules switching to one of these emerging models.  We all are committed to helping our students, and these models provide a better solution.

 Join Dev Math Revival on Facebook:

How much practice is enough?

Do you see repetition as the enemy of a good math class?  Or, do you see practice is the single biggest factor in learning?  More practice might be better … it might be worse; however, repetition is not trivial in the learning process.

One reason I am thinking about repetition is the current emphasis with online homework systems, whether as part of redesigns like emporium or with modules or with ‘regular’ classes.  Sometimes, these systems are marketed with an appeal to a high ‘mastery’ level (percent correct … not the same thing at all).  To understand the impact of various practice arrangements, we need to review some cognitive psychology.

First, a lack of repetition normally places a high work load on short term memory; without repetition, the long-term memory (playing the role of ‘knowledge’ in this case) is anecdotal, like remembering the last web site you visited before leaving home.  Without repetition, new knowledge does not become integrated with related knowledge.  In the extreme, a contextualized math course has almost no repetition; each problem is a novel experience.   In the science of cognition, this type of knowledge is called ‘declarative’ knowledge.

Second, the quality of the practice is a critical factor in how the information is stored.  Much research has been done on factors that raise the quality of practice; in particular, ‘blocked’ (one type at a time) and ‘unblocked’ (mixed) both contribute to better learning.  In my view, this is one of the major drawbacks of both online homework systems and modules … one objective at a time, practice on that, test and move on.  (In cognitive science, ‘blocked’ is used strictly … same steps and knowledge used each time.) 

Third, there is a connection between effective practice and math anxiety.  As accuracy is established via repetition, anxiety can be lowered.  [I am not claiming that practice, by itself, will lower anxiety.  I am claiming that a lack of practice will reinforce the existing anxiety level.]

In the learning sciences, research talks about “automaticity” and “performance time”.  Higher levels of automaticity are associated with faster performance time; both are factors in the brain’s efforts to organize information and ‘chunk’ material for easier recall.

Whatever class you are teaching, keep your practice consistent with your course goals.  If you want students to organize knowledge, apply it to new situations, and improve attitudes, you should consider sufficient quantity and quality of practice.

Here are some references:

Cognitive Psychology and Instruction, 4th edition 2003 Bruning, Roger; Schraw, Gregory; Norby, Monica; Ronning, Royce  (Pearson)

Beyond the Learning Curve: The Construction of the mind 2005   Speelman, Craig P and  Kirsner, Ki    (Oxford University Press)

Automaticity and the ACT* theory   Anderson, John   1992 Available at  http://act-r.psy.cmu.edu/publications/pubinfo.php?id=91

Radical Constructivism and Cognitive Psychology   Anderson, John;  Reder, Lynne;  Simon, Herbert  1998 Available online at http://actr.psy.cmu.edu/~reder/98_jra_lmr_has.pdf

 

 

 

Join Dev Math Revival on Facebook:

WordPress Themes