Modern Dev Math

Let’s pretend that we don’t have external groups and policy makers directing or demanding that we make fundamental changes in developmental mathematics.  Instead, let us examine the level of ‘fit’ between the traditional developmental mathematics curriculum and the majority of students arriving at our colleges this fall.

I want to start with a little bit of data.  This chart shows the typical high school math taking patterns for two cohorts of students.  [See  http://www.bls.gov/opub/ted/2012/ted_20121016.htm ]

hs-math-course-taking-for-two-generations

 Join Dev Math Revival on Facebook:

 

 

 

 

 

 

 

 

 

There has been a fundamental shift in the mathematics that our students have been exposed to, and we have reason to expect that the trends will continue.  We know that this increased level of math courses in high school does not translate directly into increased mathematical competence.  I am more interested in structural factors.

Intermediate algebra has been the capstone of developmental mathematics for fifty years.  At that time, the majority of students did not take algebra 2 in high school … so it was logical to have intermediate algebra be ‘sort of developmental’ and ‘sort of college level’.   By about 2000, this had shifted so that the majority of students had taken algebra 2 or beyond.

The first lesson is:

Intermediate algebra is remedial for the majority of our students, and should be considered developmental math in college.

This seems to be one lesson that policy makers and influencers have ignored.  We still have entire states that define intermediate algebra as ‘college math’, and a number that count intermediate algebra for general education requirements.

At the lower levels of developmental mathematics, the median of our curriculum includes a pre-algebra course … and may also include arithmetic.  Fifty years ago, some of this made sense.  When the students highest math was algebra 1 in most cases, providing remediation one level below that was appropriate.  By fifteen years ago, the majority of students had taken algebra 2 or beyond.  The second lesson is:

Providing and requiring remediation two or more levels below the highest math class taken is inappropriate given the median student experience.

At some point, this mismatch is going to be noticed by regulators and/or policy influencers.  We offer courses in arithmetic and pre-algebra without being able to demonstrate significant benefits to students, when the majority of students completed significantly higher math courses in high school.

In addition to the changes in course taking, there have also been fundamental shifts in the nature of the mathematics being learned in high school.  Our typical developmental math classes still resemble an average high school (or middle school) math class from 1970, in terms of content.  This period emphasized procedural skills and limited ‘applications’ (focusing on stylized problems requiring the use of the procedural skills).  Since then, we have had the NCTM standards and the Common Core State Standards.

Whatever we may think of those standards, the K-12 math experience has changed.  The emphasis on standardized tests creates a minor force that might shift the K-12 curriculum towards procedures … except that the standardized tests general place a higher premium on mathematical reasoning.  Our college math courses are making a similar shift towards reasoning.  Another historical lesson is:

Developmental mathematics is out-of-date with high schools, and also emphasizes the wrong things in preparing students for college mathematics.

We will never abandon procedures in our math courses.  It is clear, however, that procedural skill is insufficient.  Our traditional developmental mathematics curriculum focuses on correcting skill gaps in procedures aligned with grade levels from fifty years ago.  We appear to start with an unquestioned premise that remediation needs to walk through each grade’s math content from 5 decades ago … grade 8 before grade 9, etc.  This is a K-12 paradigm with no basis in current collegiate needs.

The 3- or 4-course sequence of remedial mathematics is, and always will be, dysfunctional as a model for college developmental education.

There is no need to spend a semester on grade 8 mathematics, nor a need to spend a semester on grade 9 mathematics.  When students lack the mathematical abilities needed for college mathematics, the needs are almost always a combination of reasoning and procedural skills.  If we can not envision a one-semester solution for this problem, connected to general education mathematics, we have not used the creativity and imagination that mathematicians are known for.  Take a look at the Mathematical Literacy course MLCS Goals and Outcomes Oct2013 cross referenced 2 by 2 .  If students are preparing for pre-calculus or college algebra, take a look at the Algebraic Literacy course  Algebraic Literacy Goals and Outcomes Oct2013 cross referenced

Pretending that the policy influencers and external forces are absent is not possible.  However, it is possible for us to advocate for a better mathematical solution that addresses the needs of our students in an efficient model reflecting the mathematics required.

 

 

More on the Evils of PEMDAS!

The most common course for me to teach is ‘intermediate algebra’, and I’ve been thinking of the many issues with that course as part of the college curriculum.  However, my interest today is in poking at PEMDAS … and the poor way we often teach the order of operations.  As you know, understanding the order of operations concept is one key part of understanding basic algebraic notation.

An easy poke at PEMDAS is the “P” (parentheses for us, ‘B’ bracket in some other countries).  The problem below is actually from our beginning algebra curriculum:

16÷(4)(2)

Operator precedence usually places products and quotients at the same level, with the normal parsing from left to right (answer: 8).  Of course this ‘tie breaking’ rule is arbitrary; however, a convention about this is necessary for all machine calculation … and our students interact with these machines.

I’ve seen people say that this is a silly point, without merit … and they suggest including sufficient grouping to avoid any “ambiguity” from the expression.  I’ve also seen people say that there is no such thing as implicit multiplication (as in the problem above, or as in an algebraic term like -3x).  What they mean is that implicit multiplication has the same priority as explicit multiplication; some programming environments do not allow implicit products in order to avoid issues with that precedence.

If we state the problem algebraically, it might be:

16÷4k, where k=2

We, of course, prefer fraction notation for quotients due to the ‘confusion’ created by the divided by symbol (which our students write as a slash):

16/4k

One discussion site has a comment that we should use those grouping symbols to be clear, and concludes with a comment that the answer changes when we use algebraic notation for the same quotient & product expression.  (see http://math.stackexchange.com/questions/33215/what-is-48%C3%B7293  )  This ‘changing answer’ feature should bother all of us!

In the original problem above, the product involves parentheses … so our PEMDAS-based students always calculate that product first.  They have no idea that there is an issue with implied products when variables are involved; I’m okay with that at the time (we get to it later).  In all of my years of reviewing missed problems like that one, I’ve never heard a student justify their answer by ‘implied products have a higher priority’.  They always say “parentheses first”.

If we could say “GEMDAS” (for “grouping”) we would be more honest.  I’m not sure what “G” means for my poor aunt Sally … but, then, having a sentence for an mnemonic with no connected meaning is likely to be a bad thing.  When we continually talk about ‘remember my dear aunt Sally’, we encourage students to process information at the lowest possible level — instead of a beginning understanding, all they get is a memorized rule which is fundamentally flawed.

The role of mnemonics in ‘remembering’ has been studied.  The book Cognitive Psychology and Instruction, 4th edition Bruning et al has a review of research on this on pages 72-73 (it’s also in their 5th edition though I don’t have that page reference).  The basic conclusion was that mnemonics help students remember when mnemonics help students remember … and can interfere with remembering when the student does not find them helpful.  That means the some students can use them to remember, some students get confused … and (in my view) all students have negative consequences for using poor aunt sally.

I think the emphasis on PEMDAS also creates a mental ‘twist’ in our students’ minds.  They take expressions which do not have stated grouping and insert parentheses so that the basic meaning is changed:

5x²  is mistakenly processed as (5x)²

In the intermediate algebra course, some strange things happen relative to parentheses.

(3x² – 5) + (4x + 3) is treated as a product

A good portion of my class time is spent on un-learning PEMDAS and building some understanding of notation with order of operations.  The biggest problem … grouping that is done with other symbols besides parentheses (fraction bars, radical symbols, absolute value, etc).

Because I’ve been teaching so long, I’m occasionally asked about any changes I notice.  Folks expect me to report that students are less prepared now compared to 30 or 40 years ago.  Actually, there have been improvements in the mathematics preparation of our students.  However, these improvements are not uniformly distributed both in terms of students and in terms of mathematics.  In particular, students struggle more now with order of operations; some of that degradation seems to be due to the over-use of PEMDAS.

We should avoid books that build in PEMDAS, and we should avoid the mnemonic in our classes.  Understanding something is much better than memorizing an erroneous rule.

 Join Dev Math Revival on Facebook:

Understanding the Data on Co-Requisite Remediation

We need to change how we handle remediation at the college level, because the traditional system is based on weak premises … and the most common implementations are designed to fail for most students.  Where we have had 3 and even four remedial courses, we need to look at one for most students.

Because of that baseline, the fanatical supporters of “co-requisite remediation” are having a very easy time selling their concepts to policy makers and institutional leaders.  The Complete College America (CCA) website has an interactive report on this (http://completecollege.org/spanningthedivide/#remediation-as-a-corequisite-not-a-prerequisite   ) where you can see “22%” as the national norm for the rate of students starting in remediation who complete a college math course.  With that is a list of 4 states who have done co-requisite models … all of whom show 61% to 64%.

One obvious problem with the communication at the CCA site is that the original data sources are well hidden.   Where does the ‘22%’ value come from?  Is this all remediation relative to all college math?  The co-requisite structures almost always focus on non-algebraic math courses (statistics, quantitative reasoning).  One could argue that this issue is relatively trivial in the discussion; more on this later.

What is non-trivial is the source of the “61% to 64%”.

One of the community colleges from a co-requisite remediation state came to our campus and shared their detailed data … which makes it possible to explore what the data actually means.  Here are their actual success rates in the co-requisite model they are using:

Math for Liberal Arts: 52%

Statistics: 41%

These are pass rates for students in both the college math course and the remediation course in the same semester.  Another point in this data is that ‘success’ is considered to be a D or better.

For comparison, here are similar results from a college using prerequisite remediation, showing the rate of completing the college math course for those placing at the beginning algebra level.

Quantitative Reasoning: 53%

Statistics:  52%

In other words, if 100 students placed at the beginning algebra level in the fall … there were about 52 who passed their college math course in the spring.  Furthermore, this college considers ‘success’ to be a 2.0 or better.  The prerequisite model here has higher standards and equal (or higher) results.

The problem with the data on co-requisite remediation is that only high-level summaries (aggregations) are shared. Maybe the state average for the visiting college really is “61%” when they have about 45% (they have more in statistics than Liberal Arts).  Or, perhaps the data is being summarized for all students in the college course without separating those in the co-requisite course.  One hopes that the supporters are being honest and ethical in their communication.

I suspect that the skewing of the data comes more from the “22%”.  The source for this number usually includes all levels of remediation followed to any college math course (including pre-calculus).  The co-requisite data is a different measurement because the college course is limited (statistics, quantitative reasoning).

Another interesting thing about the data that was shared from the co-requisite remediation college is this statement:

Only about 20 students out of 1500 in co-requisite remediation had an ACT Math score at 15 or below.

At my institution, about 20% of our students have ACT Math scores at 15 or below.  Nationally, the ACT Math score of 15 is at the 15th percentile.  Why does one institution have about 1% in this range?  Is co-requisite remediation being used to create selective admission community colleges?  [Not by policy, obviously … but due to coincidental impacts of the co-requisite system.]

Sometimes I hear the phrase “a more nuanced understanding” relative to current issues in mathematics education.  I suppose that would be nice.  First, though, we need to start with a shared basic understanding.  We can not have that basic understanding as long as the data being thrown at us is ill-defined aggregate results lacking basic statistical validity.

Perhaps the co-requisite remediation data has statistical validity.  I tend to doubt that, as we use a peer review process to judge statistical validity … we we know that has not been the case for the co-requisite remediation data we are generally seeing (especially from the CCA).  The quality of their data is so bad that there would be a failing grade in most introductory statistics courses for a student doing that quality of work.  It’s discouraging to see policy leaders and administrators become profoundly swayed by statistics of such low quality.

Reducing ‘remediation’ to one measure is an obviously bad idea.

 Join Dev Math Revival on Facebook:

MichMATYC 2016 conference schedule (Saturday, October 15)

The MichMATYC conference planners at Delta College are doing the final tuning of the program for October 15 (2016).  Here is a almost-final schedule of sessions:

programgrid2_sept19_2016

 

 Join Dev Math Revival on Facebook:

 

WordPress Themes