Category: Research connected to practice

Dev Math: Where Dreams go to Thrive … Part II (Evidence)

Developmental mathematics is where dreams go to thrive; we have evidence that even the traditional courses help students succeed in college.  The narrative suggested by external political forces is often based on a simplistic view of students which is out of touch with reality.  Let’s help by spreading the word on a more complete understanding.

Students who need to take developmental math courses have a wide range of remediation needs.  Peter Bahr’s study on pathways with single or multiple domains of deficiency (https://www.devmathrevival.net/?p=2458) concluded that the basic college outcomes (such as earning a degree) show equivalent outcomes for groups of students (needed remediation versus not).

A totally different analysis by Attewell et al 2006 (see http://knowledgecenter.completionbydesign.org/sites/default/files/16%20Attewell%20JHE%20final%202006.pdf) also reaches a conclusion of equal results between groups in many ways.  Many studies of remediation are simple summaries of enrollment and grades over a short period of time.  The Attewell research was based on a longitudinal study begun on 8th graders in 1988 (thus, the acronym “NELS: 88”) done by the National Center for Educational Statistics.  Over an 12 year period, the study collected high school and college information as well as additional tests and surveys on this sample.

A key methodology in this research is ‘propensity matching’ — using other variables to predict the probability of an event and then using this probability to analyze key data.  For example, high school courses and grades, along with tests, were used to calculate the probability of needing remediation in college … where a sample of students with given probabilities did not take any remediation while another sample did.  An interesting curiosity in the results is the finding that low SES and high SES students have equal enrollment rates in remedial math when ‘propensity matched’.

Thrive: Key Result #1
Students taking remedial courses have a higher rate of earning a 2-year degree than students who do not take remedial courses with similar propensity scores for needing remedial courses.  Instead of comparing students who take remediation with the entire population, this study compared students taking remediation with similar students who did not take remediation.  The results favor remediation (34% versus 31%)

In the bachelor degree setting, the results are the other direction — which the authors analyze in a variety of ways.  One factor is the very different approach to remediation in the two sectors (4-year colleges over-avoid remediation, 2-year colleges slightly over-take remediation).   However, the time-to-degree between the two groups is very similar (4.97 years with remediation, 4.76 years without).

Thrive: Key Result #2
Students taking three or more remedial courses have just slightly reduced results.  This study shows a small decline for students needing multiple remedial courses: 23.5% earn 2-year degree, versus 27.5% of similar students without multiple courses.  The Bahr study, using a local sample, produced equivalent results in this same type of analysis.

It’s worth noting that the results for multiple remedial courses are pretty good even before we use propensity matching:  25.9% complete 2-year degree with multiple remedial courses versus 33.1% without.  This clearly shows that dreams thrive in developmental mathematics, even among students with the largest need.

Thrive: Key Result #3
Students taking 2 or more remedial math courses have results almost equivalent to other students.  The predicted probabilities for students with multiple remedial math courses is 23.8%, compared to similar students without multiple remedial math (26.7%).

Note that this study was based on data from prior to the reform movements in developmental mathematics.  Even then, the results were reasonably good and indicate that the remediation was effective at leveling the playing field.

Thrive: Key Result #4
This is the best of all:  Students who complete all of their math remediation have statistically equivalent degree completion (2-year) compared to similar students (34.0% vs 34.7%)

This result negates the common myth that taking multiple remedial math courses spells doom for students.  The data shows that this is not true, that completing math remediation does what it is meant to do — help students complete their degree.

 

I encourage you to take a look at this research; it’s likely that you will spot something important to you.  More than that, we should all begin to present a thrive narrative about developmental mathematics — because that is what the data is showing.

 
Join Dev Math Revival on Facebook:

 

The Case for Remediation

Today, I am at a state-wide conference on developmental education (“MDEC”), where two presenters have addressed the question “is remediation a failure?”.  As you likely know, much of the recent conversation about developmental mathematics is based on a conclusion that the existing system is a failure.  The ‘failure’ or ‘success’ conclusion depends primarily on who is asking — not on the actual data itself.

The “failure” conclusion is presented by a set of change agents (CCA, CCRC, JFF); if you don’t know those acronyms, it’s worth your time to learn them (Complete College America; Community College Research Center; Jobs For the Future).  These conclusions are almost always based on a specific standard:

Of the students placed into developmental mathematics, how many of them take and pass a college-level math course.

In other words, the ‘failure’ conclusion is based on reducing the process of developmental mathematics down to a narrow and binary variable.  One of today’s presenters pointed out that the ‘failure’ conclusion for developmental math is actually a initial-college-course issue — most initial college courses have high failure rates and reduced retention to the next level.

The ‘success’ conclusion is reached by some researchers who employ a more sophisticated analysis.  A particular example of this is Peter Bahr, who has published several studies.  One of these is “Revisiting the Efficacy of Postsecondary Remediation”, which you can see at http://muse.jhu.edu/journals/review_of_higher_education/v033/33.2.bahr.html#b17.

My findings indicate that, with just two systematic exceptions, skill-deficient students who attain college-level English and math skill experience the various academic outcomes at rates that are very similar to those of college-prepared students who attain college-level competency in English and math. Thus, the results of this study demonstrate that postsecondary remediation is highly efficacious with respect to ameliorating both moderate and severe skill deficiencies, and both single and dual skill deficiencies, for those skill-deficient students who proceed successfully through the remedial sequence.  [discussion section of article]

In other words, students who arrive at college needing developmental mathematics achieve similar academic outcomes in completion, compared to those who arrived college-ready.  There is, of course, the problem of getting students through a sequence of developmental courses … and the problems of antiquated content.  Fixing those problems would further improve the results of remediation.

One of the issues we discuss in statistics is “know the author” … who wrote the study, and what was their motivation?  The authors who conclude ‘failure’ (CCA, CCRC, JFF) are either direct change agents or designed to support change; in addition, these authors have seldom included any depth in their analysis of developmental mathematics.  Compare this to the Bahr article cited; Bahr is an academic (sociologist) looking for patterns in data relative to larger issues of theory (equity, access, etc); Bahr did extensive analysis of the curriculum in ‘developmental math’ within the study, prior to producing any conclusions.

Who are you going to believe?

Some of us live in places where our answer does not matter … for now, because other people in power roles have decided who they are going to believe.  We have to trust that the current storms of change will eventually subside and a more reasoned approach can be applied.

In mathematics, we have our own reasons for modernizing the curriculum; sometimes, we can make progress on this goal at the same time as the ‘directed reforms’.  Some of us may have to delay that work, until the current storm fades.

Our work is important; remediation has value.  Look for opportunities to make changes based on professional standards and decisions.

I’ll look for other research with sound designs to share.  If you are aware of any, let me know!

 

Join Dev Math Revival on Facebook:

 

 

Avoiding the Beginning Algebra Penalty

The most commonly taken math course in two-year colleges is beginning algebra; if we select a math student at random, there is a 21% probability that they are enrolled in a beginning algebra course.  On average, taking a beginning algebra course either does not improve the odds of passing intermediate algebra … or actually decreases the odds of passing.  #NewLifeMath #MathLiteracy

According to the 2010 Conference Board of Mathematical Sciences report (CMBS, http://www.ams.org/profession/data/cbms-survey/cbms2010-Report.pdf) about 428,000 students enrolled in a beginning algebra course at community colleges, compared to a total of 2.02 million enrolled.  The next most common enrollment was intermediate algebra (344K) followed by college algebra (230K) and pre-algebra (226K).  These extreme enrollments in courses in a long sequence have got to stop … see other posts on ‘exponential attrition’.

The main point today is this:

Evidence suggests that students incur a penalty when they enroll in a beginning algebra course.

Progression data is difficult to obtain, at the cross-institution level.  When the progression data is available, the format is often an overly simplistic comparison of those who placed at level N compared to those who took course N-1 then course N.  These summaries provide little information about the results of course N-1 (beginning algebra in this case).  At my institution, for example, those taking beginning algebra prior to intermediate algebra have a slightly higher pass rate in intermediate algebra compared to the course average.

However, this data is not research on the impact of beginning algebra.  Fortunately, our friends at ACT routinely conduct research on various components of the college curriculum.  In 2013, ACT released a research study on developmental education effectiveness (see http://www.act.org/content/dam/act/unsecured/documents/ACT_RR2013-1.pdf).  This ACT study used a regression discontinuity method, with a very large sample (over 100K), to examine the impact of taking certain courses with ACT Math score as a basic variable.  Since most two-year institutions do not use ACT Math as a placement test (at this level), their sample included large numbers of students at varying levels … a portion of which took beginning algebra first then intermediate algebra, and a portion which took intermediate algebra only.

The results were strong and negative:

ACT beginning algebra versus intermediate algebra 2013

 

 

 

 

 

 

 

 

The ‘dashed’ lines are students taking beginning algebra prior to intermediate algebra.  The upper set of lines represents ‘receiving a C or better in intermediate algebra.  For all ACT Math scores, the data suggests that students would be better served by placing them in to intermediate algebra (6% or higher probability of success, regardless of ACT Math score).

This is not what we want, at all; our personal experience might suggest that reality is different from this research study.  I believe that the research study is accurate, and that our own perceptions are misleading about generalities.

What strikes me about this research is that the results form a consistent pattern even though we lack a standard for what is ‘beginning algebra’ and what is ‘intermediate algebra’.  In some states, this is defined by a governing body; overall, though, we have operational definitions — beginning algebra is a course called beginning algebra, using a book titled beginning algebra.

Both courses (beginning and intermediate algebra) are heavily skill and procedure based,  organized around discrete chapters and sections.  In practice, intermediate algebra involves enough complexity that some understanding is required … while beginning algebra tends to reward memorization techniques.  To me, the research findings make sense

We need to avoid the beginning algebra penalty by replacing beginning algebra with a modern course that builds reasoning (like Mathematical Literacy).  Students are ill-served when we ‘keep it simple’ … students are not prepared for the future, and we also reinforce negative messages about mathematics (“I am not a math person”).  As long as we teach beginning algebra, we harm our students — we help some, but harm a larger group.

The beginning algebra course is beyond rescue; no amount of tweaking and micro-improvements will result in any significant improvement.  It’s time to start over.

At my institution, we are expecting that our beginning algebra course will decrease over the next few years while Math Literacy grows.  [We also expect to move away from intermediate algebra, but that might take longer.]  I know of other institutions, like Parkland College in Illinois, which have gone further on this path.

What is your plan for getting rid of beginning algebra at your institution?

 Join Dev Math Revival on Facebook:

Co-Requisite Remediation in Tennessee

Has Complete College America (CCA) collaborated with the Tennessee Board of Regents (TBR) to create a great solution … or, have they inflicted an invalid model on the students of the state?  I suggest that “data” will not answer this question. #CCA #CorequisiteRemdiation

To summarize some key features of the Tennessee plan in mathematics, implemented state-wide this fall (2015):

  • All students are placed in to college-credit mathematics
  • If the ACT Math score is below 19, that college level math course will be statistics or quantitative reasoning
  • If the ACT Math score is below 19, the student is required to enroll in a co-requisite ‘support’ course
  • This co-requisite support course involves all developmental math learning outcomes

These elements are taken from a TBR memo (http://www.pstcc.edu/curriculum/_files/pdf/cdc/1415/Features%20of%20Corequisite%20Remediation%20-%20Memo.pdf)

From what I can see, actual practice is pretty close to this plan … learning support classes are paired with a QR course and an intro statistics course (but not college algebra or pre-calculus).  The learning support courses list topics from arithmetic, algebra, geometry and statistics.  I noticed that the QR courses tend to be more of a liberal arts math course — set theory, finance, voting, etc (the course is called ‘contemporary mathematics’).  In the 4-year college setting, this type of liberal arts math course is usually offered without any math prerequisite.

The initial data from the Tennessee pilot look very good; in fact, my provost is smitten with the Tennessee program, and wants us to consider doing the same thing.  I think the plan will “work” fairly well in Tennessee because of the non-symbolic nature of their QR course (intro statistics is notoriously non-symbolic, in the algebraic sense) … and the fact that they block students from STEM.  [They also had an inappropriate prerequisite on the non-STEM courses; see below.]

In Michigan, we have tried to establish 3 paths in math … college algebra/pre-calculus, statistics, and QR.  For statistics and QR, we have established a ‘beginning algebra level’ prerequisite (algebra or math literacy).  This level maps roughly to an ACT math score of 17, and we require more algebra in my QR course than in the Tennessee course. When the Tennessee plan ‘works well’, part of that is due to the fact that students never needed any remediation for stat or QR if their ACT math was 16 to 18.

In other words, the good results from the co-requisite pilot is due, in significant part, to the math prerequisite for college level courses.  ACT math = 19 (the Tennessee cutoff)  is a bit low for college algebra, but it is too high for statistics and QR (even if the QR is more algebraic, like mine).  Tennessee could have achieved the benefits for about 30 to 40% of their students by changing the prerequisite on two courses to be more realistic; they had established a ‘intermediate algebra’ prerequisite for all college math when that is not appropriate.  Changing the prerequisite would have helped many of the students without requiring them to take another class.

The problem we face is not that there are ‘bad ideas’ being used; the problem is that policy makers are evaluating ideas at a global level only, when the meaning of any statistical study is derived from analysis done at a fine-grain level.  Aggregated data is either useless or dangerous, and ‘aggregated’ is all policy makers consider.

CCA says “the results are in”.  Nope, not at all … we have some preliminary data about some efforts, which are not necessarily showing what the aggregated data suggests.

 Join Dev Math Revival on Facebook:

WordPress Themes