Category: politics of developmental mathematics

Generalities

I have to admit that I am a bit grumpy.  I have to even admit that I was grumpy yesterday in a meeting at my college about developmental education. #Pathways  #MathProfessors

As you may know, I have been in the profession of developmental mathematics for quite a while; in a way, I stumbled upon this work back in 1973, when I was not able to find a high school teaching job … I had spent a year working in a local car dealership doing odd jobs, and managed to get interviewed for an adjunct position at the local community college.

After some time in the work, I discovered that there were both rewards and significant challenges.  I stayed with the job, and eventually connected with AMATYC and our state affiliate; that connection was a key turning point in my life.  All of us involved with mathematics in the first two years of college have a responsibility to our profession and the professional group (AMATYC).

Our work is incredibly important; we make a difference in student lives every day.

So, the grumpiness … essentially, the profession that I have been committed to for over 40 years has been under attack for the past few years.  Reports, foundations, policy makers, and state lawmakers have stepped in to our work; many of declared that developmental mathematics is a failure, and many suggest that students would be better served by being placed directly in to college-level courses with ‘support’.  These attacks, filled with pseudo-data and articulated with propaganda features, seek to preempt the faculty responsibility to maintain the curriculum in colleges.

Generalities … the attacks take some valid criticisms of developmental mathematics, supported by external forces, to create the types of change that certain groups want to see.  Generalities … the challenge of speaking the truth while recognizing the extremes of variation in the work.

At one point in my meeting yesterday, I made some comments about the guided pathways work being started here.  My college has a long history of separation between academics, and between academics and service functions; my ‘generalities’ were meant to suggest that prolonged effort was needed to overcome our decades of certain work climates.

Generalities … when a person does not agree with generalities, the response is often “don’t speak in generalities” (which is what I was told yesterday).  Generalities are the only way to describe a system; this is comparable to having shared definitions in a mathematical system.  Generalities are not the end of the conversation, nor the only factor in decision making; a successful human system requires long-term effort among the community involved.

I am tired of the ‘generalities’ presented as attacks on developmental mathematics.  We know that much needs to be fixed, and I am confident that we (the professionals in the field) can create solutions which will serve our students better.  Some people lob generalities at us in the same way that people lob the “f bomb” in groups; there is an element of bullying involved when outsiders state generalities about how bad our work is.

Rare is the profession where non-professionals are able to implement specific procedures within the profession.

We need a “TEA Party” type movement; perhaps call it “Legislated Enough Already” (LEA) or “Bashed Enough, Dummies” (BED).

Thanks for reading!

 Join Dev Math Revival on Facebook:

GPS Part III: Guided Pathways to Success … Informed Choices and Equity

In the structures used for Guided Pathways to Success (GPS), colleges are encouraged to provide information to students about selecting a major.  That is great, obviously, until one reads the next detail:

Colleges use a range of information such as past performance in high school to provide recommendations to students about programs of study that match their skills and interests. http://completecollege.org/the-game-changers/

In other words, we would limit information to each student based on our interpretation of their background.  I want every student coming to my institution to consider (even dream) about goals that exceed their history and the accidents of their background.

The push to have students select a program of study is well-intended … we all want students to ‘succeed’.  However, we can not be so short-sighted that we encourage students to only consider goals that seem reasonable based on the data we might happen to have available.  Such methodologies will tend to maintain social class and economic standing; therefore, I see a fundamental conflict between this GPS method and the basic purposes of community colleges … upward mobility.

Statistics does not work for limiting choices at the individual level.  In the medical uses of data, providers can get very close to a valid ‘limiting’ of choice … when the statistical analysis has a small margin of error, due to understanding a physical process well.  Education does not deal with small margins of error (not in this decade, anyway).

The CCA website repeatedly shows ‘data’ with the implication that the results are statistically determined.  For example … community college students average 81 credits for a 60 credit degree, proving that students accumulate ‘too many’ credits.  That only makes sense if you look at the 130% credit count as measuring wasted effort; this has never been determined for a group of students, even though the CCA would like us to believe that it has been.

What’s your thought on what the ‘extra’ 30% represents?  Personally, I look at that 30% as being composed of several parts:

  • Excess remediation (should be 10% or less); is likely to account for 15%.
  • Intentional program credits (programs requiring 61 to 64 credits are common).
  • Intentional student choices (deliberately taking a course at CC … often because it’s cheaper).  This one probably accounts for 10% in that 30%.
  • Uncertainty causing choice of courses inappropriate for the student

I do not see the rationale that says this 30% ‘excess’ means that students must make a choice of program of study EARLY and that we should direct them to fields appropriate to their background.  I believe that the CCA does not understand the community college environment, with the factors influencing student choices about courses; this, combined with a bad use of a piece of data, results in a socially unacceptable suggestion (that we track students based on their background).  Following the CCA advice seems to amount to “keeping the wrong people out of the important programs”.

Equity is a fundamental part of our work in community colleges.  Equity and upward mobility are more important than arbitrary metrics of credits earned in community colleges.  State and local policy makers should be very concerned about following the CCA advice to implement GPS with a heavy emphasis on selecting the best program of study right away.

 Join Dev Math Revival on Facebook:

Co-requisite Remediation: When it’s likely to work

A recent post dealt with the “CCA” (Complete College America) obsession with ‘corequisite remediation’.  In case you are not familiar with what the method involves, here is my synopsis:

Co-requisite remediation involves the student enrolling in both a credit course and a course that provides remediation, concurrently.  The method could be called ‘simultaneous’ remediation, since students are dealing with both the credit course and the remedial course concurrently.

The co-requisite models are a reaction to the sequential remediation done in the traditional models.  For mathematics, some colleges have from two to five remedial courses in front of the typical college course (college algebra, pre-calculus, or similar level).  The logic of exponential attrition points out the flaws in a long sequence (see https://www.devmathrevival.net/?p=1685 for a story on that).

The co-requisite models in use vary in the details, especially in terms of the degree of effort in remediation … some involve 1 credit (1 hour per week) in remedial work, others do more.  Some models involve adding this class time to the course by creating special sections that meet 5 or 6 hours per week instead of 4.

I do not have a basic disagreement with the idea of co-requisite remediation.  Our work in the New Life Project included these ideas from the start; we called it ‘just-in-time remediation’; this emphasis resulted in us not including any course before the Mathematical Literacy course.

The problem is the presumption that co-requisite remediation can serve all or almost all students.  For open-door institutions such as community colleges, we are entrusted with the goal of supporting upward mobility for people who might otherwise be blocked … including the portion needing remediation.  The issue is this:

For what levels of ‘remediation need’ is the co-requisite model appropriate?

No research exists on this question, nor am I aware of anybody working on it.  The CCA work, like “NCAT” (National Center for Academic Transformation) does not generally conduct research on their models.  NCAT actually did some, though the authors tended to be NCAT employees.  The CCA is taking anecdotal information about a new method and distributing it as ‘evidence’ that something works; I see that as a very dangerous tool, which we must resist.

However, there is no doubt that co-requisite remediation has the potential to be a very effective solution for some students in some situations.  Here is my attempt at defining the work space for the research question:  Which students benefit from co-requisite remediation?

Matching students to remediation model:

Matching students to remediation model

 

 

Here is the same information as text (in case you can’t read the image above):

Of prerequisite material ↓ Never learned it Misunderstands it Forgotten it
Small portion5% to 25% Co-requisite model Co-requisite model Co-requisite model
Medium portion30% to 60% Remedial course Remedial course Co-requisite model
Large portion65% to 100% Remedial course(s) Remedial Course(s) Remedial course

The 3 by 3 grid is the problem space; within each, I have placed my hypotheses about the best remediation model (with the goal of minimizing the number of remedial courses for each student).

As you probably know, advocates like CCA have been very effective … some states have adopted policies that force extensive use of co-requisite remediation “based on the data”.  Of course the data shows positive outcomes; that happens with almost all reasonably good ideas, just because there is a good chance of the right students being there, and because of the halo and placebo factors.

What we need is some direct research on whether co-requisite remediation works for each type of student (like the 9 types I describe above).  We need science to guide our work, not politics directing it.

 
Join Dev Math Revival on Facebook:

 

Data – Statistics – The CCA Mythologies (or Lies?)

Decision makers (and policy makers) are being heavily influenced by some groups with an agenda.  One of those groups (Complete College America, or CCA) states goals which most of us would support … higher rates of completion, a better educated workforce, etc.

A recent blog post by the CCA was entitled “The results are in. Corequisite remediation works.” [See http://completecollege.org/the-results-are-in-corequisite-remediation-works/]  CCA sponsored a convening in Minneapolis on corequisite remediation, where presenters shared data on their results.  Curiously, all of the presenters cited in the blog post had positive ‘results’ for their co-requisite model; clearly, this means that the idea works!  [Or, perhaps not.]

The textbook we use for our Quantitative Reasoning course includes a description of ‘results’ that were very (VERY) wrong.  A survey was done using phone number lists and country club membership lists, back in the 1930s when few people had telephones.  The survey showed a clear majority favoring one candidate, and they had over 2 million responses.  When the actual election was held, the other candidate one by a larger margin than was predicted in the survey.

The lesson from that story is “we need a representative sample” before using the data in any way.  The CCA results fail this standard badly; their ‘results’ are anecdotes.  These anecdotes do, in fact, suggest a connection between co-requisite approaches and better student outcomes.  College leaders, including my own provost, report that they “have seen the numbers’ showing that this method is promising.

Based on that exposure to questionable data, we are running an experiment this summer … students assessed at the basic arithmetic level are taking an intermediate algebra course, with extended class time and a supplemental instruction leader.  Currently, we still allow students to use an intermediate algebra course to meet a general education requirement, so this experiment has the potential to save those 11 students two (or more) semesters of mathematics.  I have not heard how this experiment is going.

Curiously, data on other ‘results’ would not be used to justify a change by decision makers and policy experts.  People have used data to support some extreme and wrong notions in the past, and are still doing so today.  The difference with the CCA is that they have some methodologies that automatically achieve validity, with the co-requisite remediation models at the top of this list.

Scientists and statisticians would never reach any conclusions based on one set of data.  We develop hypotheses, we build on prior research, we refine theories … all with the goal of better understanding the mechanisms involved in a situation.  Co-requisite remediation might very well work great — but not for all students, because very little ‘works’ for all students.  The question is which students, which situations, and what conditions will result in the benefits we seek.

The CCA claim that “corequisite remediation works” is a lie, communicated with a handful of numbers which support the claim, and this lie is then repeated & repeated again as a propaganda method to advance an agenda.  We need to be aware of this set of lies, and do our best to help decision makers and policy experts understand the scientific method to solving problems.

Instead of using absolute statements, which will not be true but are convenient, we need to remain true to the calling of helping ALL students succeed.  Some students’ needs can be met by concurrent remediation; we need to understand which students and find tools to match the treatment to them.  Our professionalism demands that we resist absolutes, point out lies, and engage in the challenging process of using statistics to improve what we do.

 Join Dev Math Revival on Facebook:

WordPress Themes