Study Suggests 15 Credits per Semester is Better: Think Again (2023)

Study Suggests 15 Credits per Semester is Better: Think Again (1)

Shutterstock

Summary of Recent Study:

A recent study reported that students studied who took 15 as opposed to 12 credits a semester did better in college/university than those who took the lower credit amount. According to the study, those carrying 15 credits had a higher overall GPA, higher 1st to 2nd year retention and more on-time graduation rates. These benefits hold true regardless of entering high school GPA and socio-economic status.

The study was conducted and the results released by EAB, an entity that provides research and technology to campuses. (Their enrollment division may be familiar to some within the academic community: Royall and Company.) Details were recently reported in InsideHigherEd, including a hyperlink to the research methodology, which information was later added to the EAB blog. For the record, the EAB blog itself specifically refers to the study at the University of Hawaii.

Advertisement

Reading the study report, one would think we had found nirvana. And, if the data were accurate, imagine the benefits of more students completing college in four years (which is what 15 credits will enable). Student debt would be lower as there would be at least one less year of college; students would be more likely to earn a degree; and we could discard all the other information out there on contributors to student success (or lack thereof), as we’d have our “answer.” We’d have an answer to the completion cunundrum.

Be Careful When Reviewing Data

Sometimes, we rely on and interpret data in ways that are flawed. Sadly, this is one of those cases, as best as I can tell from the disclosed information and other information and studies within the world of higher education. That is bothersome on many levels, not the least of which is that it will encourage more students to increase their credit load from 12 to 15 credits in the hopes of garnering the promised and reported results.

I appreciate that the EAB blog tells policymakers not to rely exclusively on their data to make institutional changes but then does on to say, and I quote, “Students, no matter their academic preparedness or financial situation, should consider taking 15 credits per semester—and then work with their advisor to make the decision that makes the most sense for them.”

Ask this question: how are most students making the decision on credit hours in which to enroll? What is the role of their academic advisors? What is required for their program? What courses are available? What level of credits is compatible with needed work schedules?

Advertisement

Let’s start with which students are within the scope of the study: full-time first time students. Transfer and part-time students are not within the scope of the study. Next, the study looks at low-income students using Pell grants, as a surrogate for low-socioeconomic status that is, in and of itself, is non-problematic. But, who are the students studied? What is their age? What is their race? What is their gender? From whence do they hail – a way of assessing whether they live on or off campus? What is their area of study? Are any enrolled in professional programs like nursing or radiologic technology or physical or occupational therapy? What characteristics did those students who carried 15 credits a semester have compared to students who did not “opt-up?” Were their athletes in the group (DI, DII and DIII)? And, did they up their credits in the semester of their season?

And, the methodological summary reports that 1.3 million student records were reviewed from 137 public and private colleges. Let’s assume for a moment then that these data did not include community college students, even those at community colleges that grant a four-year degree. What are the characteristics of the colleges and universities in the study? Are they selective institutions? Highly selective? Open enrollment? Where are they physically? What is their size? How many of the courses were online? What grades did the students receive in the courses on a disaggregated basis? How were withdrawals from courses treated?

Now, ask whether any regression analyses were done and if so, what variables were regressed? Were the results statistically significant and what are the “p” and “t” values?

Now, to delve deeper, look at this redacted sentence in the EAB blog: “These students ... were retained at a rate nine percentage points higher (90% versus 81%).” Really? The only institutions with that 1st to 2nd year retention rate are selective or highly selective institutions. Something is way off here. Or, the 137 institutions studied are not representative of the over 6000 educational institutions in the United States.

Advertisement

To be specific, look at these retention data reported in the College Scorecard (prepared by the US Department of Education based on IPEDs data): 1st to 2nd year retention nationally is, on average, in the 60’s at four year colleges. Looking specifically at the virtually open enrollment institution where I was president, 1st to 2nd year retention is listed at 61% (within the range of average for like institutions). The state land grant university, the University of Vermont, has a 1st to 2nd year retention of 87%, well above the average. And Middlebury College, an elite highly selective institution, has a 1st to 2nd year retention rate of 96%, an almost perfect retention rate! Ask: what colleges and universities were in the EAB study?

Just for the record, the EAB blog specifically calls out the program at the University of Hawaii. Looking again at the Scorecard, we start with this realization: there are four separate campuses identified. Which retention rates should we consider when we are measuring? Let’s assume we are looking at the campus of the University of Hawaii with the most students: UH at Manoa. The 1st to 2nd year retention rate there is identified as 79%, well above average.

Let’s ponder too whether the data are sufficiently reflectively of low-income students. Look at this statement in the methodology description. “The cohort of the Pell analysis was limited to the roughly 18,000 freshmen from six private institutions that Pell data was available for. Four thousand of these freshmen were Pell recipients and fourteen thousand were not.” Forgetting for a moment the linguistic gaffs in this sentence, ask whether the number of Pell students studied mirrors national averages.

Nationwide, in the years covered by the EAB study, the percentage of Pell students attending college ranged from 33 –38%. This would mean, using simple math and the lowest percentage, that of the 18,000 students studied for purposes of Pell, the estimated percentage should be approximately 6,000 Pell enrolled students. Yet, the EAB looked at only 4,000 Pell students. This leads to two key observations: the data are not revelatory of the lowest income students and the institutions studied had low percentages of Pell students enrolled. This latter statement means that these were among the higher selective colleges that generally enroll lower percentages of Pell eligible students.

I could go on and on but let me take these three points.

First, I think the EAB data mistakes correlation for causation, a common problem with data analyses. The reported success could be “caused” by the increase in credit hours of studied students from 12 to 15. But, it is also possible that other factors contributed to the success – the type of institution studied, the characteristics of the students studied, the absence of sufficient low-income students within the assessment. Perhaps it is the institutions not the credit hours that account for the successes reported. In other words, the EAB study is reflective of correlation not causation.

Advertisement

Second, for low-income students, particularly in the first year, adjustment to a college environment is fraught with difficulty. In addition to academic challenges, there are a myriad of psychosocial challenges. Even if the results were accurate as reported here, I am wondering if the price paid is worth the candle. In other words, students may get better grades but are they better adjusted? Are they developing the needed psychosocial skills? Are they taking the best courses for their future? Are they taking “gut” courses to get to the proposed 15 credits?

Third, I appreciate the completion agenda. Vastly too many students have debt but no diploma. That is an expensive problem. But, let’s get real: the barriers to success are plentiful. Were the institutions studied representative of the 6000 plus colleges in our nation and if they weren’t, then the results send the wrong message: more is better. As a generalizable matter, whether in education or any other field, more is not necessarily better. Less can be more. For many students, 12 credits helps a student settle in and get adjusted to life away from home. It enables depth over breadth. It allows a student to concentrate.

Perhaps we need to re-think credit hours too. Is the number needed for graduation optimal? How did we get to 120 credit hours anyhow? Does it matter whether courses are 2, 3 or 4 credits? What are we measuring anyhow? Seat time? Competencies? It is true that 12 hours a semester over 8 semesters will not enable graduation. But, is that a problem with the credit hours or the under-enrollment in sufficient credits by students? Do we want students to take courses in the summer? Do we want to give credit for J-term or for internships that are unpaid and unsupervised? What about clinical hours?

Worries

I worry about college access and completion and successful graduate entry into the workforce. I worry about student debt loads but I am more worried about debt that does not lead to a degree with value. I worry about the quality of education.

I also worry about facile solutions that fail to take into account the actual lives of the students enrolled in colleges today. Yes, there are many first time, full time students. But, the population of students enrolled today and tomorrow is changing. More and more students are non-traditional; many are transfer students; many are part-time. Many are minorities. Many attended weaker high schools. Many are minorities and low income and first generation.

Advertisement

Would that life were simple and problems solved with one solution – like increasing credit hours. I am reminded of the old adage: if something looks too good to be true, it probably is. And, so it is with the EAB study. Surely some students will be able to increase their course load and perhaps that will facilitate greater retention and higher graduation rates. Were I a guessing person, which I am not, I’d suggest that the results are a product of numerous variables – and here’s the point, those variables are not disclosed or discussed or deconstructed in the EAB study.

The watchword here needs to be: caution. In the meanwhile, let’s keep studying and reflecting on student success. Nothing could be more important than that if we want to insure that the workplace of the future and communities in which we live and the health of our nation are optimized.

Related

Go To Homepage

References

Top Articles
Latest Posts
Article information

Author: Msgr. Refugio Daniel

Last Updated: 08/31/2023

Views: 6410

Rating: 4.3 / 5 (54 voted)

Reviews: 85% of readers found this page helpful

Author information

Name: Msgr. Refugio Daniel

Birthday: 1999-09-15

Address: 8416 Beatty Center, Derekfort, VA 72092-0500

Phone: +6838967160603

Job: Mining Executive

Hobby: Woodworking, Knitting, Fishing, Coffee roasting, Kayaking, Horseback riding, Kite flying

Introduction: My name is Msgr. Refugio Daniel, I am a fine, precious, encouraging, calm, glamorous, vivacious, friendly person who loves writing and wants to share my knowledge and understanding with you.