Pages Navigation Menu

A national public policy initiative to improve developmental studies in postsecondary education

Live Blogging: ATD/DEI State Policy Meeting

Print Friendly

Going to give a whirl at live blogging the Achieving the Dream/Developmental Education Initiative State Meeting.  Great people doing cutting edge work.

10:30 AM EST
Just sat in small group settings where state teams reviewed the new Developmental Education Initiative State Policy Inventory.  This will be a wonderful tool for states to assess their current policies on developmental education.

Big conversation is whether states should be setting a floor in developmental education.  What to do with students who test below a given score on the placement exam.  Can Adult Basic Education funds be used to provide instruction to those students who have a high school diploma, but test below even developmental level.  The short answer is that it depends on the state – but more research needs to be done in this area.

11:00 AM EST
Data-Driven Improvement Plenary with Linda Baer- Bill and Melinda Gates Foundation and Travis Reindl- National Governors Association.

Linda Baer talking Analytics. Recommends new book Analytics at Work.

Key issues in analytics are the balance of information and insight. Information is about what happened, what is happening and what will happen.  Insight gets to the next level and asks the questions: how and why did it happen, what’s the next best action and what the best/worst that can happen?

Predictive Modeling: process by which a model is created or chosen to try to best predict the probability of an outcome.

Keys are good data, a willingness to do something about it, analytical leadership, setting strategic targets. Based on the data – can you make pinpoint decisions that will make a measurable impact.

Applying Analytics
What is the cost of student failure?
What is the cost of customized student services and instruction?

Minnesota State Colleges and Universities did an accountability dashboard with key data that was publicly available. Not only did they share data, they evaluated staff and their departments against it and used the data to drive improvement.

Question from Tamara Clunis, TX HECB – How do states build institutional capacity to do analytics?

Linda Baer – Make completion such an important issue – train trustees on what they should do to support completion. Use the Complete College America/National Governors Association progress and completion metrics.

11:24 AM EST – Travis Reindl, NGA
23 states have agreed to use the NGA/CCA completion metrics.

The goal of the metrics is to change the conversation in states, not to push uniform use and acceptance of the metrics.

NGA is diving deeper into the metrics to provide a 30,000 foot view.  Key issue is what is the relationship between the supply of students and the demand for their skills.  Using Tony Carnevale’s research on job projections.

NGA will host a policy academy to work with states on how governors and states can more effectively use the data from the metrics at the state policy level?

Key issue: how to move states away from the “across the board” cuts to higher education to a more strategic use of resources based on completion metrics. Washington and Arizona are models already.

Regulation is another tool. How do you differentially regulate institutions based on the goals we have for those institutions and the data about their performance.  Virginia leads the way on this approach.

Another key strategy is to more effectively bringing together Departments of Labor and Education.

Caution: need a gradation of data – certain data is appropriate for state policy.  Avoid getting in the weeds with legislators.  The deeper data is best utilized at system/institutional level.

We need to both add and subtract data.  There may be data that are not helpful, don’t keep collecting it.

There needs to be a strategy to help states manage of this.  We may have an “anarchy of good will” with too many people working these efforts in states.  How do we tie together the state efforts.  ATD/DEI/CCA, etc. need to find way to agree on baseline data that we all can work from.

Question from Gretchen Schmidt, Virginia – Can data outcomes approach stifle innovation and creativity in education? Are there lessons learned from K-12?

Linda Baer – We can’t lose site of how we develop creativity in students. New era, means new solutions.  The Voluntary System of Accountability tries to not lose site of the importance of these measures.

How do we use common core to support this work?
Travis Reindl – states that have adopted need to have a statewide conversation on how to leverage the standards in higher education.  Can’t have different conversations with different institutions.  Everyone needs to be involved and engaged.

Francesca Purcell, Massachusetts – What goals to set, what data to use?  With CCA/ATD/Voluntary Systems of Accountability and so many efforts with different goals and different data – how do we manage it all?

Travis Reindl – States need to answer for themselves what they need and how these various efforts will support their goals.

11:50 AM EST – Break for lunch.

1:15 PM EST – Using Data to Achieve Continuous Improvement in Student Completion Rates. Davis Jenkins, Community College Research Center.

Davis Jenkins will discuss his excellent new research paper entitled, Redesigning Community Colleges for Completion: Lessons from Research on High-Performance Organizations

Davis Jenkins – This might be a little heretical to mention at a DEI meeting, but maybe we are a little too focused on developmental education.  Of course it is important, but the question is developmental education for what? We need to look at supports and programs throughout the system and ensure their effectiveness.

Definition Concentrated students – completed nine semester credits/3 courses in a particular field/major.

Students who enter, but don’t have a concentration rarely result in a credential. Result is that for developmental education, unless there is a focus or concentration to prepare students for, your overall success rates may suffer.

Different programs have different success rates, you need to measure success of those programs against its historical performance.  Does not make sense to measure programs against one another.

Key questions: How can we increase the rate at which students choose a program of study and then how quickly they get out? Then, whether those who get a credential get employed in the workforce.

Unfortunately, we offer to few services to help students get focused and find a path.  As a result, they often get moved to developmental education, without a focus. Consequently, students are less likely to succeed.

We need to create clearer pathways for students. Research from Judith Scott-Clayton shows that given many choices, students don’t fare well.

Davis’ research finds that the coherence of the system from assessment, placement, curriculum and supports is critical to student success.

Paul Sussen – supports that students need to choose a program, but there is a logistical problem.  The more students who choose programs, the more advisers you need, those advisers are often faculty who are already busy.
Davis – a solution would be to create a more clearly defined program of study, combine small programs and create more simplicity for students, you can mitigate some of the logistical problems.

What about students who are developmental and are unclear about goals?
The realities of the economy and times demands that students don’t have the time to contemplate their choices, we need to educate them about the realities.  It does not work to set students on a path, starting with developmental education without some direction.  A solution would be to articulate the five main program areas that you offer and then contextualize developmental education within those program areas.

We have found students who complete developmental education, but have not passed the three course/9 credit concentrator threshold not being successful. The problem is likely the gatekeeper courses.  If we contextualize developmental instruction, then we can prepare students for that gatekeeper course that is tripping up so many students. It is not all developmental education’s fault, we need to look at the entire system.

Larry Nespoli, New Jersey – Programs are more than the academic, credit-based track, but also the non-credit tracks. Shouldn’t we be referring students who are not ready for longer credit based programs into shorter, non-credit options that have market value
Davis – bottom line is that students need some kind of postsecondary credential.  Whatever the program, it must lead to some kind of certificate, certification or other credential. Admittedly we don’t expose students to the non-credit options, particularly students with limited basic skills.

Sharon Morrisey, North Carolina – vast majority of our students transfer without a credential. We are restructuring our certificate so that students must take more rigorous courses and provides fewer choices to ensure students are more prepared for transfer.
Davis- research proves that students who take more rigorous path are more successful. We need tell more people that this is the cost effective way to go.

Tamara Clunis – Provide students better information about which credential programs have market value and the academic requirements for those programs. Problem with open access, open entry programs where students are admitted a day before classes prevents a thoughtful presentation of their choices and options.
Davis – Research is clear that ABE students, for instance, that choose a program will be more successful. However, changing the system will take time and will be difficult.

Washington – We learned that the curriculum matters most.  We could provide more advisers and other services, but having the curriculum structured around contextual learning through the I-BEST was critical to success. Without a commitment to get students to an outcome, how is open access really equitable.

Should we require students to choose a program before getting financial aid? Problem – you could have a situation where students choose a program just to access financial aid, not because they are committed to the program.

Davis – Clearer pathways of what is expected in the program could prevent students from choosing a program that is not suited for them.

3:45 PM EST – Performance Funding and its Implications - Kevin Dougherty, CCRC; Michelle Andreas, Washington State Board of Community and Technical Colleges; Ron Abrams, Ohio Association of Community Colleges

Kevin Dougherty will provide an overview of state performance funding.  See his new research paper on the topic here.

Higher education leaders motivation is that an enrollment based model is not a viable model for generating public revenue for their institutions. Policymakers are interested in a more market oriented approach.

Opposition among four year institutions was that it provided a mechanism to cut funding and/or intrude in the affairs of their institutions.  Concern is that the models will not consider unique elements of institutions.

Immediate impact of performance funding is that increased awareness of state priorities for higher education, which changed behavior at institutions.

Emerging research showing that performance funding can impact student success.  Data from the Washington Student Achievement Initiative is promising.

Negative side effects.

  • Some evidence of restricting access to specific programs, but know evidence across the board.
  • Mission distortion by not investing in programs that have little payoff in system.
  • Temptation to lower academic standards.

These systems are typically abandoned by the states that institute them.

  • Illinois, Missouri and discontinued.
  • Washington discontinued and revived.

Reason for demise often happens during drops in revenue. Higher education agrees to drop performance funding in an effort to protect base funding.  Erosion of interest among institutions, policy makes and business leads to demise.

Policy Implications:

  • Support of higher education institutions is critical.
  • Building and sustaining business support is important.
  • Reach out to equity-oriented groups to support it as a vehicle to create greater equity.
  • Insulate performance funding from the state revenue cycle.
  • Ensure original champions of program have successors.
  • Using completion of courses as funding standard may threaten academic standards.
  • Increase impact by having performance based funding take up a larger percent of state funding.
  • Use of appropriate indicators tied to institutional missions is important.
  • Fund both outcome and progress/momentum point metrics
  • Build capacity of institutions to reflect and use data.
  • Combat the tendency to reduce access to institutions
  • Tie each mission of the organization to indicators (ABE, remedial ed, etc.)
  • State pay for the compliance costs.

Michelle Andreas-Washington State Board of Community and Technical Colleges
We did everything wrong during the first go round and shortly after implementation it was discontinued.  The new incarnation of the performance funding model flows from our “Tipping Point” research. The research showed the point at which students would reap a financial benefit from a higher education.  That led the the development of momentum points that track students toward the tipping point.  The performance based system was built around this strategy and resulted in the Student Achievement Initiative.

The success of this system is now leading to policy discussions to move it to four-year institutions.

We are resisting Complete College America  and Complete to Compete because their metrics are different enough that we are worried about losing focus.

Ron Abrams, Ohio Association of Community Colleges
Ohio higher education is generally very decentralized, with little influence from the legislature. The development of the Ohio Strategic Plan created momentum for this approach.

Different from Washington, the Ohio funding comes out of the base – it was not an add-on appropriation. It does include success points and has opportunity to negotiate indicators related to the mission of the institution. Although the success points are what is getting all the attention and not much has happened with the mission oriented indicators.

Richard Kazis – What percent of funding is enough? How do we ensure that institutions are well prepared to implement?

Kevin Dougherty- It depends on the nature of the system/institutions on what the appropriate percentage should be.  10-15% will certainly get institutions attention.   The question is as the percent changes, how does that translate to different institutional responses.  The key is to limit negative side effects and maximize impact.

Michelle – The amount of money in Washington is small, so it is not what drives implementation.  The roll-out, the data and the fact that the metrics made sense was much more important to buy-in and success. Moving students further and faster became our mantra. They worked hard to convince faculty, made small initial payouts.  Once the payouts happened, everyone realized how politically popular this approach was and that resulted in greater urgency in states.

Ron Abrams – 20% of base funding translates to about $100 million. The system resulted in a review of policies to create greater standardization across campuses.

Peter Quigley, Hawaii – How do you stop gaming the system?
Kevin – It’s a constant process of monitoring what is happening and tweaking approaches.  As a result, the system is constantly changing.  Key is how to manage the change and to keep people on board.  The consultation structure needs to be regular and substantive.

Francesca Purcell – What advice do you have for states that have no history in performance funding?
Michelle – The continued and robust involvement of campuses is very important. Look at the data, better understand what is and is not working on campuses is important. Involve all your constituents – legislators, other state agencies, etc.  Make sure the measures are ones that institutions can influence and understand.
Kevin – Spending a lot of time on how you are going to design it. Be clear about incentives, but understand that implementation in a complex organization may mean incentives play out differently. Process is economic, sociological and political. Ease in the system – the Big Bang doesn’t work.

How do we reward the P-20 system?
Kevin – there is no one system (preK, K-12, postsecondary). Best you can do is build in joint rewards. Reward both 2-year and 4-year institutions for successful transfer.  Reward high schools for lower remediation rates, and 2-year institutions for successfully remediating students.

What about those hardest to serve students, such as ABE students?
Michelle – our system is not strong enough.  There are disincentives to serve basic skills students. The result is that investments in basic skills are in areas where they know it will be most successful.  Consequently, the investment is going into I-BEST rather than traditional basic skills.
Ron- The commitment to ATD and the resulting statewide success initiative has provided support for the performance based funding system and maintaining a commitment to all students.

Wednesday, February 9
8:30 AM

Breakfast Plenary: Mapping and Integrating National Completion Initiatives.

I was joined on this panel with Nan Poppe from Completion by Design, Barbara Endel of JFF, Domy Raymond of Complete College America.

Themes of the session focused on how to help align national efforts and determine a successful approach from the state policy to the institutional level. Thoughts shared include:

  • Creating flexibility in the data requests of states.  Different initiatives have slightly different data requests.  Allowing some flexibility would be helpful.
  • How do we not only examine the strategies but how to move change at the institutional level.
  • Matching data with state policy – assess whether policies are successful or explain why there are different results across states.
  • Find easier way to disseminate information – the 65 page report is not always useful.
  • Engaging policymakers in the conversation that developmental education is not a symptom of system failure, but a key strategy for  increasing completion.

11:00 AM – Latest Research and Means of Dissemination – Shanna Smith Jaggars – CCRC

Strategies that CCRC has examined to understand completion.

  1. Assessment Testing – testing is weakly predictive of success and not effectively diagnose deficiencies.
  2. Acceleration models – not a lot of solid data, but what exists is promising
  3. Contextualization – promising approach, but underutilized.
  4. Non-academic supports – strategy less important than the focus and substance of the strategy.
  5. Program-Institutional Structure – evidence support simplifying process.
  6. Online courses – not as successful as traditional courses
  7. Organizational Improvement

Organizational Improvement is not a separate area – but should be infused through all the other strategies.

Complexity and Structure
The concern over the complexity and  the  number of choices students face suggests limiting student choices.  Faculty and other institutional leaders are concerned about limiting choices.  However, there may be ways to simplify the decision process without eliminating choices.
Recommendations:

  • Simplify the structures and bureaucracies that students must navigate.
  • Form cross-functional teams of faculty, students services, staff administrators
  • Map out student experiences from first contact; where and why are students frustrated.

Faculty Engagement
Substantial organizational improvement requires strong engagement.
Recommendations:

  • Faculty must have deep understanding of goals and methods of reform.
  • Empower employees as part of reform

Academic Alignment and Assessment
Instructional program coherence is central to successful organizations. Includes: well coordinated curriculum, common instructional framework, clearly defined learning outcomes, integrated assessments and academic supports. Unfortunately this is not a strong suit for colleges.

Recommendations:

  • Faculty across disciplines coordinate efforts around reading, math and writing learning outcomes.
  • Help part time instructors understand goals
  • Help student understand program goals
  • Help high schools understand goals.

Continuous Improvement
Practices of high performance organizations

  • Set learning outcomes
  • Measure student learning/progression
  • Identify achievement gaps
  • Align practices/policies to improve outcomes
  • Evaluate and improve alignment efforts
  • Whole process actively engages faculty and external stakeholders.
  • Rethink committee structures, professional development and incentives.

What can states do?

  • Convening and Connecting role – facilitate discussion, discuss data and disseminate research.
  • Online learning may be place to employ continuous improvement efforts.

11:35 Discussion – How do we make sense of all this information and research to make decisions.

Sharon Morrisey, North Carolina – All of our assumptions about what we know about teaching and learning are being challenged are being challenged by research, that is both good and bad news. Our equilibrium is out of balance – which will result in change.

How do you connect about college practice into state policy?
In North Carolina we are looking at various policy levers that will be influenced by research. We are examining the assessment,  placement and delivery models together.  Key question is how do we increase our data analysis capacity?

Once we have gathered evidence, we are sharing it with academic officers and others who have not engaged data previously.

Mike Collins – As states begin to share data, it makes sense to share how that is done from state to state, develop best practice.

Michelle Andreas, Washington – Convened institutional teams and used open space technology to allow participants to develop agenda and identify key issues to be discussed. Presentations needed to be evidence based.  We also brought in data experts to share research to “shake things up”, but also include institutional presentations of data.

Mike Collins – Student Success Centers provide a space for this type of activity.