Skip to main content

Measuring the Effectiveness of Nursing Education

By HCPro's Advisor to the ANCC Magnet Recognition Program®  
   October 12, 2010

Several years ago, while working in nursing professional development and education, consultant Gen Guanci, MEd, RN-BC, CCRN, realized that she was doing herself and her department a disservice by reporting "productivity" of the nursing education department. 

At the time, Guanci was working on reports based on quantitative data—the department conducted 20 classes, for example, serving 200 nurses. (She calls this concept "butts in the seat reporting.") 

What they were not doing, she explains, was demonstrating what outcomes those filled seats then led to. 

"In other words, what was different as a result of our educational activities?" says Guanci, who is now a consultant with Creative Health Care Management in Minneapolis.

She undertook the task of identifying and explaining that qualitative aspect. 

"I linked our department activities to identified desired outcomes of specific education," says Guanci. These outcomes were then linked to the organizations' goals, and even pay-for-performance initiatives. 

"Many of these are stretch goals or outcomes some educators have a hard time relating their work to," explains Guanci. 

For example, say your department holds education classes on computerized physician order entry (CPOE). One of the main reasons organizations implement CPOE is to reduce transcription errors. After your classes, the order transcription rate drops by 66%. This helps validate the critical importance of nursing professional development's role in regards to patient safety and outcomes. 

Beginnings

The reason Guanci changed her view on how to demonstrate effectiveness of nursing education comes from her previous organization's experiences on the ANCC Magnet Recognition Program® (MRP) journey and its pursuit of the Baldrige Award. 

"The organization I worked for at the time we implemented these concepts was [MRP] designated," says Guanci. "It's not about how many people are in the seats—it's about results." 

She advocates the use of Professor Donald Kirkpatrick's Four Levels of Evaluation. 

Level two looks at what new knowledge has been retained by the student. Guanci felt that pursuing recognition for the results of the department's work is something every education department has to be aware of. "I went this way originally because in times of economic challenge, education departments are often the first to be slashed and burned," she says. "Leadership often doesn't perceive the value the department provides." 

In her department at the time, Guanci was creating an outcomes report every six months—and having a terrible time getting credit for the work the department did. She knew she had to alter the way the department's work was reported. 

Since then, the change has been notable. 

"The process still occurs there," says Guanci. "They've added positions instead of cut them—and [the education department has] really been able to create proof of worth for their department." 

Feedback

In her previous organization, the education department felt it had sufficient evidence to show that as a result of foundational education it provided, it was able to assist in a decrease in transcription error rates. 

"Educators have a hard time trying to take credit for things that change in an organization that start with their education," Guanci says. "We know education alone doesn't invoke change. It's a combination of many factors." 

There's a partnership that needs to be formed—educators provide the education, and then managers need to make sure that improved performance occurs following the learning. 

"You're not saying that it's only because of your work, but it was the foundational behaviors leading to future behaviors," says Guanci. 

There is also the matter of making sure the education department chooses appropriate targets. The system in which this concept was implemented was set up so Guanci's department could access all necessary outcomes data. This came to bear when they were targeting areas for education. For example, a nursing director called and reported an increase in errors whenever nurses used a pain pump. The director then asked that the entire department be educated on pain pumps. 

Before making a decision, Guanci and her team drilled down to determine the cause of the errors. They discovered that there had indeed been a spike in pain pump errors. However, they were low in number (three), and each error had occurred on one particular nursing unit. Looking deeper, it turned out they occurred on the same shift, and finally they discovered the errors were the result of one nurse who needed additional training.

"I made the decision that we were not going to educate the entire hospital on this matter," says Guanci. "It wasn't a hospitalwide problem. We saw this nurse needed remediation." 

This is the department's mind-set. Always look to the data and hunt for cause and effect. "It's for planning education as well as reporting education," Guanci says. 

"This is huge," says Guanci. "It's something [education departments] have often never been asked to do before." 

At national seminars Guanci has spoken at, she has found the topic to be "a bugaboo"—people are asking the wrong questions. 

"I'll hear the question, 'How are you measuring your hours per patient day?' Education shouldn't be measured in patient day!" she says. 

Measurement gurus often try to slip education into measurement like any other measure of RN productivity. However, education is as much an art as it is a science in terms of measurability. 

"Sometimes you'll hear a department automatically jump to education—for example, let's have a class for customer service," says Guanci. "Educators will put together a customer service class. Then the original requester comes to you and says, 'But they still are engaging in the same problematic behavior!' It really is a matter of putting forward the mind-set of what do you want to see happen as a result of this education before you even plan the program."  

You have to define it before you can achieve it. 

Another challenge: Quantifying evaluations 

An evaluation might ask, "Did the program meet its objectives?" The answer might simply be yes, all of the objectives defined in the program were met. But were they put into practice after the program was over? 

"The hardest part is educating the educators on how to write an outcome," says Guanci. "I would ask for outcomes and I'd see four CPR classes with 22 attendees. That's not an outcome! We have to step back." 

Remember, you can't evoke these kinds of changes alone. It requires the entire department to understand what an outcome really is and hold fast to that belief. 

It might sound cynical, but the truth is just because you told students something doesn't mean you educated them—and just because you trained them doesn't mean they're doing it. 

Don't be afraid to let leadership see and know what the education department is doing. "You are having an effect on patient safety and outcomes in the organization, so claim it," says Guanci.

What are the Four Levels of Evaluation? 

The Four Levels of Evaluation were first published by Donald Kirkpatrick, professor emeritus at the University of Wisconsin in North America and a past president of the American Society for Training and Development, in 1959. 

The four levels of Kirkpatrick's evaluation model essentially measure:

  • Reaction of student: What the student thought and felt about the training
  • Learning: The resulting increase in knowledge or capability 
  • Behavior: Extent of behavior and capability improvement and implementation/application
  • Results: The effects on the business or environment resulting from the trainee's performance

________________________________________________________________________
This article was adapted from one that originally appeared in the October 2010 issue of HCPro's Advisor to the ANCC Magnet Recognition Program®, an HCPro, Inc. publication.

Tagged Under:


Get the latest on healthcare leadership in your inbox.