top of page

HOW TO - Assess Like A Ranger

This is the first blog in a new series focused on HOW to adapt and adopt the educational practices used on public lands for classroom impact. Many of these topics come from discussions and questions in our Facebook community or the National Park Classroom Ranger professional development program.  

Question - How do park rangers assess the impact of the learning experiences they lead? 

This question arose during a discussion of interpretive teaching in regards to EVERY classroom teacher’s favorite subject - assessment. While tools like grades, attendance, and student work allow classroom teachers to measure and account for their effectiveness in helping their learners progress towards learning goals in their classroom, how exactly do park rangers assess their impact absent traditional tools like these? 

One quick note - assessment is how we measure the acquisition of learner knowledge to guide our next steps in the classroom. Grading is something different, so if you’re looking for tips on grading, sorry to disappoint. 

IDP - Setting the Baseline for Self-Assessment 

While teaching may look different in a park, just like classroom teaching there are professional standards that have been established based on best practices. If a classroom teacher wants to set themselves up for effective teaching or get an idea of how they are doing, they can get their hands on a rubric that defines and articulates effective instruction and use it to plan their lesson and its delivery. They can then use this to self-assess or during an official observation. The NPS’s Interpretive Development Program (IDP) has similar tools and processes. Each element of what effective interpretation looks like comes with a corresponding rubric with quality indicators. For example, the effectiveness of an interpretive presentation can be judged as “effective” by the presence and absence of the following indicators;

  • Communicating appropriate depth

  • Level of understanding of park story and resources

  • Providing a balance of facts from multiple points of view

  • Presenting in an engaging manner

  • Demonstrating creativity on the part of the ranger

  • Encourages critical thinking about high-level concepts like stewardship and global issues.    

If you really want to get into the weeds on this (and fair warning, the weeds are DEEP) you can look at the standards for each interpretive practice on the official NPS IDP website here →

Classroom Application - Self-assessment of your lesson plans against a rubric of “best practices” can help you make sure that the learning experiences you are facilitating are designed with high impact in mind. It can also help you to consider what opportunities you might not be considering or are missing out on. Most states have one of these you can find online (here is an example from Indiana) but you can also find great ones from other organizations as well (such as this Teacher Practices rubric from PBLWorks)  


On The Fly - Informal Assessment

While the IDP rubrics and indicators are helpful for self or peer assessment, whether the program is meeting the needs of your visitors requires different tools that can be utilized at the moment. Yes, you can always ask visitors to complete surveys or post-program reflections, but these are summative, and by the time you read them your learners are walking away and your chance for teachable moments and connections has passed. Just like in a classroom, rangers make adjustments as their programs proceed and the way to do this is using informal, formative checks.  

The following are strategies I gleaned from watching Ranger John during a battlefield program at Fredericksburg/Spotsylvania Courthouse Nat’l Battlefield Park and some things I remember from my time as an interpreter as well. 

Audience Retention - I remember being told by my supervisory ranger at Fort Point that I’d know how my talks were landing with people “if you end with the same number you start with”. Visitors to parks have a lot of choices on how to spend their visit, so if they choose to participate in a program and stick with it, they only do so if it resonates with them. Beginning with ten people and ending with five is pretty normal. Ending with the same ten is considered great. Ending with more than ten, meaning some bystanders joined part way through, that’s above and beyond.    

Classroom Application - learners in the classroom may not “walk away” but they can disengage in several other ways. Increased requests for bathroom breaks, off-topic behavior, and other indicators of disengagement can be signs that it's time to pause and switch your approach. At the very least, you should take a moment to check in with your learners.   

Level of Participation - Interpretive programs, at their best, are dialogic, meaning there should be plenty of audience participation interwoven so that it's about the group building knowledge rather than the ranger just sharing theirs. Providing open-ended opportunities for the audience to interact with the ranger and each other allows the ranger to see if what they are sharing is landing as intended and provoking a reaction. If hands are going up or chatting going on when prompted, that’s a great indicator that things are going in the right direction.

Classroom Application - building in moments that require discussion or learner participation can help you figure out if your lesson design is resonating with your class. Turn and talk are one strategy that is quick and effective. If you prompt your learners with a question or some sort of learning resource (a film clip or short reading) and the level of turning and talking is low, it might be time to revisit or re-teach your previous points.  

Prompting Reflection - Being able to take what the ranger is sharing in terms of facts and stories and turn them into understanding or meaning-making is a great indication that the program's themes and content are resonating and making an impact. Offering questions that promote thinking and provide space for reflection is a time-honored move that many rangers use to see how their programs are landing. 

Classroom Application - providing a place in your lessons for learners to stop and think so they can make connections to past knowledge or drum up questions to help them clarify their learning can take many forms. Exit tickets, need-to-know lists, or reflective journal entries are all possible approaches that can give you insight into individual learner needs. 

Red-Light Questions - Participation by visitors isn’t something you should consider a given. When I gave a talk at Fort Point, I was taught to design my program using a Green Light - Red Light approach. Green lights were easy to answer icebreaker-type tasks, like raising hands, sharing where you are from, or taking a picture. Yellow lights require more interactivity, such as sharing your thoughts out loud or responding to a basic knowledge question. Red lights are risky, or require participants to expose themselves and their values to the larger group. If your audience was willing to respond to Red Lights, you knew your program was resonating with them AND that you had built a rapport, both of which are hallmarks of effective learning.  

Classroom Application - seeing what your learners are willing to do helps you to understand the impact your lessons or activities have on them. The more success they have on more complex or personal questions, the more learning is going on.   

“One More Thing…” - If the program is slated to last an hour, but the ranger is still there 20 minutes later fielding additional questions that is usually a good indication of impact. After his battlefield talk, I had to wait nearly 30 minutes to get to chat with Ranger John because of all the questions he was fielding from the visitors. This showed his program was effective at building knowledge and provoking curiosity. 

Classroom Application - if your learners linger after class, seek you out during lunch, or send you emails, all of these are indicators that your lessons have prompted thinking and that there is understanding being developed. This is true even if these “after-hours” visits are clarifying questions because learners wouldn’t seek you out if they hadn’t bought in or were invested in your lesson. 

Additional Resources - In the 2018 edition of the ACE Handbook, the NPS created a “Bingo Card” populated by indicators of successful interpretive practices. Many of them are assessment tools or indicators of high-quality learning, so feel free to use this as an additional reference.  


bottom of page