Systems Thinking and Simulations in the US Public Policy Community: ey &
NASPAA’s Student Simulation Competition
tone
Laurel McFarland, NASPAA; Emily Reineke, NASPAA; Bobby Milstein, ReThink Health; Rebecca Niles, ReThink
Health; Gary Hirsch, ReThink Health; Ernest Cawvey, ReThink Health; Jack Homer, ReThink Health; Anand Desai,
The Ohio State University; David Andersen, University at Albany; Rod MacDonald, University at Albany, Richard
Irving, University at Albany
Abstract
During the fall of 2014 the Network of Schools of Public Policy, Affairs, and Administration (NASPAA) partnered
with Rippel Foundation to produce the first NASPAA National Simulation competition built on the ReThink
Health simulation platform. The event was held on Saturday February 28, 2015 and 181 graduate students from
93 institutions participated in-person at five regional sites. During the competition, students worked with the
simulator in small teams to craft long term policy solutions to a complex network of problems facing the health
care system in the United States. Following is a description of how this competition came to be, identification of
the key stakeholders, a description of the environment which made the competition a reality, details about the
simulation model which was the backbone of the competition, an explanation of the planning and logistics for
each site, a list of materials that were created to turn the tool into a learning experience, and speculations about
what is happening as a result of this competition and the role for simulation-based exercises in public
management and policy education.
Creating the NASPAA Student Simulation Competition
NASPAA is the international membership association of nearly 300 graduate schools of public service education.
The organization ensures excellence in education and training in public service as well as promotes the ideal of
public service. Annually, approximately 30,000 students are enrolled in NASPAA schools, who go on to be
leaders in all sectors, but primarily the government and nonprofit sectors. Across all NASPAA schools, there are
approximately 4,000 faculty members with expertise ranging from policy specific areas to administration and
management. Public service education is a smaller field than business or law education, and NASPAA students
are different because of their desire to work in the public sector and serve the public.
NASPAA had three main goals in establishing a competition: to emphasize the usefulness and transformative
potential of a graduate degree in public service, to increase student learning and engagement, and to showcase
student excellence by highlighting their critical thinking, leadership, and collaborative decision making skills.
Students have multiple options for post-baccalaureate study, and many are not aware of graduate degrees in
public policy or public administration. Further, many students are not aware of the skills they could obtain with
an MPA or MPP; the MBA and the JD are still more widely recognized degrees. But competitions can be one way
to raise brand identity and awareness of a public service degree. NASPAA, as an accreditor of these programs,
also has a stake in trying to raise the quality of pedagogy and instruction at its schools. Contextual learning,
including simulation, has received considerable attention in the academic literature as a means to raising quality
and learning outcomes in higher education. NASPAA has therefore long sought to enhance graduate schools’ use
of internships, hands-on learning, cases, and more recent innovative learning experiences like simulations.
NASPAA also had specific expectations for the competition. First was to advance pedagogy at all NASPAA schools
by providing each member school the opportunity to send students, and the opportunity for faculty to use the
simulation in their classrooms after the event. Second, NASPAA hoped that students would be engaged through
the excitement and challenge of competition, and through the sense of empowerment that they have the skills
and confidence to tackle complex public problems. Finally, NASPAA hoped that by publicizing the competence of
1
public service graduates and the academic training they received, that employers and the next generation of
students would be excited to consider a MPA/MPP degree for their post-baccalaureate study.
To accomplish these goals and achieve the expectations, a simulation was chosen rather than other competition
formats such as case competitions or projects. This is because simulations offer several practical advantages
including a faster and more intensive event, but more importantly, a host of conceptual and theoretical
advantages. A simulation is a powerful decision-making tool that provides a means of testing alternate policy
approaches quickly, an opportunity to take big actions and see big consequences, and a means to promote
public service values by enabling students to consider intergenerational and long term impacts of policy and
management decisions. Additionally, a simulation was well suited for a one-day, immersive competition which
placed students against time constraints and competitive pressure. The world of public service is rife with scarce
time and resource constraints; a simulation in a contest format is a fun but powerful introduction to that reality.
Upon further discussion of the competition, a few factors arose as important tenets that shaped the structure of
the competition. First, students should be able to easily access the competition without depending on the
resources — both financial and human capital — of their school. This meant that students could compete
individually and they didn’t need a full team from their school. Second, faculty members have many competing
demands for their time and NASPAA did not want to ask them to spend additional time coaching or preparing
students; this led to a competition that required minimal preparation work. Third, the competition should
complement the learning occurring in the classroom and be minimally disruptive to other curricular events,
which meant that the competition would occur in one day on a weekend. The final important decision point was
the effort to showcase the collaborative nature of public sector problem resolution. Hence, the decision was
made to create teams with students from different programs. Given all of these constraints and desiderata, the
competition format which emerged was a one-day competition held at multiple, easily accessible sites where
multi-program teams would compete to solve a complex and grand challenge.
Challenges where students compete to demonstrate their problem solving skill are fairly common. The standard
format is to provide a case study that purportedly mirrors a generic, but realistic problem situation requiring
managerial, organizational, and policy skills to help address the problem. In this competition, NASPAA wanted to
challenge the students intellectually in addition to exposing them to a decision making tool which is not readily
available in most public service education curricula. With these goals, a problem needed to be identified that
offered sufficient complexity and had national attention. The United State healthcare system offered this
complexity and it was known to NASPAA that a sophisticated simulation model, The ReThink Health Model,
existed which mimicked the issues that a community would encounter as it adjust itself to changes brought
about by the Affordable Care Act of 2010. Additionally, this model was currently in use in a course at the
Rockefeller College of Public Affairs & Policy at the University at Albany — SUNY. The focus of this course was to
help students learn how systems thinking could address complex problems where changing dynamics and
feedback were essential characteristics of the problem.
In November 2014, with the competition structure and topic in focus, a steering group formed to write a case
that mimicked how a state health commissioner’s office would implement healthcare reform. The simulation
model would show in real time the implications of their decision during the next 25 years and students could
test potential scenarios before determining the best solution. This competition was thus defined as a case study
with an accompanying computer simulation which created a rich learning environment while also allowing a
competitive element.
Evolution of the ReThink Health Model
The ReThink Health Model was developed as part of a process that began in 2007, founded by the Fannie E.
Rippel Foundation of Morristown, NJ and a group of influential change-makers from health care, business,
politics, and energy. In addition to extensive discussions, studies, and consideration of multiple scenarios for the
US health care system, the group was introduced to a simulation model in 2009 called HealthBound developed
for the Centers for Disease Prevention and Control (the CDC). HealthBound built on earlier community-level
models developed in the 1990’s and chronic illness models that emerged in the early 2000’s. HealthBound is a
system dynamics (SD) model of the entire US population and its health and health care and was created to help
the public better understand the need for including “upstream” prevention efforts in national health reform.
HealthBound illustrated the potential for using simulation models as a tool for guiding and promoting health
reform.
The Rippel Foundation subsequently decided to focus ReThink Health on local health reform and developing the
tools for communities to use for their reform efforts. A SD simulation model was seen as a key element in this
“toolkit”. It would enable communities to envision the effects of various reform initiatives and build strategies
around initiatives most likely to yield the greatest improvements. Rather than create the model in isolation, the
first version of the ReThink Health model was developed in collaboration with a group in Pueblo, Colorado, a
community facing the real challenges of local health reform. Pueblo had joined the Institute for Health
Improvement’s Triple Aim program and needed a way of evaluating alternatives for achieving the program’s
objectives of better health and care and lower cost.
The ReThink Health team worked with a Pueblo steering committee in 2010 and designed a model to assist local
communities with their reform efforts, balancing local concerns with more generic considerations that would
make the model useful to many other communities. Staff supported by the Kaiser Health Plan in Colorado
helped to assemble the data from a variety of sources such as the Colorado Department of Public Health and
Environment and the Colorado Health Institute. Additional data were drawn from national data bases of the
CDC’s National Center for Health Statistics. An interface developed by Forio enabled the people in Pueblo to use
the model themselves, run a number of simulations with different combinations of initiatives, and observe the
combined impacts. Their learning from using the model became the basis for a strategic plan. Pueblo is currently
working with ReThink Health to update their model so that it can continue to guide their efforts.
The model developed in Pueblo and elaborated extensively since then represents the population of a region,
disaggregated into multiple groups differentiated by age, income, and insurance coverage. The health status and
risks of each of those groups is represented along with their utilization of the region’s health care system and
the cost of that utilization. Simulations project likely trajectories of health status, costs, and other key variables
as various initiatives are implemented. Initiatives available to users include ones that help reduce cost such as
the coordination of care, streamline care delivery and make its quality more consistent, and help to reduce
behavioral and other health risks.
More implementations followed in other metropolitan areas such as Atlanta and Cincinnati with local data
inserted to particularize the model to those areas. There was additional learning with each one about the needs
of local communities and some incremental improvements to the model. In its most successful
implementations, the model became a nexus around which stakeholders with diverse interests could meet and
develop mutually acceptable strategies for moving their region toward better health and care and lower cost.
After several such implementations, an “Anytown” version was developed to represent a hypothetical
community with one-thousandth the population of the US (beginning with 280,000 in year 2000 and rising to
400,000 by year 2040) and with characteristics that corresponded to US averages for health status and risks and
health care utilization and costs. An elaborate interface makes the model accessible to a wide range of users.
The interface presents users with many options for creating change in the simulated health system and tools for
viewing and understanding the output of simulations.
This Anytown version has been used by scores of civic and professional groups to better understand the
dynamics of local health reform and the character of effective reform strategies. It has also been used with
students in public health and public administration programs to help them understand the complex systems
they will be dealing with when they join the workforce in their fields. This track record made the ReThink Health
model the ideal model to proceed with for the NASPAA competition.
Revising the Simulator for the Competition
Working with NASPAA, the ReThink Health team made a number of modifications to the then extant simulator
to make it easier to use in a competition format. There were three primary modifications: a “Super Score”
function, embedded diagnostics, and shortcuts to stakeholder concerns.
Creating the “Super Score” Function
During the competition, students and judges had minimal time to complete their tasks. To select a winning team
with less than one hour of judge discussion time, it was crucial to have one element of the final score be a
quantitative assessment of how well their solution performed, as determined by the simulator. This assessment
became known as the Super Score, and it
so i . Total Score (2015-2040) @
measured five important health system metrics ott d
(health care costs, death rate, quality of care, Total Breakdown Sty Pais to
° . 7 . : improve your Score
inequity, and employee productivity) which were Bemis, eaareny
consolidated and normalized into a single number. Score
The result was a single, quantitative score for each Ow rs saicroier 33519 MMM,
team and the ability to compare proposals within 1323, MaryondNASPAAD
sites and across the nation. This single Super Score ov 3 Spa
ag HEART marylanonAsoans
also served as a clear indicator of team progress ow
i i i 2 Ulddlddilla
toward possible system improvement. Each metric Tia iajenaussnun
had a maximum score of 100, but it would be ev co WU:
impossible to concurrently score a 100 on each Best. MaryandNAsPAn2
metric. The maximum score would be in the high ov «3 Yt
SPFINAL** MarfnhASBAA7
300s. Figure 1 is a screenshot of the Super Scores
collected from one competition site. Figure 1: Screenshot of Super Scores
Embedding Diagnostics in the Simulation
Each competition site had a technical leader on site who underwent basic training in the model. While they were
familiar with the model, they did not have an in depth understanding of healthcare or systems dynamics. This
healthcare and systems dynamics knowledge, typically present in the person who facilitates the experience ina
community setting, would be crucial for students to
have so that they could understand the simulation 2
results and make improvements for their future felts : Unintended consequences of the current scenario © Selected run: Te
iterations. For this reason, diagnostics were
embedded into the user interface to help students
understand the unintended consequences that
resulted from their proposed strategies, see Figure
2. Typical consequences (like unsustainable
financing because of overambitious efforts and
triggering of unnecessary specialist procedures due Triggering Supply Push
to falling income after cost cutting efforts) were
flagged and a graphical story of causality was
Test Baseine
Figure 2: Pitfalls Tab
displayed to provide guidance on how to improve the Super Score by thinking both systemically and strategically
about how combinations of initiatives can be complementary. In addition to easing the load on the technical
leaders, this diagnostic feature also helps to reduce the natural tendency to game the system by encouraging
users be more thoughtful in their simulated proposals.
Making Stakeholder Concerns Visible
While the focus of the competition was healthcare, most students had background knowledge in public policy
areas other than healthcare. Students did know that to implement lasting policy change, regardless of the
specific area, they need to have buy-in from stakeholders. This simulation needed to identify key stakeholders
and what was most important to them, in the same way that local communities currently use the model. In the
final task during the competition, students were asked to role-play as public health officials, hospital
administrators, medical specialists, and health insurance companies to ensure that the competition was more
than just an analytical exercise. Students were forced to contemplate the practical realities and implementation
challenges of bringing their proposal to reality. By creating a specific menu for stakeholders in the model and
directing students to metrics (represented in behavior charts over time) of concern to the various stakeholders,
they could discuss issues that local policy makers face in implementation in the real world. Thus, allowing teams
to propose compromises and quickly modify and test their proposals to address some of these stakeholder
concerns.
Competition Activities and Logistics
While the conceptual development of the competition had been occurring for a few years, planning for the
actual day of the competition began in September 2014, six months prior to the competition date.
Six Months Prior to Event
First, a date (Saturday, February 28) was selected and then host sites were approached about their availability
on this day and their interest in hosting. Sites were selected to be within driving distance of as many schools as
possible, and distributed across the United States. At this point, since the focus of the case was on the United
States Healthcare model, only US domestic host sites were chosen. The majority of NASPAA schools are located
on the East Coast, so three East Coast sites were selected: The University at Albany — SUNY, the University of
Maryland, and The University of Georgia. In addition, a Midwest site, Indianapolis University — University Purdue
Indianapolis, a South site, The University of Texas at Dallas, and a West site, the University of Southern California
were selected.
Also in September, an initial kick-off meeting occurred with competition stakeholders: the NASPAA team, the
ReThink Health team, and David Andersen, who had experience using the ReThink Health model in a public
service education curriculum. This meeting set the framework for how the competition would operate and
identified the pieces that would need to occur: writing the case, identifying judges, training the technical
leaders, and determining site-specific logistics.
Three Months Prior to Event
After sites were selected, they were invited to participate in training at NASPAA’s Annual Conference in
November 2014. This training was for the site leader and the technical leader at each site. The Annual
Conference also served as the launch of the awareness campaign to educate schools about the competition and
to encourage them to send student participants. Following the conference, each of NASPAA’s nearly 300 schools
received information in the mail about the competition to post on bulletin boards.
The second in-person planning meeting occurred in December 2014 to discuss the first draft of the case, review
the status of judges, and discuss the requested adaptations to the ReThink Health model. In December, judge
recruitment began and their training requirements were identified. Each site needed three judges (for a total of
18) who were trained on the ReThink Health model; it was preferred that these individuals had experience in
healthcare or simulations.
1.5 Months Prior to Event
In the 6 weeks leading up to the competition, there were four major components that needed to be shepherded
forward for a successful event: the case, the judges, the sites, and the students. First, the case and all supporting
material needed to be finalized. While the case text had been drafted in the fall, the supporting material needed
to be created and finished. This included:
¢ Clear instructions for each of the 4 tasks.
Worksheets for Tasks 1 — 3.
A PowerPoint Template for Task 4.
Judging rubric (which was the same for judges to use and for peers to use).
Short videos to show before Tasks 1 — 3.
Self-tests of readiness for students to complete before Tasks 1 — 3.
Second, judges requested specific guidance on how to evaluate the teams and advice on how to allocate their
time during the day. This request led to the creation of a judges’ manual which included:
e Guiding questions to ask each team during each task.
e Overview of the judging spreadsheet.
e Explanation of how the winning team would be selected.
e Suggested choreography for how the judges should allocate their judging time.
Third, each site needed detailed information to ensure that the day ran smoothly. NASPAA created two key
resources to ensure that the day ran smoothly at each site:
e A PowerPoint which would be visible in a central area all day.
e Arun-of-show document which detailed the minute-by-minute activities that needed to happen and
who needed to do it.
Finally, the students needed to receive e-mail communication on a specific timeline leading up to the event:
e February 13: Acceptance email.
e February 20: Assignment of team members and link to a pre-test.
e February 26: Detailed information with the case and instructions on how to prepare for the event.
Additionally, each student received a folder with printed materials when they arrived on-site. The folder
included the case, peer-judging rubric, acknowledgements, and social-media information. The folders and
additional materials were shipped to each site and arrived the Thursday prior to the event.
eoceee
Day of the Event
Students arrived on site by 8 am on Saturday February 28, and, received a nametag and a folder, signed a
photo/video release, and were instructed to meet their teammates. The day officially began at 8:30 am with
introductions and logistical information before moving onto the overview of Task 1 and playing the related
video. Then teams separated into their individual spaces to complete Task 1, which was identifying a policy
package without the help of the simulator. Judges circulated as observers during Task 1.
At 9:45 am, with Task 1 completed, students reconvened as a large group to watch the overview of Task 2 and
complete the second self-test of readiness. Students then returned to their individual spaces to complete Task 2,
which involved using the simulator to test different healthcare policy combinations to identify the set of policies
that would have the greatest impact in terms of improving health care costs, death rate, quality of care,
inequity, and employee productivity over a base run. Success in Task 2 was measured by the Super Score, in
which a higher score was better. Judges interacted with each team during Task 2. Host sites had lunch available
around noon for students to eat during this task.
At 1:10 pm, students reconvened as a large group to watch the overview of Task 3 and complete the third self-
test of readiness. In Task 3 students adapted their Task 2 scenario to create an implementable policy package
that met the needs of stakeholders. Judges interacted with each team during this task also. During Task 3, each
team created a PowerPoint presentation to give to their peers and a judge during Task 4.
In Task 4, running from 3:45 — 4:45 pm, teams were separated into 3 groups of 3 teams each. In their group,
each team gave a 10 minute PowerPoint and had up to 5 minutes to answer questions; presentations were
evaluated by two peer teams and one judge. Following the presentations, the judges conferred, reviewed the
judging spreadsheet, and assigned a final score to each time. Based on the score from the simulator obtained
during Task 2 (50% of their total score), the judges scores (35% of their total score), and the peer scores (15% of
their total score), the final winner for each site was determined.
The judges then announced the winning team and that team was asked to make their presentation again to
everyone at the site. This presentation was videotaped for review by a panel of judges to determine the National
winner. After the final presentation, students were dismissed and materials from the day were collected.
Post-event
Immediately following the selection of regional winners, materials from each winning team were collected and
sent to a panel of 3 super-judges to review and determine the National winner. The super-judges met via phone
on Monday March 2 to determine the winner, and all winners were publically announced on March 2. The other
national award is an award for Best Documented Team Learning. Each team submitted a USB drive with their
work from the day and a faculty team is reviewing each of them. They will use a rubric to determine which team
learned the most during the event and bestow on them this award.
Each winner on the winning team got the ability to pick a prize that would benefit them the most in their career.
Since NASPAA did not want to offer a monetary prize because that would change the motivation behind why
many students participated, instead they are working with each individual student to create a meaningful
experience for them. Depending on the students and where he/she is physically located, NASPAA is arranging
things like informational interviews with a Dean of a Policy School, a job-shadow with high ranking local officials,
or career building experiences such as travel funds for a student to attend a conference or join a professional
association.
Critical Support Functions
To create a functional competition, four main elements needed to come together: student interest, host site
logistics, judge recruitment and training, and the technical infrastructure.
Student Interest
Student recruitment began with the creation of a website (studentcompetition.naspaa.org) which went live on
October 30, 2014. The website identified the host schools, gave a brief overview of the competition, and was the
main communication vehicle for the competition. At the NASPAA conference, the competition was promoted
through an open house, an exhibit table, a breakout session, and an announcement during a luncheon. This
began the awareness campaign for the competition which was followed by each member school receiving a
hard-copy mailing by the end of November, 2014. In January 2015, a series of emails were sent to school
representatives asking them to promote the competition to their students. The principal representative at each
school was given the opportunity to pre-register a student for the competition and 37 did so. During the three
weeks that registration was open, a total of 234 students registered to participate.
Host Site Logistics
NASPAA also secured and prepared each host site. With the intention of having the competition be within
driving distance to as many NASPAA schools as possible, regional sites were chosen to be within a 4 hour drive
of as many schools as possible. This proved difficult in the Central and West portions of the United States due to
the geographical dispersion of members. Once sites were identified, NASPAA signed a memorandum of
understanding with each one and asked them to provide a logistical leader, a technical leader, student
volunteers, Wi-Fi, food and beverage for the event, meeting room space, and videotaping service. The site
leader would work with NASPAA to ensure that the site was ready and the technical leader would work with
ReThink Health to ensure that all necessary technology elements were ready. The biggest difference across
competition sites was how the workspace was setup for students. Some sites had teams at tables in one large
room and others had each team in their own room. Each site assigned teams to spaces that worked best based
on their individual layout. A few days before the competition, each site received a box of materials that included
judge folders, student folders, nametags, judge awards, and a large sign with the school logo on it.
Judge Recruitment and Training
The third major element necessary for the competition was judges. A total of 18 judges were necessary, with 3
attending each site. The ideal judge team would include one of each of the following: a faculty member with
experience in health care, a practitioner with experience in healthcare, and an individual with systems dynamics
experience. Judges were recruited from the host schools, from a pool of people who volunteer with NASPAA in
other capacities, and from colleagues of the individuals involved in planning the competition. The hardest part
about securing judges was the training required of them. Since it was important for the judges to understand
the technical elements of the simulator, they were asked to participate in a total of 12 hours of training on 4
dates. For the individuals who were unable to make the trainings, ReThink Health trained them individually.
Technical Infrastructure
The backbone of the competition was the ReThink Health simulator. This model is a highly complex, data rich
System Dynamics model that is used real time in regions to help align local stakeholders around effective
policies to improve the health system. Heavy facilitation had typically been necessary to make the most of its
use, however, as identified above, the model was adapted for use in a low-facilitation environment. In addition
to thoroughly training judges and technical leads, ReThink Health and /forio were available for remote technical
support on competition day.
Materials Developed to Support the Competition
The competition was a multifaceted educational experience and a comprehensive set of material were
developed to ensure that students could quickly use the simulation as a tool to facilitate their understanding of
healthcare and the complexities of this policy problem. The following were produced for use on February 28,
2015:
ReThink Health Dynamics Model | Thinking through the complexities of the health system is fraught with
https://forio.com/app/rippel/ret | difficulties and new initiatives seem to be short-sighted, fragmented, and
hink-health/logi ml unable to alter long-term trends. This tool allowed students to use formal
modeling to bring clarity to long-term solutions to establish a healthier,
more equitable, and more sustainable health system.
Training Version of the ReThink
Simulator
The ReThink Health team created a training version of the larger simulator
that had only a 2 initiatives enabled so students could use the simulator
while watching the instructional video and not have an advantage when
the competition began.
The Case Study
https://naspaastudentcompetitio
n.files.wordpress.com/2015/02/2
015 simulation case_final.pdf
Students were presented with a case study depicting a recent graduate in
his first job with the Commissioner of Health. He has been asked to work
with a local task force to create a local ‘bottoms up’ approach to health
care reform. The case included background information and then specific
tasks which were to be completed during the day. The case also included
worksheets to compliment the tasks and appendices with relevant
information.
Worksheets
https://naspaastudentcompetitio
n.files.wordpress.com/2015/03/n
aspaa_worksheets.pdf
Since there was a limited amount of time for students to complete each
task, task-specific worksheets were created to keep students focused.
Students downloaded the worksheets and then saved them to a USB drive
which was collected at the end of the day.
Team Self Assessments of
Readiness
https://naspaastudentcompetitio
n.files. wordpress.com/2015/03/n
aspaa_self-tests.pdf
Before the start of each task, teams were provided a checklist to test their
readiness for the task that they were about to undertake. The technical
leader at each site facilitated a question/answer period on these tests to
ensure that all students had an appropriate base-level of knowledge to
proceed. This exercise was designed to be collaborative; students were
encouraged to help each other learn the simulator and not be competitive
on who could find the most useful graph.
ReThink Health Model
Mechanics Video
https://vimeo.com/118290764
The ReThink Health team created an 18 minute Model Mechanics Video
that trained students in the mechanics of using the simulator. All students
were asked to view this video twice before arriving at the competition site
— once to learn and once to follow along.
Instructional Videos
https: .box.com/s/xOxc3pm8
mS5yajeoxnvi29d4iojo3rxcp
Three other videos were developed — one for each of the three tasks
within the case — to orient students to their next required tasks. The
videos were titled: Overview of Urbis (the fictional city where the case was
set), Simulator Basics, and Addressing Stakeholder Concerns.
Judges Manual
https://naspaastudentcompetitio
n.files. wordpress.com/2015/03/n
aspaa-competion-judge-
manual.pdf
There were a total of 18 judges and through the judges manual, they were
provided sample questions that they should ask of students to assess their
learning during each task. The manual also included a judging rubric and
information about how the final winning team would be selected and
suggested choreography about how the judges should allot their time.
Judges Training Sessions
All judges participated in online trainings so that they understood the
model the students would use and could ask the students questions to
assess their learning. Training occurred over 4 sessions for a total of 12
hours. ReThink Health provided and recorded the trainings.
Technical Leaders Training
Sessions
Each of the six sites had a technical lead person who participated in an
online training to learn the technical details of the ReThink Health
simulator, including how to use administrator privileges. ReThink Health
provided and recorded these 2 sessions.
Bibliography For ReThink Health
Policy Options
http://libguides.library.albany.ed
u/NASPAAcompetition
Students were given an extensive bibliography that cited available peer-
reviewed research on the various policy options available within the
ReThink Health simulator.
Conclusions and Next Steps
The results of the NASPAA simulation competition were outstanding in a number of dimensions. Overall,
NASPAA learned that it could successfully conduct a complex, multi-site, simultaneous event to a very high
standard and with a high level of participation. Following the event, NASPAA conducted a survey evaluation of
student participants which received a 63% response rate. Evaluating the competition against its initial set of
goals for the effort, NASPAA concluded that the simulation was an extremely effective way of engaging students
with public problem solving: 85% of the student participants said they would do it again and 95% would
recommend it to their classmates. And, though analysis of student learning is still underway, early results
indicate that students did indeed realize the predicted benefits of using a simulation in a public
policy/management competition, including, for example, the fact that student presentations referred repeatedly
to long term impacts of their policy decisions, and that many of the student groups resorted to significantly
different strategies once they took into account the results from their initial simulation runs. Students did
appear to learn important lessons about systemic and long-term effects of their actions within the simulation.
NASPAA also concluded that the simulation was an excellent means of advancing pedagogy and contextual
learning on NASPAA school campuses: more than 100 schools expressed interest in the competition, and many
faculty members followed up with an interest in using the competition simulation in their classroom. Plans are
underway to launch a dissemination project to reach those interested schools.
The simulation competition was so effective a means of advancing pedagogy and student engagement that
NASPAA’s next step is to consider expanding it into to a major international NASPAA initiative. NASPAA could
institute both an annual student simulation competition and use that annual event as a springboard to massively
expand the use of simulations in public affairs school classrooms across the country and around the world.
NASPAA will conduct a competition for a second year in 2016, and the subject of the competition will have a
global dimension so that all NASPAA members can send students.
Simulations in a national competition proved to be a potent combination in a public policy education setting,
yielding both pedagogical benefits and student learning, and promotion of public affairs graduate student
prowess to a wider public.
References
1. Agrwall, A. (2003). "Sustainable Governance of Common-Pool Resources: Context, Methods, and
Politics." Annual Review of Anthropology 32: 243-262.
2. Hirsch, G., J. Homer, B. Milstein, J. Sterman, C. Ingersol, L. Scherrer, L. Landy and E. Fisher (2012).
ReThink Health Dynamics: Understanding and Influencing Local Health System Change Proceedings of
the 30th International Conference of the System Dynamics Society. E. Husemann and D. Lane. System
Dynamics Society.
3. Hirsch G, Homer J, Wile K, Trogdon JG, Orenstein D. (2014). Using Simulation to Compare 4 Categories of
Intervention for Reducing Cardiovascular Disease Risks. American Journal of Public Health, 104(7):1187-
1195.
4. Homer J, Hirsch G, Minniti M, Pierson M. (2004). Models for Collaboration: How System Dynamics
Helped a Community Organize Cost-Effective Care for Chronic Illness. System Dynamics Review, 20(3):
199-222.
5. Homer J, Hirsch G, Milstein B. (2007). Chronic Illness in a Complex Health Economy: The Perils and
Promises of Downstream and Upstream Reforms. System Dynamics Review, 23(2/3):313-343.
6. Homer J, Hirsch G. (2006). System Dynamics Modeling for Public Health: Background and Opportunities.
American Journal of Public Health, 96(3):452-458, 2006.
7. Homer, J. (2014). "Levels of evidence in system dynamics modeling." System Dynamics Review 30(1-2):
75-80.
10
2
a
N
wo
=
a
a
N
Homer J, Wile K, Yarnoff B, Trogdon JG, Hirsch G, Cooper L, Soler R, Orenstein D. (2014). Using
Simulation to Compare Established and Emerging Interventions to Reduce Cardiovascular Disease Risks
in the United States. Preventing Chronic Disease, 11(E195):1-14, November. Available at:
http://www.cdc.gov/pcd/issues/2014/14_0130.htm.
Johnston, E. W., Ed. (2015). Governance in the Information Era: Theory and Practice of Policy
Informatics. New York, Routledge.
Mahamoud A, Roche B, Homer J. (2012). Modelling the Social Determinants of Health and Simulating
Short-term and Long-term Intervention Impacts for the City of Toronto, Canada. Social Science &
Medicine, 93:247-255. DOI: 10.1016/j.socscimed.2012.06.036.
. Milstein, B., J. Homer and G. Hirsch (2009). The HealthBound Policy Simulation Game: An Adventure in
US Health Reform Proceedings of the 27th International Conference of the System Dynamics Society.
Albuquerque, New Mexico, System Dynamics Society.
. Milstein, B., J. Homer and G. Hirsch (2010). "Analyzing National Health Reform Strategies with a Dynamic
Simulation Model." American Journal of Public Health 100(5): 811-819.
. Milstein, B., J. Homer, P. Briss, D. Burton and T. Pechacek (2011). "Why Behavioral and Environmental
Interventions Are Needed to Improve Health at Lower Cost." Health Affairs 30(5).
Milstein, B., G. Hirsch and K. Minyard (2013) "County Officials Embark on New, Collective Endeavors to
ReThink Their Local Health Systems." NACA The Journal of County Administration.
. Sterman, J. (2014). "Interactive web-based simulations for strategy and sustainability: The MIT Sloan
LearningEdge management flight simulators, Part |." System Dynamics Review 30(1-2): 89-121.
Sterman, J. (2014). "Interactive web-based simulations for strategy and sustainability: The MIT Sloan
LearningEdge management flight simulators, Part II." System Dynamics Review 30(3): 206-231.
. Sterman, J. D. (1996). Learning in and About Complex Systems. Modelling for Management: Simulation
in Support of Systems Thinking. G. P. Richardson. Aldershot, UK, Dartmouth Publishing Company. 1: 89-
128.
11