College Planning & Management

JUN 2012

College Planning & Management is the information resource for professionals serving the college and university market. Covering facilities, security, technology and business.

Issue link:

Contents of this Issue


Page 19 of 107

Fire & Life Safety FOCUS ON PREPARATION AND PREVENTION Risk-Reduction Evaluations Review and document your eff orts to prove their eff ectiveness. BY MIKE HALLIGAN T HE START OF FALL FOR MOST of our programs is the start of a new budget year and a fl urry of life-safety construction projects that we hope to complete before students return from summer vacation. It should also be a time when we look back at the past year and evaluate the results of our fi re prevention and life-safety efforts. Taking time to look back at our risk-reduction efforts will only help ensure that we are reaching our target groups with effective programs that demonstrate a reduction in losses from fi res. Simply stated, we should be able to prove our efforts are working. Interestingly, when I asked at a recent meeting who was conducting full or partial program evaluations, none of my peers were doing so. After talking with some of them, there seems to be several reasons why organizations don't conduct evaluations of programs. Essentially, there are three concerns that were the lead- ing reasons why these organizations did not conduct fi re and life- safety program evaluations: 1. Fear of working with statistics even when simple math is usu- ally all that is required to get the job done. 2. Fear that an evaluation may actually identify program weaknesses. 3. Lack of adequate knowledge to conduct a meaningful evaluation. With so much at stake — the lives of our students, guests, and staff — it would seem that we could draw upon the talents of other teams within our community to assemble a team that could help put together a program evaluation that would measure our efforts. There is no need for us to be intimidated by the idea of an evaluation — failure to conduct evaluations may result in a failed risk-reduction effort and cost us lives, injuries, and disruptions in our educational mission. Review and Compare An evaluation is simply a review and comparison of data. Once you make the decision to evaluate your program, there must be a collection of objective data. Look at your incident reports, loss records, and local fi re department reports (NFIRS). These sources should show you how a typical incident happens, how often it occurs in your buildings, what types of buildings in which they are occurring (classroom, physical plant, residence hall), when incidents occur (day, time, month), cost of the incident, and trends that show the largest increase in types of incidents. 20 COLLEGE PLANNING & MANAGEMENT / JUNE 2012 Once data collection is completed and you can compare it to a baseline, you will have a reference point by which to evaluate your efforts. This may require an initial collection of data that spans several years. From the baseline, a benchmark can be set to determine the level of risk change desired and to look at recent initiatives by which to measure the results of any recent programs. If you will be using a survey to establish a benchmark, make sure it is designed to collect objective information. For example, if you want to measure your staff thoughts on leading causes of fi res in your facilities, ask them, "What do you see as the leading cause of fi res in our buildings?" If you ask a leading question, the response is more likely to result in a biased answer. Recognize the Long Term The outcome of evaluations is a long-term tool. While yearly data may show short-term impacts, it may take three to fi ve years to really see major risk or behavior changes from your efforts. The results of the evaluation, both short- and long-term, should be shared regularly with all the stakeholders in your organization. I suggest holding yearly meetings to share this information and review results. Team members will often have thoughts on how to further improve the evaluation or program to achieve even better results. This will also allow for all stakeholders to agree on revi- sions of the program. If your project has a hard completion date, it will be neces- sary to conduct a fi nal evaluation to show before and after data. The report generated should be shared with the target audience of the program as well as the management team for your school, the local fi re department, and, when approved by administration, local community political leaders. Sharing the evaluation will show your local fi re department that your school takes fi re risk reduction seriously and that your administration is committed to looking at and implementing programs to further reduce the risk of fi re in your facilities. CPM Mike Halligan is the associate director of Environmental Health and Safety at the University of Utah and is responsible for Fire Prevention and Special Events Life Safety. He frequently speaks about performance-based code solutions for campus building projects, is recognized as an expert on residence hall fi re safety programs, and conducts school fi re prevention program audits/strategic planning. He can be reached at 801/585-9327 or at WWW.PLANNING4EDUCATION.COM

Articles in this issue

Links on this page

Archives of this issue

view archives of College Planning & Management - JUN 2012