Accurate and complete fire-response data will influence local and national fire service support
In 2012 I chaired a task force for the Ohio State Fire Marshal's Office on the effectiveness of smoke alarms. The obvious statistic was the dramatic reduction in the annual number of fire deaths in the United States from 1975 to the present.
But one of the major recommendations the task force made was to improve, in Ohio and nationally, the quality of the data reported by individual fire departments through the National Incident Fire Reporting System .
In Ohio, better than 95 percent of all fire departments report their fire runs through the State Fire Marshal's Office to the USFA's NIFRS program. While this percentage is well above the national average, the problem is not only whether a department reports, but if the data accurately reflects the type of runs being made.
Why is better data important to an individual department and collectively to the national fire service?
The answer is simple. Today's administrators and public officials look to data as an absolute statistic that captures all of the daily activities conducted by local government — including fire, EMS, police, zoning, recreation and public works.
Two truths
The first premise most administrators believe is simple: "If an activity isn't captured on a report that can be statistically evaluated, then it didn't happen."
Whether that is fair or not, everything you do should have a verifiable report. This can include training, education, maintenance, counseling, inspections, public education and hose testing.
The second premise may seem trite, but it is also very true. When it comes to data, it is still "garbage in, garbage out."
So you may ask what this has to do with you or your department. Remember the Ohio task force? We discovered that many fire departments responded to a large number of residential or commercial fire alarms and two things happened.
First, they would either take a disregard from the alarm company. Second, they found that a detector was set off by such things as grease or food left on the stove that gave off excessive smoke, but didn't make it to the ignition stage.
Defining success
Most of these were reported as either a "system malfunction" or a "malfunctioning smoke alarm." In fact, the smoke alarm did exactly what it was supposed to do: notify the occupants that something was wrong such as the food left on the stove.
Why, we wondered, wasn't this reported as a success story when the fire extinguished prior to arrival? Or in the case where there was fire or smoke damage beyond the pan, why it wasn't reported as an actual fire?
The truth appears to be that most of us don't like spending time completing reports — it's not the exciting part of what we do. To help with this, there is a built-in shortcut to NFIRS if the incident is a system or alarm malfunction.
This option cuts a considerable amount of time off the reporting process; a fire out on arrival involves more time to complete the additional reporting fields.
Take this one step further and consider the reports' statistical importance to the research groups. The Ohio Task Force couldn't be sure that the effectiveness of smoke alarms were as bad as our statistics indicated. Hence, one of our recommendations was to improve the reporting captured in NFIRS.
Two action items
Earlier this year NFPA and the National Association of State Fire Marshals started a program to improve fire service data reporting within NFIRS. Specifically, this program will attempt to determine reporting gaps and why they are occurring. They launched a free on-line training program,
"Understanding Your Role in Fire Incident Data."
He is what I'd encourage all of us to do.
First, see that those who input fire-reporting data for your department understand why it's important to enter the data accurately. Teach them that it helps with a risk assessment of what is critical in your response area. And, collectively it will give a clearer picture of national fire service issues and the problems we face on a daily basis.
That data also drives funding. It can play a role in whether your department might be eligible for a FEMA grant and to what level these grants should be funded by Congress.
Next, make sure the reporting data gets periodic accuracy checks before it's submitted. For EMS Quality Assurance, the accepted rule of thumb is to check the accuracy of every 10th emergency call. That rule should also determine if there are gaps in your fire reporting system.
Especially on the local level, remember that accurate data will prove the worth of your department to those who live in your community, which subsequently translates into much needed support from your elected officials.
Copyright © 2024 FireGrantsHelp.com. All rights reserved.