|How is the service from the new Weather Forecast Office at Morristown? The following table of statistics represents how well we're doing since 1996 when we assumed responsibility for the County Warning Area. Much of our success is due to the power of the Doppler radar to detect and analyze severe storms. We could not, however, have had near the success as we have attained without the aid of our volunteer spotters. Most of these spotters are HAM radio operators who contribute their time and equipment to provide the National Weather Service (NWS) with continuous up-to-date information on what is happening on the ground at any given location. Also, our close working relationship with State and County Emergency Managers is invaluable, giving us feedback from all parts of our warning area. One of the best measures of service is how well we're verifying our severe weather warnings. We measure our success by several different methods. The first method is the Probability of Detection (POD) or how likely we are to detect a severe storm. The second method is the False Alarm Ratio (FAR) or how often we 'cry wolf'. The third method is the Critical Success Index (CSI) which is a combination of the POD and FAR. Last, but definitely not least, is our Average Lead Time which is simply how much advance warning we give.
Comparison of Verification Statistics for WFO Morristown
|Number of Severe Weather Events
|Number of Severe Weather Warnings
|Probability of Detection (perfect = 100)
|False Alarm Ratio (perfect = 0)
|Critical Success Index (perfect = 100)
|Average Lead Time (higher is better)
Probability of Detection (POD): This is the percentage of all severe weather events which were warned for (a perfect score would be 100%). For example, if we issued 60 warnings and there were 100 total severe weather events reported (60 warned, 40 unwarned), the POD would be 60%.
POD = warned events / (warned events + unwarned events)
60 / (60 + 40) = 60 / 100 = 60%
False Alarm Ratio (FAR): This ratio measures how often we issue false alarms, or in other words, a measure of 'crying wolf'. Ideally we want this number to be 0.0%. Some of our false alarms will be from storms that may appear severe, or are borderline severe. Some false alarms will be from storms that are severe, but the severe weather occurred where no one was around to observe the event (classic case of 'if the tree falls in the forest and nobody hears it, does it make any noise?').
(from above example)
FAR = unverified warnings / (verified warnings + unverified warnings)
40 / (60 + 40) = 40 / 100 = 40%
Overwarning will achieve a high POD, but at the expense of a high FAR. If warnings are rarely or never issued, the FAR will be low but so will the POD. Overall success can be expressed by the Critical Success Index, which is a function of both POD and FAR.
Critical Success Index (CSI): CSI is the ratio of warned events to the total number of events + the number of unwarned events. As with the POD, 100% is a perfect score. As an example, if there were 80 warnings issued, and 60 warnings had verified severe weather while 40 did not have verified severe weather (in addition, there were 20 severe weather events that went unwarned), then the CSI would be:
CSI = warned events / (warned events + unwarned events + unverified warnings)
60 / (60 + 20 + 40) = 60 / 120 = 50%
Average Lead Time: This is simply the length of time from when we issue the warning until our first report of severe weather in the warned area. This time can be anything from 0 minutes up to the total valid time of the warning.