Proposal: Define warning and alert-rates automatically.

More
2 years 4 weeks ago #3536 by heliosh
Hi,

Since radmon.org is logging all values, it would be straight-forward to set warning- and alert-levels automatically depending on the recorded range of values from a certain station.
During a calibration phase the average value and standard deviation could be calculated.

Warning-Level could then be set for example to 4-sigma (0.00317% of all values, equivalent to one false warning every 22 days when reporting once per minute)
Alert-Level could be for example 4.5-sigma (0.000345% of all values, or statistically occurring once every 201 days) or maybe even 5 sigma which would be equal to one false alert every 6.3 years.
With 132 active stations the entire network would produce statistically one false alert every 17 days.

I think that would make alerts more meaningful and make the levels easier to compare and observe.
What do you think of this idea?

Please Log in or Create an account to join the conversation.

Time to create page: 0.064 seconds
Powered by Kunena Forum