NEWxSFC - Interim Summary…as of 21 JAN 2011

AVG SUMSQ

AVG STP

AVG Total Absolute

AVG Absolute

Mean RSQ

 

Previous Ranks

Rank

Forecaster

Class

Total STN 4casts

Error (")

Error Z

% MPRV over AVG

Rank

4cast (")

Error

Error Z

% MPRV over AVG

Rank

Error (")

Error Z

% MPRV over AVG

Rank

Error (")

Error Z

%MPRV over AVG

Rank

RSQ

RSQ Z

% MPRV over AVG

Rank

Forecaster

1

1

donsutherland1

Senior

81

324

-1.101

48%

2

161.9

17.9

-0.677

52%

3

66.97

-1.256

28%

2

2.48

-1.260

28%

2

57.7%

1.323

62%

2

donsutherland1

2

2

Donald Rosenfeld

Senior

81

373

-1.016

44%

2

173.2

12.8

-0.746

57%

6

74.95

-1.000

22%

3

2.78

-1.004

22%

3

54.2%

1.239

55%

3

Donald Rosenfeld

6

3

Brad Yehl

Rookie

78

237

-0.821

36%

3

115.8

19.5

-0.332

30%

6

51.90

-0.743

19%

4

1.96

-0.750

18%

4

60.3%

0.839

28%

4

Brad Yehl

5

4

ejbauers

Intern

79

425

-0.685

31%

6

190.9

40.5

0.140

-14%

9

77.63

-0.519

12%

5

2.91

-0.532

12%

5

64.5%

0.750

36%

4

ejbauers

3

5

emoran

Senior

78

231

-0.682

28%

4

119.4

5.8

-0.785

65%

4

51.74

-0.698

14%

5

1.97

-0.663

13%

5

53.4%

0.441

18%

6

emoran

7

6

Shillelagh

Senior

79

313

-0.566

28%

6

113.6

18.7

-0.563

51%

5

57.52

-0.643

19%

6

2.16

-0.678

19%

5

45.6%

0.023

3%

9

Shillelagh

8

7

TQ

Senior

78

375

-0.491

24%

6

104.8

27.0

-0.062

6%

9

63.23

-0.594

16%

6

2.38

-0.591

15%

6

42.9%

0.010

-2%

9

TQ

4

8

Mitchel Volk

Senior

78

308

-0.342

15%

5

120.7

11.1

-0.660

56%

5

56.86

-0.797

19%

4

2.16

-0.765

17%

4

42.7%

0.258

24%

6

Mitchel Volk

10

9

herb@maws

Senior

78

344

-0.318

15%

7

105.3

33.3

0.177

-12%

8

60.68

-0.149

4%

7

2.30

-0.126

3%

7

54.8%

0.446

15%

7

herb@maws

9

10

iralibov

Chief

80

588

-0.247

11%

8

195.8

37.8

-0.315

26%

6

92.51

-0.008

1%

9

3.48

0.048

0%

9

40.2%

0.169

13%

7

iralibov

13

11

WeatherT

Journeyman

78

744

0.435

-20%

9

131.9

48.5

0.050

4%

7

100.00

0.518

-15%

9

3.78

0.596

-16%

10

36.7%

-0.781

-24%

9

WeatherT

12

12

MarkHofmann

Journeyman

81

1094

0.704

-33%

10

173.0

12.0

-0.723

55%

5

123.81

0.843

-20%

10

4.59

0.837

-20%

10

16.7%

-1.033

-50%

11

MarkHofmann

16

13

Roger Smith

Intern

81

738

0.950

-46%

12

163.2

58.9

1.023

-82%

13

96.92

0.971

-27%

13

3.59

0.882

-23%

12

25.2%

-1.162

-43%

12

Roger Smith

 

There have been four (4) snowstorm forecasting Contests…as of 21-JAN-11.  Under the ‘two-thirds’ rule…forecasters who have entered at least three (3) forecasts are included in this interim summary.

 

To qualify for ranking in the Interim and final ‘End-of-Season’ standings…a forecaster must enter at least two-thirds of all Contests.  If a forecaster has made more than enough forecasts to qualify for ranking…only the lowest SUMSQ Z-scores necessary to qualify are used in the computing the average.  IOW…if you made nine forecasts…only your six best SUMSQ Z-scores are used to evaluate your season-to-date performance.  You can think of it as dropping the worse quiz score before your final grade is determined.  The reason we have this rule is to 1) make it possible to miss entering a forecast or two throughout the season and still be eligible for Interim and ‘End-of Season’ ranking and 2) encourage forecasters to take on difficult and/or late-season storms without fear about how a bad forecast might degrade their overall 'season-to-date' performance score(s).

 

The mean normalized ‘SUMSQ error’ is the Contest/s primary measure of forecaster performance.  This metric measures how well the forecaster/s expected snowfall 'distribution and magnitude' for _all_ forecast stations captured the 'distribution and magnitude' of _all_ observed snowfall amounts.  A forecaster with a lower average SUMSQ Z Score has made more skillful forecasts than a forecaster with higher average SUMSQ Z Score.

 

The 'Storm Total Precipitation error’ statistic is the absolute arithmetic difference between a forecaster/s sum-total snowfall for all stations and the observed sum-total snowfall.  This metric…by itself…is not a meaningful measure of skill…but can provide additional insight of forecaster bias.

 

The 'Total Absolute error' statistic is the average of your forecast errors regardless of whether you over-forecast or under-forecast.  This metric measures the magnitude of your errors.

 

The 'Average Absolute Error' is the forecaster/s ‘Total Absolute Error’ divided by the number of stations where snow was forecast or observed.

 

The ‘RSQ error’ statistic is a measure of the how well the forecast captured the variability of the observed snowfall.  Combined with the SUMSQ error statistic…RSQ provides added information about how strong the forecaster/s ‘model’ performed.