Considering our state bisects the 45° North parallel, we aren’t exactly in the most perfect position to have a long, drawn out severe weather season on a yearly basis.
That being said, the state has seen severe thunderstorms in every month on the calendar.
So with that in mind, wouldn’t all 365 days a year be considered as severe weather season? Yes and no.
It is true that severe weather can occur in every single month of the year, but it is clearly far more likely in the warmer months of summer.
Peak severe weather season for us is typically when temperatures are at their warmest… so the months of June, July, and August are usually the most active… but not all the time.
Some years, there may be more severe weather outbreaks in April and May than in July, July, and August.
Other years, it could be the fall that gets the most.
This is what we call year to year variability, which is completely normal, but is also why we preach so much about severe weather safety from April through October.
So in an average year, when is peak season?
Well, we can use several different tactics to find out, but one of the easiest is accounting for every severe thunderstorm warning throughout the year and finding the week that has the most.
The map below is split into National Weather Service forecasting areas.
Each NWS office issues severe weather warnings for their area.
By combining all of the warnings from year to year, here is when severe weather season typically peaks at each office area across the country.
For the metro area and much of southern and central Minnesota, the week of June 17th
is usually the peak of severe weather season… or at least the week of the year that averages the most severe weather warnings.
For northern Minnesota, it is typically later, usually the middle of July that gets the most severe weather warnings.
Contrast that with other areas, and the further south you go, the earlier the peak season… at least in most cases with the southern plains and southeast peaking in April or May.
How much time do we spend in severe thunderstorm warnings?
Well, that answer is a little more complicated.
There are many variables that go into this calculation.
First off, before 2008, warnings were issued by county, and not by zone or polygon, so larger counties would inherently have more warnings than smaller ones making the results skewed. But the scale has shifted since then to just pieces of counties using the polygon system.
So how do we break down the data? Well, we normalize the data into one category.
Basically, we use a mathematical equation to piece together all of the variables and get a “normalized” answer between 0 and 1.
Then we can interpret the data by comparing counties against each other to see which ones spend more time in warnings.
Think of it like an averaged scale where 0 is well below average, .5 is average, and 1 is well above average.
Using those variables, we can conclude who spends more time in warnings.
By the looks of this map, major metro areas spend more time in warnings on average… likely because the National Weather Service will often error on the side of caution when so many people’s live are at risk.
Take MSP metro for example.
Anoka is above a .8, with much of the metro coming in about a .5.
But if you look in western Minnesota, most areas are in the .2 or .3 scale indicating a sub average amount of time compared to other parts of the country.