We then calculate T, the average time spent on each Si, t(Si), weighted by probabilities.
Average case scenario is useful when you are not sure what the data will look like, and expect to have a lot of of it. Your main objective here is to find an algorithm that is as efficient as possible, as often as possible, and you can deal with edge cases where it will be (too) long.
This is the most often used scenario to compare algorithms, because it allows to know what can happen in the worst case. For Sn, the worst-case is the max t(Si) for Si∈Sn, denoted:
In addition of being the most used for comparing algorithms, sometimes you can’t do anything without knowing the worst case scenario. Imagine you are an open-heart surgeon, even if you find a way to operate that is really efficient and risk-less, say, 98% of the time, but the patient dies the other 2%, well, even if the 98% performance is incredible, you just cannot accept the death of the patients 2% of the time!
So you need to thoroughly analyze the worst-case time complexity of your algorithm if you absolutely need something efficient 100% of the time.