No it's absolutely not time to make winter tires mandatory. It can be mandatory for people with poor driving records. Winter tires give drivers a false sence of security. The best way to be a safe driver is to be a defensive driver.
Insurance companies don't reduce rates for winter tires. I think they would know if winter tires were to reduce claims. If i'm forced to get winter tires for safety then insurance should be forced to reduce my rates.
The answer is no to winter tires.
|