For the record, I am not a fan of either being mandated, It's just something that never made sense to me, and no one seems to be able to give me a good answer on this. When Obama rolled out Romney-care nationwide, you were mandated to have health insurance, and people understandably got upset. But when I would point out that car insurance is also mandatory, people on both sides of the aisle just shrug like it's no big thing.
I know this is a bit off topic, but for all my petty gripes about this place, GA.WIN is the last political place on the internet I can ask a question and get an answer from an actual human being.
Medical insurance is to cover your expenses if you get sick.
Car insurance is to cover liability for injuries you may cause others if you injure them.