For the record, I am not a fan of either being mandated, It's just something that never made sense to me, and no one seems to be able to give me a good answer on this. When Obama rolled out Romney-care nationwide, you were mandated to have health insurance, and people understandably got upset. But when I would point out that car insurance is also mandatory, people on both sides of the aisle just shrug like it's no big thing.
I know this is a bit off topic, but for all my petty gripes about this place, GA.WIN is the last political place on the internet I can ask a question and get an answer from an actual human being.
It's a fed vs states thing.
The real reason is that car insurance is handled by the states. Every state has its own insurance laws (there might even be a few states where car insurance is not mandatory) and they vary quite a bit. In California, car insurance was not mandatory until about 1988: I remember the ads suddenly switching from "we have a great record of always paying out quickly" to "if you don't have insurance you could go to jail!" Generally states have a lot more power than the feds.
The feds are not supposed to have power over the regular activity of the citizens. So aside from all the hot-button talking points of socialized medicine, the federal government never ever in its history forced citizens to buy anything. And it's not within its purview. That's why it was so challenged: state sovereignty was rudely shoved to one side at the same time that citizens had a new unwelcome mandate. The supreme court made a flat-out wrong decision by allowing it to stand.
But states have been oppressing their citizens for a long time. We complain and moan, but the states are allowed to do it. Assuming the elections are legit, of course.