For the record, I am not a fan of either being mandated, It's just something that never made sense to me, and no one seems to be able to give me a good answer on this. When Obama rolled out Romney-care nationwide, you were mandated to have health insurance, and people understandably got upset. But when I would point out that car insurance is also mandatory, people on both sides of the aisle just shrug like it's no big thing.
I know this is a bit off topic, but for all my petty gripes about this place, GA.WIN is the last political place on the internet I can ask a question and get an answer from an actual human being.
If you bump into me on the sidewalk I break a leg your health insurance doesn't fix my leg, mine does. I have the choice to have that insurance or I go to medical collections for not paying for my treatment. If I crash my car into your car, my insurance pays for your broken car. I do not have the choice to not have that insurance to protect your body and your car and neither does anyone who drives a car.