Do you believe Everybody Deserves Healthcare?
Americans have great wealth and power compared to the rest of the world and we try to give back or "spread the wealth," to other countries who have dire need. I want to point the focus back on ourselves for a moment and ask a question: should America develop our own universal health care system like many other developed countries do? Or, should we continue to build on our current system which allows for-profit insurance corporations to make a profit from our suffering. We are ranked 37th in the world in healthcare, yet we spend the most. How would you fix the problem? Should we go to a "national health system" like Britain where all health services are completely public - should we do an "all-payer" system like Germany where they have heavy regulations on health insurers, or should we go to a "single payer" system where insurance is publicly run, but care is private? Or none of the above? What do you think?