I think as long as there is still some kind of incentive that pushes doctors to keep developing new procedures and whatnot that we'd only improve as a country for giving healthcare to all our citizens... but american medicine is some of the best in the world (if you can afford it >->), would it decline if no longer a part of the free market? The idea that the quality of medicine declining if it's socialized is a scary thought, but much less scary than thinking about how in four months I'll turn 26 and no longer have health insurance at all. :/
no subject
Date: 2010-10-21 12:27 am (UTC)