Most of us get health insurance through our jobs, a system puzzling to the rest of the industrial world, where the government levies taxes and offers health coverage to all as a basic right of modern society. But for many Americans, their way feels alien—the heavy hand of government reaching into our business as some bureaucrat tells doctors and patients what to do. We always seem to fight over the role of government in our healthcare. The long-standing tension between public and private healthcare in America has produced a unique and confusing way to provide protection against the cost of ill health.