Has the thought ever occurred to you to take care of yourself and make it your responsibility? Why do you feel you deserve to get in someone else’s pocket for your wants or needs? Just asking because I think Americans are losing the fiercely independent attitude that’s made us the greatest nation the world has ever known.Oh it's not actually free, but there's not a giant corporation who's bonuses are based on then not providing you care