When I was growing up I was taught that we were born with our rights. That no one gave them to us. This has changed in the last few years into the idea that we are granted our rights by the government. It is only reasonable for those who are taught that to assume that if government grants them a right then they also must supply it.
After all how can the government proclaim that I have a right to home ownership and then refuse to pay for it? Isn't doing so denying my right to own the home?