Originally Posted By: IvyLeague
Again, you go by the assumption that a certain minority has these rights to begin with. Nowadays, if people want something, they just claim it as their "right," whether it really is or not.

It's called inherent rights. The name is self explanatory if you are not blinded by prejudice, hatred and religious dogmas.

Originally Posted By: IvyLeague

The only true rights come from God because any rights given by the government or man can be taken away by government or man.


First of all whose Gods? What if someone hid behind another imaginary God such as yours and claimed he/she gave him/her such a right? What do you have to say then with the freedom of practicing every religion?

Second of all, any government can take away any and all rights, regardless of your labeling them as true rights given by God. What's your point? I don't see a point here.


"Fire cannot kill a dragon." -Daenerys Targaryen, Game of Thrones