I need someone to enlighten me concerning christianity and I'm not trying to be ugly about this but it just seems to me that people that I've run into that claim to be christians seem to either have a chip on their shoulders, they think they are better than you, and maybe they think I'm not a christian and they treat me like a disease. It seems they are always putting down people, especially the poor people. And I would just like for some true christian to reply to me and give me the real definition of a true christian. I would really like to know your true thoughts on christianity.
I'd like to know how you as a christian should treat people. Should we fear God and if so, why? I would also like to know what you think as a christian as to what God thinks of the war.
As a christian, are you suppose to give back as well as you take?
As a true christian, do you respect other denominations and other non-denominations? Do you respect religion of others?
Some people believe there's going to be an antiChrist and rule the world and they claim they get this from Revelations. How much of Revelations do you believe has already happened and how much has not? My belief is that anyone who does not believe in Christ is an antiChrist. The antiChrist is already here and there's a lot of them.
I appreciate serious reply's only. Thank you very much.
Original Post