[
Replying to post 1 by Zzyzx]
I was taught "We are worthless, and can do nothing without god!" during my stint in christianity. However, we we also taught what we do is worthless unless god's involved in it and promotes god's agenda. So, basically, it's telling us we are worthless - period.
What would induce anyone to adopt a religion that started with such a negative assumption? Guilt. Fear. Both of which are part of modern christianity and taught, with force and apparent success. Some people just like to feel....anything... even if it's bad. Others like to be "victims" while others like the "we will win in the end" aspect of christianity.
I don't believe that most christians truly understand what they're believing in and what their 'wondrous eternal life' encompasses; they may say they do but in reality, they're in it for the "here and now"