Thursday, August 11, 2011
Should Christian women go to college and have a career?
I've heard all the scriptures quoted that a Christian women should not go to college but prepare for marriage. Also, that a woman should counsel and help her community. I agree but only to an extent.... what about the teachers (children in the school system) and nursing professions where a woman's touch and care are necessary? Don't we need some God-fearing women out there in the hospitals and schools? Nurses make a pretty good income, so I guess maybe that might make one's husband or future husband feel emasculated? In my opinion... we need some god-fearing women out there in the school system and in the hospitals at the patient's bedside being a nurturer. I'm not limiting the career opportunities Christian women can have, but these are example of "domestic" careers similar to her "biblical role".... What do you guys think?
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment