I was reading an article about Nigeria and how the oil companies have raped the country, leaders pocketing huge oil $$$ and the people live in tin shacks and streets full of garbage. The people were quoted as saying they have faith in god and he will improve thier lives etc etc. Wouldn't it be better to get of there backisides and do something to improve themselves intsead of putting faith in a myth?
2007-02-14
13:59:51
·
3 answers
·
asked by
Jason Bourne
5
in
Society & Culture
➔ Religion & Spirituality