If they have always been Christianized and or Westernized, has this anything to do with the history of Western influence in, say, Africa, for example - such as Rome's influence during imperial times and the influence of Saint Augustine Of Hippo etc?
If black Americans did once have their own religion, culture and language, what happened to it all? Have they become assimilated?
I am trying to be as neutral as possible - and peace to those of African decent - so please try and understand my question, which is geared towards culture and religion (and it is not about racism).
2006-11-24
07:28:11
·
4 answers
·
asked by
Yahoo user
4
in
Society & Culture
➔ Cultures & Groups
➔ Other - Cultures & Groups
Regarding the word 'pioneering', I was being extremely polite, if you know what I mean.
2006-11-24
07:36:33 ·
update #1