CelticViking
07-12-2011, 01:58 PM
In fact, what is it that white people, specifically white men, are supposed to "give back" to black people? Black Africans NEVER owned any land. Any form of formal ownership is a Western concept. The 'black' tribes of the mid 19th century haphazardly
http://www.timeslive.co.za/ilive/2011/07/11/white-settlers-never-stole-any-land-from-africans
http://www.timeslive.co.za/ilive/2011/07/11/white-settlers-never-stole-any-land-from-africans