Well, I mean, as much as the cult of social justice wants to pretend that everything was going great in Africa until the evil white men came and started kidnapping all the poor, sweet, innocent black people they found, that's not exactly what went down. A big part of the slave trade was all about rival tribes selling each other out, but this pretty much never gets mentioned. I guess it's some sort of atonement bullshit, like how now nobody mentions anymore how some of the Indian tribes were actually quite violent long before whitey showed up in America. Life is never black and white, but some aspects of history (in K-12 anyways) sure wants to pretend it is.