Now don’t get me wrong Africa has all the animals, well you know what I’m trying to say but that does not mean wild animals are walking around freely like our brothers and sisters.
I’ve seen lots of movies or music videos where whenever Africa is mentioned or there is something about Africa, there has to be animals. Don’t get me wrong I understand that’s what makes us unique from other continents or what not but I feel like the idea is getting a little too old.
I mean in Mean Girls Cady visualises everything in a total different way, where you got us Africans looking at the TV like noo, you never see such in Africa nor do you see animals just walking about I mean come on.
I have no problem with animals nor them being our representatives as Africa as a whole, I just feel like there should be more to Africa than animals and such.