We’re going to Hawai’i in March and it seems like the universe is throwing so much stuff about Hawai’i into my word. I was on Facebook and saw an article about Hawai’i posted from Everyday Feminism so I had to click on it. I just finished reading 3 Myths about Native Hawaiians You Ought to Know Before Visiting Paradise and it made me tear up a bit. I tend to forget that so much of the “history” that I’ve been taught is some bs to promote the melting pot idea of the US.
I haven’t done any reading about Hawai’i because I wasn’t interested in going. With all of the advertising, it just didn’t seem like a place that I’d enjoy. Now that I am going I’m learning a lot about the history of Hawai’i and the Native Hawai’ian peoples. It never occured to me that just like the First Nations peoples that live on the mainland, Native Hawai’ians have been pushed off of their land and banished from polite circles.
In the article the author suggests that tourist visit Hawaii’s Plantation Village to learn a bit more about the past of Native Hawai’ians and others that have immigrated. Apparently, the tours are given by Native Hawai’ians. I don’t usually visit plantations (you know, the enslavement of Africans, the beatings, the rapes the emotional abuse, etc., etc.) but I may make an exception.
Are there any other places we can go to learn about native culture and support Native Hawai’ians? Do share. =)