With risk of sounding like a libtard/commie..But just a curious question from a humble clueless European:
I don't care about people looks (women aside), i only care about peoples actions and morals. If they are a patriot who lives up to western values and culture and the love of all mighty god.
But in America it seems race has a very important role in your society.How you see each other, it seems as you guys always have a need to distant yourself from each other in that sense. (Its not about hate or so, its just that everyone seems so fixated about race its ridiculous)
Why is there a "black community"? "Asian-american community" "White community?" "Latinos"..For people in Europe we mostly would see you all as Americans. I do understand the fact that most of the immigrants to the new world was white Europeans so in that sense it is a "white nation" historically so that i respect in regards to identity and history i also think its fair to see it that way.
But today it just seems so weird to say and not sound racist no matter what race you are? And don't get me wrong, almost every American i've met in the US were LOVELY! Only great encounters, we meet a white cop in the middle of no where in Texas who stopped our car because i made a wrong turn, he was the typical "hillbilly cop "you see in movies but he was just cool against us, and he didn't give us a ticket because we where tourists and wished us a pleasant journey, best cop i ever met. And as i said, i'm black myself. So yea, it has nothing to do with that, its just how you talk about each other no matter if it is a positive sense or not.
Me and my friend from Sweden visited the US some years ago, and we did laugh about that fact that we rarely saw a black and a white walk side by side on the street. We did and people assumed we were cops (LOL) xD.
All sides are doing it. I mean, its strange to hear black people talk about racism when the majority of blacks are racist in my opinion, and i'm black myself but not American. I would dare to say that these days black people are more racist against others as towards themselves then any other group..But still everyone,no matter what "group" you think you are is always focusing on race..How will America ever be a united country if the race bating is such a normal part of your culture? I hear all sides speak like this and always has.
Your military don't seem to think like this? But the civilians this is a huge part in how you look at each other. Enforced by the democrats of course but still..
Any ideas?
American people are the descendants of another country, we lack the pride of someone who has a motherland, I think some people take advantage of us lacking that sense of belonging that comes from having family generations passed that have real roots in the soil they stand on. We need to view America as out motherland, instead of saying, well I'm half Irish, and one quarter German, one sixth Native American.... Etc. We have never considered the soil we stand on our roots, and it doesn't help that we are taught from our youth that this land was stolen from native American people. From the time we are small we are trained that we did not originate here, and worse, current generations are being taught critical race theory. We are literally made to feel guilty for our heritage, so we naturally try to attach it to where our ancestors originally belonged. It's probably why we get along so well with Australia.