Is it true that the only places that matter in the USA are Texas and Utah?
Is it true that the only places that matter in the USA are Texas and Utah?
all of the states are insignificant
Mexico matters more
as long as the midwest lives, america will survive.
Texas is the single most important state in the USA and don't let no one tell you otherwise
New Hampshire and Maine are still very American. They are just complete assholes though.
I like Texans, theyre always so friendly when i go visit family.
Ive visited many states in the US and all of them seem to be stuckup assholes
Texas is the only one where i can seem to speak to any random fella in the area and hed be happy to sit down for a cold beer
Kek
Thats also one rare pepe
Texas is the best and most important state
Spicland isn't American
Michigan is America
California is also important
Also Washington DC is important
Wrong, it's objectively Rhode Island.
why would utah matter
Yes, it is true
>utah
Only if you’re a mormon
clerk
you can’t forget best flag
based
They got some fire bitches over ar salt lake city
I agree. Partial towards Maryland...cept for Baltimore
baltimore is actually pretty cool in some areas. and if you don’t like nogs, there’s a local kkk chapter there as well