Is it true that the only places that matter in the USA are Texas and Utah?

Is it true that the only places that matter in the USA are Texas and Utah?

Attached: 1531825646848.png (800x800, 48K)

all of the states are insignificant

Mexico matters more

as long as the midwest lives, america will survive.

Texas is the single most important state in the USA and don't let no one tell you otherwise

Attached: 1507437780535.png (1045x983, 105K)

New Hampshire and Maine are still very American. They are just complete assholes though.

I like Texans, theyre always so friendly when i go visit family.
Ive visited many states in the US and all of them seem to be stuckup assholes
Texas is the only one where i can seem to speak to any random fella in the area and hed be happy to sit down for a cold beer

Kek

Thats also one rare pepe

Texas is the best and most important state

Spicland isn't American
Michigan is America

California is also important

Also Washington DC is important

Wrong, it's objectively Rhode Island.

Attached: 1499542899170.png (785x757, 47K)

why would utah matter

Yes, it is true

>utah
Only if you’re a mormon

clerk󠀡 󠀡 󠀡 󠀡 󠀡 󠀡

you can’t forget best flag

Attached: 559766D7-E832-4990-9FCD-E6F959BCEE36.png (2000x1333, 54K)

based

They got some fire bitches over ar salt lake city
I agree. Partial towards Maryland...cept for Baltimore

baltimore is actually pretty cool in some areas. and if you don’t like nogs, there’s a local kkk chapter there as well