Just wanted to know how people felt about areas like Seattle and Portland. Both are on the west coast but they don't seem to get the same type of acceptance as being "west coast" cities from the rest of the country's perspective. Do yall think its a problem? could care less? think im wrong in my assumption? Speak ya mind.