Just out of pure curiosity: What exactly is the coastal Cali culture you're refering to? To me California always has been as american as it gets. We went there on our honeymoon and to me it's one of the nicest places on earth with all the nature, great Canyon roads, beach, all that Tech and StartUp in Silicon Valley, .... I always considered living there and sometimes still dream about it. Is my Picture completely wrong? Is it actually nothing like I imagine it to be? What makes living there so horrible in your opinion, that you're even afraid of getting discriminated? As a German I really have not idea. Asking both of you @HelpAndProsper @LaraJFI'm sorry you live in California. I lived there for 12 years and will never go back now that it's a haven for everything indecent and unAmerican in the Universe. Did I mention how much I detest the coastal Cali culture?
Don't like ads? Remove them while supporting the forum. Subscribe.