When Joe Biden assumed office in 2021, the progressive press hoped, as the LA Times crowed, that he would "turn America into California again". To the great loss of America, the West and, of course, Californians, he is living up to this credo in spectacular fashion.