Do you feel more confident about the United States now that Joe Biden will be president?
For all of those people who stormed the U.S. Capitol building, there are millions more who feel lied to, cheated, disrespected and unrepresented by their government. That's not likely to change after Joe Biden becomes president; it's probably going to get much worse, if Progressives continue to brazenly and arrogantly push their ideology and agendas on the entire country and to disregard or vilify (or even punish) anyone who opposes or resists it.
It reminds me of the old saying, “Don't win the battle but lose the war.” It's no victory for the United States as a country if half of its population feels disenfranchised and disrespected by the government, regardless of which half it is.
I think it's easy for people on the Left who are surrounded by people who share their views and are informed by a media that reinforces the same to forget that there are millions of their fellow Americans who disagree with them. Those other people are “America” too. Only catering to one side and constantly blaming and putting down the other is not good political strategy, at least not if you want a united country.
I think there is a lot of hypocrisy in the way that the actions of the Left and the Right are covered by the mainstream media in general: Right or wrong, justified or unjustified, major news or minor news; it depends on which side is doing it, even if the actions are virtually identical.
Back in June, a district of Seattle, WA was taken over by Left-wing activists. The police department was seized. Activists with guns were patrolling the “Capitol Hill Autonomous Zone”. At least one person was shot and killed and others were wounded within the zone while local law enforcement had its hands tied by the city's mayor. And that situation lasted for weeks.
I wonder if people can see any difference in how that story, and other similar actions recently taken by Left-wingers, were/are being covered and how the Right-wing Capitol riot/protest is being covered and portrayed by the media and politicians? (Pay attention to the wording.)
When I heard about the Capitol riot, I was reminded of the “Boston Massacre”. History is written by the victors. If England had won the American war of independence, the colonists would have been “despicable traitors”, not “patriot heroes”. Same events but viewed through different eyes and framed by different authors.
Right now, I see the United States being in a similar situation politically, with a large number of its citizens being very dissatisfied with their government and feeling like their values and views, concerns and needs are not being respected, heard or served. That's a powder keg, and if Biden and his administration don't recognize this fact and take pains to mend what is already badly fractured, it will only result in greater resentment and hostility between the two sides and the crippling and unraveling of the Union as conceived by the Founding Fathers.
The very real ideological divide in this country (which is at the root of the political divide) needs to be seriously addressed if the United States is to remain united in any way but name only.
God Bless America and God Bless You! Be strong and stand behind the USA Constitution.