Saturday, February 29, 2020

Limbaugh Said What?


“The coronavirus is the common cold, folks.”

       With this statement, Rush Limbaugh reaffirms his position within the sordid ranks of world class liars. He is, to be fair, simply echoing the sentiment of Pence, Trump and those others who have politicized and are continuing to politicize the widening world outbreak of coronavirus. On the other hand, an actual doctor, neurosurgeon and CNN medical correspondent Dr. Sanjay Gupta, put it in an actual medical perspective: "This is a brand-new virus. I think what Rush Limbaugh is referring to is the idea that it is from a family of coronaviruses. That is the family name of these viruses, and some of them in the past have caused systems that were more consistent with the common cold. But it’s also been the same family of viruses that caused SARS, that caused MERS."

       Lest we should view Limbaugh’s pettifogging as unprecedented, we must only go back about a century to 1918 to see that it is not. In early 1918, World War I was raging and both sides viewed the end as being in in sight. Germany, now on the defensive, largely due to the entry a year earlier, in 1917, of the United States, had morale problems, as by the end of the year, on the battlefields of France in spring 1918, war-weary and battered allied armies enthusiastically greeted fresh American troops arriving, as many as of 10,000 a day, at French and British ports.

       In the USA, while Woodrow Wilson had achieved his goal of entering the war for various reasons, political and fiscal, there was still some significant anti-war sentiment. The last straw, or at least the most publicized and anger inducing series of events, was Germany’s declaration on 31 January 1917 that its U-boats would target neutral shipping in designated warzones. By March of that year, five American flagged merchant freighters had been torpedoed, engendering the national outrage which preceded Wilson’s declaration which Congress approved, unanimously in the Senate and by 350 to 1 vote in the House.

       At this juncture, The US and its allies needed morale to remain high and Germany, fading fast by early 1918, needed morale to remain as high as possible. Both sides would see the involvement of an enemy which threatened them both and for which neither had a military solution.

       History sometimes “unfairly” tags persons places and even (in this case) diseases with a national moniker which is less that justified. Russian dressing isn’t “Russian,” it was invented in New Hampshire. So it was, with the “Spanish Flu” in 1918.

      In the spring of 1918, a new strain of influenza hit military camps in Europe on both sides of World War I. This (epidemics in military camps) was not a “new” phenomenon. Pneumonia, typhoid, diarrhea/dysentery, and malaria were all prevalent and epidemic at times during the Civil war, especially in encampments of new recruits. Altogether, two-thirds of the approximately 660,000 US Civil War deaths of soldiers on both sides were caused by uncontrolled infectious diseases, and several major campaigns were either abandoned or delayed by such outbreaks. While treatments for the Civil War illnesses were vastly improved, since 1865, there was no preparation even possible for what was soon to be labeled the “Spanish Flu.”

       This new strain of flu affected soldiers significantly, but with relatively few fatalities on the first go-round. This should be of no real surprise, as today, flu is more dangerous to the elderly or those already in poor health

       Even so, Britain, France, Germany and other European governments kept it (this first outbreak) a secret. They didn’t want to hand the other side a potential advantage. Spain, on the other hand, was a neutral country in the war. Consequently, when the disease hit there, the government and newspapers reported it accurately. Even the Spanish King Alfonso, a healthy 32-year-old, got sick, but recovered.
       Months later, however, a bigger and much deadlier wave swept across the globe. Since it seemed like it had started in Spain, even though it hadn’t, and because the Spanish told the truth, the new plague was called the “Spanish flu.” Even so, Britain, France, Germany and other European governments still kept it secret.

       Subsequent world events are well described by John M. Barry in “The Great Influenza: The Epic Story of the Deadliest Plague in History.” He describes it thus: “When the second wave of Spanish flu hit globally, “there was outright censorship” in Europe. In the United States, they didn’t quite do that, but there was intense pressure not to say anything negative.”

       Since US entry into the War, news about the war had been carefully controlled and selectively rationed by the wartime Committee on Public Information (CPI), an independent federal agency whose architect, publicist Arthur Bullard, once said, “The force of an idea lies in its inspirational value. It matters very little if it is true or false.” The “Committee” was, in essence, a national propaganda agency. Oddly enough, (and unrelated to the topic at hand) Bullard was also a closet socialist and admirer and chronicler of the Amana Society.

       Under Bullard’s aegis, The CPI wrote and released thousands of positive stories about the war effort, and newspapers often republished them verbatim. Consequently, when the Spanish flu spread across the United States in the fall of 1918, both the government and the media continued the same “Candide” strategy “to keep morale up.”

       Unlike the barrage of White House bloviation we are currently experiencing, Woodrow Wilson made or specifically endorsed, no public statements. His Surgeon General observed that there was “no cause for alarm” if proper precautions were observed.” Another top health official, Author Barry says, dismissed it as “ordinary influenza by another name.”

       But it wasn’t. The Spanish flu had a mortality rate of 2 percent — much higher than seasonal influenza strains, and similar to some early estimates about the coronavirus. It also differed in who died. Seasonal flu tends to be worst among the very young and very old. The Spanish flu was deadliest in young adults - You know, like soldiers crowded into military camps? (Note, this has not been, or so we’ve been told) the case with Coronavirus.)

        In the main, most, if not all,+ local media followed the government’s lead and censored bad news related to the epidemic. news. This had the effect of making matter worse.

       In Philadelphia, for a specific example, local officials were planning the largest parade in the city’s history. Shortly before this event, about 300 returning soldiers started spreading the virus in the city. Mr. Barry recounts, “Basically every doctor, they were telling reporters the parade shouldn’t happen. The reporters were writing the stories; editors were killing them,” he said. “The Philadelphia papers wouldn’t print anything about it.”

       The parade was held in spite of the suppressed warnings and, 48 hours later, Spanish flu slammed the city. Even after schools were closed and public gatherings were banned, city officials continued claiming it was “Not a public health measure,” and there was, “No cause for alarm.” (One can almost hear the chubby policeman in South Park reciting “Nothing to see here!”)

       By the time the epidemic had run its course, Philadelphia had become one of the hardest hit areas of the country. Bodies of the dead lay in their beds and, in some cases even on the streets for days; many if not most, eventually ending up in mass graves. More than 12,500 residents died, according to the Philadelphia Inquirer’s estimate. The flu can kill tens of millions of people. In 1918, that’s exactly what it did.


    Army camp in US set up as Flu hospital, 1918

       If a newspaper reported the truth, the government threatened it. The Jefferson County Union in Wisconsin warned about the seriousness of the flu on Sept. 27, 1918, less than 2 months before the armistice, which stopped the guns in France, Within days, an Army general began prosecution against the paper under a wartime sedition act, claiming it had “depressed morale.”

       The world-wide pandemic continued raged through October of 1918, and most Americans came to see that the “blind reassurances” coming from local and national officials were simply untrue. This predictable crisis of credibility led to wild rumors about bogus cures and unnecessary precautions, Author Barry says, and we are seeing them today in social media and even broadcast media. The disgraced, and epically loathsome, Jim Bakker is even huckstering a brand of colloidal silver which he has previously claimed “cures all venereal diseases” as a sure “corona virus killer.” Note: colloidal silver has no value, medicinal or otherwise, except to the seller.

       The Spanish flu ultimately killed about 50 million people worldwide, including 675,000 people in the United States, according to the Centers for Disease Control and Prevention. Even President Wilson caught it, in the middle of negotiations to end the Great War. This outbreak (coronavirus) isn’t the Spanish Flu, by any means, but the present White House response is making the truth harder to discern.

       My point? Listen to your doctor, take sensible precautions, if and when they become necessary, and vote in November, because the current administration has, by putting Mike Pence in a position which should be held by a qualified medical expert, demonstrated their true allegiances - to the Dow Jones Industrials and their red hat, moron, cadre.

Saturday, February 22, 2020

Class and advertising


        As I listen, to the extent that I can stomach it, to current  political discussions’ ebb and flow, it remains obvious that we have a huge gap between the haves and have nots in our society, not to mention the widely varying ways proposed to address (or ignore) these issues.

        American beliefs about social class, especially as portrayed by commercial advertising, are not always reflective of reality. Up to half of Americans claim to belong to the middle class, and many believe in an egalitarian society where social mobility is possible. These misperceptions, I feel, are attributable, in part, to the mass media.

       There are, today, to a greater extent than ever before, very visible examples thrown at minimum wage or even middle-class workers which serve to accentuate and exacerbate this sense of class difference. Many of these are media driven and intended for commercial purposes without any consideration of the message as it reaches less fortunate Americans.

       Examples: Every Christmas we are bombarded with automobile ads featuring couples buying not just one automobile, but one for each other. This represents, at a minimum, $70,000 in spending. I can only imagine how the average driver of a six- or seven-year-old car views this. Certainly, it must instill a sense of, “What have we done wrong?” or “How have we failed?”

        What actually triggered this essay is the current ad for Peloton, which shows the fit, young woman fitting 20 miles on the exercise bike and accompanying subscription electronic programming into her hectic daily routine. The commercial ends with the assertion that, at $58 per month it’s “for everyone.”  The $58 per month is a 3½ year commitment totaling over $2200 for the bike and access to the app. While this may seem reasonable to many of us, it’s a stretch for lower income families who have other concerns like, say, food, clothing and gas for their car.  

       Many automobile ads target perhaps a tenth of a viewing audience as potential buyers. The same is true ads for “wealth management” companies, which feature couples simply “adjusting” assets to afford college, vacation homes etc.

        So, from a historical perspective, how has this developed?  Immediately following WWII, the United States experienced an economic expansion, spurred in part by the GI Bill which (brilliantly) allowed avoidance of a post war recession.

        Additionally, with many world industrial economies in ruin, the US, untouched, was exporting worldwide. This was a boom in which in which all social classes participated. Yet, by the 1970s, this growth had stalled and began to reverse in what has been termed by some economists as “the Great U-turn.”  Beginning in the late 1970s, American society began to transition from roughly equal (on a percentage basis) real-income growth across socio-economic strata to current historic inequality in wages, earnings, and consumption.

       Real middle-class incomes declined and both inflation and recessions battered the middle class.  However, during this same stretch, the top fifth of income earners, and particularly the very top, took a greater share of income. 

       As a point of interest, this was facilitated by significant decreases in highest marginal income tax rates, from 90% (1953-1962), to 70% (1966-1980), to less than 30% (1987) and back up only slightly to around 37%. It is worthy of note that regardless of these reductions in tax rates on upper income earners, there were still recessions which, predictably, adversely affected lower income Americans far more than the top “10 percenters.”  In fact, the vaunted “Reagan tax cut” actually resulted in the “Reagan recession.”

       Perhaps the most concise and cogent statement describing the current effects and mindset of mess media advertisers comes from Gregory Mantsios’ 2007 book, “Media Magic, making Class Invisible.” His thesis: “The United States is the most highly stratified society in the industrialized world. Class distinctions operate in virtually every aspect of our lives, yet remarkably, we, as a nation, retain illusions about living in an egalitarian society. We maintain these illusions, in large part, because the media hides gross inequalities from public view. In those instances when inequalities are revealed, we are provided with messages that obscure the nature of class realities and blame the victim.”

        Analysis is of all that is inherent in that statement is the stuff for a doctoral thesis, not just this essay. Suffice it to say that, as briefly as I can manage: Advertising tends to portray upper class individuals as benevolent and sharing in most cases, and working class individuals, if they are “hard” workers as having a folksy, aw shucks “we’re here to help” attitude.

       Remember “Madge,” the manicurist, her working class status evident through her occupation and her name embroidered on her uniform, telling her client a secret: you don’t need a fancy, high priced product to soften your hands: Palmolive dishwashing soap is perfect for the job, and it just costs pennies? Madge provides valuable knowledge that the client supposedly does not and would not possess. The common sense, authenticity, and folksy wisdom of the working class is leveraged. (Of course, today, “Madge” would most likely be Asian.)

       By contrast, we almost never see non-working, “working class” persons in  commercials, reserving that coverage, instead, for Public Service spots or the nightly news. This isn’t new; even in Elizabethan times the term 'deserving' poor was used to societally segregate those in need who were unable to work because they are too old, disabled, or too sick. The 'undeserving poor' were people who don't want to work and often it is assumed that all able-bodied unemployed people fit into that category. Little has changed in the world as relates to this dilemma.

        This is compounded in the here and now by political efforts of the current administration to “brand” Republicans and Republicanism as inclusive and representing (and “protecting”) the welfare of the middle class, and working class as long as they are white, Anglo-Saxon and Protestant. The truly devious part of this tactic is that hard work matters a lot less if you’re Brown, Muslim, and/or undocumented. All this, of course, is driven by the ultimate elitist and top 1 percenter in the White House.

       Mass media in the form of Twitter and other means of mass communication have enabled the conveyance of this most deceitful of messages. Political advertising has stolen a page from commercial adverts in several ways. First, by cheap hats, tee shirts and slogans, Donald Trump’s machine has allowed lower middle and downright poor folks to feel as if they “belong” to a movement which favors them, headed by a leader who “likes” them when the inverse is true.

       The media magic here is the lack of any apparent Republican rank and file outrage, even in the face of Trump’s pardons of several devious felons, one of which was Michael Milken who invented the junk bond market and promoted its use in hostile corporate takeovers that destroyed businesses, labor unions, and working class job security while enriching a tiny corporate elite.  No matter, the Red Hat middle class and middle-class wannabees swallow it hook line and sinker.

        Isn’t advertising truly the great American invention?      

Monday, February 17, 2020

One way or the other, not both!




       Some time ago, my beloved (and very bright) younger grandson posted (to Facebook) a picture of a 4th grade "science" test. It isn't clear whether this is from one of the (too numerous) "Home School" packages produced by religious ignoramuses, or curriculum from a "Christian School." Because of the resurgence of lunatic Evangelical conservatism and science denial under the red ball caps, and since this "test" is indicative of their mindset, I have reworked this.



      Although administered in 2013, one supposes little has changed other  than that the 4th grader should now be a graduating high school senior. Unless there has been a reality intervention, this would be one of the most scientifically illiterate grads ever. In the first place, this is a totally true/false test, which is a code for either home schooled or "Christian" school "easy A". In the second place it is rife with supposed "science" questions which were to be answered based on a Bible perspective and are devoid of scientific validity. The test is shown below.

       



       Another reader, unknown to me, then proceeded to defend this test and the discussion spiraled into a discussion of science v. religion, upon which the reader began to discuss the "brilliance" of religious thinkers and the "rational thought" which had been applied to create what we, today, properly refer to as "dogma."

       Dogma, defined, is: "A principle or set of principles laid down by an authority as incontrovertibly true." "Dogma", the movie, makes far more sense and is much funnier.

       At this point, my brother in law, an actual scientist and very bright guy, entered the conversation, made several concise, valid points, receiving a somewhat scathing rebuke from the original writer, essentially ridiculing him for using logic to show the flaws on the writer's argument. Of course, I felt compelled to respond, not that my bro needed any assistance in dealing with this mental lightweight, but, rather, because the responder's lack of civility and lucidity astounded me. What I wrote is below:

       (Name of the "other guy"), you truly exhibit the messianic zeal of the true believer. In fact, as S**** so eloquently put it, no one has ever debunked any scientific assertion with religion, yet our history is full of the opposite. His point was that Copernicus and Galileo made the observations available to them and theorized possible reasons for those observations. When those theories ran counter to dogmatic teachings of organized religion ("the Church") Galileo was persecuted, and Copernicus had conveniently expired, having waited until the end of his life to publish.

       As to your comments regarding the 2000 year old institution of Christianity and the "brilliant minded" men:  


      Superstitious adherence to a dogmatic position, unfortunately can infect even the brilliant. Isaac Newton, while brilliant , was also hyper-religious, yet believed in alchemy. As Peter Medawar states, "I cannot give any scientist of any age better advice than this: the intensity of a conviction that a hypothesis is true has no bearing over whether it is true or not."

       Let me simplify that for you. The fact that you desperately want, or have an emotional need, to "believe in" a myth makes it no more true. Yes, Jethro, the sincerity and degree of zealotry associated with any religious or pseudo-scientific belief simply has zero bearing on its being true. If simply believing made it so, many of us would have gotten that pony on our 4th or 5th Christmas.

       The contrast in these two positions is that, when additional scientific study shows a theory to be incorrect or incomplete, it is replaced by the rational thinker with the newly revealed reality. Science doesn't claim immutability. Sadly, the inverse is true of religious doctrinal belief.

       No matter how many brilliant scriptural experts analyze documents and rethink the validity of allegedly scriptural writings describing "quotes" from Jesus made when he was alone, and no matter how obvious it has become to most modern scriptologists that many of the letters attributed to the Apostle Paul weren't written by him, no matter how ludicrous it now seems that the Crusaders were a bunch of moral Christian knights who only wanted to free the "Holy land" from the Infidel, there are those of the faith who refuse to believe such things.

       Two more quotes and then I'll leave you to continue your novena for the souls of unbelievers who choose the Scientific Method over superstition:

       “One must state it plainly. Religion comes from the period of human prehistory where nobody—not even the mighty Democritus who concluded that all matter was made from atoms—had the smallest idea what was going on. It comes from the bawling and fearful infancy of our species, and is a babyish attempt to meet our inescapable demand for knowledge (as well as for comfort, reassurance and other infantile needs). Today the least educated of my children knows much more about the natural order than any of the founders of religion, and one would like to think—though the connection is not a fully demonstrable one—that this is why they seem so uninterested in sending fellow humans to hell.” - Christopher Hitchens

       The late Isaac Asimov's remark about the infantilism of pseudoscience is equally applicable to religion: 

       
       "Inspect every piece of pseudoscience and you will find a security blanket, a thumb to suck, a skirt to hold.' It is astonishing, moreover, how many people are unable to understand that 'X is comforting' does not imply 'X is true.”

       A final question for the "test" designer responsible for the drivel above: If you, as you obviously do, believe in a 6000 year old earth with coexisting species which modern science now places eons (millions of years) apart, riddle me these simple questions."

1. If, as you insist, the earth is a mere 6000 years old, then either a) Earth was all one huge land mass, therefore all the species were on one huge continent, or b) The continents were "created" as they are today, separated by vast expanses of water. (Note: Many Evangelicals now "hedge their bets", staunchly clinging to the idea that the split of the continents happened "spasmodically" as a result of the Noah Flood, in other words after creation, when all that lives or has ever lived existed in one place.)

       If a) is true, then shouldn't all present day species be present on all continents? They are not. Jaguars and Tigers (for very simple examples) are absent from Africa,Europe and Australia. So are poison dart frogs, which Noah must have handled very carefully. This implies as well, that all the species we now have, must have evolved (gasp) from the originals. There are no fossils or other pre-historic remains of jaguars, llamas, groundhogs, monk seals, pronghorned antelope or almost all Amazon rain forest basin creatures outside the new world. The same is true of tomatoes, potatoes and maize

If b) is true, that explains why T-Rex fossils are found only in North America, but leaves the question "If God created all the critters in one place and they were all coexistent with Adam and Eve, why are there so many variants which exist variously all over the world, but not in the "Holy land?" Moreover how did they, and the humans with whom you insist they coexist, get there? Did kangaroos swim to Australia with koalas on their backs?

Once again, one wonders.





































Saturday, February 15, 2020

On Patriotism and Justice


               

       There's an article in the WaPo today, regarding a student who refused to stand for the national anthem in a classroom. The teacher told her to "go back to your own country." I suspect the teacher is, like so many who labor in ignorance, one of those who believe that the National Anthem and the Pledge of Allegiance were somehow spontaneously delivered by God on stone tablets to George Washington's front yard on July 4, 1776.

       
           In fact, "In God We Trust," The Pledge, and the National Anthem are all much younger than the nation, and, as much as anything, represent efforts by The United States Government to instill patriotism by rote, rather than by good example.

          Although “In God We Trust” was first stamped on a one cent coin in 1864 (wartime plea to the supernatural?), it wasn’t until a law passed in a Joint Resolution by the 84th Congress and approved by President Dwight Eisenhower on July 30, 1956, required the presence of "In God We Trust" on American currency. In 1957, the phrase was used on paper money for the first time—on an “updated” one-dollar silver certificate that entered circulation on October 1, 1957.

       At this point, I'd like to insert an example of real patriotism, and I'll
 use Dwight Eisenhower for two such. First, the night before the D-Day invasion, he penned a letter taking complete responsibility for the failure of the invasion should if happen. That's a leader demonstrating accountability in absolution of all who may have died on his orders and putting mission before self. This is essentially the most Un-Trump action of which I can conceive. 

        In like fashion, when first appraised of the horrors of the concentration camps being liberated in Poland,  he insisted that they be photographed by the Army Photography corps, stating, in essence that, "Years from now some will attempt to say that this didn't happen." Publicizing the horrors of the camps wasn't pleasant, but a true patriot, Ike understood the necessity, even while understanding that some, even in the US, weren't all that concerned.    

       The Pledge of Allegiance was written in August 1892 by a socialist clergyman, Francis Bellamy. It was originally published in The Youth's Companion (magazine) on September 8, 1892. Bellamy had hoped that the pledge would be used by citizens in any country. As he wrote it, “The Pledge” reads:

       "I pledge allegiance to my Flag and the Republic for which it stands, one nation, indivisible, with liberty and justice for all." That’s all folks. Not “The” flag, or “under God.”

       The “Under God” part, much like “In God We Trust,” stems from the age of Joe McCarthy, when “Godless Communism” was the brush with which arch-conservatives painted any who disagreed with them. Accordingly, in 1954, in response to the perceived Communist threat, President Eisenhower encouraged Congress to add the words "under God," creating the 31-word pledge we say today. I find it worthy of note that Reverend Bellamy's daughter objected to this alteration.

       Another feature of Bellamy’s original construct was (as he wrote in the magazine article): “At the words, "to my Flag," the right hand is extended gracefully, palm upward, toward the Flag, and remains in this gesture till the end of the affirmation; whereupon all hands immediately drop to the side.” Shortly thereafter, this morphed into the pledge beginning with the right hand over the heart, and after to the flag "to the Flag," the arm was extended straight out toward the Flag, palm-down. 

       This was so patently identical to the Nazi salute that by a sort of national agreement, around 1942-43 the pledge was changed to keep the hand over the heart. In spite of this, I vividly recall being taught by Sister Hermes, in Catholic kindergarten in 1947-48, to extend the hand, al a the Nazi salute. I “unlearned it” in first grade of public school.

       Finally, the National Anthem. The lyrics come from an 1814 poem “The Defense of Fort McHenry” written by Francis Scott Key during the war of 1812. On a strictly personal interest note, my maternal great and great-great grandfathers were in Fort McHenry at the time. Coincidentally, Mr. Key is buried about 50 feet from my maternal grandmother in Frederick Md. Cool, huh? The first “official” use of the poem, now set to the tune of an old English men’s social club (pub) tune called “To Anacreon in Heaven,” was by the US Navy in 1889. It didn’t become “officially’ the US “National” Anthem until Congressional resolution in 1931, in the depths of the Great Depression.” Again, in crisis, let's mandate a little shot of patriotism so they'll forget the economic disaster brought in by unmitigated (and grossly under-regulated) greed.

        As seen in the preceding paragraphs, there seems to be a note of enforced patriotism involved in the timing of these events becoming “official” vice traditional. Likewise, the playing of the national Anthem during sporting events has become a traditional, but not mandated, occurrence in America. Until after WWII, no NFL crowd ever heard the national anthem pre-game.

       By USSC decision, a student in public school may refuse to stand or to say the pledge. In 1940 in the case of Minersville School District v. Gobitis, the Court held that a public school could force students who were Jehovah’s Witnesses to salute the flag and say the Pledge.

      Only three years later, however, the Court changed its course in West Virginia State Board of Education v. Barnette, where the majority reversed the Gobitis decision and held that “the Free Speech clause of the First Amendment prohibits public schools from forcing students to salute the American flag and say the Pledge of Allegiance.”

       Note that this specifies, “Public Schools.” Nowhere else in law is any other venue involved in any Pledge (or national anthem) decisions. The majority decision in “Barnette” states, in part:

       “If there is any fixed star in our constitutional constellation, it is that no official, high or petty, can prescribe what shall be orthodox in politics, nationalism, religion, or other matters of opinion or force citizens to confess by word or act their faith therein.”



       So the next time anyone, President, red cap wearer, or just plain bozo, screams about or calls for the firing of an athlete taking a knee to protest real social injustice, remember; they (the athlete) have the absolute right to do so and if you think love of country or the wish to see our nation be the best it can and should be hinges on a slogan or song, maybe you don’t understand the word “liberty.”



Note, specifically the beginning words "patriotism on command." This is not an unpatriotic statement. 

       But what is patriotism, anyway? Is it a genuine love for country, a belief in its ideals even at personal cost, a desire to honor those who have sacrificed to keep it safe, a willingness to help it move closer to its ideals? Or is it weaponizing the symbols of your country in order to trample over human beings, and becoming very, very upset whenever a black athlete engages in political speech? Ask Donald Trump.

Wednesday, February 12, 2020

Why Not Bernie? an Electoral History Lesson


An Electoral History Lesson  

Lots of flack today re: “Bernie.” As a response to an earlier question re: my opinion on a realistic Democratic ticket, I think a Warren/ Buttigieg (or vice-versa) ticket could win but feel Sanders would condemn us to four more Trump years.

       Remember, we’re not talking to a politically sophisticated or informed country as a whole, and Trump supporters are at the bottom of that barrel. They will not be won ever by any non-rightist candidate, therefore any split in the Democratic ranks will be tragic.

       Sanders is a self-styled Socialist, for whom only one of his Senate democratic colleagues has voiced support. Even those who feel they do “understand socialism”, have little or no political real-world basis for that belief. Exemplary of this fact is believing that the Green New Deal is a panacea for “what ails us” indicating profound ignorance of how we got here and how deeply entrenched a market economy is in Americans at even the small businessman level.

Using the term Socialism, (Sanders does) by implication and definition means (Oxford English Dictionary):

       “A political and economic theory of social organization which advocates that the means of production, distribution, and exchange should be owned or regulated by the community as a whole.” Regulated really somewhat reflects the concept of a regulated market economy, but most Americans see “Owned” and reject that out of hand.

       We have too long a history of entrepreneurial business to simply reverse course. There is way too much money in far way too many hands. The best course would be the agreement that as Wilson said, Regulation in the public interest is appropriate. Theodore Roosevelt, himself a Republican, but also the first to use the Sherman Act to legally limit the public abuses by businesses of the age of robber barons (See Rockefeller, John D.) had said earlier, “ When we control business in the public interest we are also bound to encourage it in the public interest or it will be a bad thing for everybody and worst of all for those on whose behalf the control is nominally exercised.” I think there may well be a great truth in that idea. It is possible to encourage entrepreneurialism while limiting its abuses. Dodd-Frank was a good faith attempt to do that. You saw how quickly Trump backed us out of Dodd -Frank.
 
       Without massive social upheaval, (and I mean literally) the neutral ground would be to fairly regulate what private industry can and can't do. This is what Trump has tried to dismantle to a great degree. Sanders verges on being an out and out Socialist and, in my opinion, almost as reactionary to the left as is Trump to the right. In point of fact, based on anecdotal things from those who actually have worked (or tried to work) with him, he is abrasive and doesn't play well with others. Democrat “centrists” would probably not support him.

        To begin, let’s just examine Socialist “electability” today, and because it’s a frequently an accurate bellwether, the historic appeal of Socialism, as most Americans (mis)understand it, in the 20th century.

       In 1924, styling himself as a “Progressive,” Robert (“Fighting Bob”) LaFollette, having served as Governor, Representative and Senator from Wisconsin, ran for President. Although nominally a Progressive, his was, admittedly, a Socialist politics far left of the Republican originator of the term (“Progressive”) and 2012 Presidential candidate for president on that party’s ticket, Theodore Roosevelt. (T.R, actually supported universal healthcare!)

       La Follette stated that his chief goal was to break the "combined power of the private monopoly system over the political and economic life of the American people," and he called for government ownership of railroads and electric utilities, cheap credit for farmers, the outlawing of child labor, stronger laws to help labor unions, and protections for civil liberties. His diverse coalition proved challenging to manage, and the Republicans rallied to claim victory in the 1924 election. It is note-worthy that Lafollette, while actually winning one state (Wisconsin) and gaining 13 electoral votes in one of the best third-party performances in U.S. history, lost big. He died soon after, a political footnote to all but us history geeks. A footnote to this is that while Lafollette was essentially unelectable, parts of his program resonated and were enacted. The lesson here is that although rigid doctrinarianism usually loses, cooperation with the ability to compromise frequently succeeds.

       Jump ahead to 1932. If there was ever a time when Socialism should have “looked good”, this was it. The US economy was in disarray, unemployment at 18% and climbing. Many families were financially bereft of basic needs. Husbands even left their families and became hobos to enable the mother to qualify for such “welfare” as there was.

       In the midst of this misery, one might be expected to find a “share the wealth” Socialist ticket attractive. One who did was Presbyterian minister, Norman Thomas. He got a rousing 2.23% of the popular vote and no electoral votes, which was better than his 1928 result of .73% and no electoral votes. Although he ran four more times, he never again came close to 1% of the popular vote.

       Why go into all this? Because we can learn from history. As sad as you may consider it, there are two facts of political life in the United States that are historic absolutes: 1) Third parties don’t do well in American elections. Even the immensely popular and progressive Theodore Roosevelt failed as a third- party candidate. In 1992, Reform Party candidate, H. Ross Perot, spent multi-millions of his own and others money proving the same thing. He won exactly no electoral votes yet, had even half his supporters voted for Bush senior, Clinton would have lost badly. 2) Well worth considering dispassionately, as many still refuse to do in the rubble of the 2016 catastrophe, is the fact that just as in 1912, when TR’s third party campaign split the Republican vote and ushered Woodrow Wilson in to the white house, third party voters frequently accomplish what in many cases they may live to regret.

      While it’s difficult to figure out what Jill Stein voters thought they might accomplish, it’s a pretty safe bet that what they had in mind definitely wasn’t to place Donald Trump in the White House ……but that’s what they did. In Wisconsin, Stein voters cast more than enough votes, which had they voted Clinton, would have given Mrs. Clinton that state’s electoral votes and thereby the election. Ralph Nader’s Florida “Green” voters, by not voting for (really, really “green” himself) Al Gore, took 97,488 votes from Gore and allowed George W. Bush to eke out a 437 vote squeaker in Florida, which Gore might well have won by more than 97,000 votes, keeping deceased shithead Antonin Scalia and the rest of the USSC the hell out of it. Think about that. No Iraq war, what would Dick Cheyney have had to occupy himself?

       So, I hope I have outlined two different but related take -aways. The first, that the tag “Socialist” hasn’t ever played well in America and would become even more pejorative in the morally bereft hands of the Trump campaign, especially if the other candidate (Sanders himself) declares it to be his standard. The second, a corollary, is that, as in 1912, a split party (Taft/Roosevelt) runs far worse than one unified. And finally, it should concern possible Sanders supporters that if he were to be elected, at his inauguration he would already be older than Ronald Reagan, in the early clutches of Alzheimer’s, was when he left office! And spare me the “agism” bullshit. I’m just a year younger that Sanders and healthier, but I would never consider running for a four-year term of anything.

        Finally, once the Democratic national convention selects a candidate, of which Sanders is, I believe the least viable, as in the least electable, we must all close ranks in support of whomever that may be, since failing to do so all but guarantees four more years of madness. To me, just the thought of a Donald Trump in a second term, who doesn’t have to worry about reelection, is frightening. Finally, the 33 Senate elections are critical in 2020. A Senate Democratic majority would castrate Trump. I like the sound of that.

Monday, February 10, 2020

They Did It Again



       The section of our newspaper containing the Crossword, Jumble, Sudoku, and Bridge column also runs the daily wine reviews. Though I try not to look, I seem to be inexorably drawn to the wine reviews. It’s not that I am all that interested, or I need someone else’s advice. Rather, it’s the historical word-mangling and purple prose the reviewers use. I don’t think these are locally generated, else I would have sought out and crippled the writers a while ago.

       Think I’m overreacting? Here’s today’s drool bucket load of verbal vomit: warning: this is verbatim therefore some of the comments may be sane, but I think you’ll know when you get to the parts that make me wonder “What the F**K?”

“2017 Rex Hill Pinot Noir, Willamette Valley, Oregon, $35: The wine has scents of rich, dark, berries, pomegranate, rhubarb, caramel, mulling spices and tobacco with secondary hints of balsa wood, flint, slate and graphite. The palate is expansive and energetic with moderate tannins.”

      After reading this one, I wondered: did this guy eat his model planes to master the nuanced flavor palette that Balsa wood brings? Or maybe he has licked a wide variety of stones so as to differentiate the subtleties of Slate and Flint. The last time I even saw graphite it was as a black powder used to lubricate locks. Graphite is an allotrope of carbon, so the writer might try licking the inside of his chimney for the same effect, and oddly enough, Graphite is also sometimes used as a nuclear reactor moderator!

      As for the actual edible substances referenced (pomegranate, rhubarb(?), caramel, mulling spices and (just marginally) tobacco? That mix sounds a bit more like really cheap potpourri. It’s difficult to grasp all that this bozo says is happening in this bottle.  I think he’s being paid by the adjective.


       Now the rest of the story: In researching this wine I was struck by several facts: First, this wine is available many places at about $26 per bottle. I wondered about that so I went to the website if the vintner and, wow! 

     Lo and behold, Only Rex Hill's website lists the wine at that $35 price. Furthermore, the review, (by the winery and  of their own product ) is, verbatim, the one in today's paper. Yep, this isn't even a review, it's a friggin' commercial!

       I would never be so bold as to style myself an oenophile, but I have some (make that a lot of) experience with wines, ranging from the very bad (remember Mateus Rose, Boone’s Farm or that Chianti bottle with the straw basket?) to the very good and price has never been an absolute indicator of quality. For example, although there is an almost instinctive urge to skip the “Naked Wines” ads, their products, all very drinkable, and their producers, small winemakers with no (other) national distribution' are generally worth significantly more than the cost.

Just one example, and this by way of amplification of information, not a commercial for Naked Wines:


        F. Stephen Millier is a California winemaker, growing in stature, but originally one of Naked Wines' small producers. His very nice 2017 Amador County Zin (no you can’t buy it anywhere near here) retails in local California outlets, in the vicinity of $19.9 per bottle. Naked Wines retails the same bottle at $13.99 (30% less for the math challenged). The difference? No national advertising or wine merchant markup. But wait, it gets worse. Order a bottle in in almost any restaurant and bend over. Restaurant markups average two and a half to three times wholesale cost. A bottle priced at $10 wholesale might sell for $15 retail, but $25 to $30 in a restaurant. What does the restauranteur do to justify this price gouge? Nuthin’ I don’t do at home. Open, pour,… that’s all folks.

      A group of academics (of all disciplines) at the University of Minnesota held more than 6,000 blind tastings. They found that “the correlation between price and overall rating is small and negative, suggesting that individuals on average enjoy more expensive wines slightly less.” Yep! That’s what they found. Perhaps Balsa, Rhubarb and Graphite are overrated?

As the old saying goes and relates to many personal choices, not just wines: “I’m no expert, but I know what I like.” This is especially true where wine is concerned.

Saturday, February 8, 2020

An evening with "Old Yeller"


             
      An evening with "Old Yeller"


Lewis Black


        Our son and son in law, David & Scott, gave Emily and me tickets to see “Old Yeller” (as he styles himself), Lewis Black,  at the Bob Carr theater in Orlando, last night. Much more on the show in a moment.

       The Carr, completed in 1926, was the venue in Orlando for touring shows. Over the years we lived in Orlando, we saw a fair number of excellent stage productions, ranging from the sublime - Book of Mormon - to the lyrical – Les Mis, Wicked, Phantom – to the mundane and banal - Chitty Chitty Bang Bang, my personal nominee for the “musical which should never have left tryouts” award.


       Fortunately, O-town now has a real world-class facility in the Dr. Phillips Center for the Performing Arts. The Carr built in 1925-26 has a lovely atrium, courteous staff and decent bar. The audio system seems to have been reworked recently, as well. These attributes make it a nice second tier venue for the city.

      Unfortunately, the Bob Carr also has orchestra rows 60 seats across with no center aisle. Let me make this clear: if you are in good seats (we were, thanks, guys!) in orchestra left (16 and 17) there are at least 14 more seats which can only be reached by people crawling past you. this is made worse by the paucity of space beween rows front to back. It’s a case of either stand up multiple times or remain seated and get ass in your face. The auditoriums in every high school in Orlando are superior in seating design. 


       Please understand, this isn’t a rap on the entire facility, which benefits from great staff, great parking accessibility and a stream of good attractions, one of which Mr. Black certainly is! While the City of Orlando has obviously spent significantly to update the outside (new easy entrance, nice lobby, etc. they have been remiss in failing to address the seating issue, which has remained the same since the first time we went to the venue, about 43 years ago.

        This may sound a bit nit-picky, but I assure you, we were far from the only persons who found the lack of center aisle a serious pain in the butt. We were again treated to facefuls of ass at intermission as several Lizzo sized people in the center clambered out to the lobby and back. This nice, classic venue needs new modern, accessible, seating.

      That off my chest, the pre-show was comic Jeff Stilson, with whom we were unfamiliar. Stilson, while not exactly a household name, has killer creds as writer producer for such as The Daily show, Letterman, Chris Rock and others. He is also a multiple Emmy winner and nominee for half a dozen Emmy and Oscar awards shows. His 30 minutes was far from a "warm-up". I’d have paid to see him headline!

       Lewis Black was, as expected, brilliant, outrageous and hilarious. He also dealt deftly with several unrequested audience "shout outs" of the embarassing kind which makes you reflect, "Yeah, I remember my first beer."


        His was a rant for all seasons, with material ranging from the current tour’s first stop- Rapid City South Dakota - and why the f**k would anyone choose to live there?, I-4 traffic, drug costs and catchy adverts, to aging parents (his mom is 101!) and on to, as expected, impeachment. The last was all new material which he used last night for the first time. Not unexpectedly, Mr. Black is less than fond of the President. All in all, a great night of standup …until.

       One feature of Black’s tours for several years is that Stilson, who acts as emcee as well, urges the audience members (After his "open" and before Black begins, during a 15 minute intermission, to feel free to text their own “rants” to him for use after the routine. After he does his standard routine, he (Black)leaves the stage for a moment and Stilson reintroduces him,  for a live streamed podcast, entitled "The Rant is Due" during which he reads that audience’s submitted rants with his own commentary. One can see how this might be funny, weird or both.  

      
      To put this in perspective, the first submission read  “To the slob sitting in front of me: you stink and the cheap perfume you have on doesn’t begin to cover it. Take a bath”  (ouch!)  Another was a rant about Orlando traffic, on which Lewis amplified appropriately. So far, so good.

 
      Then Mr. Black read one from a Staff Sergeant who was, as am I, less than complimentary to the current C in C vis a vis the military. In truth he was downright derisive but made his points well. Still ok… until an audience member stood up and, interrupted Mr. Black in mid-sentence, announcing he was a former military guy, and began to yell. And here it became a classic example of a “But Whaddabout?”

       For the uninitiated, a “But Whaddabout?” is a fatally flawed discourse device in which, when presented something negative about (anything), an individual attempts to deflect the question by diverting attention to something irrelevant. It goes like this (as an extreme example) “Adolph Hitler sure was a wicked man” “Oh yeah? But whaddabout that Idi Amin?”

       We saw this all during the Trump impeachment proceedings. Usually as Republicans responding to factual statements regarding Trump’s extortion of Ukraine by whining, “But whaddabout Hunter Biden?” It’s that old childhood ploy grown large …. “But, they did it!”


       Lewis was, initially, very patient, repeatedly urging the man to “write it down,” but the guy was unstoppable. He then resorted to the “whaddabout,” yelling that Obama had “reduced the authority of battlefield commanders” and he had friends killed as result, then he yelled something to the effect of “had anyone else in the audience ever had that experience?.”


        First, the allegation is blatantly false. Obama had differences of opinion on priority with various overall commanders in Afghanistan but never modified the rules of engagement, endangering troops. In fact, it was Obama who urged increased use of drones, vice manned aircraft, to reduce personnel risks. On the other hand, Trump after proudly proclaiming troop drawdown in Syria to allow his Turkish friends to slaughter Kurds (which they did with alacrity), has since sent over 28,000 more troops to the region.

      As this guy raved on, undaunted by Mr. Black’s repeated requests for him to “write it down.” Audience members, us among them, began to get up and leave, so I have no idea how the incident resolved, but there is no news of another mass shooting, so I will assume he was unarmed. 


       This use of the whaddabout was ill timed, as most of the audience, on the same page with the sergeant and Mr. Black, were embarrassed to be in the same audience with the aforementioned jackass.

Coda:


       In looking at Trump’s relationship with the military one would be forced to find an off the record source, because it is against the individual’s and The Pentagon’s interest to opine for publication due to Trump’s well known penchant for vindictive response to any truth not fitting his narrative. (just ask Lt.Col. Vindeman!)

        That said, what follows is excerpted from an article in the Atlantic Monthly of November of 2019. The author, Mark Bowden, has written on military issues for over 20 years, and written 13 best sellers (including "BlackHawk Down") involving military issues, acknowledges this difficulty and therefore discussed the issue with service members know to him personally on  condition of anonymity.

       “In 20 years of writing about the military, I have never heard officers in high positions express such alarm about a president. Trump’s pronouncements and orders have already risked catastrophic and unnecessary wars in the Middle East and Asia and have created severe problems for field commanders engaged in combat operations. Frequently caught unawares by Trump’s statements, senior military officers have scrambled, in their aftermath, to steer the country away from tragedy. How many times can they successfully do that before faltering?”

Mark Bowden The Atlantic, November 2019


https://www.facebook.com/thelewisblack/videos/626643527906822/

The link above is to the actual "The Rant is Due" segment referenced above