Monday, November 23, 2020

Μῆνιν as the first word of Western Civilization?

In the Fall of 1979, I entered UNC-Chapel Hill as a freshman. My first class was in Greek and was, appropriately enough, the study of the Iliad taught by the wonderful and learned professor H. Kenneth Snipes. There I was on the first day of college reading what was arguably the first line of the first work of all Western literature written by the oldest and deadest of famous white men. Last night I pulled from my shelf that same Greek textbook of the Iliad prompted by an upcoming study of it in English with my Great Books classmates from the Fall of 1980. 

The first word of that first line is "Μῆνιν" and the lines proceeds famously with "ἄειδε, θεὰ, Πηληϊάδεω Ἀχιλῆος." What a start! The key to the poem's meanin' may lie in its "μῆνιν." If so, what is its best translation? 

Most translators have rendered "μῆνιν" as "anger" or "wrath." The usage of wrath has been adopted by the greatest of all translators, Alexander Pope, in recognition of the usage of "μῆνιν" more frequently by the gods and goddesses. "Μῆνιν" is clearly not a normal anger and, as the poet's story unfolds, we see how abnormal the anger is. As I meditated on "μῆνιν," two events from my own life came to mind.

When I first moved to my home, the neighborhood was pretty rundown and crime was noticeable. One night, I heard the sound of my ancient Suburban starting. I jumped out of bed and ran down the stairs and out of my house. There I saw three men with my "Hellen" (ironically for the discussion here, her name) and went, well, crazy. I don't know the right words to describe how I ran straight at the group of men. Two of them fled immediately to their own truck, but the third tried to drive off in MY vehicle. However, barefoot and in my boxer shorts, I gave pursuit, successfully grabbed the luggage rack and hung on while shouting the driver what I was going to do to him. He swerved to lose me, went off the road, hit a stop sign and then swerved again. This time I lost hold of the luggage rack, but with a quick four or five sprints, I reattached myself. As I shouted through the driver side window while hanging with one hand and pounding with the other, he looked at me with a mixture of astonishment and fear. We went another couple of blocks before he crashed Hellen into a bridge and jumped out the passenger side. I ran after him, but slipped because the soles of my feet were covered with blood. The value of my suburban? Maybe $5,000. What was that?

My children were crying when the police dropped me back at the house. We often discussed my response. While it was clearly "crazy," it was a certain kind of crazy. My response had been connected with a deep sense of injustice, of losing something of "mine." I had personalized everything - it was "Hellen," and it was in my neighborhood and it was people I could see. In my response, I did not think about my safety, the value of the car or the challenges of the thieves. The intersection of indignation and anger and justice was rooted viscerally - much like I see in David Attenborough videos of animals protecting their domain. "Wrath" or "anger" don't describe the moment nor even the English cognate "manic", but "μῆνιν" somehow fits my situation as well as the story of Achilles.

Later I had another similar experience. One night I recognized that my briefcase had been stolen while some workers had been at my house. I called the workers and got a description. The following day after going to the bank to close my account, I came home to see a person of a similar description walking on the street. I went up to this powerfully built man and told him that he stole my suitcase. When he denied it, I opened my flip phone and told him to explain it to the police. With that he took off running and I took off after him. After about three blocks of running, he looked back and stopped. He told me, "you must be a warrior for Jesus." I told him that I just wanted my briefcase back. He said that he knew who stole it. We went back to my Suburban (somethings never change!) and drove over to a housing project. We went into an apartment where about six men were sitting in the living room. I announced "I don't want any trouble, I just came here to get my briefcase" and, incredibly, one of the men got up, went to a bedroom and retrieved it. 

Again, the elements are the same as the prior experience, but the comment "a warrior for Jesus" highlights the profundity of the conviction. The clarity of this conviction is what characterizes "μῆνιν." This situation was not "anger", not "wrath," not "mania" but something deeply religious, something deep in my value system. It felt like a society that permits this is not a society and I was a piece of that societal fabric. It's this sense of "μῆνιν" that I believe defines the Iliad, defines those two experiences of mine, but continues to defy translation beyond "a certain kind of crazy." What is a better way to define Western civilization?

Monday, November 16, 2020

Architecture as Expression

For high school, I attended The Hill School in Pottstown, Pennsylvania. As a result of reacquainting with some of my classmates, I pulled out some histories of the school. Prior to its current alumni-run non-profit status, The Hill was family-owned and family-run for three generations of the Meigs family. Each generation of family leadership was distinctive in outlook, personality and architectural expression.

Rev. Matthew Meigs was the founder and was not much of a fan of overdoing things. The Reverend founded the school in 1851 with 25 students and when he turned leadership of the school over to his son in 1876, the school still had 25 students. The buildings he built were family residences that we simply expanded for school purposes. 

In contrast, his son, Dr. John Meigs, called "Professor" by many, took a school with 25 students and within 25 years had multiplied enrollment nearly by ten to 228 students and by the time of his untimely death in 1911 at age 62, the school had an enrollment of 348 boys. This rapid leadership created financial strains and Professor responded by issuing stock to a small group of investors (family and faculty) as well as getting alumni support. The buildings he constructed were beautiful in the sense of charming and intimate and are epitomized by the chapel:

After his death, the family used an interim headmaster while Dr. John Meig's son Dwight was readied for leadership which he assumed in 1914. Dwight had been a scholar at The Hill, Yale and Oxford. Unlike the term "Professor" used for his father, Dwight was called the "King" to indicate his autocratic manner. While he did not dramatically expand the student enrollment, he did expand the faculty and the buildings - a combination that had dire financial results and forced him to sell the school to the alumni in 1921. Sadly, he committed suicide a few years later leaving behind a beautifully-written, but heart-breaking letter. Yet the buildings he had constructed moved me more than any other.

So profound was that impact that when I left school, I wrote of my first day "Out of the silent, soft grey air, I walked into Memorial Hall, a dark, dank hall with leaded glass windows, dark Gothic wood carvings and ancient tapestries barely visible. It was morning, and as I stood still on the wet flagstone floor, I could faintly make out the names of the men who had died, names which were painted in gold under the small Gothic arches delicately carved." Memorial Hall was initially priced at $80,000, but ultimately cost $400,000. For scale, with inflation and building costs, roughly $16 million and $80 million in today's dollars and pictured here:

The King's biggest project cost him his position and, ultimately perhaps, his life. But in his buildings, I found an inspiration embedded: that life, its thoughts and actions, is not about charm and warmth, but is an eternal presence of beauty and austerity infused with a melancholic awareness of its evanescence.  

The buildings constructed since the King's death have been functional and useful, but not evocative or moving. The expression of private ownership has given way to public functionality. Yet the King's other great work, the dining room, not only carries as the soul of the school, but is also featured on the cover of a book titled "Old School" by a well-known Hill School alum, Tobias Wolff. Even the dining room chairs purchased in 1914 have not been replaced. As they say, "the value is remembered long after the price is forgotten."

Friday, November 13, 2020

Meditation as Brain Damage?

I have accumulated enough hours meditating on a free app called Insight Timer to be considered an expert. But a sufficient number of those hours have been spent snoozing, so the adjective "expert" is unwarranted. I might use the term "avid fan" and have described the effect as "restoring my ability to be fully present by bringing balance to my capabilities." Part of this journey is to become meditatively silent, to embrace being "an ant on a log floating down the river." A recent article on brain damage intrigued me.

Apparently there is a syndrome called Auto-Activation Deficit (AAD). With AAD, a person is absolutely unresponsive to any interior motivation and, in fact, may be devoid of these motivations. The person is present with a "blank mental state." This nirvana sounding state is the result of brain damage to the basal ganglia - a region in the base of the brain which deals with motor activity. The unusual part of AAD is that when stimulated by another person, the "damaged" individual is responsive and appropriate in behavior.

The parallels to an experience of achieving some meditative state is intriguing. In meditation, I experience an awareness of self that differentiates the clamoring of my ego state from an observing self. As I situate within the observing self, I am not motivated internally but am responsive to others - such as a phone call or a knock on the door. AAD seems to indicate that part of meditations powerful effects occur within the basal ganglia, while I would have assumed much more of the impact was within the prefrontal cortex.

Monday, November 09, 2020

Help From Hillel the Elder

Recently a close friend of mine has been studying a course in which personal experiences and resulting value shifts are translated into the public sphere with words (or narratives) and actions. The course framework is based on a famous quote of Hillel the Elder (sometimes anachronistically known as "Rabbi Hillel") - "If I am not for myself, who will be for me? And being only for myself, what am 'I'? And if not now, when?"

The ingenious concept of this course is to process transformative moments of ones development, such as a traumatic childhood experience, into a fully engaged public life. Applying the course's framework to myself, I discovered that I had encapsulated a couple of childhood traumas in such a way as to highlight my own sense of agency. I presume this was to offset a sense of helplessness. However this encapsulation was faulty and needed to be reworked.

My own traumas had created two powerful modes which I will call Captain Justice and Captain WinLove that consistently interrupted my natural process. While these two modes created tremendous skills, they inevitably brought the attendant miseries of fantastical thinking. To rework them, I focused on the causal experiences and re-experienced them as Not My Fight and I'm Good.

None of this is going to make me Abraham Lincoln or Martin Luther King Jr, but as Rabbi Zusya told his students on his deathbed, "in the coming world, they will not ask me, 'why were you not Moses?' instead, they will ask, 'why were you not Zusya?'" If I can heal the wounds driving unnatural trajectories, then I have a better chance of being me.

Friday, October 23, 2020

"End Period Dominance?"

Years ago, I was discussing an investment result with a business partner. The investment had been lackluster, but suddenly shot up and the impact seemed disproportionate. His comment was that it was "end period dominance." It was a striking term and since he had just gotten an MBA, I assumed it was part of the MBA lingo.

Years later, I referenced "end period dominance" and he looked at me strangely. I retold how I got familiar with the term. He chuckled and told me that he had just made up the term so that we could move on to another topic (I can be obsessive - as my Spinoza studies reveal).

This week new research was published in the Journal of Neuroscience that confirms "end period dominance." Studies show that that two different parts of the brain are activated, and compete with each other, when we make decisions based on past experience. They can cause us to overvalue experiences that end well despite starting badly, and undervalue experiences that end badly despite starting well—even if both are equally valuable overall.

Two parts of the brain are the amygdala and the anterior insula. The part of the brain called the amygdala works out the 'objective value' of an experience, such as a Sum of The Parts valuation. Meanwhile a brain region called the anterior insula was shown to 'mark down' our valuation of an experience if it gets gradually worse over time, such as occurs in trend-following investors.

Not only a likely culprit for an obsession with the end of movies, this "end period dominance" pattern can also affect quality decision-making and is likely to be an objective reason why the political candidates are willing to spend enormous funds to advertise right before the election season. It may be that a "happy ending" disguises an "unhappy experience."

Thursday, October 15, 2020

Self-Serving Bias Strikes Again!

Recently I read a blog post on Dr. James Fallon, a neuroscientist who discovered that he had the brain imaging pattern of a "full-blown psychopath." After his discovery, he checked in with family and friends to confirm interpersonal patterns that would indicate psychopathy. As a happily married man, father, grandfather, friend to many and successful scientist, he doubted that he had such traits. However, over the next years, he did identify that he lacked certain empathetic traits but these had been offset by a warm and nurturing childhood. As a result he coined a term for his condition - "pro-social psychopath."

This post caused me to reflect on my own self-assessments. Our "self-serving" biases give us a blind spot with regards to our own characteristics. Dr. Fallon took years after his discovery to become comfortable with the image in the brain imaging mirror. I don't think any of us are immune to the same issues.

When I was in college, I took my first Myers-Briggs personality test as part of a leadership program at the Center for Creative Leadership in Greensboro, NC. The test is certainly not scientific, because it is based on ideas of how you perceive yourself and your interactions with others. I tested as I have tested ever since - an Extroverted, iNtuitive (Big Picture), Thinking and Judgmental (Seeking Closure) personality. However, the personality described as The Commander never really resonated with me (although others have at times agreed with the description!).

Since quarantine, I have had increased time for self-introspection and decided to retake the test with my increased level of awareness as well as a willingness to remove my "self-serving" bias. When I retook the test, I did test differently. I shifted to Introverted, iNtuitive, Feeling, and Judgmental personality. The personality is described as The Counselor which is a much more fitting description of my offline interactions. So how did I get it wrong for nearly 40 years?

I believe that I have consistently defended myself against perceived weaknesses. I think that I believed people who were extroverted were more likely to be successful and popular. In addition, I think that I believed that people who were feelings-based were weak. As a result, I thought myself into these other roles which I could enact. But in my down time, I would naturally gravitate to my true patterns. Was this my personality "preference" or my "orientation"? I have no idea, but clarity on this issue does provide me a greater sense of internal harmony.

Friday, September 18, 2020

Thinking about Singularity: Nominalism Sets Up Statistics

Since the quarantine has lasted for a longer time than anticipated, I have successfully moved on from my first project of reviewing the basics of all of Latin grammar (see post Completion!) to another project of reviewing the fundamentals of statistics and probability. 

As suffering readers of this blog know, I have been digging into the depths of Spinoza's Ethics in order to explore and articulate Spinoza's promised provision of "continuous, supreme and unending joy." Perhaps the answer is simply doing anything other than reading the Ethics after attempting to read the Ethics.

As a result of studying the Ethics in Latin, I have been exploring the medieval (6th - 14th centuries) philosophical traditions in Latin which precede those of Spinoza. These readings have not only been excellent for encouraging sleepiness at bedtime, but have also shown the meandering development of language. However, last night's study had a wakeful impact.

In the late 11th century lived a theologian and philosophy named Roscellinus. He is considered the founder of nominalism. Nominalism is a school of thought that holds that universals, such as brown, chair or wood are not real. Instead, concrete singular items are the only reality and these universals such as brown, chair or wood are simply abstractions which function as mental tools, but are not realities. So radical were the implications that Roscellinus had to retract most of what he wrote. Political correctness has always been a central part of the academic world.

This nominalism debate is interesting to me because the same issue arises for Spinoza as he grapples with the role of language, belief and Aristotelian categories. Without wading into those areas in this post, I have found myself agreeing with the idea that the most detailed description of an item is the most real and thus, regardless of my self-serving bias, the most perfect.

As I come to statistics, I find that the same issues come up. Statistics is a mode of reasoning which attempts to observe, describe and summarize without a loss of "meaning." This is precisely that challenge that nominalism presented to medieval thinkers. Nominalism challenges Right and Wrong in the same way as statistics does. The central idea of nominalism is singularity while the central idea of statistics is randomness. Both modes of reasoning reject a transcendence of Right and Wrong. If everything is a singularity, what modes of thought, language and belief are relevant? Statistics holds methods for reasoning within singularities. As nominalism set up statistics, the modern age could establish itself in the face of Right and Wrong.

Monday, September 14, 2020

Wisdom for The Times

In a recent conversation with a long-time mentor, he wisely observed - "a closed mouth gathers no foot."

Monday, August 31, 2020

Grammar in Context?

I came across an article discussing the appropriate uses of punctuation for texts. The author argued that texts are short bursts of ideas, not extended discussions. As a result, he commended the different punctuation strategies used by younger cohorts. For example, a period is unnecessary in a text because the text itself is the complete thought. If a period is used, he sees it used for emphasis as in "Oh. My. God." 

These alternate uses initially irritated me, but I have jumped on the bandwagon. I think that these innovations in punctuation, spelling and grammar empower the new modes of communication. At one time, I didn't understand why people even texted and now I use texts at many multiples of phone call usage. I've noticed that I've already applied these rules to other domains.

For example, I have long discovered that my writing is dramatically different than my speaking. It was not always so. Right after college, I was engaged in a conversation and used a word most suitable for written works. My acquaintance asked me what the word meant. After I explained it, he asked me "well, why didn't you say that?" 

On the other side of this, I don't recall ever using profanity in a written document. But conversations laced with profanity are regular fare. I recall a former Baylor University president once told me that "profanity represents a lack of vocabulary." I only really agree with that somewhat pompous statement as it relates to written expression. Profanity does things verbally that punctuation do in written form. (That's my story and I'm sticking to it!)

Of course, all of this has a "self-serving bias." As someone who started out going to speech class for a speech impediment, I welcomed my move to the South with its conversational idiosyncrasies that aligned with my innate tendencies and have never looked back to an indistinct second person plural.

Sunday, August 23, 2020

Morals Versus Ethics

In a recent conversation with a friend, we discussed the role of morals. Afterwards, I found myself reflecting on the difference between morals and ethics - especially since I am working through Spinoza's Ethics. To distinguish two closely related words, I decided to look at their linguistic origins and then their contemporary usages.

Morals comes from a Latin word, mos which means "manner (of behaving)" or "custom." In contrast, ethics comes from a Greek word, ethos which means "character." In the Greek world, ethics was related to character as an area of virtue and its impact on human happiness. It was also as a category of rhetoric and indicated what kind of character the speaker projected. It seems to me that both words are rooted in patterns of behavior, but morals have to do with internal drivers while ethics have to do with externals.

This linguistic alignment seems to work well with contemporary usages. When we speak of morals, we are often talking about an inner sense of right and wrong. Ethics, on the on the hand, are often a strict code of conduct as in an ethical standard. As a result, there can be conflicts between morals and ethics. For example, in my world, it may be considered morally wrong to invest in oil & gas or tobacco, but it is (at least so far) ethically proper. In contrast, it may be considered morally right to avoid investing in oil & gas or tobacco, but it may be ethically improper if it provides inferior investment results.

One of my favorite conflict resolvers between morals and ethics, in favor of ethics, was a non-drinking client's response to owning an alcohol stock, "well, if its going to make money, let's buy it. The devil's had that money long enough." Displaying my preference here, I'm generally inclined to go with the ethical consideration, but I do believe that moral consideration is important.

Warren Buffett has always avoided tobacco investments, even though it has cost him. The decision was a moral one - in my definition here. He stated that it was highly profitable and legal to own, but addictive and injurious. I find his stand honorable, but difficult to see clearly. There is a much brighter line between ethical and unethical than between moral and immoral. Linguistically this vagueness is confirmed. Ethics, like legal and illegal is only opposed by unethical. Moral, on the other hand is opposed by immoral, but moderated by the word amoral.  

Monday, July 13, 2020

Grammar of Life

During my quarantine, I have undertaken a thorough review of basic Latin grammar. In my early days, I was in too much of a hurry to let these grammar lessons really sink in. (It's interesting that I was in such a hurry at 14, but am now in slow motion at 58.)

There are two primary structures that rely on the subjunctive mood in Latin. The subjunctive mood allows an exploration of a hypothetical situation (if I were...). The two primary structures are purpose clauses and results clauses. The distinction between the two has highlighted an underlying distinction for thought and action dynamics.

I can use either a purpose or result clause to answer the question, "Why are you headed to the refrigerator?" My answer of "So that I might get some food" is appropriate and uses a purpose clause. The other answer of "So that I can satisfy my hunger" is equally appropriate and uses a results clause. While both are appropriate, which is better?

In Spinoza's framework, a results clause is less likely to lead to cognitive illusions and a lack of power because it connects to true causal connections rather than artificial teleology. By emphasizing hunger, we focus on the real power or potentia (in Latin) that drives the process. In contrast, by emphasizing the contents of the refrigerator, we focus on that which we control or potestas (in Latin).

It appears that the ego prefers control and seeks the grammar of purpose clauses. It is a linguistic structure that shames the true self. Instead, the grammar of results clauses honors the true self, gains wisdom about causality, opens up more options and alleviates shame.

Friday, June 19, 2020

Status

In my review of Latin, I have started to notice the implications of Latin-based words. One of the first verbs I ever learned was "sto" meaning "to stand." It is a first conjugation verb and the perfect participle of the verb is "status" with a passive meaning "to have been stood." This stuck me as brilliant.

When I was choosing a college, I was fortunate to have a number of good alternatives and was completely undecided. To help me make a choice, I went to some of my high school teachers who had attended the potential choices. In retrospect, I am surprised by the factor that led their recommendations: status. Each teacher recommended a school that had nothing to do with me or my interests, but was simply a maximization of "status."

Over time, I have understood that "status" is exactly as the Latin means. The idea of a college with status was that it would stand me up. The passivity conferred by status does not confer competence, joy or wisdom. It might increase income. But over time, it tends to strengthen the ego and weaken the individual.

Monday, June 15, 2020

"Coining" A Term?

All of the U.S. coins bear a Latin phrase "E Pluribus Unum." Even non-Latin readers know that it stands for "one from many" and that it refers to the founding of the country as Thirteen Colonies came together as one. But the phrase caught my attention recently.

Some of my quarantine time has been dedicated to a review of basic Latin grammar. I was studying comparative adjectives and adverbs and came across "plures," the Latin word for "more." The positive form of the comparative "more" is "many" and translated by "multi." So, the accurate translation of "e pluribus unum" is "one from more." To render "one from many" would require "e multis unum." So why the error?

"One from more" seems strange as it would imply that the one or "unum" is added to by others. In fact, that phrasing seems appropriate in an earlier usage by Cicero where he discusses friendships and family. He hold that the love of others (plures) is added to the love of oneself (unus). This sense of "more" added to oneself seems lovely in its comparative lift.

So why was the comparative and not the positive used? One idea is that Pierre Eugene du Simitiere, the person who introduced the term, was an artist and not skilled in Latin. However, the fact that other founders were skilled in Latin and yet agreed to the motto does not seem to support this idea. Then I realized that "e pluribus unum" has one special quality that "e multis unum" does not have - it is formed by thirteen letters.

Tuesday, April 21, 2020

Virgil and Trauma Recovery

One of my favorite courses in high school was Virgil's Aeneid in Latin taught by an excellent teacher named Robert Iorillo. Part of his theatrics was standing on his desk and cringing or throwing an eraser at you if your rendering of Virgil's beautiful was inadequate.

One of Dr. Iorillo's translations stuck with me. In Book 1, line 203, the Aeneid famously reads "forsan et haec olim meminisse iuvabit" which he rendered as "perhaps one day it will even please you to remember these things." It was an encouraging statement because, like Aeneas, "Doc I" was assuring us that one day we would look back with pleasure on our current sufferings. (I even considered making it my yearbook quote, but chose a much worse quote to honor Doc I with.)

Recently I have been using some of my quarantine time to revisit the fundamentals of my Latin with my Anderson and Groten textbook. In chapter five of a 70 chapter book, the text lists the verb parts and translation of "iuvo, iuvare, iuvi, iutum" as "help, aid." That set me to thinking that his translation, although encouraging, did not seem correct.

This line comes at a time that the men of Aeneas had suffered extraordinary wartime trauma. His speech was meant as a help to his men, but the difference in translation is important. Is Virgil having Aeneas say that this memory will be like college friends who get together, laugh and recollect exploits that landed them in trouble or is this memory something more important?

Three years ago, I studied a book by Bessel Vanderkolk titled The Body Keeps Score. In my characteristic enthusiasm, I recommended the book to all and even bought it for some. One of my friends termed it "the Granowski bible." The premise of the book is that childhood recovery is a challenge because trauma needs to be processed, but the mind, being merciful, typically suppresses the memory of the trauma. The long term result of the suppression is physical and mental illness. It seemed to me that Virgil might be pointing in that direction.

I began lookin up other translations on Google and almost all were in Doc I's translation camp. But I then found a wonderful article on this topic by a Latin high school teacher Dani Bostick titled "Forsan Et Haec Olim Meminisse Iuvabit: Will Remembering Help or Please?" She comes down on the side of help and even narrows it down to trauma. I would love to audit her class!

I think that a rendering of iuvabit as "help" forces a deeper interpretation of "meminisse." "Iuvabit" is the future tense "will help" and is paired with the perfect infinitive "to have recalled or remembered" implying an action that has to be done in a prior time. With this verb, Latin generally uses a perfect tense form with a sense of present tense. However, I think that it is better to render this as a perfect tense that implies a past work completed. So a better translation might be "perhaps one day it will  help to have processed even these things."

Why has everyone gotten this so wrong? I think it might be rooted in generations of glorifying war. For years, Vanderkolk struggled to get anyone to acknowledge PTSD. There was a societal unwillingness to acknowledge the true horrors of war. I believe that Virgil profoundly understood the horrors of war and that a real purpose of the Aeneid was to critique the high price of empire, but carefully worded to not offend the emperor. I think my translation honors that critique of empire - relevant to our current day struggle as we try to help vets returning from long campaigns in the Middle East.

Who says the classics are irrelevant? Maybe its been the translations that have been irrelevant.

Tuesday, April 07, 2020

COVID - 19: End Game

In Steven Covey's well-known Seven Habits of Highly Effective People, he advocated "keep the end in mind." Given the complexity of the current pandemic, I thought it might be helpful to think about the end game.

If the virus continues to spread, even at a low rate, it seems highly likely that everyone gets it. Only if its transmission completely goes away by isolation does the virus drop out of existence. Given the global connectedness, the past transmission rates, the current populations affected, it is difficult for me to see how this does not result in all of us getting the virus. But this full contagion has a variety of possible outcomes.

First, it is possible that a much greater number than we realize have already contracted this virus and are effectively immune to its impact. This group of people either have genetic characteristics or attained antibodies that protect them from its often horrendous impact. If this number is sufficient, then it is possible that we may be able to trace a solution for the rest of us that would allow full exposure. It seems this solution is months away at earliest.

Second, it is possible that treatments may be developed that allow for a diminishing of the impact of the virus on the body. It appears that the combo of an anti-malarial and a z-pack could be effective as an immuno-suppressant as the worst effects come from the body's own attack on the virus. This solution would seem to be the earliest that science can develop in terms of months away.

Third, it is possible that a vaccine is developed at a faster rate than the next few years - given the history we have of vaccine development and the urgency at hand. In that way, we could simply generate global supplies of vaccines and dramatically lower the fatality rate.

In all cases, it appears that we need more time. To gain this required time, we have to practice three simple steps: 1) social distance, 2) washing our hands and 3) not touching our faces. The economic impact would seem likely to fade as these social practices are implemented. The practice of non-activity cannot go on for prolonged periods as people develop "safe work" practices that do not impair the economy.

Clearly the last parts of the economy to heal were the first to go: travel and luxury goods. These areas are trading at low prices, especially when accompanied with debt.

Saturday, March 21, 2020

A Modest Proposal

I've been reflecting on our current situation. I think that the President has captured the essence of it: a wartime effort. However, we continue to muddle with a lack of focus and clarity. I believe that by grasping the machinery for past war efforts, we could "defeat" this enemy.

First, during wartime, we repurpose citizenry and companies. We could divide the entire population into two components: active and reserve. The reserves would be people over 70, those with compromised health systems and pregnant women. The actives would be everyone else over 18. All military members would receive monthly or weekly wages. The actives would be divided into two parts: those currently employed in critical areas and those who are available to be repurposed. For example, cable and food and health workers would stay put. Hotel and restaurant workers would be reallocated to other positions. The incomes would be set around prior tax records. Stock options excluded as well as those who are paid according to the following section.

Next, our corporations would be used for the war effort. Software companies would be reallocated to the development as needed. Income for corporations would be the same with the profits to the shareholders limited to prior year's taxable income. Those who haven't been paying taxes (we know who they are) would not receive any profits, but could maintain infrastructure, debt service and employment.

Our wartime provisions are set up for major disruptions to the economy. To fight a full war effort and expect the economy to trudge along with full funding from the Federal Reserve is to do nothing more than highlight the limits of central banking. I don't see the need to invent a new set of structures; let's just repurpose WWII processes. At the same time, we need to maintain preparedness. What if a terrorist attack occurred now?

Wednesday, March 18, 2020

Accepting the Pandemic

I have observed that "acceptance is the answer to ALL my problems." This is different than acquiescence. Acceptance is a core component of mental health, as "sanity is accepting reality at ANY cost." The current responses to the COVID-19 pandemic have revealed a remarkable lack of acceptance and, thus, sanity.

The journey to acceptance are outlined by the five stages of grief and can serve as a good indicator for where we seem to be. As a reminder, the five stages are: denial, anger, negotiating, depression and acceptance. Further, as another reminder, these stages are not directly linear, but more of moving back and forth towards acceptance.

When the pictures showed up of China rapidly building massive hospitals, my journey of acceptance began. I don't trust the Chinese words, but I do trust their actions. Their systemic response started with denial. Quickly their government responded vigorously as they sensed the anger stage. No negotiation in their system. Their rapidly forced journey to acceptance has been remarkably effective.

In our system, the early journey seems similar to China - at some point of infection, the larger population moves to anger. The result seems to be driving some long overdue bipartisanship. Politicians are clear that anger is simmering and the blame process is escalating. If we are fortunate, then perhaps rapid change can occur in our fragile economy. But, we are also clearly in a negotiating stage where monetary and fiscal policies are being tried. The stock market seems to be picking up on these in a back and forth pricing mechanism.

If policies are ineffective, which may occur given our supply chains and delayed response, then we would enter the next stage - depression and would likely see massive government intervention in the provision of basic supplies. This would likely be accompanied by a stock market complete capitulation that is far off from today's prices. (For a view of what this stage would look like, the oil patch provides a much better picture.)