Gongol.com Archives: November 2021
One of the most appealing characteristics of Benjamin Franklin is how such a lofty figure from American history had so many observations about life that, aside from the obvious idiosyncrasies of technology and social mores of the time, sound utterly contemporary to modern ears. Franklin had thoughts about lies spreading through the media, about preachers who strayed from religion, and about the scruples of business partners and rivals. ■ One of the most intriguing ideas Franklin recorded in his autobiography was entered under "Chapter 9: Plan for Attaining Moral Perfection". ■ This plan, which starts as an exercise in his own effort to apply discipline to his own self-improvement, culminates in something that can only be described as a rough outline for a religious organization -- a church, really -- having only the loosest affiliation possible with what usually looks like religion. ■ Franklin's plan for "The Society of the Free and Easy" didn't get very far; he wrote that "I communicated it in part to two young men, who adopted it with some enthusiasm" before he put the idea on the back burner, never to be reanimated. But his purpose in outlining the plan -- "containing, as I thought, the essentials of every known religion, and being free of everything that might shock the professors of any religion" -- was, in a sense, to establish a civic religion in America. ■ Even in admitting its failure to gain traction, Franklin still endorsed it: "I am still of opinion that it was a practicable scheme, and might have been very useful, by forming a great number of good citizens". Considering that about a quarter of Americans are not affiliated with any religion, and that among those who are, many hear overt political exhortations from the pulpit (despite the ban on engaging in campaign activity under the tax code), it's an intriguing theoretical exercise to wonder: What if Franklinism had produced a real church? ■ Would it have survived the last three centuries? Would it have encountered schisms? Would it have produced fundamentalists? Or would it have produced Americans "free from the dominion of vice", as Franklin had hoped?
In the abstract, Americans are pretty attentive to celebrating harvest season. We celebrate Halloween and Thanksgiving with harvest themes, and more than a few distinctly harvest-themed events are put on as well. But in the particular, most people don't have much direct exposure to what harvest season actually means. ■ It's worth noting a peculiarity of the harvest season in places where row crops are collected for energy use -- particularly corn and soybeans in the Upper Midwest. In addition to their uses as food and feed, both of these crops are widely converted into combustible energy sources as ethanol and biodiesel. ■ What makes that conversion particularly interesting is in how much it differs in its production cycle from that of other fuel sources. Oil wells pump oil 365 days a year. Coal is extracted from the ground year-round. Nuclear power knows no seasons. Even renewable sources of power like wind and solar may have seasonal fluctuations, but they operate twelve months out of the year. ■ That's where the row crops collected for fuel are distinctive. Soybeans and corn can really only be harvested in a brief window of time -- after maturity, but ideally before winter sets in. It results in a few weeks of intense work by people who, for the most part, have other things to do. (They have to: The median farm operation isn't a money-maker. This makes the harvest a particularly interesting case of surge labor being put to work. ■ It's not uncommon to venture out into the rural parts of the Upper Midwest and see people harvesting long after the sun has gone down, the giant lights on their combines illuminating the way as a year's work is pulled from the fields before winter comes. Lots of these farmers are showing up after doing a full day's work at something else. Even those who farm full-time generally have a full slate of things to do, especially if they're raising livestock. ■ The harvest represents an intense period of collecting an entire growing season's worth of solar energy and a rush to put it all into a storable format for use later. Elevators fill up and giant mounds are left on the flat ground outside. The scale of it all is so vast that it really can't be compared to anything else. While fields of corn and soybeans lack the magnitude of a cooling tower at a nuclear power plant or the sheer scale of an offshore oil rig, it's an effort on a scale visible from space. Punctuated occasionally by small towns and larger cities, the whole affair is epic as it stretches for hundreds of miles in every direction. ■ And while the work is facilitated by giant machines with labels like John Deere and Fendt, it still has to be conducted by human beings. And so, the rest of us owe them a tip of the cap as they work to complete an entire year's worth of energy collection in a window of time most of us might easily overlook.
America's higher educational system is reasonably good at channeling a lot of high-achieving young people into professions where quick thinking is highly valued. Lots of smart students find themselves pointed towards professions like medicine and the law, where professionals like emergency-room physicians and trial lawyers are required to think quickly on their feet. ■ One of the things our system is much poorer at doing is channeling high-performance thinkers into solving long problems -- challenges that aren't solvable on a short timeline, but that require sustained, long-range, expert attention. Find a good student who doesn't want to be a doctor or a lawyer (or some similar profession), and there's a non-trivial chance that their college professors and other advisors are directing them towards a research-type degree. Yet, as has been widely noted, is the job market for tenure-track Ph.D.'s has crumbled to the point where only a small handful of doctoral-level scientists end up in tenure-track academic roles. Many end up in the private sector -- which can be great -- but it isn't immediately obvious that the academic training is really matching the desired occupational outcomes efficiently. ■ Even where job security isn't a problem, the drive to take high-achieving individuals and place them into ever-tightening areas of focus often results in people who are profoundly expert at an extremely narrow field of inquiry come up but who are reluctant to have anything to say outside of their immediate authority. This tends to reflect a general deference to other individuals who may be even more expert in that particular narrow field -- but as much as that can reflect professional courtesy, it can also become a hindrance to having well-informed conversations about important public issues. ■ This can be hazardous if it takes talent and attention away from long-term problems and issues that need multidisciplinary thinking. We have lots of people who are extremely good at laser-like focus on very narrow areas, but in many cases, our big problems need broad-based thinkers who can synthesize imperfect information and remain comfortable with the resulting uncertainty. ■ The field of education has responded to this gap with the development of the educational doctorate, a degree frequently awarded based upon "action research" -- a practice that rewards live experimentation on real-world applications rather than laboratory-style testing. Whether other fields will find themselves open to the same kind of practical doctorates remains to be seen, but one can imagine fields like economics or technology where engaged practitioners may be able to advance the state of the art without a Ph.D.-like focus on theory. ■ Evolution in the way that advanced degrees are granted wouldn't necessarily result in placing more attention on long problems. That requires a reward mechanism -- some way to pay for it. But as problems seem to emerge and grow at relatively faster paces than they did in the past, it's essential to find ways to think even farther ahead. Surprises will always emerge, but if we really do think that the pace of change has accelerated, then thinking about systematic ways of looking farther down the road isn't a luxury; it's a necessity.
One of the telltale signs of an unserious thinker is the demonstration that they can't tell the difference between "things I don't like" and "things that are objectively evil". And Charlie Munger's proposed dormitory for the University of California at Santa Barbara has brought lots of unseriousness to light: People are proudly putting their names to assessments like "nightmare", "grotesque", and "a jail". ■ The alarm -- whether serious or not -- is due to the lack of windows in the individual rooms. 94% of the rooms won't have them. Anyone is free to find fault with that design, but only as a matter of personal tastes and preferences. Calling it an "unsupportable" "psychological experiment" is simply too much. ■ People live without exterior windows in a number of environments -- submariners, for instance. And untold numbers of workplaces lack exterior windows, too, from factories to cubicle farms. Objecting to them as a matter of taste is fine, but the real crime isn't whether people are able to have exterior windows in their residences, but whether they are free to do things like learning freely and making choices on their own. The proposed dorm is expected to increase the on-campus housing supply by 50%. Imagine the choice: Live in a private, individual room attached to a shared social space but sacrifice an exterior window, or live in a car. ■ Or, even further, imagine the choice for an international student coming from an unfree country: One might have all the windows they could ever want in an apartment in Xinjiang, but it would be objectively better to have a windowless room at UCSB and the freedom to live away from an oppressive government. One columnist calls the artificial light "dystopian", but the fact is plain: It's perfectly humane to have the freedom to read John Locke or John Stuart Mill by the light of an artificial window. ■ The world isn't perfect, and sensible adults know that trade-offs must be made. It's hard to fathom the thought that the lack of exterior windows would send so many people into apoplexy -- when people willingly pay handsome sums to have interior staterooms aboard cruise ships, and when the point of a collegiate experience is to broaden the mind more than to accommodate a preference to bask in the sun. If a plan like Munger's is what it takes to get more willing minds into a good school, then it's objectively better than the alternative, to deny them the opportunity to grow.
The lyrics to R.E.M.'s "Radio Free Europe" don't make any real sense. The chorus, of course, returns to the phrase again and again, but otherwise the band's debut single never really had anything to do with the international broadcasting agency of the same name. It might have been a missed opportunity. ■ The purpose of Radio Free Europe (the broadcaster) "is to promote democratic values and institutions and advance human rights by reporting the news in countries where a free press is banned by the government or not fully established". And in several of the countries behind the Iron Curtain, Cold-War-era Radio Free Europe reached 30% to 60% of adults every week. ■ There is a nobility of purpose to the handful of international broadcasting agencies that have used the capacity of radio signals to cross borders in order to communicate with people whose governments wanted them to remain in the dark. ■ Humans have a powerful urge to know what's going on. If there's a loud noise, heads turn. If a crowd starts moving, others want to know why. If the lights go out or if storm clouds emerge on the horizon, people are compelled by our nature to look for new information. It is the lack of human interaction that makes solitary confinement so psychologically consequential, and even much milder forms of isolation from "what's happening" are painful to the human psyche. ■ Suppose someone were to apply the mission "to promote democratic values and institutions and advance human rights" not to people living under oppressive regimes, but rather among the public at large in the United States. It is undoubtedly a worthwhile goal: Americans perceive a dearth of faith in democratic institutions generally, and surveys of specific principles of democracy reveal that troublingly large minorities aren't firmly committed to the freedoms of civil society, opposition parties, speech, or the press. ■ In other words, it is sensible to ask what a Radio Free Europe for ourselves might sound like. It would be unlikely to sound similar to either most syndicated talk radio (in which openly anti-democratic voices are overrepresented, especially at the top), nor like the traditional model of public radio (which is gradually learning to broaden its appeal beyond older, left-leaning, and affluent white listeners, but still has work to do). ■ It would need to sound not only broad of mind and curiosity, but also relentlessly live. This is one of the appeals of cable news programming: To have the television tuned to CNN, Fox News, or MSNBC is to feel as though the window is open to the world all the time -- even though those networks tend to have the effect of isolating their viewers inside politics-saturated echo chambers. Life is far more vast than just politics, and it's certainly more than the two-dimensional perspective most often served up in "reporting from Capitol Hill". ■ The mission itself is simple, and more important than we likely give it credit for being. It's a common mistake to think that people are attracted to programming on television and radio because of the content, when in fact most of the appeal -- perhaps 80% of it -- is in the delivery. People are not attracted to Tucker Carlson or Lawrence O'Donnell or Sean Hannity or Rachel Maddow because they are saying anything new -- they're tuning in for the equivalent of mental comfort food. ■ If civic-minded funders and organizers really got behind it, a truly vital (as in, both lively and important) programming stream could be created to feed that intense public hunger to feel connected to what's happening right now, while promoting democratic values by talking about the many non-partisan aspects of life, from science and health to money and technology to entertainment and even just the weather. It takes thinking about what ought to be said first, then finding the right fit for how and by whom it could be said in an engaging way. (The BBC has found a way to make a popular show about math; Americans are capable of doing the same.) ■ That we lack such a conversation is a shame, and it's one we once had the initiative to address where it was missing abroad. The evidence is mounting that Americans need to think about looking inward with the same eye to promoting worthwhile values.
There is a nugget of wisdom that says "If you only study the last battle, you won't win the next war." That doesn't mean it isn't worthwhile to study the past -- indeed, a thorough knowledge of history is essential in almost every worthwhile field of human endeavor. But circumstances change, and consequently so must the ideas brought to bear on present and future problems. ■ Nobody wants to imagine a shooting war with China. Any kinetic exchange of ordnance has the potential to be unfathomably costly, perhaps on a scale we've never seen before. The United States is a wealthy and technologically sophisticated country, but China's government has increasingly devoted both funding and technological resources to its armament, too. ■ And the potential for crossed signals and other instigators of conflict is vast: Territorial ambiguities are many, tests of those differences are primed to occur at jet speed, and at least some portion of America's policy in the region depends upon "strategic ambiguity". ■ Against this backdrop, it is alarming (even if not especially surprising) to see that China's military is building targets modeled on US aircraft carriers and destroyers in an area where ballistic missiles have been tested. ■ Dwight Eisenhower advised in his first inaugural that "[W]e Americans know and we observe the difference between world leadership and imperialism; between firmness and truculence; between a thoughtfully calculated goal and spasmodic reaction to the stimulus of emergencies." That was nearly 70 years ago. The principle, of course, ought to remain valid today. Yet we shouldn't fall prey to the tunnel vision that would tell us we only have physical targets and kinetic weapons to worry about. ■ It has long been a fiction that China's regime has a long-term plan for the future. It certainly has objectives, but it actual behavior all too often reveals a lack of understanding of the difference between international cooperation and a sort of modern incarnation of mercantilism. It's one thing to have end goals; it's another to have principles that can be trusted to lead to the right destination. ■ We may well be several years into a dangerous game in which we have largely sleep-walked. There is no reason to wait any longer to wake up and think both clearly and broadly about the principles of friendly cooperation and strategic creativity that are urgently needed for our own security.
One of the most exceptional documentary series of the modern era was "The Commanding Heights", which came out in 2002. The short series laid out the essence of the 20th Century contest between free markets and Communism, and identified the long, consequential path to a largely globalized economy. ■ In particular, one of the most notable moments featured in the documentary is Margaret Thatcher's visit to Gdansk to meet with the leaders of Poland's Solidarity movement. In a moment she had most likely considered in advance, Thatcher implored the Solidarity leaders not only to know what they wanted, but how they expected to get there. ■ Thatcher's words make for unbeatable advice: "How do you see the process from where you are now to where you want to be? [...] It's not only what you want, but how -- the practical way you see it coming about." She trails off into a recommendation phrased as a hypothetical event: "[W]rite down the ten steps from where you are now to where you want to be." ■ What Thatcher understood deeply was the old maxim: "A dream without a plan is just a wish." And she knew that the difference between those mattered, not only for those with power, but for those around whom power hadn't coalesced yet. More than most other things masquerading as aspects of leadership, the essence of leadership is the plan -- whether ten steps or a few more or less -- and the ability to transmit that plan as a vision which others can share. ■ Thatcher may be out of vogue among those who spend their time railing against the purported "neoliberal consensus", but her essential advice really ought to prevail in some of our most vital debates today. ■ As the UN Climate Change Conference proceeds in Glasgow, there will be talk of achieving climate-change goals like "accelerat[ing] the phase-out of coal" and "accelerat[ing] action to tackle the climate crisis through collaboration between governments, businesses and civil society". Those may be fine goals, but the events of "COP26" will remain esoteric and inaccessible to the general public unless translated into those clear, practical "ten steps from where you are to where you want to be". ■ Determining those steps -- and making them sufficiently achievable that people can actually recognize the changes and measure the progress toward them -- is up to individual leaders in particular countries. It's up to national-level leaders to outline a clear set of achievable steps and then relentlessly beat the drum on the march to their achievement. ■ The same goes for problems like Covid-19. Part of the fatigue that has set in -- and some of the bad blood that has developed -- is because we in the United States really haven't coalesced around a well-articulated vision of what it means to achieve incremental victories along the way to making the disease retreat from "life-altering emergency" to "persistent nuisance". It's obvious that eradication isn't likely to occur (hence the talk of shifting from "pandemic" to "endemic"), but we need to have mileposts along the way. ■ Those mileposts -- like achieving a 60% adult vaccination rate or developing an antiviral drug to make the disease survivable under most conditions -- are crucial for telling people that progress is being made, and for rewarding their commitment to sacrifices along the way. Scientists may need to be cautious, but public leaders need to be able to set achievable goals around the science so that issues like "forever masking" don't become flashpoints that sap the initiative to keep moving ahead. ■ Michael Bloomberg once wrote that "Humans need to see results in time frames they can handle." He was talking mainly about business advice, but it turns out the advice applies to other aspects of life, too. We intrinsically need to know not only where we're going, but how we're going to get there. And, especially on a long path, we need to know that we're making progress along the way. Politicians don't have to agree with Margaret Thatcher's political philosophy to take a page from her playbook. Most certainly, they should.
Americans don't really fall in love with corporations, but we do adopt some of them -- less like pets, more like barn cats. We accept their utility and don't mind seeing them remain a little hungry, but otherwise fed well enough to keep doing their work. We don't want them close and cuddly, but a sort of friendly symbiosis is welcome. ■ General Electric is one such barn cat. From having its logo on everyday lightbulbs to its once-giant presence in the Manhattan skyline, GE has long been the default "industrial" name in America. ■ News that GE is breaking up seems like dirt on the grave of the conglomerate structure. The three resulting companies will independently focus on health care, energy, and aviation, which seems to be the kind of thing that financial managers reward today. ■ The conglomerate form has been in retreat for some time -- ITT has broken up twice: First in the 1990s, then again in 2011. Gulf + Western spun off and slimmed down in the 1980s. United Technologies is no longer united. ■ And yet, it's not the conglomerate format itself that is fundamentally flawed. If a corporation can be managed by people with a well-tuned skill for capital allocation, then it makes all kinds of sense to capture the profits of individual companies under a broader corporate umbrella and deploy them where they can earn high returns. Matters like the cost of capital and the prevailing tax policies of a country can make big differences in how much incentive exists to spur the formation of conglomerates, but let's not kid ourselves: Conglomerates exist, even if they don't appear as known companies in the stock market. ■ When a money manager claims to be able to set up an actively-managed mutual fund, a private-equity fund, or even a private "wealth management plan", they're promising that they can assemble groups of businesses under a conglomerate-like umbrella in order to maximize returns. It's not that Americans are opposed to conglomeration -- it's that one form is unpopular and another happens right under our noses without so much as a nod of recognition. If you don't think the $130 billion Fidelity Contrafund is a conglomerate, you're just not using your imagination. ■ A handful of skilled capital-allocators will always have a chance to shine in this world -- people like Royal Little, Bob and Larry Tisch, and Jay and Robert Pritzker. Warren Buffett may be the only household name among them living today, but he won't be the last. As Buffett's partner Charlie Munger said in 2014, "We think the conglomerate model works very well when you do it right." ■ GE may have decided to pull the plug on conglomeration, but don't think that the sun has set on that corporate form altogether. In a sense, some of today's high-tech corporations (like Alphabet) are the likeliest candidates to become the next generation of true conglomerates. But new ones will be formed out of old companies, too. And the better we realize that it isn't the fault of the form that GE and others have broken up, but rather the fault of the managers at the helm, then the more likely it is that we'll adopt a few more useful barn cats along the way.
The under-appreciated science of psychology has grown quite a lot in its appreciation of meaningful work, particularly in the last decade. Even though there is vastly more work to do in this field, there is considerable evidence to suggest a widespread desire to feel like constructive, contributing members of human society. This knowledge is particularly illuminating -- and potentially alarming -- in light of the accelerating pace of retirements among the Baby Boomer generation. ■ Nobody is really surprised by the retirement wave; it's been obvious from even a cursory glance at a population pyramid. If there was a surprise involved, it was only that the Covid-19 pandemic came along and accelerated the pace of departure. ■ Surprise or not, as a country and as a culture, the United States needs to consider novel ways of helping people to cultivate that sense of having meaningful work to do. "Unretirement" is a real thing, too: Sometimes out of necessity, but also sometimes out of psychological need. A Pew Research Center review of the data found that, on average, Americans over age 60 spend half their waking hours alone. For many, it's even more than that, especially if they live without a spouse in the home. ■ Everyone needs at least some time alone, but too much time without others can lead to social isolation -- and we've witnessed the consequences of people having too much time to sit by themselves, with social media and mass media filling the connection gap with low-quality pseudo-interaction. ■ Age is often nothing more than an artificial barrier to doing productive things, either at work or in a volunteer environment. Norman Borlaug was awarded a Nobel Peace Prize in 1970 at 56 years old, and went on to work into his 90s. Betty White has acting credits in her late 90s. Benjamin Franklin was, at 81, the oldest delegate to the Constitutional Convention -- and no slouch. ■ What we don't do very well is find roles for our sages -- both within our working environment and outside it (where much of life is lived). It's hard for firms and organizations to grow if people cannot see a viable path to senior leadership, which is why companies often have executives step aside in their 60s. But there really must be sensible ways to help people engage in meaningful productive activity even when it's time for the next generation to take the reins. (Prince Charles, about to turn 73, would likely agree.) ■ Senator Ben Sasse has spoken eloquently of the need to address the matching of people to meaningful work, which is a challenge already in an economy that is growing more skill-dependent all the time. We need to be conscious, too, of the generational dimension at play: Not only will people need to find ways to adapt and grow while they are in the conventional workforce, many will seek meaningful work (and work-like) things to do after reaching conventional retirement age. It is a double-sided coin, too: In order to remain a valuable contributor, most people will have to continue learning new skills along the way. Outside of hereditary monarchs, most occupations inevitably evolve with time. (Good news for Charles, but bad for everyone else.) ■ There has to be something better than just tacking the word "emeritus" to a person's last career and having them fade out. This is not something easily resolved by a big bill passed through Congress; it's likely to be more responsive to a bottom-up approach. But finding a place for our sages that doesn't look like a permanent Spring Break trip could well be one of the most useful things American business thinkers could do, not just for the economy, but for the well-being of society.
Even decent institutions sometimes cave to the pressure to generate buzz by putting clickbait on the Internet, and there's no surer way to get engagement than by promising that people can discover something about their own identities by taking a quiz. The Pew Research Center -- an honorable outfit -- has offered just such a purported window for self-discovery with their political typology quiz. ■ The quiz itself isn't particularly good nor bad; it is, as all such quizzes are, fairly reductionist. The first question asks nothing more than whether the quiz-taker would rather have a "smaller" or "bigger" government. Vast enlightenment does not follow. In its reporting on the use of this instrument to survey a sample of American adults, Pew says it can identify nine "typologies" of American voters, arranged on a conventional left-right spectrum. ■ The problem with bunching people as "Faith and Flag Conservatives" and "Democratic Mainstays" and "Stressed Sideliners" isn't that people are immune to this kind of bunching; it's obvious that American political parties are as coalitional as their European counterparts, with the difference being merely that Americans form coalitions before our general elections rather than after. ■ No, the real problem is that the typologies aren't very illustrative. A far more interesting taxonomy of American politics would survey who among us are "Wilsonian Activists", "Jacksonian Populists", or "Madisonian Federalists". ■ A joke? Not at all. American politics really don't change as much as we think they do. The individual issues may vary with the times, but people tend to align with certain consistent themes: Whether they want an activist government that tells them all of the ills from which it will offer to free them, or a limited government that leaves them free to make their own decisions. Whether they want to cast their lots with the will of a majority, or to stand up for pluralism as a good in and of itself that sometimes trumps a popular vote. Whether they want an America that looks after itself and its own regardless of the world around it, or one that engages with the international community in the interest of buttressing a favorable order in the long run even when it comes at a short-term cost. ■ Politicians and parties change positions, sometimes on a dime. But finding the deeper instincts, predispositions, and beliefs that animate how a person relates to the very idea of politics would actually tell a lot more than whatever this year's passing flavor of "typology" happens to be. ■ Just for example, a pro-trade, limited-government internationalist would have been on the right up until a hot minute ago, but those views hardly square in a coalition with those the Pew survey calls the "Populist Right" and "Faith and Flag Conservatives". "Right" and "left" are not only relative terms, they're so malleable that we can hardly agree upon what "conservative" and "liberal" even mean. ■ It may be a little too easy to slip into seeing everything that happens in America today through the lenses of Alexis de Tocqueville and the Federalist Papers, but they probably have more resonance in describing the "how" and "why" of what many American voters actually think -- even if the voters themselves are not conscious of the influences -- than anything that seeks to reduce ideas down to a simplified left-right spectrum. ■ Humans, it turns out, are animated by human nature. And human nature changes far less than the winds of technological progress and current events. It wouldn't hurt us to be more conscious of anchoring our understanding of identities in those things that remain steady, if not permanent, about who we are. Not that useful as clickbait, but far more predictive of behavior in the long run.
If vaccines merely reduced the amount of total human suffering in the world, that would be enough to commend them to widespread use. But they don't do just that: For communicable diseases, they can be exceptionally cost-effective measures. If a Covid-19 vaccine costing less than $50 a dose cuts just one $42,200 hospitalization in a hundred, that's still a huge net return of more than 8:1. ■ But vaccines aren't the only preventative medicines. While we're paying acute attention to one particular public-health event, it's worthwhile to examine whether we ought to be looking to other circumstances where small investments in prevention could have highly-leveraged returns. The fluoridation of public water, for instance, pays off 20:1 in dental-treatment savings. ■ We ought to give serious consideration to the possibility that society could end up with an attractive return on investment if we were to institute a prevention-first approach to mental wellness. ■ The consequences of the pandemic have been far more than just physical. The social and emotional consequences have been such that the American Academy of Pediatrics and other professional organizations published a statement to "declare a National State of Emergency in Children's Mental Health". ■ Because we are social beings, we have a greater interest in one another's mental well-being than in almost any other aspect of health. Another person's broken ankle, root canal, or kidney disease represents a misfortune, but it rarely has broader consequences for other people. The health of our brains, by contrast, can affect others quite a lot. ■ As Americans, we have an utterly terrible track record of approaching mental health with the seriousness and objectivity it deserves. The terrible and nearly universal consequences of the pandemic are such that, at long last, perhaps we are gaining some ground on realizing that mental wellness isn't an on/off switch, in which everything is fine unless diagnosed otherwise. Everyone lives on multiple continua of mental wellness, from conditions that emerge from the chemical structure of the brain through environmental factors, from the permanent to the temporary. And all of them are worthy of respectful attention rather than stigma. ■ The brain is a powerful organ, responsible for 20% to 25% of a normal adult's metabolism and 100% of their consciousness. The proportional physiological demand is even greater in children. And since the overall state of the brain cannot be as easily measured as blood pressure, we ought to be prepared to invest accordingly in seeking out thoughtful ways to ask that most basic of human questions: How are you doing? ■ A stethoscope can't really answer that question, so it's well worth considering what kinds of specialists we ought to be cultivating -- and perhaps even funding -- so that we don't accept as "normal" a serious shortage of necessary professional care. If it's sensible for people to go in for a physical checkup at the doctor's office once a year or to visit the dentist for semiannual cleanings, then wouldn't it also be sensible for everyone to have a therapeutic check of their mental wellness at least once a year, too? Just as we consider the "physical" a routine part of life, so we ought to consider how a "mental" should be just as commonplace, too -- long before we reach a state of "national emergency".
It seems never to fail that whenever the word "inflation" enters the contemporary conversation, the challenge of the Federal Reserve's "dual mandate" comes to the forefront. This dual mandate is an assignment from Congress to simultaneously seek maximum employment and price stability. ■ Critics find some easy historical references to support them, like Matthew 6:24 ("No one can serve two masters") and the laundry list of two-front wars that exhausted the states fighting them. ■ Yet the world is full of dual mandates (and even multifactor mandates), and nothing about them is necessarily impossible to execute. They do, however, demand some modesty about how we approach them and how much we can expect of the outcomes. ■ Purchasing a vehicle, for instance, isn't something that comes down to just one dimension: The choice is usually a balance of capability (carrying the right number of passengers and the right type of cargo), fuel efficiency, resale value, and comfort, among other values individuals are free to weigh. If a single purchasing decision requires such a dynamic evaluation, surely so does the economy -- the aggregate of all discrete purchasing decisions. ■ Milton Friedman famously hypothesized that the money supply would ideally be set impartially and dispassionately by a computer to respond to obtainable economic data. A stable, predictable rate of increase in the money supply (as envisioned by Friedman) would be ideal. Yet, even a perfectly divined program for calculating the size of that money supply would require constant human intervention: Someone would have had to program it to analyze inputs and outputs in the first place, and then recurring interventions would be necessary to update it to reflect new judgments about how to value those inputs and outputs. The price of leaded gas no longer matters, but the price of a smartphone does. ■ In other words, a dispassionate computer may seem ideal, but it is impossible to remove the humans from the system. We create the data, measure it, and decide how important it is -- even artificial intelligence can't be trusted to do that for us. And what the Federal Reserve is being asked to do right now is a gargantuan task. The money supply grew in extraordinary ways to get past the pandemic shock of 2020, and Americans have been behaving strangely with our money ever since 2008, moving our dollars slower than at any time in modern history. ■ Any change to that velocity of money is going to have a hugely magnified impact. As with most money matters, leverage counts. And in this case, the leverage of a big supply of money moving anywhere close to historical rates (rather than at the extremely abnormal snail's pace of the last 13 years) could make it hard even for a highly-skilled Federal Reserve to pull away the punch bowl just as the party gets warmed up. ■ That's neither an enviable position to be in, nor a condition anyone can be expected to navigate deftly. And it all takes place against that second part of the dual mandate -- maximum employment -- which is tough to measure against a long-term decline in the labor force participation rate (in no small part due to generational turnover) overlapping with a one-time shock adjustment downward, from which it does not appear that a full recovery is going to take place. Some people left the workforce in 2020 and just may not ever come back. ■ Someone will soon be nominated to chair the Federal Reserve for the next term, and it could be the incumbent, Jerome Powell. Whoever it is will likely have to navigate one of the most complex (and highest-stakes) economic experiments of all time, balancing the interests of a big population living on investments and fixed incomes (and thus historically hypersensitive to inflation) with a large wave of early-career workers with high expectations for career opportunities and economic expansion. ■ If the Federal Reserve had only a dual mandate to navigate, it would be a challenge -- but to respond to both in the midst of competing intergenerational interests and unprecedented changes in fundamental data will take a great deal of wisdom and more than a little bit of luck. The closer they can emulate the steadiness of Friedman's computer while making necessary adjustments to the programming along the way, the better for all of us.
If it is indeed true that a changing climate is likely to cause a greater frequency of extreme or severe weather events, then we ought not only to take steps to try to reduce the causes but also to mitigate the effects. (Besides, unless China's output of carbon dioxide is set to be radically curtailed, even if the United States achieves net-zero emissions, the situation will still get worse.) ■ One (relatively) small investment that could help is to increase the number of National Weather Service radar installations, particularly in the Midwest and in other locations prone to the most extreme events, like tornadoes. It's perhaps surprising but true that large portions of Tornado Alley (and adjacent regions) remain far outside the effective low-elevation reach of modern Doppler radar installations. ■ The problem isn't one that can be overcome with technology, because its cause is the curvature of the planet. Earth is round, which means that any straight line projected laterally from anywhere close to the surface in one location will ultimately end up elevated well above points far away. This is a particular problem when it comes to severe weather, because we care less about what's happening way up high than we care about what's happening close to the ground. ■ And while it's possible to infer some things about what's happening close to the ground from the signals bouncing back from high up in the clouds, it's better to get the data than to "read between the lines". There's good reason why some of the most sophisticated tornado researchers have portable radar units they can deploy right up close to the action. But while this research is fascinating and has great potential, it's not practical to hope that a portable Doppler will be in the right place at the right time to capture an emerging storm. They're tools for research, not for 24/7 surveillance. ■ Much is said about the consequences of climate change on communities that are, for one reason or another, regarded as disadvantaged. At present, your chances of having good coverage from a radar system are pretty good if you live in a big population center, and your chances of being left with lesser coverage tend to be stronger if you're in a more remote location. Improving the radar coverage for those outlying areas would be one way to offer a form of equity that may be growing in importance if we truly are on the verge of seeing a more extreme weather environment. ■ It wouldn't seem to cost all that much in the grand scheme of things -- certainly not against the backdrop of a trillion-dollar infrastructure plan. In Iowa, the gaps in near-surface radar coverage could probably be filled just by adding installations at Storm Lake, Mason City, and Vinton (midway between Cedar Rapids and Waterloo) -- places each about 100 miles away from the nearest installations, and located where they could serve populations suitably large to make the coverage helpful. Similar sites could be named for other states with high severe-weather potential. ■ Plenty of other technological advances are either here or on the way that will help to make forecasting and real-time weather surveillance better. But filling some of the gaps in our coverage -- and getting all of the most tornado-prone parts of the country covered down to the 3,000-foot elevation -- seems like the kind of investment to begin taking seriously now, before the worst of the long-range forecasts may come to bear.
The atmosphere of the Internet Age means that it's almost impossible to go through a week (or even a day) without leaving behind some kind of digital footprint. Text messages, Facebook comments, Tweets, Instagram captions, emails, Reddit comments, blog posts, and seemingly countless other vectors make it possible to put a comment on the record, even unintentionally. Even those who aren't trying to leave behind a digital record can easily and inadvertently do so by making an announcement in a church bulletin, responding to a public official in a format subject to FOIA, or speaking up at a PTA meeting where minutes are taken. What happens in the analog world still ends up being digitized at a high rate. ■ It wouldn't matter quite so much if it weren't for the compounding factors of persistence and searchability. That offhand blog post from Y2K could easily have been archived by the Wayback Machine or cached by Google, and if you don't know how to ask, it may never be removed. Searching through the past may have been daunting before, but decades-old newspapers are available from any smartphone or laptop, and can be searched en masse in an instant. No longer is the past lost to old file drawers full of microfiche. ■ So not only is it easier to unintentionally (or unthinkingly) create new content in this age than at any time in the past, that content can be replicated and stored forever and searched with increasing ease. Artificial intelligence will soon swallow the world's massive archives of audio and video recordings, translate them into text, and make them infinitely searchable, as well. ■ As a result, we need to become, by default, more forgiving of the things that people say and write. One ill-considered or malformed comment on Twitter could easily be seen by more people (and, even if deleted, live indefinitely as a screenshot) than a letter to the editor in the New York Times a mere generation ago. It's so easy to stumble into 15 minutes of fame that there is a stock response to the experience of going viral: "Check out my SoundCloud", itself a reference to even further grasping at celebrity. But the strange alchemy of this chronic drive to "create content" and the strange way that content sticks around practically forever is leading to a whole cottage industry in the weaponization of words. The internecine warfare of the College Democrats of America, fueled by text messages and tweets from its combatants' childhood days, is merely a taste of the emerging status quo. ■ Put simply: If you expect anyone to grow up now or in the future without having left anything regrettable on the record in their past, you're bound to be disappointed. Either they will have been carefully groomed by their parents from birth (think modern-day versions of Joseph Kennedy's clan), or they will have been so milquetoast and unremarkable as youths that no one will have paid them any attention (and thus would have been unlikely to find their way into interesting roles). If we intend not to become a nation led by weirdos, we're going to have to make a cultural habit of turning a blind eye to more of the indiscretions of youth. It's unlikely that the rising generations will actually do more stupid things than their predecessors -- they'll just be documented digitally. ■ None of this means we should give a free pass to anything a person says under the cover of youth, but it does mean we ought to recognize that the entire point of isolating juveniles from some of the consequences of the adult world is to help them learn to navigate choices and ideas. Not only are youthful brains not fully developed, but they simply haven't had the time to cultivate wisdom. An idea may seem appealing merely because it's the first one to which the young person has gotten exposure -- whether it's from Karl Marx or Ayn Rand. It takes time to seek out, digest, and draw conclusions from competing points of view. If we didn't intrinsically believe that, then we'd repeal the portions of the Constitution that set increasing age thresholds for serving in the House, Senate, and Presidency. If we want well-developed leadership in the future (and even right now), then easing our expectations of the words of others would be a decent start.
It's easier to joke about uncomfortable topics than to address them directly. Watching "Weekend at Bernie's" or "Six Feet Under" is preferable to most people than having a frank conversation about death. Using scatological words to express frustration or as a punchline is far more common than speaking directly about the value of sanitation. As a result, it's hard for a message like World Toilet Day to break through. The natural defense mechanism against discomfort gets right in the way of the gravity of the matter. ■ Yet still: Out of a global population of 7.8 billion, a full 3.6 billion people live without safe sanitation facilities. ■ Sometimes it helps to reframe a problem to give it proper perspective. If instead of it being a chronic problem for some of the world and almost never a problem for the rest, what if the entire world had sanitation facilities, but they failed to work 46% of the time? Any air traveler knows that you wouldn't want to take even a 10% chance of being stuck next to a malfunctioning lavatory, much less a 46% chance. The very notion would be intolerable. ■ One of the main obstacles to taking the problem seriously, at least in the United States, is that we usually punt the issue of water quality to authorities whose mission falls under the banner of "the environment". Water quality is supervised at the Federal level by the EPA -- the Environmental Protection Agency. And in many states, the regulatory authority for those services is named a Department of Natural Resources (DNR) or a Department of Environmental Quality (DEQ). But the plain fact is that Americans don't really care about the environment. We pay it considerable lip service, but we don't treat it as a priority because it falls into the tragedy of the commons. ■ What we do care about is our health. And the simple proof is to look at virtually any household budget. The amount spent on health (whether for health insurance, doctor's visits, gym memberships, fitness equipment, diet foods, nutritional supplements, or any other related goods and services) is virtually certain to outstrip the amount spent on the environment. This gap is why the safe supply and disposal of water must be addressed as a matter of public health, not the environment, if it is ever to be taken seriously. ■ Rare American encounters with malfunctioning sanitation systems often become the fodder for jokes in late-night monologues and radio morning shows, like the 2013 Carnival Triumph debacle. But well-functioning sanitary systems protect people from dreadful diseases like cholera and dysentery, not to mention making it possible to wash our hands and clean our dishes. The World Health Organization estimates that the death toll is over 800,000 a year from the diseases left behind when waste isn't safely conveyed away. ■ Progress is being made on a global basis, but not very quickly. Surely not as quickly as one might expect for the cause of more than 800,000 deaths per year. The enormous, double-decker Airbus A380 carries as many as 575 passengers. If one of those jumbo jets were crashing every six hours and killing everyone aboard, the world would take action immediately. Yet that same toll is being taken by the world gap in sanitation, and the best we can seem to muster is to label it "World Toilet Day". ■ Safe, reliable sanitation is clearly a poverty-linked problem, but taking it seriously demands that the wealthy countries of the world see it as it is: A matter of health, not of the environment -- and a matter of real consequence, rather than one to avoid with the help of punchlines. Words matter. Awareness of the problem is clearly necessary -- but a health-centered, pro-sanitation mindset is the only path ahead, and that change must begin at home.
When people use the phrase "Orwellian", they often intend for it to suggest a surveillance state -- one in which "Big Brother is watching you". While there are plenty of reasons to be concerned about the creeping surveillance state -- especially as it is imposed both at home and abroad by authoritarian regimes like the one with power in China -- that isn't the only way in which circumstances can be Orwellian. ■ The surveillance state is indeed an instrument of terror, but so is the other aspect of the Orwellian condition: One in which the language itself is bent so as to destroy meaning and vex understanding. It is in this latter sense -- perhaps even more than the former one -- that we should be alarmed by the "Metaverse". ■ While the corporate name change at Facebook to "Meta" is largely interpreted as a matter of smoke and mirrors, the name itself is Orwellian in the latter sense. Mark Zuckerberg wrote in his announcement of the name change that "I used to study Classics, and the word 'meta' comes from the Greek word meaning 'beyond'." ■ The notion is that this "metaverse" (in which Facebook/Meta wants to be a major stakeholder) will be a sort of one-stop shop "to do almost anything you can imagine", a virtual enhancement -- or perhaps a surrogate -- for the world we currently recognize as reality. ■ But in the sense it is being pitched, the "metaverse" is less the removal of arbitrary limitations and more of a constraint around an experience of reality. It's really a "microverse" -- like a snow globe, or a ship in a bottle. ■ The distinction is important. Like a ship in a bottle, an online environment can be crafted with precision. It can be made nearly perfect. It can even represent an entirely mythical or fantastical place. But it cannot be the reality a person inhabits completely. It is not "beyond" -- it can only be "in addition to", and only then, within constraints. ■ Some connections facilitated by the Internet have a wonderful place in our world. The ability to share interests, to overcome artificial barriers (whether of geography, physical abilities, or of brain differences), or to maintain contact with acquaintances is all quite good. ■ But no "metaverse" can take the place of meaningful human connections that are untethered to whatever some computer programmers in Menlo Park are willing to hack. It is generally good to have more people who will celebrate your triumphs and commiserate in your sorrows. But there is no number of acquaintances that can fully substitute for having people who would be willing to donate a kidney if you needed one. No metaverse is required to sustain those relationships -- nor can it create them at scale (the aid and influence of dating apps notwithstanding). ■ So, while we should be cautious about the surveillance risks of our online interactions, we ought as well to be skeptical of the other kind of Orwellianism, too: The one that says technology can substitute for reality by going beyond it. Like a ship in a bottle, a digital microverse may be a thing of beauty -- but it is inevitably smaller than the real thing.
No matter how left-wing a person's politics might be, one of the most illiberal things they can do is refuse others the right to be wrong. This distinction is important, especially because Americans far too often substitute the word "liberal" when we actually mean "left". Real liberalism -- that is, open-mindedness and willingness to tolerate the opinions of others -- is not a mindset that fits tidily into a left-right spectrum. There are illiberal right-wingers and illiberal left-wingers. And, unfortunately for our times, there are many of both. ■ Lots of people change their minds over the course of time and it's important to give them room to have been wrong in the past so that they can correct themselves going into the future. Regrettably for our times, there is an entire genre of online commentary devoted to haranguing people over past choices and taking the time to insist that they are too tired to argue. ■ The failure to give others the room to be wrong -- and, more importantly, to correct themselves and acknowledge that their minds minds have changed -- is a matter of mental fixity. If we aren't capable of changing our minds, then what is the point of individual liberty? People have to be ready, willing, and able to persuade one another and be persuaded. The notion that all it should take to convince our fellow Americans is a 280-character tweet or one Facebook meme shared with the "praise hands" emoji is utterly insufficient to the notion of self-government. ■ The essence of the American experiment is just that: It is an experiment. Because we have no certainty about the way things will turn out, we have to be able to make changes over time. Those revisions, changes, and improvements are exactly what have marked most of our most important national milestones. America was born with errors -- including grave ones, like slavery. Yet the country was also born with the capacity to self-correct: The inclusion of the amendment process to the Constitution was a statement of humility by its authors. They knew some things would turn out to be wrong, so they included a process for institutionally changing our minds. And while the power of the pen alone wasn't sufficient to overcome some of those original obstacles, it was most certainly necessary. ■ America owes important steps in its growth and improvement to epic acts of persuasion. The Federalist Papers, the Lincoln-Douglas debates, the "Cross of Gold"; speeches made by the likes of Frederick Douglass, Sojourner Truth, and Martin Luther King, Jr.; Presidential addresses, from Washington's legendary farewell address, to FDR's fireside chats, from John F. Kennedy's exhortation for America to go to the Moon to Reagan's demand that Gorbachev "tear down this wall". ■ Every speech, essay, pamphlet, and editorial in this vein is itself a declaration of understanding that others are rational individuals, capable of being persuaded when they see the error of their ways or when exposed to persuasive new facts and arguments, and that they are capable of coming around to the truth. ■ If people don't give others the room to be wrong, in the past and even in the present, then what do they actually believe? Most minds are not fixed in amber: They are malleable, as they should be. We can anchor ourselves to bedrock principles while remaining open to the power of new ideas and perspectives. As much as any of us believes it for ourselves, so must we insist on believing it of others, too.
One of the strange practices of the United States Congress is the naming of bills with contorted titles that can be compressed into snappy acronyms: The CARES Act ("Coronavirus Aid, Relief, and Economic Security") or the USA PATRIOT Act ("Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism"). The practice has an obvious origin, but it still results in linguistic contortions. In general, though, even the titles derived from backronyms at least seek to remain moderately faithful to the bill's intent. Words matter, and so does the accurate representation of what government does. ■ The Energy Department has been ordered to release 50 million barrels of the nation's 605 million barrels held in the Strategic Petroleum Reserve. The amount would exceed any of the notable previous sales by a large margin. The intention of the sale is to put some weight behind a coordinated effort to put pressure on OPEC and check some of the substantial growth in energy costs affecting the economy. (Energy prices have risen by 30% in the last 12 months, and gas prices have risen by 50%.) ■ Putting 50 million barrels into global circulation may not seem like much in contrast with 90 million barrels of global production each day. But marginal economics are funny things, of course: Depending on how the supply and demand curves are intersecting (and at what slopes), a release can have a disproportionate effect on average prices. That could be pleasing for consumers. ■ But if price stabilization is the new de facto purpose of the Strategic Petroleum Reserve, then perhaps a name change is in order. One would expect from the word "strategic" that the reserve is meant for use against major global events that would disrupt the flow of oil, like the 1991 Gulf War. ■ Being straight with our language is important, especially in the context of major government policies. The Defense Department still uses something like 85 million barrels of fuel a year, and has been working on a plan to make biofuels a dependable source. Until we reach that point, how much reserve is enough? Now that the United States is a net petroleum exporter, for how long should we continue to view a reserve as necessary in case of a global supply shock? ■ These are all questions that deserve clear thinking -- particularly because there is already so much ambiguity involved in considering them. There doesn't have to be any malice involved in a name that outlives its usefulness. But if we're going to use the Strategic Petroleum Reserve for something other than a strategic purpose, then we ought to revise its name to reflect reality. ■ Strategic thinking is valuable and indeed necessary for a superpower. Tactical steps are considered on a much smaller basis (thus, the distinction between strategy and tactics). Having a ready supply of fuel stored to make a country immune to being disastrously cut-off from energy supplies is a decidedly strategic choice. Trying to nudge prices by releasing stored oil into the market is a tactic. That doesn't mean it's the right decision or the wrong one -- but it does call us to use the language carefully. If everything is strategic, then nothing is.
He didn't intend it as a Thanksgiving prayer, but Dwight Eisenhower offered a humble benediction at his first inaugural, entirely fitting for a gathering of Americans on Thanksgiving 2021: "Give us, we pray, the power to discern clearly right from wrong, and allow all our words and actions to be governed thereby, and by the laws of this land. Especially we pray that our concern shall be for all the people regardless of station, race or calling. May cooperation be permitted and be the mutual aim of those who, under the concepts of our Constitution, hold to differing political faiths; so that all may work for the good of our beloved country and Thy glory. Amen." ■ Gratitude, we are told, is good for the brain: Expressing thanks makes grateful people happier. Thus, literally giving thanks may be its own reward -- as surely many Americans feel rewarded after enjoying the traditional holiday feast. ■ Those who approach the Thanksgiving holiday as a minefield of potential conflicts with family members, especially over politics, may well need to heed not only Eisenhower's words about finding cooperation, but to look all the way back to George Washington's proclamation of the very first Thanksgiving holiday. ■ In addition to commending Americans to give thanks to "that great and glorious Being" for "his kind care and protection of the People of this Country previous to their becoming a Nation", Washington added a prayer "To promote the knowledge and practice of true religion and virtue, and the encrease of science among them [other nations] and us". Washington heralded "the means we have of acquiring and diffusing useful knowledge" and asked God "to enable us all, whether in public or private stations, to perform our several and relative duties properly and punctually". ■ Both Washington and Eisenhower asked Americans to petition the supreme being -- not for wealth or power, but for practical wisdom and better judgment. Abraham Lincoln's 1863 Thanksgiving proclamation similarly asked for prayer "to guide the counsels of the Government with wisdom", and John Adams petitioned God to "enlighten them [officeholders] to a just discernment of the public interest, and save them from mistake, division, and discord". Calvin Coolidge implored Americans "to render thanks for the good that has come to us, and show by our actions that we have become stronger, wiser, and truer". ■ That is, perhaps, the highest form of prayer: Not just to thank the Creator for past prosperity, nor to plead for greater abundance, but to ask for the capacity to do more good, be more just, and act more wisely. Articles like "How to talk to your Republican uncle at Thanksgiving" or "How to fact-check your family at the Thanksgiving dinner table" are parts of a familiar genre, but they are outside the traditional spirit of the holiday. The real wisdom is with those who use a gathering of family to aspire to learn more and serve better.
Certain pieces of advice, through repetition and familiarity, gain a veneer of respectability they don't deserve. "Live each day as if it were your last" has been said so often it is treated like a profundity. And it has a certain pedigree, through Horace's "Carpe diem" and Seneca's "The one who puts the finishing touches on their life each day is never short of time." ■ But it is flawed advice, both on its literal surface and deeper down. If a person were literally to live each day as if it were the last, then no plans would be needed for tomorrow: Forget paying the mortgage or flossing your teeth. Living each day like that would be its own brand of insanity. ■ Deeper down, it remains faulty advice. A dying grudge is nothing but deadweight, but there are plenty of disagreements and frustrations that deserve time and healing. To tell someone to forgive and forget prematurely (merely because this day could be their last) may well be to deprive them of a healthy and reasonable process of healing without being hurried. Achieving emotional and psychological balance in life -- under the assumption that you probably won't die tomorrow -- is more likely to pay off in total life satisfaction than rushing to balance the metaphorical books on life before going to bed each night. ■ It's all too easy to put too much weight on how things end, rather than on the whole of the experience. It's a temptation within everything from vacations to relationships, and most certainly with how we treat life itself: The focus on how a person died ("peacefully in her sleep" or "after his brave fight with cancer") often takes up far too much of the obituary or the eulogy in relation to how they lived. ■ Rather than over-valuing the end, perhaps the better advice would be to "Leave nothing to the custody of your last day". It's true that everyone will have a last day, and that most of us won't actually know when that day will be. It could come as a surprise tomorrow, or it could take 111 years. But if it's the latter, that means living through more than 40,000 days -- meaning that 39,999 of them were lived as part of a continuum, each of which was an opportunity to do something good, even if it wasn't fully "seized". ■ Every person is a work in progress, as is every relationship and most every worthwhile project. The best advice isn't to attack each day with the spontaneity of the very last, but to see each one as a step towards doing the things that shouldn't be entrusted only to the end. We already endure too many temptations to put too much weight on the finish.
In passing the Build Back Better Act, the United States House of Representatives has made an unusual turn: It is offering a "payroll credit for [the] compensation of local news journalists". The proposal still requires Senate approval and a Presidential signature before it can become law, but the provision is worthy of attention. ■ There's really no escaping some hard truths about the state of local news outlets. For instance, the National Association of Broadcasters said in a comment to the FCC that "Local radio stations' OTA [over-the-air] ad revenues fell 44.9 percent in nominal terms ($17.6 billion to $9.7 billion) from 2005-2020". And the estimated total newspaper circulation in the country is well below half of what it was in 1990. Everyone is aware that alternatives like social media and streaming platforms are tough competitors for mass media to face. The story is even worse for ad revenues than for circulation, if that can be believed. ■ If the Build Back Better plan were to become law, it would deliver a subsidy of almost $1.7 billion to those local news outlets over the course of ten years. While not a "Brewster's Millions" type of windfall, it certainly would be met with approval among some media owners, both large and small. ■ In the long term, though, neither a few years' worth of subsidies -- nor any other intervention the government could likely imagine -- is going to be sufficient to change the grander dynamics of news. Some outlets (like terrestrial radio and television broadcasters) have always subsisted on advertising revenues rather than subscriptions, and their emergence had an effect on traditional publishing. Digital publishing has an effect on everything that came before it, as well. ■ There are still bitter fights ahead over ownership of legacy media companies, including the immediate contest over Alden's attempt to take over Lee Enterprises. And those have the potential to vastly out-scale anything that happens around marginal tax bills. But in the end, American citizens will get the kind of news coverage we choose to value. ■ Whether that takes on the form of conventional media under benevolent ownership (that is, rich owners who don't insist on turning a significant profit), startup outlets that seek to crowd-fund or otherwise support local journalistic coverage, or even local news cooperatives (the reporting equivalent of credit unions or mutual insurance companies), imaginative new forms have the potential to steer into the future. But it's also entirely possible that if interested citizens don't see the value and pay up, the only local media left standing will be increasingly partisan outlets that serve mainly to pick fights rather than to document reality. Which chosen path lies ahead remains to be seen.
Americans of a nit-picky sort sometimes engage in disputes over whether the United States is a "democracy" or a "republic". Of course, the national business is not conducted via direct democracy, but the literal definition of "democracy" is simply government by the people. And despite its antiquarian association in the mind with the Roman Republic, the definition of a "republic" is virtually the same: One in which supreme power lies with the citizens. A monarchy could be democratic, and a republic could be undemocratic, but at least for the case of the United States, sovereignty lies with the people, and the people choose the government. ■ Let it be noted, though, that a new republic has come into the world: Barbados has declared itself a republic, removing Queen Elizabeth II as its head of state and replacing her with a president. Interestingly, one of the complaints regarding the transition is that it was declared without conducting a democratic referendum. (Barbados is already governed by a democratically-elected legislature.) ■ It is a bizarre artifact of history (and institutional inertia) that dozens of countries still acknowledge hereditary monarchies -- including Canada and Australia, which still bow to the Queen. While there may be no urgent need to depose the House of Windsor (or any of its cousins), the act of declaring a republic really shouldn't seem objectively shocking in 2021. Most of the remaining monarchies are constitutional or parliamentary in form anyway, and the fact they retain hereditary heads of state is often only because those heads of state behave well enough to retain the consent of the people. (Not to mention the interest of the heirs in the line of succession -- Japan's Princess Mako and Britain's Prince Harry have decided it's more interesting to move to America.) ■ But the fundamental symbolism of declaring a republic is this: It says that supreme authority is organic, and that governments are only legitimate when they are "deriving their just powers from the consent of the governed" -- not because anyone can make a claim to the bloodline of some long-ago warlord. To the extent that a head of state is generally a person tasked with embodying the symbolism of a country, then it's not really much of a step to take that duty away from a hereditary monarch and to place it in the hands of someone elected to do the job -- in fact, Barbados simply re-titled the governor-general (the designated representative of the Queen) and swore her in as president. But symbols matter, and to label the state as the people's thing is a healthy decision. ■ If there is one lesson America should learn from newer republics, perhaps it is that we ought to be open to separating the roles of "head of state" and "head of government". The President, inasmuch as he or she is functioning according to Article II of the Constitution, is the head of government (though in a shared role with the leadership of both the Congress [Article I] and the Supreme Court [Article III]). But those who function most effectively as heads of government may well be poorly-suited to the more symbolic role of the head of state. We might find ourselves better-served by electing a mainly ceremonial head of state who could symbolize the national zeitgeist, perhaps for just one year at a time and just one term in a lifetime. It might give the country an outlet to express popular feelings in a way that would contain them away from the processes required of a deliberative government limited by checks and balances. ■ An elected, ceremonial head of state -- call it the "Citizen of the Year" -- could offer us a focal point for those many feelings that don't really need to pump greater animal energy into politics. Sometimes we might feel like Betty White, and sometimes we might feel like Lewis Black. But giving someone the de facto title of "national id" might be good for us, especially if it were to divert status-seeking celebrities away from politics and maybe even let some culture wars be fought away from where laws are made. ■ Demanding that celebrities take popular political positions and converting ex-politicians into celebrities are both bad habits, and giving them all a path to a role of an oversized profile with an undersized level of responsibility might just clear the field a bit for the more serious and duty-driven among us to be selected for the work of governing the country. Choosing people as official symbols -- like the decision to name Rihanna as a "National Hero" of Barbados -- might well be a task best performed with the help of a gentle firewall from governing politics. (But we could still let the people decide.)