Gongol.com Archives: January 2023
Benjamin Franklin, a person of legendary self-regard, offered sterling advice in his 1755 edition of Poor Richard's Almanack: "Be at War with your Vices, at Peace with your Neighbours, and let every New-Year find you a better Man." Franklin's advice isn't why people form New Year's Resolutions today (it's a practice that predated Franklin by thousands of years), but his words have undoubtedly contributed to cementing the practice in the general American vernacular. ■ The problem with New Year's resolutions isn't the quest for self-improvement, but rather their widely-cited rate of failure. "Happy Planet Fitness Day for all those who celebrate", teased one Anthony DeRosa on Twitter as his clock struck midnight to welcome the new year. Gym memberships spike in January, but decay has usually wiped out the gains by April. ■ Gym membership isn't the only marker, of course; about 40% of Americans make New Year's resolutions, and they're not all about pumping iron. Quitting bad habits, managing money better, and eating more wisely are all popular choices. ■ The problem with turning a single event each year into the moment to turn over a new leaf is that the high rate of failure only discourages people from undertaking self-improvement in ways that will work. There's clearly some survivorship bias at play in the figures that say two-thirds of people will keep their resolutions all year; the people who don't keep this year's resolutions will be less likely than the successful ones to make new resolutions next year. ■ Franklin himself built a whole system for keeping his own resolutions under the lofty title of a "plan for attaining moral perfection". It was his tacit acknowledgment that resolving to do something once is incomplete without deliberate follow-up. If we really value self-improvement (which we should), then resolutions ought to burrow their way deeper into the culture. ■ The idea of the New Year's resolution gets perpetuated because it's easy to remember when to make it, and it gives us something common to talk about. But more effective than one ambitious list of resolutions to uphold starting January 1st would be six discrete resolutions, taken up one at a time, every two months. In other words, a single New Year's resolution, then another on March 1st, on May Day, on July 1st, on September 1st, and on November 1st. ■ Resolutions are often about habits, and no magic rule exists for how long it takes to form a habit. But some research suggests that the median length of time to make a behavior into a habit is 66 days, or just a smidge longer than two months. Thus it would probably make us better if the people who talk about New Year's resolutions around January 1st (news reporters, radio and podcast hosts, self-help advisors, and others) were to revive the question two months later and ask, "What's your March 1st resolution?", then again every two months after that. ■ Higher-frequency, lower-stakes resolutions would be great cultural achievements, if we could make them as routine as the nature of changing sports seasons. The philosopher Maimonides wrote, "[H]e should attend to the defective moral habit in himself and continually seek to cure it, for a man inevitably has defects." The wisdom is in the word "continually". But it is often easier to undertake something new if others are trying, too. The lesson to take away from Franklin and Maimonides alike is that people need more on-ramps for that trying.
The House of Representatives is formed by the Constitution in Article I, Section 2 -- prior to the Supreme Court (Article III), the Senate (Article I, Section 3), or the Presidency (Article II) -- and should accordingly be expected to act as the primary locomotive of the entire Federal government. And it is having trouble selecting a leader. On three separate ballots of its first day in session, the House delivered no candidate a majority of the votes for Speaker. That hasn't happened in a century. ■ The Speaker of the House is named by the Constitution before the Chief Justice or the President; so foundational is the job supposed to be. It has always been a political job, of course. But while the Chief Justice must be appointed by the President and confirmed by the Senate, and the President must be chosen through the Electoral College, the House is free to choose whomever it wants to serve as Speaker. ■ This raises an interesting thought exercise: How might things run differently if the House, which is so narrowly divided on party grounds that the swing votes could fit in a Kia Sportage, chose a fair-minded outsider as Speaker? ■ Justin Amash, who not long ago left the House, has offered himself as an outside candidate for the job. His would be an unlikely candidacy, for sure -- he already bears deep partisan scars from his time in office. But he does make the valid point that Congress needs some process reforms to function better: Unencumbered bills, committee order, and an open amendment process. ■ America would probably benefit from having a charismatic technocrat as Speaker of the House: Someone likable, a bit nerdy, earnestly committed to fairness, and above all, obsessed with the long-term good of the House as an institution. We have people like that in America; many of them are former governors, including several Republicans. ■ It's extremely unlikely to happen, of course, bordering on the impossible. But given the small group of holdouts keeping the Republican majority from electing their presumptive candidate, it wouldn't take that many Democratic votes to put a mild-mannered technocrat into the role, if mainstream Republicans could find a consensus outsider to put forward. Could the House of Representatives function like a world-class deliberative body with someone like Mitch Daniels or Rick Snyder holding the gavel? Perhaps in a parallel universe somewhere, they're about to find out.
Unless changes are made this year to the nation's intellectual-property laws, Disney is set to lose the copyright protection on "Steamboat Willie", the earliest Mickey Mouse cartoon, next year. It doesn't mean you'll be able to slap mouse ears on your corporate logo in 2024, but the enormous sway Disney has held over copyright law is in no small part due to the enormous significance placed on Mickey Mouse as a cultural icon. ■ Yet, for as sentimental as people get about Mickey Mouse, the most American of all cartoon characters will forever be Bugs Bunny. A canonical rule in Bugs Bunny cartoons is that he never starts a fight. Someone else is always the antagonist: Elmer Fudd on a hunting adventure, Daffy Duck seeking to cheat his way to security, or Marvin the Martian out to destroy Earth. Bugs Bunny's job is to outwit his adversaries -- sometimes humiliating them in order to put them in check -- but never to start a fight. ■ America hasn't always held to that same canonical rule, but we generally aspire to it. And there is no time like the present to revisit the Bugs Bunny doctrine. ■ The United States has become by far the largest material supporter of Ukraine (in absolute terms), not because we desired to engage in any kind of fight, but because Russia initiated an unprovoked war of aggression against what a majority of Americans recognize was a victim state. Drawn into supporting a fight not immediately related to our self-interest, Americans overwhelmingly still see that there is a right side and a wrong side to the war, and that supporting the right side remains our duty. ■ Likewise, we have no interest in starting a fight in the South China Sea or anywhere else in the Pacific. But if China is going to engage in wildly unsafe behaviors like attempting to intimidate American pilots by flying too close in international airspace, then by no means should the United States back down from asserting its right to maintain a global footprint. ■ It may seem quaint or even silly to enlist a cartoon character as a metaphor for a nation's security posture, but it's entirely apt. America has little or nothing to gain from initiating fights with Russia, China, or any other country. Our long-term prosperity and welfare have benefitted from a well-ordered world peace and vibrant international trade. ■ This is no time to be picking fights among great powers (or even among lesser ones). But if other powers are going to start fights, we have to bring our cleverest ideas for direct and indirect deterrence to the fore: Backing down only invites further provocation. It is costly to remain strong: $1.6 trillion was the bill in fiscal year 2022. ■ We should gladly prefer to spend that money on almost anything else. But that's not the choice we are presented, nor is it in our self-interest to look the other way as others undermine a more peaceful order. We should remain congenitally opposed to engaging in a fight unless provoked, but no one anywhere should doubt American commitment to supporting partners and putting antagonists in their place. That doesn't always call for punching them directly in the nose -- but it does require that the outcome of any spat never be in doubt.
For as much as people wisecrack about living in a "post-truth" era, human beings haven't really surrendered our basic impulse to try to grasp a sense of what's really happening in the present moment. We are social creatures, and being social requires knowing what other people are doing, thinking, and feeling. If someone in a crowd suddenly turns and points at something overhead, it's a sure thing that almost everyone else will turn and look, too. ■ That is the essential attraction of news: News is whatever materially changes our understanding of the status quo. Lots of other things try to masquerade as news, but many "also-rans" in the world of news coverage are merely items of information or documentation of events. ■ In the free world, we often voluntarily subject ourselves to non-news because it has entertainment value. There are not actually four hours of news contained in one "Today" show. There's a little bit of news, and then a lot of other stuff. But people like to watch, so on it goes into its 71st year. ■ Americans generally have the luxury of taking news less than seriously. But people elsewhere aren't so fortunate. The United States invested in international broadcasting throughout the Cold War as a means of achieving public diplomacy -- reaching people living under Communism, so that they could learn what their governments wanted to suppress. ■ Totalitarian and authoritarian regimes are good at filling "news" time with non-news; witness the fawning domestic coverage of North Korean autocrats, the propagandistic efforts of China's CCTV, or the utterly distressing content being broadcast by Russian state television. But people living under those regimes still need and deserve to be told the truth. ■ Taxpayers in the United States should be proud to fund outlets like the Voice of America, Radio Free Europe/Radio Liberty, and other arms of the US Agency for Global Media. In seeking to tell the truth and report on what legitimately matters as news, especially in places where that coverage is inconsistent or even prohibited by local authorities, these outlets serve a vital purpose for building a better world. ■ Reality is the best friend of liberty; people who know the truth don't voluntarily choose to be oppressed. People can generally sense when they're being told lies, but that isn't the same as being told the truth. America has taken our international broadcasting agencies for granted for too long. In a complicated world with altogether too many bad actors eager to deny people their natural freedoms, the cost of making sure people can get real news everywhere is a small price to pay.
A certain level of mayhem is probably inherent to the lower houses of most democratically-elected legislatures. Anyone who thinks legislative hijinks are an exclusively American phenomenon needs to spend some time watching the Prime Minister's Questions from the British Parliament (or just a super-cut of former House of Commons Speaker John Bercow screaming "Order!" at his colleagues). ■ That doesn't make the marathon to elect a Speaker of the House any less embarrassing, but it does raise a point worth considering. The United States House of Representatives hasn't grown since 1929, even though our current population (at 334 million) is approaching three times the roughly 122 million it was then. The House wouldn't really have to triple in size to be more appropriate, but it really must be made larger than it is. ■ The Founders imagined much more representation than we put to use now. In Federalist Paper No. 56, either Hamilton or Madison wrote that "[I]t seems to give the fullest assurance, that a representative for every thirty thousand inhabitants will render the latter both a safe and competent guardian of the interests which will be confided to it." We're at a representative for about every three-quarters of a million people today. Enlarging the membership of the House to come into line with a principle like the Cube Root Law might have merit for a number of reasons. ■ First, in shrinking the population of the average Congressional district, we would make it easier for individual voters to have access to their representatives. Smaller districts would also, quite likely, be easier to win with relatively smaller campaign budgets (thus diminishing the much-hated influence of money in politics). Smaller districts might also discourage the sort of gerrymandering which Americans have learned to hate. ■ But a bigger House could also do two favorable things regarding the qualities of its members. First, a broader population could well have the effect of drawing in people from a wider range of occupational and other backgrounds -- we don't have many engineers or computer programmers in Congress, and maybe we should have a few in a technologically sophisticated world. ■ Second, a bigger House of Representatives should tend to dilute the power of individual cranks. Any given sample of a hundred or more people is likely to contain a few bad actors. But if we generally trust the public to select away from nuts, then their relative impact should be lessened if the individual bad apples who get through the process are diluted among many other more-normal individuals. ■ Notably, expanding the House of Representatives would be an almost frictionless way to make the Electoral College more proportional, since a larger number of electoral votes assigned via House seats would reduce the relative weight of electoral votes arising from Senate seats. No Constitutional amendments or National Popular Votes required -- just a statutory change. ■ And the cost of adding House members would be trivial; each member's office costs about $2 million a year, between salaries and the operating expenses of their offices. Adding members would barely move the needle from a budgetary perspective, since the same number of constituents could be served by more or less the same number of offices and staffers who serve them now. ■ Expanding the House wouldn't solve every problem, but it does have the potential to mitigate some of the chamber's most unflattering aspects. In the past century, the country's population has nearly tripled, and our national problems have grown in complexity by a factor of much more than three. Holding steady at the same number of Representatives who served when Calvin Coolidge was President is only a way to invite tiny factions to hold everyone else hostage.
When faced with a complex problem and limited access to the truth, a sound course of action is to look at the flow of resources. It's easy to increase the supply of rhetoric -- talk is cheap, after all. But the truth is usually found in the actual commitments that people make with resources that are scarcer than mere words. ■ It is credibly believed that Russia has drafted 300,000 men into the military since late October, and in some corners, it is believed that another 500,000 may be called up soon. ■ Pressing 800,000 conscripts into service to fight a war would be an enormous commitment under any circumstances. For comparison, the Allies in World War II committed 160,000 troops to invade Normandy on D-Day. That operation was enormous in its own right, and it ultimately resulted in victory on the European continent. What the Allies sought to do on D-Day was just and righteous, and Eisenhower rightly grieved the lives lost. ■ But sending five times that number to fight a mad war of aggression against Ukraine is a dreadful confirmation of something Dwight Eisenhower wrote after WWII. Eisenhower noted, "Americans assess the cost of war in terms of human lives, the Russians in the over-all drain on the nation." Many of those whom Russia will send into action will be killed or injured. Many others will be be damaged or broken by the experience, likely to return home with new demons or worse. Russia would be making an extremely large and hazardous gamble with its own domestic future by taking such a risk, to say nothing of its appalling crimes against the people of Ukraine. ■ Someday, we will know the resolution of the Kremlin's awful war. Perhaps it will mark a historical low point and the last moment before a peaceful liberalization, with Russia's present autocrat deposed and the country on a path towards economic and political harmonization with the free world. It happened in (West) Germany, after all. ■ But for now, we can only try to recognize what patterns are being repeated, and Russia's leadership is making choices that show no moral growth over the ones its dictators made eight decades ago. It's troubling enough that the people of a peaceful neighbor are being forced to suffer from an unjust invasion. It compounds the tragedy to imagine that Russia's soul may emerge even sicker at the end than it started.
Things got stopped by a serious computer system outage. It's like we keep having all of the technological breakdowns that we were warned were going to happen on Y2K.
Never read too much into anyone's assessment of culture that assumes consistent behavior across large numbers of people. But don't be surprised, either, when common patterns emerge in the data describing people in large groups. Take, for instance, the amusing case of a poll asking people of Generation X and older whether they take and share "selfies" on social media. ■ The answer -- after nearly 2,000 votes and 20,000 views -- was that for every person who said "I take a lot and share a lot", there were 88 who said "Rarely or never". Nobody ought to read too much statistical rigor into a self-selected set of responses to a poll on Twitter, but that's an extreme imbalance that would seem to be hard to fake. ■ And it's an extreme imbalance that wouldn't seem likely among younger generations. The presence of self-image (and images of the self) presented online are of such importance to some youth that the Seattle Public Schools have sued social media companies -- Facebook parent company Meta, Snapchat, TikTok, and Google-owned YouTube, among others. ■ The school district paints the social-media tools as a public nuisance with contributory responsibility for a crisis in mental health in their state. Whether the suit has merit will be up to the legal system to decide, hopefully with fairness and sobriety. ■ But the very act of initiating such a case shows that the schools realize there is a lot to teach those young people, and there isn't a lot of foundational work to build from. Whether it's called "digital citizenship" or "digital literacy" or something else, there's no hiding from the need to communicate skills and behaviors to the students in the classroom. That's especially the case if the people of, say, their parents' generation are 88 times more likely to avoid taking selfies than sharing them. Adults need to put in lots of extra effort to make sure that their reactions are prudent, not just impulsive.
Of the various named weather phobias, iridophobia seems the least necessary. Who fears the rainbow?
Whatever further cements the broadcaster's presence in the region is probably for the good
Seva Gunitsky, a professor of political science, shares the pithy and cutting observation that "Every generation, upon reaching middle age, begins to see the world around them in decline as a way to habituate themselves to their own decay." It's hard to find fault with Gunitky's hypothesis; history is cluttered with variations on a theme of "Kids these days are terrible", yet most of the meaningful measures of human welfare have trended toward the good for centuries -- and the pace of improvement on many measures is accelerating. ■ Something has to explain the gap between perception and reality, and the dread of one's own decline may well be it. Notable are the exceptions to the pattern, who have recorded their faith that perhaps the kids will indeed be all right. Benjamin Franklin was one of those; he offered the reassurance that "[D]iscretion did not always accompany years, nor was youth always without it." ■ The pessimist, for instance, looks at censorial behavior on college campuses and concludes that the young are bound to turn themselves stupid and servile. These critics may be right to recognize that a learning environment, by design, cannot be made free of uncomfortable exchanges, and that a robust dialogue about meaningful questions (of the kind we easily associate with the idea of "going to school") will invariably push some people out of "safe spaces". ■ The optimist, though, looks at the environment and concludes that the behavior that gets people labeled as "snowflakes" could be attributed much more charitably. Youthful eagerness to behave more inclusively and with greater sensitivity to the identities of others probably deserves to be regarded as a sign that society at large is becoming wiser and more open. But the perpetual hazard of eagerness is that of overreach. ■ In so many cases, the youth-driven movements that are prone to taking things too far are crying out for the guiding hand of those who have seen pendulum swings in the past. That takes a balance between openness to the new and a leavening sense of caution about how to put the new into practice. ■ The human world is only rarely in decay, but it can be disorienting when new ideas are adopted with inexperienced "drivers" at the wheel. It may not always be obvious (even to them), but young people with big ideas are often , if not usually, in search of guidance about how to enact the changes they desire. To the extent that their ideas contain even the germ of a good thing, it's important to avoid reacting as though all change is suspect, and to instead ask "How much of this would really be a good thing, and how can we help shape how (and how much of) it comes into being?"
Chances are good that when her identity as a political figure has passed, Condoleezza Rice will be regarded as one of the most valuable thinkers to emerge from the contemporary age. It's hard for people to dissociate her from her roles as National Security Advisor and Secretary of State under President George W. Bush, who is not a neutrally-regarded figure even now. But it's worth trying. ■ Rice has taught and written since leaving government service, and her 2017 book, "Democracy: Stories from the Long Road to Freedom", has particular resonance in the present. Inside, Rice warned that "In today's interconnected world, the creeping and subtle authoritarianism of illiberal elected leaders is a greater threat to democracy than if they were to crush it with tanks in the city square." ■ Given what just happened in Brazil, with a failed apparent coup attempt, and what could of course happen elsewhere, it's timely to remember that adherence to rules and norms is essential to keeping elections within the boundaries of intellectual disagreement. The contest of ideas must be kept from coming to literal blows, and the provocateur shouldn't have a veto against any democracy, no matter how young. ■ The former Secretary of State also noted, "America's Founding Fathers understood that liberty was the necessary condition for citizens to find fulfillment. It is not, however, sufficient." No form of government would be worth its salt without a commitment to liberties, but liberties have to coexist with democratic processes and with civic responsibilities if they are to endure.
Ukraine says at least 25 people were killed by a Russian missile attack against an apartment building in Dnipro. If it happened in a country at peace, this would be labeled instantly as an act of terrorism. Russia's deployment of the corrosive, relentless terror of war will be a stain on its history a century from now.
At least for a good share of the Upper Midwest, WHBF in the Quad Cities is the record-holder as the television station with the longest stretch holding the same channel number, call sign, and network affiliation. It's been the same (Channel 4, carrying CBS programming) since 1950. That would be a long relationship in any business -- maybe some implement dealers and insurance agencies have gone on with their principal producers longer, but not many. ■ That longevity raises a question worth pondering: How much will affiliation ties matter to media outlets in the years to come? Radio stations once proudly trumpeted their network relationships, but that has faded into almost complete obscurity as once-renowned names like Westwood One and NBC Radio Network have become identities in name only with no connection to their past incarnations, and networks like the storied Mutual Broadcasting System have gone defunct entirely. ■ Television networks still have brand identities (driven in no small part by their newsgathering arms), but it's becoming increasingly important which streaming service carries a program than which network. Finding NBC programs on Peacock versus Hulu may be more relevant to many viewers than identifying the network with a local channel number. ■ This transition, from placing lots of weight on institutional and team identities to placing relatively little, is evident throughout media. Newspaper and magazine columnists have far less incentive to remain loyal to their institutions when there's always the option to strike out on their own, possibly for much greater compensation. In the light of extraordinary downsizing at publications both big (like the Chicago Tribune) and small (like the St. Cloud Times), being a career institutionalist may no longer remain an option. ■ Local radio and television personalities, pushed by management to "build audience engagement" via social media and other outreach, may find that their affiliation with a local station is best used mostly as a springboard to other high-profile occupations (a choice for which they can hardly be blamed, given the dismal state of the broadcasting job market). ■ Longevity is still to be admired, to be certain. When someone can last 40 years at one outlet, it's worth recognition and applause. But something is decidedly different in the media from what it was just a decade or two ago, and the macro-scale forces seem aligned to keep it that way for a while to come, at least unless and until consumers grow so weary of trying to follow their favorite content producers in so many places (and under so many subscriptions) that they reconstitute themselves into bundles much like the old ones -- albeit perhaps more digital this time around.
Late-morning thunderstorms. Tornadoes by afternoon. It's an extreme outlier for Iowa weather in January.
How can you not cheer for a people so determined to make things work like ordinary, despite a brutal war being waged against them?
Something to remember next time anyone tries to tell you the Communist Party has grand plans to do X, Y, or Z. Despite holding the unchecked coercive power of the state, they couldn't work out the basic arithmetic of human procreation until this was already inevitable.
Imagine the horror when they discover that modern office buildings have things like ergonomic chairs inside, rather than a Louis XIV chair within every cubicle.
Gauzy reminiscences about the past are easy to find. There's never been a shortage of people willing to wax poetic about whatever they consider the "good old days". But it's worth training ourselves to dismantle the flawed arguments (and they are almost always flawed) by realizing that although human progress is anything but linear, it has many incentives to keep improving in the aggregate, even if certain creature comforts are lost along the way. ■ Take, for example, the fawning over the supposedly glamorous days of regulation-era air travel. When airlines were told that they couldn't compete on price, it's no surprise that they turned to competing on accoutrements, like the quality of in-flight dining. Between some hazy recollections by grandparents and the yellowing pages of carefully-staged magazine ads from the time, it's easy to get the impression that something has been lost in today's era of bags of tiny pretzels and Biscoff cookies. ■ But even setting aside the really important differences -- like the vast improvements that have come about in both affordability and safety since deregulation in 1978 -- the in-flight experience today is markedly better, even if you're one of the few who wishes smoking were still allowed onboard. ■ To be fair, the basic coach seat is smaller than its predecessor. (Anyway, for those to whom seat size is a priority, there's always the upgrade to business class, by whatever name it is found. Everyone else just implicitly chooses to pay less and endure the smaller seat.) But once stationed in the seat, the modern traveler has, for one thing, infinitely more choices than in the past. ■ It wasn't that long ago that in-flight entertainment was limited to perhaps ten channels of audio, a mediocre in-flight magazine, a well-worn copy of the Skymall catalogue, and (for the lucky passengers) a pre-selected in-flight movie. Now, even the passenger sitting in the very last row next to the lavatory more often than not has the option to watch or listen to an unlimited range of content. Airlines generally offer free WiFi for a broad array of streaming services (and the savvy traveler already knows to download some preferred content before departure). Seatbacks now often include not just power outlets, but also device holders to make the viewing experience more convenient. ■ We have migrated away from the one-size-fits-all model that used to predominate (with a single in-flight movie shown on unreliable projection screens that were too close for those sitting right behind a bulkhead and too far away for almost everyone else) and towards virtually infinite customization. Giving everyone almost exactly what they want is vastly better than giving everyone the same milquetoast offering. ■ Entertainment is only one aspect of the total experience, of course, and not every aspect has gotten better. But over time, the system has optimized around those things people actually care about the most, not what they merely say they want. You can still get carved roast beef when you're traveling, but it will generally involve ordering from a sit-down restaurant inside an airport terminal between connections. ■ Anytime we are presented with an over-simplified assessment of reality that assumes the best about the past while dismissing the state of the present, it's worth remembering that the reality we inhabit today is the result of an evolutionary process shaped by the past. Not all things that are worth keeping survive, and not all things that survive are truly worth keeping. But in general, the burden of proof ought to be on the upon those who argue that something different than the present state is what's worth keeping, equally whether they think there is progress yet to be made or that the progress achieved thus far ought to be rolled back.
A non-trivial aspect of this news: "Microsoft’s workforce expanded by about 36% in the two fiscal years following the emergence of the pandemic, growing from 163,000 workers at the end of June 2020, to 221,000 in June 2022." That doesn't make the cuts hurt less for anyone among the laid-off, but it does go some way towards indicating whether this is really evidence of an existential risk to the company.
And in an effort to get them, the country's defense department has produced a video with all the throwback flavor of a 1992 Chevrolet truck ad. Is this public diplomacy? Yes. Is this defense policy? Yes. Is this postmodern comedy? Oh yes. Is this absolutely brilliant? Beyond question, yes.
Someone, somewhere, has apparently labeled it "energy privilege" if a person doesn't need quite as much sleep as other people. The appellation is probably just one person's weird way of trying to make a point, but it's not the first time someone has slapped the "privilege" label on something where it plainly does not apply. Sometimes differences among people are just differences. Framing every difference in terms of power structures only dilutes the meaning for those cases where "privilege" actually does apply. Words matter.
If you're in a place where you must shovel snow (and don't have a snowblower or face a snow type that isn't appropriate for one), then you must learn the herringbone approach: Shovel straight down the middle of the driveway to clear a walking path. Then, shovel diagonally outwards from that centerline, pushing downhill. It minimizes the amount of wasted energy and general shoveling ennui.
The artist was quite talented; the style should make a comeback
Large technology companies have been laying off a lot of workers over the last three or four months, with Microsoft and Google together releasing 22,000 workers in the last week. Those are large numbers for two of the most dependable blue-chip firms in the technology business, and the announcements certainly give people reason for concern. ■ "Layoff" is a word we should always treat with caution. It's a euphemism, and not a very good one at that. Yet we don't have a good alternative word to convey an essential connotation to the act: A layoff is the employer's fault, not the employee's. It's not a release for cause nor for underperformance. And it's an involuntary departure on the part of the worker. Layoffs happen because something has gone wrong at the strategic level of the company. ■ Jobs in high technology often seem like they ought to be beyond the reach of those kinds of ebbs and flows. A business degree or a tech-friendly computer or scientific degree are often seen as virtual guarantees of employability. But they, too, are obviously not immune from economic forces beyond the employee's control. ■ For all the debate that has raged around college debt and which majors are or are not "worthwhile", the best solution is probably for every student capable of the challenge to go after a double major. One major from a practical field, and one from the liberal arts. The former should make the graduate productive, the latter should make them adaptable. ■ That adaptability is going to be all the more important over time. If even the big growth industries (like high technology) are going to be susceptible to big shifts, while others (like journalism) can find themselves in employment freefall, it's only responsible to try to send graduates into the world with both the tools they will need in the short run as well as the ones that will keep them from becoming hidebound in the long term. ■ Changes are coming for most workers, and often the worst of those changes will come through no fault of their own. You could have done all the right things in becoming an automotive engineer starting in 1993 and not have foreseen that electric vehicles would someday decimate the need for your kind of high-skill work. Preparing people to earn an honorable living, both now and in the future, is a challenge for every educational institution to show they're capable of meeting.
Three cheers for this development, which is great news for "place-bound" learners -- people who can't just pick up and move somewhere else to start or finish a degree. UNI has a very highly-regarded accounting program, and Des Moines is a city where lots of them can be put to good use. Getting people who might have limited choices otherwise on a path towards a high-demand career through a reputable in-state university is a great development in serving the public.
Bryan McGrath on the ongoing saga of classified documents found where they do not belong: "Over-classification is a problem, but it is not THE problem. THE problem is people who think the rules do not apply to them."
It's hard to believe just how long some of the misdirections and fibs that parents tell their kids go on to linger in the child's mind. Always tell kids the truth. Someday you'll be old and mystified by a cultural development or some advanced form of technology, and you'll want your offspring to give it to you straight. They'll remember if you messed with them for your own amusement when they were kids and they'll be within rights to serve up their revenge cold.
A small band of dyspeptic commentators runs about on the English-languag Internet, decrying various aspects of modernity as having lost touch with the supposed beauty of the past. Under anonymous handles, they point to small samples of art and architecture from the present, selectively lining them up agsinst small samples of historic works, or just point vaguely at great past achievements and weigh them against a vague assumption that people today are too lazy or unambitious to match up. ■ It would all be amusing in an anachronistic way -- like someone who refuses to watch any movie filmed in color -- if it weren't so very likely that the accounts putting on their holy war against modernity weren't serving a truly backwards ideology. ■ The problem is that it is so easy to tap into a human instinct to mistake ornamentation for beauty. If something looks elaborate, it often looks like it required a lot of work, and that seems to appeal to a basic human instinct. ■ But ornamentation isn't equivalent to quality. The Las Vegas Strip is chock-full of ornamentation, but nothing there is permanent, nor is it classically beautiful. It is merely decorated. Lavishly decorated, but that is all. ■ Human tastes change, both individually and at scale. That's a blessing, not a curse. The architecture of skyscrapers alone has gone through at least four major schools since World War II. ■ Some looks have endured. Others have not. But ornamentation hasn't really made any lasting difference, any more than wallpaper can salvage a poorly laid-out room. Fully valid schools of thought have idealized the simplification of design, from Frank Lloyd Wright's Prairie School of architecture to the Streamline Moderne design movement to Internationalism in tall buildings. ■ What makes the Internet commentators so troubling is the ease with which they seek to recruit people to a broad and unsubstantiated dissatisfaction with the modern world. It's a very old playbook -- representing the modern as corrupt or bereft of "true" beauty, needing replacement by a revival of "traditional" beauty -- which, suspiciously, is always defined by what its advocates oppose. ■ It's gross and backwards, but more significantly, it invites alliances with those seeking conscripts to a war against this straw-man of "modernity". The most important tradition is the continuous evolution of human tastes and standards to match the new things we've learned. Human civilization must conserve what works, while adapting to change and adopting what works better than what came before.
No rite of passage should escape periodic re-examination. And it would seem that, at least in some quarters, the rite of obtaining one's first driver's license is undergoing a revival of scrutiny. ■ There is an obvious case to be made that obtaining a driver's license is a relic of America's engine-obsessed past. Some would even argue that we are doing the next generation a favor by focusing their time on practices that will better serve them in a world full of mass transit and walkable spaces. Time not spent in driver's education class, the thinking goes, is time that can be spent on an extracurricular activity with better returns in the college admissions process. ■ But unlike certain specific driving-related practices (like learning how to drive a stick shift or fix a carburetor), the general need to know how to drive a vehicle has not been obviated entirely by technology. Driving is a skill that is not only widely useful in personal life, it is still often applicable in professional life, too, as well as in a wide number of trades. Whether one ends up as a long-haul trucker, a corporate attorney rushing to meet filing deadlines, or a shipping department supervisor running a forklift, the essences of a steering wheel and an accelerator pedal must be known. ■ Sure, you may choose to live in a place like New York, with lots of mass transit. But that doesn't mean you won't travel to other places -- for work or for leisure -- that won't require you to rent a car or take a long-distance road trip. For now at least, it remains inescapable that driving is a part of basic personal functioning for the vast majority of Americans. Sure, you might be able to virtualize many of your interactions, but there's no enduring substitute for being in the same time and space with your friends. ■ Even with self-driving vehicles, we may remain a car-dependent nation for some time to come. And there is no reason for anyone within a car-dependent nation who is physically and mentally capable of driving to be completely dependent upon others to do the driving. ■ Adolescence is a time for trying new things, learning how to fail safely, and figuring out how to be both resilient and self-sufficient. Parents have to encourage their children along that road: As a parent, your unconditional love and support should be both obvious and unquestionable. But everyone needs to go through a process to earn self-respect, and it's for the best if that starts by trying new things (and sometimes failing) when the stakes are low. That time is in childhood and adolescence. ■ It's a fallacy for parents to try to pack adolescent resumes in order to impress others, instead of training and guiding those adolescents to become self-sufficient and capable adults. Like it or not, your kid becomes an adult in the eyes of the law on their 18th birthday. That's some heavy, heavy reality. And at least for now (and for some time to come), American adults will need to know how to drive.
New Zealand's prime minister, Jacinda Ardern, just stepped down after a surprise announcement that she had reached a state of personal burnout. Given that she has been the target of relentless online harassment, perhaps the bombshell announcement shouldn't have been that much of a bombshell. It's too easy to paint political leaders as caricatures, rather than as real people. Sure, maybe there's something more to her story...but there doesn't have to be. It's been an exhausting time to be a political leader, and anyone who doesn't freely admit that is posing as something they're not. We would do well as a society to de-normalize this caricaturization of political figures, and to treat them as the human beings they are. If done properly, that would cut both ways: We'd stop tolerating the dehumanization among our own aimed at our perceived opponents, but we'd also stop expecting them to go on doing their jobs forever.
According to the Census Bureau, the median American was born around 1982. This means that anyone older than a "Geriatric Millennial" is now in the upper half of the age distribution. (Sympathies to Generation X.) With that realization occasionally comes the knowledge that one's lived experience is now history. ■ Among the most obvious of social changes from one generation to the next is the extent of consumer technology. Those who attended high school or college in the late 1980s, the 1990s, or the early 2000s didn't realize it at the time, but they occupied a brief and unique window of history when "media" wasn't social, but technology itself was. ■ Thanks to the overwhelming utility of Internet and database connections and to the obvious advantages of word processors and spreadsheets over typewriters and paper notepads, people of that early-computer generation had tremendous incentives to become early adopters of personal computers. But even the slow, under-powered desktops of those days were expensive. After tuition, room and board, and book fees, there wasn't always enough left over in the student budget for a modern computer. ■ And thus the campus computer lab emerged. Colleges (and some well-funded high schools) invested in hardware, installed the machines in centralized locations, and provided (usually) supervised sites for work. At the time, going to the computer lab was often a social occasion just as much as an academic one. It was, in its way, an early edition of the "co-work space". Everyone might have been working on something different from everyone else, but gathering around the technology was defined nearly as much by the gathering as by the devices. ■ The experience was, to a degree, an extension of (and an overlap with) the era of scheduling the use of a school's mainframe. That experience, though, tended to belong more to people in the STEM fields (before they were called that). The computer lab experience of the nascent Internet era was much more democratic: Everyone on campus had papers to write. ■ Computer labs may still exist, but the social aspect of making a pilgrimage to go to the technology is fading fast. The Northern Iowan, the student newspaper of the University of Northern Iowa, reports that on-campus computer labs are languishing because so many students now depend upon laptops and wireless network connections to get things done. The labs still exist, but increasingly their remit is to provide equity of access to students who don't have their own devices, or whose devices lack the expensive software needed to accomplish particular tasks. ■ Is something lost along the way? Yes and no. Generations of people attended higher education before computers became widespread, and they undoubtedly left feeling fulfilled. The computer lab experience was a consequence of scarcity, and that scarcity belongs more and more to historical memory as Moore's Law continues its march. But those who lived through it should know that what they experienced, much like trans-Atlantic crossings via fast ocean liners, airships, or the Concorde, was a rare moment in history. ■ It's rarely obvious at the time, but those moments belong to a different class than most histories. They often seem too unremarkable to record at the time, and often the only things left behind are tangential artifacts. They aren't worth resurrecting for their own sakes, but they are often worth documenting for the historical record and observing for what they might have to say about adaptation to change. Something, sometime, will become the new "going to the computer lab".
Artificial intelligence will most likely be able soon to replicate many if not most pop musicians. But it would be shocking if AI is ever able to sound quite like the wonderful and weird Beck. He's going to be our musical Turing test.
The high-tech layoff figures keep on rising
OpenAI has implemented a robot test for access to Dall-E -- even before a password is required. It makes sense for them to limit login attempts, considering the incentives for people to try to over-use their system. It's probably a sign of things to come more generally online.
A ham-fisted execution of a good intention can be frustrating to people who sympathize with a concern but who cannot defend an overreach or an inelegant declaration. The Associated Press has demonstrated just such an overstep via an update to its widely-used Stylebook. ■ Within reason, it's a good practice to center descriptions of people with their personhood first. Calling people "slaves" makes the unjust condition of slavery the noun, rather than the people. A subtle change -- to "people who have been enslaved" or "enslaved people" -- adds words, but reminds the reader that the subjects are people, not animals or inanimate objects. Dehumanization is essential to the evil of slavery, so anyone who rejects that evil (which should include any right-thinking person) should recognize that using "people" as the noun and "enslaved" as the modifier is a reasonable linguistic accommodation. ■ Centering on particular nouns can go too far. The AP has pronounced, "We recommend avoiding general and often dehumanizing 'the' labels such as the poor, the mentally ill, the French, the disabled, the college-educated. Instead, use wording such as people with mental illnesses. And use these descriptions only when clearly relevant." In the case of "people with mental illness", it's probably a reasonable accommodation -- recognition of their very personhood has mattered a great deal to their treatment, both in the past and in the present. ■ But "the French"? Under what possible set of conditions could it be considered "dehumanizing" to call the people of a nation by the name they have chosen for themselves? It's not like calling Germans "Huns" in the First World War. "The French" is a demonym; nothing more, nothing less. There's no dehumanizing connotation to it. ■ Should writers be careful how broadly they paint with any brush that describes large groups of people? Certainly. But it is a huge jump from categories of people whose treatment has all too often been defined by how others have deliberately or even unintentionally dehumanized them (like "the disabled") to people who made choices to join a group ("the college-educated") or who are members of a human nation. Care should always be taken with the language: Words matter. ■ But over-extending a good intention so much that it contorts the language into absurdity risks causing people to miss the point entirely. The difference between a medicine and a poison is often in the dose. Writers, too, need to know when to stop themselves before going too far.
Those who haven't worked in public-facing occupations may not realize just how unbelievably entitled some people feel. But even those with a lot of practice may find this one surprising: Someone actually asking the National Weather Service for a forecast specific to a particular stretch of road, 18 hours in advance.
Between Meredith and Cowles, Des Moines was once a major publishing powerhouse. Times have changed dramatically.
There's a certain type of blockhead who seems to imagine that "capitalism" is to blame for nearly every challenge in life, or even every shortcoming from a utopian imagination of the world. While it is difficult to launch an intellectually honest refutation of the advantages of market economics, it's possible for a person to hold an authentic opinion that some other type of system is better. What is impossible for a person to do with any intellectual honesty is to make out all the various troubles of life as the exclusive fault of capitalism. ■ Consider the following argument, apparently made in earnest: "[C]an you imagine being a human during the paleolithic age just eating salmon and berries and storytelling around campfires and star gazing ... no jobs no traffic no ads no poverty no capitalism-caused traumas just pure vibes" [punctuation, capitalization, and all other errors from the original]. It's difficult to pack that much fantastical intellectual dishonesty into so few words. ■ There have always been jobs to do, whether the prevailing economic system was capitalism, socialism, feudalism, or hunting-gathering. It has always been that way, because the world has always been a place where scarcity of resources has forced human beings to compete -- with each other, with other living things, and with the forces of nature. "No jobs"? "No poverty"? "Just pure vibes"? Nonsense. ■ Blaming anyone's current lack of comfort on "capitalism" is impossibly dumb, if capitalism is defined (correctly) as the idea that people should freely exchange things of value and be free to quantify those things as "capital". Nothing in history has ever worked so effectively to improve the material condition of human lives as capital-based market economics. Compare South Korea to North Korea. Compare Hong Kong or Taiwan to Communist-controlled China. Compare the historical West Germany to East Germany. ■ Markets don't create the scarcity that people blame for their problems. More than anything, they help to peacefully resolve scarcity. It's nonsensical to rely on the boogeyman of "capitalism" as the reason people don't enjoy "eating salmon and berries and telling stories around campfires". Those are choices, and they are enhanced by the production of more salmon, more berries, and more leisure time. Markets do just that. ■ It's ludicrous, too, to romanticize the past: It was often sickly, painful, and extraordinarily violent. That doesn't mean the present isn't too often full of troubles and violence, but it is vastly easier to make the case for peaceable coexistence when one starts from the premise that each individual owns themselves and the fruits of their own labor, whether produced by the hand or by the mind. If you, your life, and your work are valuable, then you have a right to protect them, and the state has a duty to help you do that. ■ Benjamin Franklin phrased it artfully a quarter of a millennium ago: "Is not the Hope of one day being able to purchase and enjoy Luxuries a great Spur to Labour and Industry? May not Luxury therefore produce more than it consumes, if without such a Spur People would be as they are naturally enough inclined to be, lazy & indolent?" ■ Franklin wasn't defending Goldman Sachs when he wrote that; he was identifying a common thread in human nature: That everyone has a hunger to consume, and that it must be satisfied by production. Most alternatives to markets depend upon coercion to get that production done, whether it's the feudal lord compelling work, a Communist government starving 20 million people through forced industrialization, or the chief of a small tribe deciding where and when to hunt. Nothing real in this world comes from consuming "vibes".
Americans are often quite reasonably frustrated by the endless growth in health-care expenses. It's a problem we sometimes would like to believe is all our own, due to the unusual way that health insurance is handled in this country. But in no small part, what frustrates Americans also frustrates people in other countries -- it just manifests itself differently elsewhere. ■ The root cause is that everyone has an incentive to pursue the maximum available coverage for their own health, and the resources to deliver health care are limited by a variety of real constraints. Take, for instance, the NHS in the United Kingdom: It's widely admired and, generally, free to the user. But demand has exceeded supply quite a lot, to the extent that ambulance response times have stretched to shocking levels, well in excess of the NHS's targets (not to mention patient expectations). ■ And nurses and ambulance staff are going on strike, saying they aren't paid enough. Fundamentally, it is the same problem as anywhere else: Health care is an area of almost uniquely limitless demand, and somewhere along the line, that demand runs up against some kind of limitation in supply. ■ High technology often doesn't help the supply side as much as intuition suggests it should, but we may be on the cusp of something new -- if and only if the regulatory and professional environments are prepared to accommodate the possibility. ■ Consider this possibility: For as much as medical professionals criticize patient reliance on "Dr. Google", it seems inevitable that self-service diagnostic tools built on AI platforms and delivered by credible providers will be offered as alternatives to urgent care and walk-in clinics, perhaps at kiosks in drugstores and grocery stores. Lots of tests require facilities (like the ability to draw blood) that most people don't have at home, but which could be provided in fairly frictionless ways at establishments already connected to health care. ■ A human professional would probably still need to review and sign off on the diagnosis, but most of the heavy lifting would be automated. Just as AI is already showing up in radiology, so too could it have a lot to contribute elsewhere in medicine. But a human's guidance still makes enormous sense, just as it remains logical to keep human pilots in the cockpit even though autopilot can fly a plane by wire. Humans working in concert with computers are better than either working alone. ■ The trends involved are so powerful that the helping hand of automation seems impossible to ignore. Between known workforce shortages in the medical field, the quest for cost containment, and the rising health demands of an aging population, all of the pressures causing troubles now are set to make things worse unless there is some kind of fundamental change to the systems of provision. More burnout is unfortunately likely to beget even more burnout. ■ We're already growing familiar with telemedicine. Like a lot of other things, it was forced on a lot of people due to the onset of the Covid-19 pandemic, but now it seems far less unusual or far-fetched than it once did. Medical care driven mainly by machines -- call it automedicine -- is almost certainly coming fast behind. It's up to lawmakers to start thinking right now about what "automedicine" will need to look like. If the people in state legislatures and the halls of Congress don't appear to be up to the task, then perhaps patients need to consider the consequences.
Despite its occasional lapses into the ridiculous, the AP Stylebook serves a useful purpose. As a guide for journalists in newsrooms all over North America and beyond, it creates a set of rules that resolve the inevitable conflicts posed by a complex language like English. In standardizing and, to an extent, codifying the use of language, the AP Stylebook makes it possible for people with different backgrounds to find common agreement about what they're discussing. ■ Even when we disagree with others, it's important to have a common base of language from which to differ. That's a value that has become harder to recognize as more and more written conversation takes place not in centralized locations, like Associated Press news articles, but across the vast reaches of the unregulated Internet. ■ On one hand, it's a wonderful thing that so many individuals have basically unfettered access to their own "printing presses". But the absence of a common set of standards for how words are to be used makes it difficult for people to come to reasonable conclusions about the debates that inevitably emerge. ■ Conducting heated debates without open agreement about the rules of the language is like debating the speeds of race cars without resolving the units of measurement first. If one person is using miles per hour, another is using kilometers per hour, and a third chooses meters per second, there's no way they can reach an understanding on the merits. ■ No single source needs to have a monopoly on setting the standards for language, but it would be useful if people could identify themselves based upon the stylebooks to which they adhere. The New York Times has a stylebook. US News, too. There's a Washington Post version, and a [University of] Chicago Manual of Style. Or one could consult the BBC News Style Guide or The Economist Style Guide, for an overseas flair. ■ The point is that it's too easy to talk past one another when we don't even agree on what words are supposed to mean. Dictionaries can only go so far, saying little or nothing on important matters regarding connotations, weighted language, and the boundaries on appropriate use. Those just aren't the kinds of conflicts that Merriam-Webster can resolve. ■ But it would be useful if people could come to common agreement about what they mean by things like "capitalism" or "socialism", "public health" and "balanced budgets", "democracy" and "authoritarianism". What we need is not one style book for the Internet age, but a few of them, all of which could compete with one another for legitimacy. ■ And the more that people adopted them openly, the more legitimacy they would obtain. Lots of people try to signal some of what they mean with their words by identifying with a party or a creed in their social-media profiles, but it might do more good if they'd only tell the rest of us which stylebook they're using.
National Houseplant Day (January 10th each year) doesn't have the same cachet as most holidays. It's not even a close second to Arbor Day, which is big enough to have its own foundation. But it might be time to consider not just appreciating houseplants for their aesthetics, but expecting a little more work from them, too. ■ While it appears true that plants are capable of extracting volatile organic compounds from the air, their capacity is limited. Most of the successful experiments with using plants to improve indoor air quality have been just that: Experiments. And while the evidence has pointed in the right direction, the problem is one of scale. The houseplants we have now just don't work through enough air to act as indoor air purifiers. ■ Yet a few other things are clear: Trees are big enough to do the job of phytoremediation, removing hazardous chemicals from the soil and water. Lots of people already are predisposed to believe that houseplants can purify indoor air. And it appears possible to genetically engineer plants to do more detoxification than they might without human help. ■ Perhaps what we really need is some innovation to bring together some of that genetic engineering along with interior design and architecture. Green walls aren't particularly commonplace, but there's no reason they couldn't be, if given the right aesthetic and maintenance characteristics. And what about ceilings? What if, in place of popcorn ceilings, builders installed green ceilings composed of plants that would require minimal maintenance? ■ Ceiling space is mostly wasted today, and usually have to be treated so that they perform some acoustic deadening. Tiny houseplants -- or maybe even their rootless cousins, the mosses -- could, hypothetically, be optimized through hybridization or genetic modification to perform always-on, energy-free air purification. ■ Maybe it seems unlikely now -- but it also would have seemed unlikely just a short decade or two ago that we would be building our indoor habitats to feature giant, feather-light televisions and gigabit-speed wireless Internet connections. Some of the same energy and creativity that has gone into making consumer technology better might be put to good use making consumer biotechnology better, too. If walls and ceilings could be making life healthier, shouldn't someone be thinking about trying?
Americans like our acronyms a little too much. We like them so much that people often contort themselves to instill a word with meanings by creating retroactive acronyms ("backronyms", in the words of some). ■ When it was new, the "Tea Party" movement adopted its name in honor of the Boston Tea Party. Later, people retroactively instilled the word "Tea" with a backronym: "Taxed Enough Already". ■ While the movement itself went through a predictable life cycle and has mainly been subsumed into other identities, the "taxed enough already" attitude has persisted. Unfortunately, this is entirely untrue. If anything, the American public is plainly not taxed enough already -- certainly not enough to cover the costs of the things we demand that our elected officials deliver at the Federal level. ■ States and local governments are different: Without the power of the printing press, they're forced to observe a lot more fiscal discipline than Congress is. Virtually all of the states have balanced-budget requirements. Not so at the Federal level, where the budget deficit is a trillion dollars. ■ Accumulated deficits result in debt, which is now at more than $31 trillion. And because the government has a statutory debt ceiling, we're in a dangerous political spot: Congress has already run up the spending, and now it has to account for the fact that deficits create debt, and the world expects us to pay for what we borrow. ■ Reasonable people can disagree about particular spending choices, but it's plain that the far right wing is daffy when it claims we can fix the deficit by cutting discretionary spending (we cannot; the biggest spending categories are basically untouchable entitlements), and the far left is nuts when it claims that we just need to tax "millionaires and billionaires" more (most tax revenues come from ordinary people paying income and payroll taxes, and there aren't enough of "the rich" to soak enough to make up the gaps). ■ There's nothing wrong with persistent budget deficits of a small amount that remains less than the rate of economic growth. We could easily get by with a Federal deficit of 1% or 2% of GDP, especially if some of that spending goes towards productive activities that could enhance economic growth, like building infrastructure. It's the chronic habit of overspending by 5% of GDP that gets us into trouble. ■ And we need to keep some borrowing capacity in reserve for occasional surprise events that require emergency spending -- like wars and pandemics. An emergency is no time to get stingy, especially if the consequences could be existential. And we have been faced with more than a couple of those emergencies, even in recent memory. ■ Nobody wants to admit it, but the truth is that we either have to radically reel back what we expect from our government, or we need to show the maturity to realize that there isn't a magical pot of gold at the end of any rainbow, just waiting for us to raid it. Persistent (and preferably accelerating) economic growth is the best possible way to put our debt to rest, but that growth will be badly threatened if we don't demand an adult resolution to the problem. ■ The debt exists because the deficit already created it. We need to pay what we owe, then set about correcting course and talking about our budget like adults. And unfortunately, we have shown by our voting choices that we are not quite taxed enough, regrettably.