Gongol.com Archives: May 2021
Microsoft has made it official: As of June 2022, Internet Explorer is history. While there's very little reason to weep for the soon-to-be-departed browser itself, its sunset does bring to mind what things were like at its dawn. ■ When Microsoft first introduced Internet Explorer (MSIE) in 1995, we were a long way from the contemporary Internet as we know it. 54% of American adults used computers at the time. Not the Internet, not certain devices, not a particular social-media platform: Used computers at all. By comparison, in 2021, 69% of American adults use Facebook, and 81% use YouTube. The percentage who used any online platform in 1995 barely cracked double digits. And, in the words of a 1995 Pew survey, "Only one in five of all online users (3% of Americans) have ever signed onto the Web." It's hard to communicate just how much of a difference just one quarter-century has made. ■ At the time, it was widely heralded as the Information Age. It was more like a little bit of information age. But the prospect, at least, seemed grand. ■ Not long after MSIE came along, it became the subject of a bitter antitrust battle. Indeed, it almost seems quaint now that so much effort went into a battle over browser usage, but it was a massive fight at the time. ■ The battles today are not fought over browser usage, but over issues like user time, attention, and personal data. It's no longer Bill Gates in the hot seat, it's Mark Zuckerberg and Jack Dorsey. But in the quarter-century of MSIE's existence, something else has happened: We've landed in a confused economic state. Nobody really knows what to call it -- a problem explored at some depth in Ben Sasse's book, "Them: Why We Hate Each Other and How to Heal". His lament is that our inability to name the thing has the same effect as Frankenstein's monster, frightening in its haziness -- in Sasse's words, "A powerful entity with no name is unnerving." ■ What we should call it is the "Teach-Yourself Economy". It's not the arrival of abundant information, per se, that dramatically changes the economy. This may be an "information age", but information alone isn't doing the work. Instead, what makes this era particularly disruptive is that there is a lot to learn, a deficit in our systems and frameworks for teaching it, and a significant amount of turnover in who's doing the work. As a result, there's far more autodidactism in far more places than there ever was before. ■ Periodically, American policymakers have made big moves to help people teach themselves. The most prominent example was the rise of the land-grant university and its offshoot, ag extension. But that model far predates the Internet. Furthermore, it's hard to communicate just how much workplace knowledge and experience is going out the door as the Baby Boom generation retires en masse. The oldest members of Generation X are now in their mid-fifties, which means they are filling senior management and leadership positions -- but the numbers are inescapable: There aren't nearly as many Gen-Xers as Boomers, so even if institutional knowledge was being formally handed down, there are far more prospective teachers than pupils. ■ Thus, of all the skills that might be valuable in the modern world, one that may have the highest payoff is the ability to teach yourself. It might be easy to learn how to do basic activities from a YouTube or TikTok video (and, indeed, satisfying the emerging curiosity gaps is a business model for some) -- but anything teachable in a "I was today years old" tweet or an "adulting" video isn't something the market is eager to compensate highly. The really valuable things that need to be learned are complex and often intersect multiple fields at once. ■ And, paradoxically, the way in which the Internet has developed means that very little of what happened (or even what was written or recorded) prior to about the year 2000 is simply not to be found online -- even if it contains hugely valuable knowledge. Despite the best efforts of the good people of the Internet Archive, there are massive troves of proven, detailed, and authoritative information that are orphaned because they appeared in print instead of pixels. ■ We're fortunate that standards overcame the early browser wars and delivered us from a world in which websites were only functional on Mozilla or MSIE (you can thank the W3C for that). But as we prepare to send Internet Explorer off into that good night, it's worth grappling with the reality that the Teach-Yourself Economy is here to stay. The future belongs not to the country with the cleverest app developers (for their skills come and go), but to the society that learns and adapts quickest without forgetting what its people once knew. We've had enough time being enamored with the technology itself; it's well past time to return it to its proper place as a tool.
Given the option, it's best to get a double-major in college, preferably from two very different fields. The really interesting stuff happens in the overlaps -- like where lawyers have to figure out statistics or meteorologists have to understand sociology.
It's a rare political breed, but that shouldn't stop us
It's unusual to see that much of the country under one contiguous hazard area
Alas, em/en dashes don't copy and paste well into HTML, so those who write for digital publication of any sort are often hamstrung by the incompatibility if they're also trying to remain W3C-standards-compliant.
An extraordinary testament to the vaccines.
AFP: "Building collapses are not rare in China, where lax construction standards and breakneck urbanisation over recent decades has led to buildings being thrown up in haste." ■ Related: Mario Salvadori's "Why Buildings Stand Up" is a delightful book.