Obfuscation Mitigation (Lexicon):


Best Practices (link)

Making sure your blunders are popular ones. Rationally, this term should mean "methods that meet professional standards of competence and due care", but tends instead to be a managerial code phrase meaning "If anything goes wrong, I want to escape being a specific target of blame by pointing out that our hapless cock-up was the same one countless others made, too."

Betteridge's Law (link)

"Any headline that ends in a question mark can be answered by the word 'no'."

British technology journalist Ian Betteridge made this observation in a February 2009 article, reasoning that journalists resort to that style of headline when they know the story is probably bullshit, and don't have the sources and facts to back it up, but still want to run it.

Bike Shed Effect (link)

Social dysfunction syndrome first noted by C. Northcote Parkinson in his 1957 book Parkinson's Law, and Other Studies in Administration, whereby getting permission to build a billion-dollar atomic power plant is easy, but a proposal to build a cheap bicycle shed will founder under the weight of endless discussion. Parkinson noted that, because an atomic plant is so vast, expensive, and complicated that people cannot grasp it, rather than try, they'll fall back on the assumption that somebody else checked details before it got that far. However, everyone knows all about bicycle sheds, and feels no inhibition against debating their pettiest details without limit.

E.g., most technical mailing lists spend at least half their time drowning in redundant posts, trivial quibbles, and offhand opinions of no conceivable public interest. Why? Because people can — and because it's a way for them to demonstrate involvement with near-zero effort.

(Our FreeBSD-community friends have observed "bikeshedding", too.)

Bradbury's Defence (link)

The term is from a droll story author Ray Bradbury used to tell:

One dreadful boy ran up to me and said: "That book of yours, The Martian Chronicles?"
"Yes", I said.
"On page 92, where you have the moons of Mars rising in the east?"
"Yeah", I said.
"Nah", he said.
So I hit him.
Cakeism (link)

The delusion that you can pursue incompatible goals (as in "have your cake and eat it, too"). This coinage originated in 2016 debate around the (disastrous) UK Brexit referendum, where proponents had falsely claimed "leaving" would both have all the advantages of EU free trade and all the advantages of freedom from EU regulations. British voters ended up with neither, having been rendered more isolated, less influential, and poorer by "no-deal" Brexit.

Cakeist thinking prevails in populist rhetoric, on political and economic matters, e.g, that tax cuts for the wealthy will "pay for themselves". Governance and policy often require hard choices. Anyone pretending otherwise is selling something.

The cake is a lie.

Cheeto Factor (link)

Semantic death-spiral, first identified by Fred Clark (1, 2) in which each incrementally attached modifier renders a noun less true, e.g.:

  • "Cheese": real cheese.
  • "Processed cheese": cheese, sort of.
  • "Processed cheese food": cheese, sort of, plus other stuff that's not cheese.
  • "Processed cheese food snack product": the food in question is vaguely orange and squishy, but contains no actual cheese.

The term has obvious broader application, e.g., when Linux Gazette editor Ben Okopnik pointed out that the term "commercial open source" most commonly denotes "not actually open source at all".

Cole's Law (link)

Thinly sliced cabbage.

Compromise (link)

Concept touted by American commentators as an inherently desirable approach to solving other people's problems. (By contrast, all disputes touching on those commentators' own interests are exempt — as clearly entailing "important principles" that must be defended.)

This guideline's Solomonic wisdom can be seen in the hypothetical example of you, the reader (unless, of course, you're American) being attacked by some thug attempting to kill you: A typical American observer might recommend a "fair compromise" of you being left half-dead.

Computer Associates (link)

Place where formerly useful software companies go for zombification after dying. Also seen in verb form, as in "Apologies for the new support and upgrade policies, but our firm was recently Computer Associated."

To elaborate: Computer Associates, Inc. (founded 1976 and bearing several variant names over the years) is one of several US software industry firms best known for eating up established firms in financial distress, in order to not continue and improve their established product lines so much as to "milk" end-customers for maintenance and upgrade fees, while offering degraded product and hollowed-out services. I first noticed this effect when Cheyenne Arcserve enterprise backup software and associated company services became generally dreadful following Cheyenne Software's 1996 acquisition. The pattern continued with other CA acquisitions, of which there were about 200 by year 2000.

Perhaps proving shoddy knows shoddy, in 2018 CA was itself acquired by Broadcom Inc., followed closely by mass layoffs (the scavenger scavenged). If looking for the zombified zombifier these days (but why?), look under its latest variant name since 2015, "CA Technologies, Inc."

Although the same joke could have been told about other zombifiers, such as McAfee, Inc. (née Network Associates, Inc. and whatever), NortonLifeLock, Inc. (née Symantec Corporation), etc., Computer Associates is, as the saying goes, the trope-namer.

Deirdre Saoirse Moen's Law of Management (link)

"Bad managers always outlast good employees."

Management should be expected to always terminate the staffer and protect the manager. Therefore, if you find yourself under such a manager, move sideways to elsewhere but never, ever, file complaints with the company instead.

Why do companies consistently do this, strongly contrary to their long-term self-interest? Some guesses: 1. Inherently, managers are given more trust than the employees they manage. Thus, in a conflict, upper management will typically back the manager. 2. Many managers have contractual arrangements that make them very difficult (and slow) to terminate. 3. Management typically cares not at all about who's the reasonable party, but just wants the immediate problem to vanish. Firing the peon is less troublesome; peons tend not to sue, and the embarrassment and organisational disruption costs are lower. 4. Relevant to the prior point, firing the manager would implicitly reflect discredit on higher-ups who vouched for him/her. 5. In cases where the manager might have committed torts or crimes, management often fears terminating his/her job might, ironically, increase company liability for those acts. 6. Short-term thinking is the rule in most businesses, rather than the exception.

Doctorow's Law of Platforms (link)

Canadian/UK tech journalist and novelist Cory Doctorow observed in January 2023:

"Here is how platforms die: first, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves. Then, they die."

This tendency of commercial Internet services, that inevitably make their profits as middlemen between end-user and business interests, to suck benefits from both ends as they ruin the service for greater ROI, has turned out to have much broader application than just TikTok — to Amazon Marketplace, Instagram, Facebook, Google Search, Twitter and wannabe vampire capitalism replacements (every competitor except Mastodon): In every case, after users get hooked, rapacious monopoly owners gradually degrade and ruin the platform, for reasons Doctorow cites.

However, this is not inevitable; people just need to learn from the mistake of trusting monopolies to behave like a public utility, stop outsourcing to them, and favour local-first computing that stresses interoperability, the lesson the open source movement has relied upon all along. And, also, get serious about anti-trust, again. And focus on strengthening competition, regulation, self-help (e.g., ad-blocking), and organised labour, the only proven defences against this syndrome.

Dueling Banjos Effect (link)

Term coined by Jim Penny for self-perpetuating Internet prominence caused by feedback loops between search engines and Internet discussion fora. Refers in particular to bizarre and perplexing instances of such freaky fame.

The eponymous example was triggered by one Martin Eldridge's deeply mistaken query on the Debian Linux distribution developers' mailing list (debian-devel), in July 2000: "Could you please send me the sheet music for Dueling Banjos, Regards Martin". Which in turn lent that mailing list high prominence on all subsequent Google searches for "sheet music Dueling Banjos", which lead to other people's (completely inappropriate) queries much like Eldridge's, and so on.

To head off the inevitable queries I would otherwise get: No, I don't have that sheet music. Neither does the Debian Project, despite porting efforts.

Duverger's Law (link)

"The simple-majority single-ballot system favours the two-party system." This was French sociologist Maurice Duverger's much-quoted point about voting systems in his 1951 book Political Parties: simple-majority single-ballot [plurality aka first-past-the-post] voting favors the two party system" whereas "simple majority with a second ballot [dual-ballot or runoff] or proportional representation favors multipartyism". The latter point about proportional-representation election rules favouring a system with many small parties is often dubbed Duverger's Hypothesis.

Duverger's rationale was that single-ballot plurality rules create incentives for voters to engage in a particular pattern of strategic voting: You perceive an incentive to vote for, and contribute money towards, candidates (and parties) you think can win, in preference to ones you might greatly prefer, thus giving the two most popular candidates (and parties) disproportionate sway, and tending to shut out all others, as voters try to avoid "wasted votes". Decades of study have confirmed Duvenger's analysis.

Ways to mitigate the two-party system's characteristic stagnation and rewarding of mediocrity include: approval voting, instant-runoff voting, top two voting, runoff voting, mixed member representation, and (most effectively) proportional representation.

Edwards's Law (link)

"You cannot apply a technological solution to a sociological problem." Nobody seems to know who Edwards was, but pretty much the entire system administrator profession rests on the implicit assumption that he/she was egregiously mistaken.

This plausible-sounding but empty-headed dictum is most often referred to as "Edwards' [sic] Law" — by the depressingly huge mass of semi-literates unable to correctly write possessives of singular nouns ending in "s".

ENOPATCH (link)

Laconic expression meaning "You have failed to include substantive, useful content, among all that verbosity."

The expression was (to the best of my ability to tell) originally coined by Alan Cox in private mail copied back to the Linux kernel mailing list, on 2000-10-16, simulating for humourous effect a parser error ("Error: no patch") — thus advising a correspondent he'd omitted his source code patch. Some subsequent posters have used it in its current, figurative sense, e.g., Randy Dunlap's sardonic response to Luke Leighton's meandering advocacy post.

(No doubt related is Alexander Viro's similar creation: "Backwards compatibility is nice, but preserving every undocumented quirk that nobody sane would use... Sorry, but we really need an addition to errno.h: EBITEME. Exactly for such cases.")

Fisher Principle (link)

"Any national government is an insurance company with an army." This sage observation was first made by US Undersecretary of the Treasury Peter Fisher, in 2002, recognising that the great bulk of Federal spending is on health care, retirement, and the military.

When national politicos, such as the UK's PM Liz Truss and Chancellor of the Exchequer Kwasi Kwarteng, forget that they're CXOs of an insurance company with an army, and figuratively fly the plane into terrain, bad things ensue.

This quip may have originated with Michael Holland at OMB, circa 1999, as he is credited with (at some point) having said "It helps to think of the government as a very big insurance company with an army."

Frogery (link)

Yet another new and fabulous Internet invention, a "frogery" (alternatively, "froggery") is a forged Usenet posting (or, by extension, e-mail or Web site) whose address was crafted to be visually as indistinguishable as possible from that of its intended victim — substituting, e.g., "1" for "l" or "0" for "O" — to either slur the victim by association or troll him/her into complaining to the froger's Internet provider or a public forum, and thereby look stupid.

The term originated on the Usenet newsgroup news.admin.net-abuse.usenet in the late 1990s. The best known frogery episode was occasioned by an obscene January 13, 1997 soc.culture.thai post from "Lawrence Godfrey", leading the better-known Dr. Laurence Godfrey (whose e-mail address was used in the frogery) to file a defamation action against Demon Internet Ltd. for failing to remove it from the company's news spool when so requested.

Goodhart's Law (link)

"When a measure becomes a target, it ceases to be a good measure."

Managers (both public and private) love metrics, and often forget that, the moment simple-minded metrics (test scores, quantity of trouble tickets closed per week, stress testing numbers for financial institutions, etc.) get written into policy and used to decide who gets promoted/rewarded, and so on, suddenly achieving good metrics rather than good results becomes the prime objective, as gamesmanship interferes with the original intention.

This adage is attributed to British economist Charles Goodhart, who, in a 1975 article about monetary policy, observed "Any observed statistical regularity will tend to collapse, once pressure is placed upon it for control purposes."

Grossman's Law (link)

"In time of crisis, people do not rise to the occasion. They fall to the level of their training." (This was inscribed on my Operations Department's whiteboard — to remind staffers how important conditioned response and advance preparation are.) This dictum slightly adapts what former US Army soldier, US Army Ranger School graduate, and West Point professor Lt. Col. David Allen Grossman (ret.) wrote about soldiers in combat, inside his well-regarded 2004 book On Combat: The Psychology and Physiology of Deadly Conflict in War and Peace, where he attributed that saying to an unnamed USMC gunnery sergeant ("gunny").

(The original of this observation may or may not lie with a semi-legendary Greek lyrical poet and soldier named Archilochus, who lived in the 7th Century BC, the pre-Classical Archaic Period, on the island of Paros: Attribution to Archilochus of "We don't rise to the level of our expectations; we fall to the level of our training" is widely asserted but without sourcing. In any event, an observation as basic as Grossman's/Archilochus's probably has a murky origin.)

Another keen observer, Louis Pasteur, commented in a lecture at University of Lille (Dec. 7, 1854) that "In the fields of observation, chance favors only the prepared mind". («Dans les champs de l' observation, le hasard ne favorise que les esprits préparé».) However, Grossman's point, beyond that, is that if you want to reliably do the right thing in any stressful, exigent situation, you need to practice for it.

Hawthorne Effect (link)

Initial improvement in a process of production caused by obtrusive observation of that process. Industrial researchers studying General Electric's Hawthorne Plant in Cicero, Illinois (in 1927 - 1932) noticed that, when they raised lighting levels as part of a visible study of worker activity, worker efficiency went up. They lowered lighting levels: Efficiency went up again. They changed the humidity: Efficiency rose. And so on. The lesson seemed to be, narrowly speaking, that workers do better when they think somebody actually cares about and notices achievement — or, more broadly speaking, that work is strongly influenced by social factors.

IMVAO (link)

In My Very Arrogant Opinion. This is a gentle parody of the ostentatiously aw-shucks Internet expression "IMHO" = In My Humble Opinion (as if people actually needed to continually apologise for alluding, on the Internet, to their personal views). See also Moen's Second Law of Debate.

Internet Chat Room (link)

A journalistic jargon term used to vaguely denote an archetypal bogeyman threat to society claimed to lurk at some unspecified location on the Internet. Generally, the reporter neither knows nor cares what specifically he/she is referring to: The citation is symbolic; facts don't matter. They're (unspecified) places about which nothing else is ever said except to imply that they morally imperil the nation's youth, function as virtual dark alleys for pornographers and criminals, etc.

As a reference to a specific on-line medium, the term "chat room" originated with America Online, referring to that non-Internet-based private service's real-time discussion feature. Reporters immediately began speaking of these (typically very down-market, not very literate) forums as "Internet chat rooms", despite their having nothing to do with the Internet.

Much more recently, some actual Internet services have been packaged as "chat rooms" to appeal to wandering AOLers: instant messaging conference services from AOL, MSN, and Yahoo, Web sites' Java or Javascript-based real-time discussion functions, etc. Other real-time or asynchronous Internet discussion media simply aren't called "chat rooms" by their users, but that doesn't stop journalists: IRC channels, Jabber conferences, Web forums, Web bulletin boards, e-mail mailing lists, Usenet newsgroups, and many others all get smeared with that rather disreputable and insulting (not to mention vague) label.

Next time you see a reporter refer to an "Internet chat room" as (inevitably) where something evil happened, wager $20 that he/she can't cite its specific location and communications protocol. Ninety-nine times out of a hundred, you'll make money.

MCSE (link)

Acronym standing for "Moen Coffee Service Engineer". Early in my consulting career, two of the more penurious firms for whom I subcontracted would assign feckless (and Microsoft-centric) youth to "help" me — though these particular gentlemen knew little about network technology and, if anything, perennially slowed me down. Probably, those firms expected to have the young'uns observe, and then replace me at a small fraction of my hourly billing rate.

Since they weren't any use for client work, I'd always just send them out for coffee.

(Yes, thank you, I have indeed heard the hoary Must Consult Someone Experienced variant.)

Moen's Corollary to Shapiro's Law of Communication (link)

"Communications companies never communicate with one another." This tendency comes forcefully to the attention of anyone doing Internet provisioning for business. (Elise Shapiro's Law of Communication says: "The more methods people have for getting in touch with them, the more difficult they are to reach.")

Moen's Law of Bicycles (link)

"Good customers make for good products". This is my explanation for why an ignorant customer base causes merchandise quality to decline, on account of unhealthy market dynamics, e.g., in retail computer hardware and software. In the mid-1970s, bicycles suddenly became very popular in the USA. The masses suddenly entered the market, few knowing anything about bicycles. Many could distinguish poorly if at all between good equipment and bad; good customer service and bad. Consequently, poorly made bicycles (which cost less to make) undercut well made ones (and poor customer service out-earned the good variety), because superior value ceased to be perceived. Over time, overall quality of available bicycles declined considerably, almost entirely because of this dynamic with customers, recovering only after the fad ended, years later.

Quality thrives only when people can tell the difference. When they haven't a clue about products and how they work, schlock merchandise prevails. One can see this process at work in retail computing gear and software: People who know least about computing always insist most on achieving bottom dollar. In a way, this is understandable: You want to exercise control over the process, and, if you're dirt-ignorant about computing, the only place to exercise control is over price. Gradually, this effect tends to drive good merchandise out of the market entirely, leaving a generous selection of cheap crud.

This "law" originated in a November 1995 editorial I wrote for the San Francisco PC User Group, where I attempted to explain why, by becoming willfully ignorant computerphobes and thereby ignoring the birthright of all computer user groups, they were particularly responsible for the decline of quality in available hardware and software.

Objections to Moen's Law of Bicycles:

  • "You've merely restated Gresham's Law." No. In its literal sense, Gresham's Law says that less-base (higher intrinsic value) metal coins tend to disappear from the market, because people preferentially retain them, and instead use in commerce baser-metal ones bearing the same face value. It is also sometimes cited metaphorically to claim that "Bad X inevitably drives out good X" — but this is mere empty-headed nihilism, absent some explanation of by what mechanism inferior examples of X will allegedly predominate. Moen's Law of Bicycles differs by detailing one such mechanism, under what conditions it arises, and, equally important, how to prevent it. E.g., bad 35mm cameras have never driven good ones off the market (or rendered them artificially expensive specialty gear), because a large fraction of wise purchasers continue to make a point of understanding what they're buying.

    (George Akerlof ably described the psychology and economics entailed with the end-result of such market changes, in his 1970 paper "The Market for Lemons: Quality Uncertainty and the Market Mechanism", as an example of asymmetrical information theory, but didn't give the syndrome a specific name.)

  • "But if I just want a bike for the grandkids, why should I have to buy a 2lb. $3,000 aluminum racer?" This misses the point in two ways: 1. I never recommended the high end. 2. The 1970s' influx of ignorant customers made good cheap bicycles rare, every bit as much as the more-expensive ones: An unhealthy market dynamic has corrosive effects at all price levels.

Moen's Law of Comedy (link)

"Dollar for dollar, irony is still your second best entertainment value, right after Usenet kooks."

Moen's Law of Corrections (link)

"Any post critical of spelling, grammar, or punctuation errors will acquire a few. If Great Murphy is feeling particularly cruel, you will end up misspelling 'misspelled'."

Moen's Law of Clueless Newbiehood (link)

"The whiniest new users, and especially the ones who claim your sole purpose in life should be to help them, always say that using MS Outlook (or Outlook Express)."

I first noticed this correlation on Linux mailing lists and newsgroups. (The point, in part, is the irony inherent in demandingly seeking intensive personal assistance on a public Linux forum, using a particularly notorious Microsoft application.)

Moen's First Law of Debate (link)

"No matter what the issue, someone will try to turn it into a personality dispute." In part, this is because complex issues become more comprehensible if you convert them into soap opera, however inaccurately. Additionally, it's common for someone who's losing an argument to misbehave in hopes of debasing the discussion, scoring a draw in the eyes of casual observers, and tarring the opposition by association.

Moen's Second Law of Debate (link)

"There's nothing quite as sublimely silly as publicly professing one's humility." I get the giggles when I see people making a big deal in public of how self-effacing they are.

Moen's Law of Documentation (link)

"The more you write, the less they read."

Although any piece of writing can be improved, even the best examples, especially of technical writing, no matter how excellent, will garner requests for more detail — far past the point of reason. Why? Because, most often, a questioner's immediate reaction (to not instantly understanding) is to claim that insufficient information was provided, whether such is true or not. The longer and more detailed any subsequent, further explanations are, the more difficulty target readers will have in finding what they need, and the more they'll demand an even thicker forest of explanations to get lost in.

Thus, greater conciseness often does much more good than do longer & more detailed explanations. Or, what might be needed is better indexing, or following the classic journalist's inverted pyramid format, or the short answer / long answer format I often use — or just a polite suggestion to Read the Friendly Manual (or Search the Friendly Web).

Moen's Law of Hardware (link)

"Use what the programmers use."

After years of helping people with hapless computer-hardware woes, especially trouble-prone categories such as Linux on laptops, exotic peripheral interfaces, etc., it occurred to me to wonder why I never had such problems. It was mainly because of instinctive avoidance of dodgy, exotic, new, and/or badly designed components — which happens to track strongly with programmers' characteristic prejudices. There's a logic to that, which may not be immediately apparent to many:

Drivers for hardware don't emerge like Athena from the head of Zeus: Someone has to create them. Especially for open-source OSes such as Linux, this involves a chipset being brought to market, for it to be out long enough for coders to buy and start using it, and for them to (if necessary, absent manufacturer cooperation) do the hard work of reverse-engineering required to write and debug hardware support. Then, the resulting code filters out to various OS distributions' next releases, and thus eventually to users.

It follows that, if you blithely buy what's new and shiny, or so badly designed or built that coders eschew it, or so exotic that coders tend not to own it, it will probably have sucky software support, especially in open source. (Proprietary drivers can be written under NDA, often before the hardware's release, while manufacturer help is routinely denied to the open source world.) Conversely, if you buy equipment that's been out for a while, doesn't suffer the (e.g., Diamond Multimedia in the early 2000s) problem of chip-du-jour, is bog-standard and of good but not exotically expensive quality, it will probably have stellar driver quality, because coders who rely on that hardware will make sure of that.

Thus, it's very common for slightly ageing but good-quality gear to outperform and be more reliable than the latest gee-whiz equipment, because of radically better software support — not to mention the price advantage.

Ergo, in 1999, instead of buying a current-production laptop to run Linux on, I bought, used, a Sony VAIO PCG-505TX, because I knew several Linux kernel coders had been using those as primary machines. Performance and stability have been exceptional.

More broadly, if you can identify the types of gear programmers would favour — and avoid — you'll be ahead of the game. Coders would avoid winmodems / winprinters, brand-new 3D video chipsets, cheesy and unsupported SATA "fakeraid" chipsets, low-end scanners reached through parallel ports ganged to ATAPI ganged to SCSI chipsets, cheap multifunction scanner/printer/fax boxes, hopelessly proprietary USB aDSL bridge cards, etc. They would favour parts of standard interface, command-set, and chipset design and high enough quality that they might be reused in multiple machines over a long service life.

Moen's Law of Inefficient Immolation (link)

"Murder-suicides never quite seem competent to do it in the right order." I first noticed this effect in its strict literal sense, concerning some miscreants who murdered my congressman. However, in its main, metaphorical sense, this law describes screw-ups' tendency to wreck as many other people's affairs as possible before, themselves, flaming out.

(People who suddenly realise that they're misbehaving, especially when cornered, sometimes seem caught in a trap of escalation intended to somehow justify themselves. My implication is that it may be better for all concerned to quash the syndrome, prior to its climax.)

Moen's Law of Licensing (link)

"Any MyCompanyName Licence is defective and dangerous until proven otherwise — honourable exceptions being Affero, Mozilla, ISC, Apple, and some others too obscure to list."

A friend had been using in personal YouTube videos digital images he selected from Pixabay in the good-faith belief that they were Creative Commons licensed. It turns out, prior to January 9, 2019, they were CC0 licensed, but are no longer. On that day, Pixabay switched all images to its new, bespoke Pixabay License, and note that it "does not allow [...] distribution of Content as digital Content", which is what my friend and pretty much every Pixabay image user does, and what the images are there for. Shooting its own users in the pedal extremities was presumably not Pixabay GmbH's aim, but they did so anyway, something all too common with these "crayon licences" (to use Bruce Perens's term).

Moen's Law of Littering (link)

"People tend to litter where there's already litter." Has obvious application to any instance of socially reinforced standards, e.g., netiquette, S/N ratio on technical forums, etc.

Moen's First Law of Security (link)

"It's easier to break in from the inside." E.g., many Internet break-ins result from masquerading as a legitimate user to gain user-level access, e.g., with sniffed passwords. The attacker then has a dramatically wider selection of system weak points he/she can attack, compared to penetrating from outside.

Moen's Second Law of Security (link)

"A system can be only as secure as the dumbest action it permits its dumbest user to perform." Your users are often your weakest link; smart bad guys will attack you there (e.g., via social engineering or stolen tokens). Smart admins will try to compensate for this tendency, e.g., by using multifactor authentication instead of only passwords, and other measures.

Moen's Third Law of Security (link)

"Malware is not a security problem; malware is a secondary after-effect of a security problem."

People who focus on particular exploits against particular vulnerabilities (or worse, software packages like "anti-virus software" that do so) have already lost the security battle, because they aren't focusing on what's important — which is correcting their own strategic errors that make those recurring vulnerabilities possible (and inevitable). Marcus Ranum described what is important perfectly, in his essay "What Sun Tzu Would Say":

  • Run software that does not suck.
  • Absolutely minimize Internet-facing services.

If you have to keep chasing after holes in the same hopelessly bad software (PHP, Acrobat Reader, Adobe Flash, Java Web applets, WordPress, AWstats, wu-ftpd, lpd, etc.) — or, worse, paper over that underlying cause with anti-malware software — then you're addressing the wrong problem.

The computer-security advice Ranum attributes to Sun Tzu bears repeating, too:

If you are fighting a losing battle, it is likely one of three things:
a) You are continuing a trend in a losing war — and therefore should not be surprised.
b) You have chosen to fight the wrong battle.
c) You are stupid.

Moen's Fourth Law of Security (link)

The way most people use the word, "secure" has exactly the same semantic value as "minty fresh" (i.e., none at all).

The concept of something being "secure" or not is nonsensical: Realistically, security is a heuristic estimate of probable exposure to particular risks within a particular threat model. (Secure against what? With what configuration? Under what operating conditions and with what usage modes?) Therefore, you cannot speak meaningfully about security without a proper understanding of the software/hardware and situations, and the underlying threat model.

Even then, the concept is probabilistic, and relative. People who talk about something being "secure" or not as an absolute property are selling something, are seeking implied permission to turn their brains off, or both.

(The word is sometimes used as a synonym for "encrypted", e.g., in "secure HTTP": That is a bad habit, as the usage hides assumptions about integrity of the endpoints, crypto implementation, and authentication that may be unjustified.)

POTS (link)

Plain Old Telephone Service, as opposed to the fancier digital stuff I partly earn my living from. Regular analogue telephone calls, in other words.

Pournelle's Iron Law of Bureaucracy (link)

"In any bureaucratic organization, there will be two kinds of people:

First, there will be those who are devoted to the goals of the organization. Examples are dedicated classroom teachers in an educational bureaucracy, many of the engineers and launch technicians and scientists at NASA, even some agricultural scientists and advisors in the former Soviet Union collective farming administration.

Secondly, there will be those dedicated to the organization itself. Examples are many of the administrators in the education system, many professors of education, many teachers union officials, much of the NASA headquarters staff, etc.

The Iron Law states that, in every case, the second group will gain and keep control of the organization. It will write the rules, and control promotions within the organization."

(Jerry Pournelle (1933-2017) was an American scientist, award-winning science fiction writer, essayist, journalist, technology columnist, and one of the first bloggers.)

Powers's Law (link)

"Truth is far, far stranger than fiction." Fantasy & historical fantasy author Tim Powers, known for his painstaking research, mentioned discovering that the classic 1960s hippy-slang term "groovy" actually originated not in the swinging Sixties but rather in the Roaring Twenties, and was merely revived 40+ years later — but that nonetheless writing "groovy" into 1920s historical fiction dialogue would feel so anachronistic that it'd jar readers right out of the narrative. Powers observed at the 2011 Worldcon in Reno that, at minimum, he'd need to annotate said dialogue with a footnote saying "Uh-huh!"

Prescriptivist / Descriptivist (link)

Code phrase meaning "I don't understand grammar, punctuation, or usage, but want to have an opinion on the matter anyway."

Copyediting is all about clarity, and avoidance of (unintended) ambiguity, in written prose. Good usage is clear, as precise as intended, and difficult to mis-read. It is helped in that by good punctuation, which "fences off" possible misinterpretations and misparsings, while at the same time avoiding calling attention to itself. Editing professionals know those things intuitively, and tweak prose accordingly — not wasting time on irrelevant ideological battles or dumb question-begging about whether our always-evolving language(s) should be "allowed to change".

Used in lexicography, the terms have some meaning in theory, but in practice tend to be just name-calling phrases meaning "I don't like these people's policies."

Rumplestiltskin Effect (link)

The error of merely giving something a name and imagining that one thereby understands it. (You may recall that guessing that dwarf's name, in the eponymous Grimm Brothers fairy tale, had magical effect.)

I coined this term, as a boy, after hearing entirely too many technical terms offered by way of alleged explanation that, upon examination, explained nothing. The field of medicine (along with many other sciences, such as biology) is chock-full of examples, e.g., "hyperemesis gravidarum": A pregnant woman who visits a doctor because she is suffering near-constant vomiting is often knowingly told "Ah, well, you turn out to have hyperemesis gravidarum" -- but that is merely Latin for "throwing up excessively while pregnant". (In fairness, relatively recent medical science has suggested this is a result of hormone fluctuations, especially in the first trimester, but the point is that the name itself explained nothing, and for ages doctors offered no credible mechanism, just the name.)

At the age of nine or so, I thought my coinage was wonderfully clever because it exemplified itself: Calling a psychological phenomenon "Rumplestiltskin Effect" assigns it a name implying a degree of insight that, upon examination, isn't really there. (I'd never heard of recursion let alone recursive humour, and thought I'd invented the concept.)

Second Idiot Effect (link)

The syndrome of being quoted/cited as an equal and opposite loon, to "balance" the speaker/writer's citation of some other loon.

This term was coined by (my late friend and) professional magician James "The Amazing" Randi, who noted that reporters would seek quotable comments from him or some other skeptical critic on some outlandish claim, not in any hope of resolving the factual question, but rather to make the reporter seem "objective" by giving time to wackos on both sides. (In fact, Randi noticed that his and other skeptical commentators' remarks that did tend to clear up mysteries got left on the cutting room floor.)

In a time when reporting and analysis often go no deeper than "Colourful Person A claims [outlandish thing], while Colourful Person B says 'Rubbish'", thus reducing the factual claim to a display of ideological posturing, it's well to remember that you may have been consulted for no better reason than serving as someone's second idiot, and invent some means of breaking out of that assigned role.

Self's Law (link)

"Large, low-entropy pools are inherently dangerous."

Karsten M. Self originated this observation in the 1990s. Here's an example of his comments on this syndrome, following the attack that destroyed the NYC World Trade Center: "Firm belief that large pools of low entropy are inherently dangerous: tall buildings, large crowds, nuclear power, comprehensive databases, absolute power, monopolies. Seek the mean, keep energies and potentials balanced. Bipolar constructs are inherently more stable than monopolar (hegemonical) ones, and multipolar (diversified) structures better than both. That's not total anarchy — nexuses of power or control within a larger pool are OK, and virtually requisite. Should probably add universal networks and software monocultures to the list, as well."

Vodaphone Greece furnished, in 2005, a fine example with its large, invisibly tappable digital access to all cellular telephone traffic in Athens.

Someone to Sue (link)

Euphemism used in business, referring in reality to the concept of having (or lacking) "someone to blame".

For decades, business critics have debunked the wrong-headed notion of, e.g., eliminating as candidate solutions any otherwise reasonable software solutions that lack "someone to sue" behind them. For one thing, businesses never sue publishers of (non-custom) software for product inadequacy, and, given the disclaimers and immunity clauses typical in software EULAs, could not prevail if they did so. So, critics would point out, eliminating from consideration highly standard and competitive open-source options for merely lacking "someone to sue" made no sense — and separately would point out that warranty coverage can and should be arranged on an "a la carte" basis, if desired for such options.

However, critics have failed to grasp the finer meaning intended when that term is used in business: The speaker actually means, but would be embarrassed to declare outright, that he/she needs a plausible target of blame, to use in disposing of complaints: "Oh, that's just standard with [buggy proprietary product], and can't be fixed." See also: Best Practices.

Stepwise Disaster (link)

A common but seldom-discussed failure mode of organisations, in which each of several people innocently performs a task he/she believes is the right thing, none of them seeing the overall picture, resulting in adverse results for all. For example, I have instructions to put all our most valuable parts in an old container in the storage room, for safekeeping. The next day, you have instructions issued by someone else to dispose of the accumulated old containers in the storage room. Jointly, our actions will cost the firm dearly.

Stepwise disasters are pernicious because each party concludes, afterwards, that someone else blundered, because clearly he/she carried out his/her own part just fine. Therefore, they tend to recur.

Related but not quite the same thing is the notion of cascade failure, in which a stressed system, when it fails, causes breakage of multiple key components in sequence, as the stresses shift from each breaking piece onto the next key one. (As an organisational metaphor, this can refer to hot-potato problems wrecking people, groups, and other resources as they get bounced around.)

A truly epic stepwise disaster, if augmented by assistance from (say) middle management, might escalate into a goat rodeo (q.v.), or, as Urban Dictionary put it:.

A chaotic situation, often one that involves several people, each with a different agenda/vision/perception of what's going on; a situation that is very difficult, despite energy and efforts, to instill any sense or order into.
Tactical Stupidity (link)

Sudden, total inability to comprehend a datum one finds inconvenient — frequently employed by organisational staffers to defeat or delay change. E.g., "I just don't understand these SSH and IMAP-SSL settings. Can't we just go back to telnet and POP3?" Or: "This LibreOffice office suite is too difficult; I just can't get my work done. Why can't we have Microsoft Office back?" The tactic tends to be successful at conning technical administrators and support people, because they cannot imagine someone deliberately feigning incompetence (since, from their perspective, that would be both shameful and unprincipled).

The tactical variety can be distinguished from genuine stupidity by vanishing, the instant those supposedly unable to cope start getting replaced.

Related observation from 2006: Al Gore (quoting Upton Sinclair) observed in "An Inconvenient Truth": "It's hard to understand something when your paycheck depends on your not understanding it."

TINLA (link)

This Is Not Legal Advice. Generally paired with "IANAL" = I Am Not A Lawyer.

In every country of my acquaintance, anyone other than a lawyer accredited in the relevant jurisdiction and acting in a (mandated) attorney-client consultation context is prohibited from giving individualised legal advice about a querent's real-world legal problems — that being a regulated monopoly. Lawyers will essentially never answer such questions on the Internet, a setting whose circumstances prevent compliance with professional ethics, and where (typically) the required attorney-client relationship doesn't exist at all (not to mention payment).

Despite these background facts being really obvious, hordes of Internet denizens continually ask free legal advice of random Internet amateurs. One's prudent response, if any, is to instead give general comments about the shape of the law & likely legal procedures — something anyone may do that fortunately differs from giving legal advice and is arguably also a conversation vital to citizenship. Hence the formulation "IANAL, TINLA", clarifying that "No, seriously, I'm not crazy enough to give you legal advice in violation of Unauthorized Practice of Law statutes, even if you're unwise enough to ask for it. Nonetheless here are thoughts about general legal provisions as I understand them, it's up to you to decide how they apply to your situation, and maybe you should actually pay to see a lawyer."

Whitaker's Corollary to Hanlon's Razor (link)

"Never attribute to malice that which can be adequately explained by the actions of an overstressed under-caffeinated system administrator in a hurry." (Words of wisdom from Mike Whittaker.)

Zebra Hunting (link)

Diagnostic failure mode, in which investigation goes wrong through failure to consider obvious, simple explanations first (as suggested by Occam's Razor). As celebrated medical researcher Theodore E. Woodward, Chair of Medicine at University of Maryland's School of Medicine, used to advise interns: "When you hear hoofbeats, think of horses, not zebras." When helping computerists diagnose problems, I must sometimes intervene to halt energetic, futile hunts for imaginary zebras.

Tip of the hat to Katherine Ottaway, MD, for her light verse poem Catching Zebras, which also bears, if mischievously, on this matter.


Last modified: Oct. 27, 2023, by Rick Moen, rick@linuxmafia.com.
Copyright © 2000-2024, Rick Moen. (Quotations are copyright by their respective authors.)

Top of page

-end-