- Best Practices (link)
Making sure your blunders are popular ones. Rationally, this term should mean "methods that meet professional standards of competence and due care", but tends instead to be a managerial code phrase meaning "If anything goes wrong, I want to escape being a specific target of blame by pointing out that our hapless cock-up was the same one countless others made, too."
- Bike Shed Effect (link)
Social dysfunction syndrome first noted by C. Northcote Parkinson in his 1957 book Parkinson's Law, and Other Studies in Administration, whereby getting permission to build a billion-dollar atomic power plant is easy, but a proposal to build a cheap bicycle shed will founder under the weight of endless discussion. Parkinson noted that, because an atomic plant is so vast, expensive, and complicated that people cannot grasp it, rather than try, they'll fall back on the assumption that somebody else checked details before it got that far. However, everyone knows all about bicycle sheds, and feels no inhibition against debating their pettiest details without limit.
E.g., most technical mailing lists spend at least half their time drowning in redundant posts, trivial quibbles, and offhand opinions of no conceivable public interest. Why? Because people can — and because it's a way for them to demonstrate involvement with near-zero effort.
(Our FreeBSD-community friends have observed "bikeshedding", too.)
- Bradbury's Defence (link)
The term is from a droll story author Ray Bradbury used to tell:
One dreadful boy ran up to me and said: "That book of yours, The Martian Chronicles?"
"Yes," I said.
"On page 92, where you have the moons of Mars rising in the east?"
"Yeah," I said.
"Nah," he said.
So I hit him.
- Cheeto Factor (link)
- "Cheese": real cheese.
- "Processed cheese": cheese, sort of.
- "Processed cheese food": cheese, sort of, plus other stuff that's not cheese.
- "Processed cheese food snack product": the food in question is vaguely orange and squishy, but contains no actual cheese.
The term has obvious broader application, e.g., when Linux Gazette editor Ben Okopnik pointed out that the term "commercial open source" most commonly denotes "not actually open source at all".
- Compromise (link)
Concept touted by American commentators as an inherently desirable approach to solving other people's problems. (By contrast, all disputes touching on those commentators' own interests are exempt — as clearly entailing "important principles" that must be defended.)
This guideline's Solomonic wisdom can be seen in the hypothetical example of you, the reader (unless, of course, you're American) being attacked by some thug attempting to kill you: A typical American observer might recommend a "fair compromise" of you being left half-dead.
- Computer Associates (link)
Place where formerly useful software companies go for zombification after dying. Also seen in verb form, as in "Apologies for the new support and upgrade policies, but our firm was recently Computer Associated."
- Deirdre Saoirse Moen's Law of Management (link)
"Bad managers always outlast good employees."
Management should be expected to always terminate the staffer and protect the manager. Therefore, if you find yourself under such a manager, move sideways to elsewhere but never, ever, file complaints with the company instead.
Why do companies consistently do this, strongly contrary to their long-term self-interest? Some guesses: 1. Inherently, managers are given more trust than the employees they manage. Thus, in a conflict, upper management will typically back the manager. 2. Many managers have contractual arrangements that make them very difficult (and slow) to terminate. 3. Management typically cares not at all about who's the reasonable party, but just wants the immediate problem to vanish. Firing the peon is less troublesome; peons tend not to sue, and the embarrassment and organisational disruption costs are lower. 4. Relevant to the prior point, firing the manager would implicitly reflect discredit on higher-ups who vouched for him/her. 5. In cases where the manager might have committed torts or crimes, management often fears terminating his/her job might, ironically, increase company liability for those acts. 6. Short-term thinking is the rule in most businesses, rather than the exception.
- Dueling Banjos Effect (link)
Term coined by Jim Penny for self-perpetuating Internet prominence caused by feedback loops between search engines and Internet discussion fora. Refers in particular to bizarre and perplexing instances of such freaky fame.
The eponymous example was triggered by one Martin Eldridge's deeply mistaken query on the Debian Linux distribution developers' mailing list (debian-devel), in July 2000: "Could you please send me the sheet music for Dueling Banjos, Regards Martin". Which in turn lent that mailing list high prominence on all subsequent Google searches for "sheet music Dueling Banjos", which lead to other people's (completely inappropriate) queries much like Eldridge's, and so on.
To head off the inevitable queries I would otherwise get: No, I don't have that sheet music. Neither does the Debian Project, despite porting efforts.
- Edwards's Law (link)
"You cannot apply a technological solution to a sociological problem." Nobody seems to know who Edwards was, but pretty much the entire system administrator profession rests on the implicit assumption that he/she was egregiously mistaken.
This plausible-sounding but empty-headed dictum is most often referred to as "Edwards' [sic] Law" — by the depressingly huge mass of semi-literates unable to correctly write possessives of singular nouns ending in "s".
- -ENOPATCH (link)
Laconic expression meaning "You have failed to include substantive, useful content, among all that verbosity."
The expression was (to the best of my ability to tell) originally coined by Alan Cox in private mail copied back to the Linux kernel mailing list, on 2000-10-16, simulating for humourous effect a parser error ("Error: no patch") — thus advising a correspondent he'd omitted his source code patch. Some subsequent posters have used it in its current, figurative sense, e.g., Randy Dunlap's sardonic response to Luke Leighton's meandering advocacy post.
(No doubt related is Alexander Viro's similar creation: "Backwards compatibility is nice, but preserving every undocumented quirk that nobody sane would use... Sorry, but we really need an addition to errno.h: EBITEME. Exactly for such cases.")
- Frogery (link)
Yet another new and fabulous Internet invention, a "frogery" (alternatively, "froggery") is a forged Usenet posting (or, by extension, e-mail or Web site) whose address was crafted to be visually as indistinguishable as possible from that of its intended victim — substituting, e.g., "1" for "l" or "0" for "O" — to either slur the victim by association or troll him/her into complaining to the froger's Internet provider or a public forum, and thereby look stupid.
The term originated on the Usenet newsgroup news.admin.net-abuse.usenet in the late 1990s. The best known frogery episode was occasioned by an obscene January 13, 1997 soc.culture.thai post from "Lawrence Godfrey", leading the better-known Dr. Laurence Godfrey (whose e-mail address was used in the frogery) to file a defamation action against Demon Internet Ltd. for failing to remove it from the company's news spool when so requested.
- Hawthorne Effect (link)
Initial improvement in a process of production caused by obtrusive observation of that process. Industrial researchers studying General Electric's Hawthorne Plant in Cicero, Illinois (in 1927 - 1932) noticed that, when they raised lighting levels as part of a visible study of worker activity, worker efficiency went up. They lowered lighting levels: Efficiency went up again. They changed the humidity: Efficiency rose. And so on. The lesson seemed to be, narrowly speaking, that workers do better when they think somebody actually cares about and notices achievement — or, more broadly speaking, that work is strongly influenced by social factors.
- Internet Chat Room (link)
A journalistic jargon term used to vaguely denote an archetypal bogeyman threat to society claimed to lurk at some unspecified location on the Internet. Generally, the reporter neither knows nor cares what specifically he/she is referring to: The citation is symbolic; facts don't matter. They're (unspecified) places about which nothing else is ever said except to imply that they morally emperil the nation's youth, function as virtual dark alleys for pornographers and criminals, etc.
As a reference to a specific on-line medium, the term "chat room" originated with America Online, referring to that non-Internet-based private service's real-time discussion feature. Reporters immediately began speaking of these (typically very down-market, not very literate) forums as "Internet chat rooms", despite their having nothing to do with the Internet.
Next time you see a reporter refer to an "Internet chat room" as (inevitably) where something evil happened, wager $20 that he/she can't cite its specific location and communications protocol. Ninety-nine times out of a hundred, you'll make money.
- MCSE (link)
Acronym standing for "Moen Coffee Service Engineer". Early in my consulting career, two of the more penurious firms for whom I subcontracted would assign feckless (and Microsoft-centric) youth to "help" me — though these particular gentlemen knew little about network technology and, if anything, perennially slowed me down. Probably, those firms expected to have the young'uns observe, and then replace me at a small fraction of my hourly billing rate.
Since they weren't any use for client work, I'd always just send them out for coffee.
- Moen's Corollary to Shapiro's Law of Communication (link)
"Communications companies never communicate with one another." This tendency comes forcefully to the attention of anyone doing Internet provisioning for business. (Elise Shapiro's Law of Communication says: "The more methods people have for getting in touch with them, the more difficult they are to reach.")
- Moen's Law of Bicycles (link)
"Good customers make for good products". This is my explanation for why an ignorant customer base causes merchandise quality to decline, on account of unhealthy market dynamics, e.g., in retail computer hardware and software. In the mid-1970s, bicycles suddenly became very popular in the USA. The masses suddenly entered the market, few knowing anything about bicycles. Many could distinguish poorly if at all between good equipment and bad; good customer service and bad. Consequently, poorly made bicycles (which cost less to make) undercut well made ones (and poor customer service out-earned the good variety), because superior value ceased to be perceived. Over time, overall quality of available bicycles declined considerably, almost entirely because of this dynamic with customers, recovering only after the fad ended, years later.
Quality thrives only when people can tell the difference. When they haven't a clue about products and how they work, schlock merchandise prevails. One can see this process at work in retail computing gear and software: People who know least about computing always insist most on achieving bottom dollar. In a way, this is understandable: You want to exercise control over the process, and, if you're dirt-ignorant about computing, the only place to exercise control is over price. Gradually, this effect tends to drive good merchandise out of the market entirely, leaving a generous selection of cheap crud.
This "law" originated in a November 1995 editorial I wrote for the San Francisco PC User Group, where I attempted to explain why, by becoming wilfully ignorant computerphobes and thereby ignoring the birthright of all computer user groups, they were particularly responsible for the decline of quality in available hardware and software.
Objections to Moen's Law of Bicycles:
"You've merely restated Gresham's Law." No. In its literal sense, Gresham's Law says that less-base (higher intrinsic value) metal coins tend to disappear from the market, because people preferentially retain them, and instead use in commerce baser-metal ones bearing the same face value. It is also sometimes cited metaphorically to claim that "Bad X inevitably drives out good X" — but this is mere empty-headed nihilism, absent some explanation of by what mechanism inferior examples of X will allegedly predominate. Moen's Law of Bicycles differs by detailing one such mechanism, under what conditions it arises, and, equally important, how to prevent it. E.g., bad 35mm cameras have never driven good ones off the market (or rendered them artificially expensive specialty gear), because a large fraction of wise purchasers continue to make a point of understanding what they're buying.
(George Akerlof ably described the psychology and economics entailed with the end-result of such market changes, in his 1970 paper "The Market for Lemons: Quality Uncertainty and the Market Mechanism", as an example of asymmetrical information theory, but didn't give the syndrome a specific name.)
"But if I just want a bike for the grandkids, why should I have to buy a 2lb. $3,000 aluminum racer?" This misses the point in two ways: 1. I never recommended the high end. 2. The 1970s' influx of ignorant customers made good cheap bicycles rare, every bit as much as the more-expensive ones: An unhealthy market dynamic has corrosive effects at all price levels.
- Moen's Law of Comedy (link)
"Dollar for dollar, irony is still your best entertainment value, right after Usenet kooks."
- Moen's Law of Corrections (link)
"Any post critical of spelling, grammar, or punctuation errors will acquire a few. If Great Murphy is feeling particularly cruel, you will end up misspelling 'misspelled'."
- Moen's Law of Clueless Newbiehood (link)
"The whiniest new users, and especially the ones who claim your sole purpose in life should be to help them, always say that using MS Outlook (or Outlook Express)."
I first noticed this correlation on Linux mailing lists and newsgroups. (The point, in part, is the irony inherent in demandingly seeking intensive personal assistance on a public Linux forum, using a particularly notorious Microsoft application.)
- Moen's Law of Documentation (link)
"The more you write, the less they read."
Although any piece of writing can be improved, even the best examples, especially of technical writing, no matter how excellent, will garner requests for more detail — far past the point of reason. Why? Because, most often, a questioner's immediate reaction (to not instantly understanding) is to claim that insufficient information was provided, whether such is true or not. The longer and more detailed any subsequent, further explanations are, the more difficulty target readers will have in finding what they need, and the more they'll demand an even thicker forest of explanations to get lost in.
Thus, greater conciseness often does much more good than do longer & more detailed explanations. Or, what might be needed is better indexing, or following the classic journalist's inverted pyramid format, or the short answer / long answer format I often use — or just a polite suggestion to Read The Friendly Manual (or Search The Friendly Web).
- Moen's Law of Hardware (link)
"Use what the programmers use."
After years of helping people with hapless computer-hardware woes, especially trouble-prone categories such as Linux on laptops, exotic peripheral interfaces, etc., it occurred to me to wonder why I never had such problems. It was mainly because of instinctive avoidance of dodgy, exotic, new, and/or badly designed components — which happens to track strongly with programmers' characteristic prejudices. There's a logic to that, which may not be immediately apparent to many:
Drivers for hardware don't emerge like Athena from the head of Zeus: Someone has to create them. Especially for open-source OSes such as Linux, this involves a chipset being brought to market, for it to be out long enough for coders to buy and start using it, and for them to (if necessary, absent manufacturer cooperation) do the hard work of reverse-engineering required to write and debug hardware support. Then, the resulting code filters out to various OS distributions' next releases, and thus eventually to users.
It follows that, if you blithely buy what's new and shiny, or so badly designed or built that coders eschew it, or so exotic that coders tend not to own it, it will probably have sucky software support, especially in open source. (Proprietary drivers can be written under NDA, often before the hardware's release, while manufacturer help is routinely denied to the open source world.) Conversely, if you buy equipment that's been out for a while, doesn't suffer the (e.g., Diamond Multimedia) problem of chip-du-jour, is bog-standard and of good but not exoticly expensive quality, it will probably have stellar driver quality, because coders who rely on that hardware will make sure of that.
Thus, it's very common for slightly aging but good-quality gear to outperform and be more reliable than the latest gee-whiz equipment, because of radically better software support — not to mention the price advantage.
Ergo, in 1999, instead of buying a current-production laptop to run Linux on, I bought, used, a Sony VAIO PCG-505TX, because I knew several Linux kernel coders had been using those as primary machines. Performance and stability have been exceptional.
More broadly, if you can identify the types of gear programmers would favour — and avoid — you'll be ahead of the game. Coders would avoid winmodems / winprinters, brand-new 3D video chipsets, cheesy and unsupported SATA "fakeraid" chipsets, low-end scanners reached through parallel ports ganged to ATAPI ganged to SCSI chipsets, cheap multifunction scanner/printer/fax boxes, hopelessly proprietary USB aDSL bridge cards, etc. They would favour parts of standard interface, command-set, and chipset design and high enough quality that they might be reused in multiple machines over a long service life.
- Moen's Law of Inefficient Immolation (link)
"Murder-suicides never quite seem competent to do it in the right order." I first noticed this effect in its strict literal sense, concerning some miscreants who murdered my congressman. However, in its main, metaphorical sense, this law describes screw-ups' tendency to wreck as many other people's affairs as possible before, themselves, flaming out.
(People who suddenly realise that they're misbehaving, especially when cornered, sometimes seem caught in a trap of escalation intended to somehow justify themselves. My implication is that it may be better for all concerned to quash the syndrome, prior to its climax.)
- Moen's Law of Littering (link)
"People tend to litter where there's already litter." Has obvious application to any instance of socially reinforced standards, e.g., netiquette, S/N ratio on technical forums, etc.
- Moen's First Law of Debate (link)
"No matter what the issue, someone will try to turn it into a personality dispute." In part, this is because complex issues become more comprehensible if you convert them into soap opera, however inaccurately. Additionally, it's common for someone who's losing an argument to misbehave in hopes of debasing the discussion, scoring a draw in the eyes of casual observers, and tarring the opposition by association.
- Moen's Second Law of Debate (link)
"There's nothing quite as sublimely silly as publicly professing one's humility." I get the giggles when I see people making a big deal in public of how self-effacing they are.
- Moen's First Law of Security (link)
"It's easier to break in from the inside." E.g., many Internet break-ins result from masquerading as a legitimate user to gain user-level access, e.g., with sniffed passwords. The attacker then has a dramatically wider selection of system weak points he/she can attack, compared to penetrating from outside.
- Moen's Second Law of Security (link)
"A system can be only as secure as the dumbest action it permits its dumbest user to perform." Your users are often your weakest link; smart bad guys will attack you there (e.g., via social engineering or stolen tokens). Smart admins will try to compensate for this tendency, e.g., by using multifactor authentication instead of only passwords, and other measures.
- Moen's Third Law of Security (link)
"Malware is not a security problem; malware is a secondary after-effect of a security problem."
People who focus on particular exploits against particular vulnerabilties (or worse, software packages like "anti-virus software" that do so) have already lost the security battle, because they aren't focussing on what's important — which is correcting their own strategic errors that make those recurring vulnerabilities possible (and inevitable). Marcus Ranum described what is important perfectly, in his essay "What Sun Tsu Would Say":
- Run software that does not suck.
- Absolutely minimize Internet-facing services.
If you have to keep chasing after holes in the same hopelessly bad software (AWstats, wu-ftpd, lpd, etc.) — or, worse, paper over that underlying cause with anti-malware software — then you're addressing the wrong problem.
The computer-security advice Ranum attributes to Sun Tzu bears repeating, too:
If you are fighting a losing battle, it is likely one of three things:
a) You are continuing a trend in a losing war — and therefore should not be surprised.
b) You have chosen to fight the wrong battle.
c) You are stupid.
- Moen's Fourth Law of Security (link)
The way most people use the word, "secure" has exactly the same semantic value as "minty fresh" (i.e., none at all).
The concept of something being "secure" or not is nonsensical: Realistically, security is a heuristic estimate of probable exposure to particular risks within a particular threat model. (Secure against what? With what configuration? Under what operating conditions and with what usage modes?) Therefore, you cannot speak meaningfully about security without a proper understanding of the software/hardware and situations, and the underlying threat model.
Even then, the concept is probablistic, and relative. People who talk about something being "secure" or not as an absolute property are selling something, are seeking implied permission to turn their brains off, or both.
(The word is sometimes used as a synonym for "encrypted", e.g., in "secure HTTP": That is a bad habit, as the usage hides assumptions about integrity of the endpoints, crypto implementation, and authentication that may be unjustified.)
- POTS (link)
Plain Old Telephone Service, as opposed to the fancier digital stuff I partly earn my living from. Regular analogue telephone calls, in other words.
- Prescriptivist / Descriptivist (link)
Code phrase meaning "I don't understand grammar, punctuation, or usage, but want to have an opinion on the matter anyway."
Copyediting is all about clarity, and avoidance of (unintended) ambiguity, in written prose. Good usage is clear, as precise as intended, and difficult to mis-read. It is helped in that by good punctuation, which "fences off" possible misinterpretations and misparsings, while at the same time avoiding calling attention to itself. Editing professionals know those things intuitively, and tweak prose accordingly — not wasting time on irrelevant ideological battles or dumb question-begging about whether our always-evolving language(s) should be "allowed to change".
Used in lexicography, the terms have some meaning in theory, but in practice tend to be just name-calling phrases meaning "I don't like these people's policies."
- Rumplestiltskin Effect (link)
The error of merely giving something a name and imagining that one thereby understands it. (You may recall that guessing that dwarf's name, in the eponymous Grimm Brothers fairy tale, had magical effect.)
I coined this term, as a boy, after hearing entirely too many technical terms offered by way of alleged explanation that, upon examination, explained nothing. The field of medicine (along with many other sciences, such as biology) is choc-full of examples, e.g., "hyperemesis gravidarum": A pregnant woman who visits a doctor because she is suffering near-constant vomiting is often knowingly told "Ah, well, you turn out to have hyperemesis gravidarum" -- but that is merely Latin for "throwing up excessively while pregnant".
At the age of nine or so, I thought my coinage was wonderfully clever because it exemplified itself: Calling a psychological phenomenon "Rumplestiltskin Effect" assigns it a name implying a degree of insight that, upon examination, isn't really there. (I'd never heard of recursion let alone recursive humour, and thought I'd invented the concept.)
- Second Idiot Effect (link)
The syndrome of being quoted/cited as an equal and opposite loon, to "balance" the speaker/writer's citation of some other loon.
This term was coined by professional magician James "The Amazing" Randi, who noted that reporters would seek quotable comments from him or some other skeptical critic on some outlandish claim, not in any hope of resolving the factual question, but rather to make the reporter seem "objective" by giving time to wackos on both sides. (In fact, Randi noticed that his and other skeptical commentators' remarks that did tend to clear up mysteries got left on the cutting room floor.)
In a time when reporting and analysis often go no deeper than "Colourful Person A claims [outlandish thing], while Colourful Person B says 'Rubbish'", thus reducing the factual claim to a display of ideological posturing, it's well to remember that you may have been consulted for no better reason than serving as someone's second idiot, and invent some means of breaking out of that assigned role.
- Self's Law (link)
"Large, low-entropy pools are inherently dangerous."
Karsten M. Self originated this observation in the 1990s. Here's an example of his comments on this syndrome, following the attack that destroyed the NYC World Trade Center: "Firm belief that large pools of low entropy are inherently dangerous: tall buildings, large crowds, nuclear power, comprehensive databases, absolute power, monopolies. Seek the mean, keep energies and potentials balanced. Bipolar constructs are inherently more stable than monopolar (hegemonical) ones, and multipolar (diversified) structures better than both. That's not total anarchy — nexuses of power or control within a larger pool are OK, and virtually requisite. Should probably add universal networks and software monocultures to the list, as well."
Vodaphone Greece furnished, in 2005, a fine example with its large, invisibly tappable digital access to all cellular telephone traffic in Athens.
- Someone to Sue (link)
Euphemism used in business, referring in reality to the concept of having (or lacking) "someone to blame".
For decades, business critics have debunked the wrong-headed notion of, e.g., eliminating as candidate solutions any otherwise reasonable software solutions that lack "someone to sue" behind them. For one thing, businesses never sue publishers of (non-custom) software for product inadequacy, and, given the disclaimers and immunity clauses typical in software EULAs, could not prevail if they did so. So, critics would point out, eliminating from consideration highly standard and competitive open-source options for merely lacking "someone to sue" made no sense — and separately would point out that warranty coverage can and should be arranged on an "a la carte" basis, if desired for such options.
However, critics have failed to grasp the finer meaning intended when that term is used in business: The speaker actually means, but would be embarrassed to declare outright, that he/she needs a plausible target of blame, to use in disposing of complaints: "Oh, that's just standard with [buggy proprietary product], and can't be fixed." See also: Best Practices.
- Stepwise Disaster (link)
A common but seldom-discussed failure mode of organisations, in which each of several people innocently performs a task he/she believes is the right thing, none of them seeing the overall picture, resulting in adverse results for all. For example, I have instructions to put all our most valuable parts in an old container in the storage room, for safekeeping. The next day, you have instructions issued by someone else to dispose of the accumulated old containers in the storage room. Jointly, our actions will cost the firm dearly.
Stepwise disasters are pernicious because each party concludes, afterwards, that someone else blundered, because clearly he/she carried out his/her own part just fine. Therefore, they tend to recur.
Related but not quite the same thing is the notion of cascade failure, in which a stressed system, when it fails, causes breakage of multiple key components in sequence, as the stresses shift from each breaking piece onto the next key one. (As an organisational metaphor, this can refer to hot-potato problems wrecking people, groups, and other resources as they get bounced around.)
- Tactical Stupidity (link)
Sudden, total inability to comprehend a datum one finds inconvenient — frequently employed by organisational staffers to defeat or delay change. E.g., "I just don't understand these SSH and IMAP-SSL settings. Can't we just go back to telnet and POP3?" Or: "This LibreOffice office suite is too difficult; I just can't get my work done. Why can't we have Microsoft Office back?" The tactic tends to be successful at conning technical administrators and support people, because they cannot imagine someone deliberately feigning incompetence (since, from their perspective, that would be both shameful and unprincipled).
The tactical variety can be distinguished from genuine stupidity by vanishing, the instant those supposedly unable to cope start getting replaced.
Related observation from 2006: Al Gore (quoting Upton Sinclair) observed in "An Inconvenient Truth": "It's hard to understand something when your paycheck depends on your not understanding it."
Last modified: July 31, 2013, by Rick Moen, firstname.lastname@example.org.