News Posts In Category
When I came to Earth, I of necessity adopted a human form — in order to be less conspicuous. Little did I know what a mess caring for the human body would be.
The worst part about the tasks required to keep the body from deteriorating too much is that they take so much time. All of these mostly unpleasant activities could — if I let them — gobble up 1-2 hours of my day. Unfortunately, what I've found is that putting off some of these tasks merely means spending more than 1-2 hours when the deterioration has become more annoying than the tasks themselves.
So, what unpleasant and annoying tasks does the biological human body require? Here are the worst, from my perspective (in no particular order):
- Emptying bowels. On Mars, our bodies do this quickly and cleanly, merely be ejecting a small, shiny egglike object when necessary.
- Trimming nails. What a bother! And so prone to error, hangnails being the worst.
- Brushing teeth. Seriously, there's no reason why human teeth should require so much care and expense to maintain. When was the last time you saw a cat brush its teeth?
- Washing hands. Not unpleasant so much as annoying. Yet without frequent washings during the day, the body is vulnerable to attack by malicious microbes — and what a disaster that can be!
- Cutting hair. Some people, I've noticed, actually enjoy this activity. But to me it's merely an annoying time-waster.
- Trimming facial hair. Same as hair-cutting, except I don't think most men actually enjoy the activity.
- Treating fungus. It seems that once you are invaded by fungus, it never goes away. It flares and fades and requires outrageous expenditures on a variety of products, none of which offers a permanent cure.
- Minimizing body odor. Again, some humans enjoy showering, bathing, and the rest ... but to a Martian, this problem is best controlled by other, less time-consuming, means.
Now, granted, there are at least two biological functions that I find enjoyable — even though they both take a good deal of time: Eating, and orgasm. The former is a necessary part of maintaining the biological body, but the latter is not. It's merely a fun option... and one of the best things about living in the human body.
In recent days, I've been barraged by friends back on Mars inquiring about what psychological effects the recent spate of tornadoes in the South and Midwest United States must have on the humans there. Their interest got me to thinking, and I suddenly had an insight, which I'm sure has brightened the intellectual glow of many beings (both Martian and human) before me.
The insight encompasses the sociological effects of hurricanes as well, since the two devastating natural phenomena share some common traits... the most obvious being those furiously spinning wind and clouds.
My Martian theory also explains why tornadoes and hurricanes affect humans in ways that volcanoes, tsunamies, and earthquakes do not.
For brevity in the following paragraphs, I'm using the term "Recurring Events of Mass Destruction" (REMD) to refer to tornadoes and hurricanes, and the term "Unpredictable Events of Mass Destruction" (UEMD) to refer to volanoes, tsunamies, and earthquakes.
REMD. The distinguishing characteristics of REMD include:
- They happen every year.
- Though their frequency and severity vary from year to year, their geographical incidence is constant and encompasses huge areas of the country.
- Though you know they'll occur every year, you have no idea where exactly they'll strike.
- When they do strike, they invariably cause severe damage at the strike site.
UEMD. The distinguishing characters of these include:
- They happen whenever. Completely unpredictable.
- Most of the time, their effect on human life and property is minimal. Catastrophic events do occur, but their incidence is rare compared with REMD.
- The geographic range is much more limited and static than for REMD. Only a few States are affected by UEMD.
These widely differing attributes lead me to theorize the following psychological effects on humans who live in areas prone to REMD.
|Dread||The certainty of mass destruction lowers an unconscious web of dread on REMD people. Ongoing dread bends the psyche toward irrational fear of the unknown, as well as paranoia.|
|Envy and Schadenfreude||Although humans like to think otherwise, those who live in an area where one's town can be devastated while a town close by goes unscathed invariably feel envy when this occurs. Likewise, those in the spared town will feel the opposite—schadenfreude. Although this affects victims of UEMD too, the sociological effect is lessened by the lower incidence and narrower geographical confines of UEMD. Over time, envy and schadenfreude become ingrained in a community's collective psyche.|
|The widespread and continuous feelings of envy and schadenfreude can heighten suspicion of "outsiders" and lead to an intolerant, parochial view of the world. It can also make humans more stingy towards those beyond their immediate community. These people are more likely to adopt the philosophy, "Every Man For Himself," and they become incapable of seeing "beyond their own back yard" in terms of understanding people different from themselves.|
|Religiousness||Throughout human history, people have been driven to religion to explain natural phenomena—both good and bad. If REMD are caused by God, then perhaps fear, dedication and prayer to God will help. Religion in REMD areas would therefore be expected to emphasize the fear factor of God, as well as a greater awareness of God's opposite—Satan, Evil.|
People living with UEMD are also affected by these feelings, but to a much lesser degree. Because of the infrequency and lack of predictable recurrence of UEMD, the emotions do not grip entire communities or regions. The primary psychological effect on humans of UEMD is a sort of Stoic Fatalism, which can be summed up in the philosophy "What Will Be Will Be." Stoicism tends to make humans more tolerant of others and more broad-minded about ideas.
Given these characteristics, it's now worth mapping their effects geographically, in order to see how the incidence of tornadoes correlates with the incidence of humans with those characteristics. Since the psychological makeup of Republican (conservative) humans aligns pretty well with the attitudes of those prone to REMD, I am using voting patterns as a proxy for sociological differences between areas affected by REMD and UEMD. The maps below adopt the paradigm of "blue" and "red" States to view the correlation.
Notes: The data in Map 1 are derived from statistics published by NOAA's Storm Prediction Center, and cover the 5 years from 2000-2004 . To improve focus, it doesn't show incidence numbers lower than 10. The number of States of a particular color on Map 1 is the same as the corresponding number Map 2 (Source: Wikipedia Commons).
This is the color legend for Map 2:
- The Republican candidate carried the State in the last four presidential elections (1996, 2000, 2004, 2008)
- The Republican candidate carried the State in three of the four most recent elections.
- The Republican candidate and the Democratic candidate each carried the State in two of the four most recent elections.
- The Democratic candidate carried the state in three of the four most recent elections.
- The Democratic candidate carried the state in all four most recent elections.
Although not a perfect predictor, tornado frequency correlates pretty closely with election results: The States that lie in the tornado "alleys" of the South and Midwest are almost all populated by Republican-leaning voters.
I should stress that this theory does not postulate that tornadoes are the sole predictor of a State's sociological makeup. For example, the characteristics this theory predicts for humans in REMD areas are also found in small-town and rural areas, which are heavily populated by parochial and intolerant humans. Red States that do not correlate with the tornado data but whose population predominantly lives in small towns and rural areas include Montana, Wyoming, Idaho, and Alaska.
Among the blue States, the biggest outlier is Illinois, which maps as solid red in tornado frequency but solid blue in voting pattern. Though this result seems surprising—as well as contradictory to my theory—it can be explained by noting that the incidence of tornadoes is primarily in the southern part of the State, which is also heavily Republican and has long considered itself part of the South.
It's worth noting that many of the red and pink States on the third map are also those that suffer the most REMD by hurricanes—South Carolina, Georgia, Florida, Mississippi, Alabama, Louisiana, and Texas. Adding hurricanes to the tornado data above would undoubtedly also turn North Carolina red.
And this, my fellow Martians, is my explanation of how tornadoes affect the sociology of the human populations they afflict.
One of the truly bewildering traits of human beings is their ability—and even carefree willingness—to ignore facts that conflict with their current worldview. I touched on this topic in an earlier article, and find it manifested in numerous ways in this most viciously anti-rational political climate.
This article looks at data for a timely topic that's a favorite target for fact distortion: Has the U.S. Federal Government workforce grown too large, or not?
The "Tea Party" politicians, in particular, appear to be masters at the art of selling people willful ignorance, perhaps partly because they themselves drink from that cup religiously. Among the false ideas they consider common knowledge is the idea that the Federal workforce needs to be cut—presumably because it, like the Government as a whole, has grown too big. While they're at it, they'd also like to make sure Federal employees don't have a benefits package better than members of their own congregation do.
Recently, a Republican from Texas, Rep. Kevin Brady, submitted a legislative proposal to cut the Federal workforce by 10 percent. According to a Washington Post article, Brady's reasoning goes like this:
There's not a business in America that's survived this recession without right-sizing its workforce, without having to become more productive with fewer workers. The federal government can't be the exception. We're going to have to find a way to serve our constituents and our taxpayers better and quicker and more accurately with fewer workers. I'm convinced we can do it and we don't have a choice.
Including its overall premise, Brady's short statement includes several fallacies, and on Mars we find it alarming to realize that this guy is chairman of the Joint Economic Committee and a senior member of the House Ways and Means Committee. Where I come from, those are pretty big britches! When someone with authority over such enormously important Government functions gets his facts wrong, one has to wonder whether he is deliberately lying for political reasons, or whether he's maliciously failing to determine the facts—instead shaping them to fit his policy goals.
On Mars, such behavior is almost unheard of. When I first revealed it, my fellow Martians had trouble believing that sentient beings could behave this way. And even if someone were to deliberately distort reality, surely Earth's legal systems would be constructed to punish the act.
Apparently, however, this behavior is not only tolerated, it's rewarded by the mere awareness that it's tolerated. After all, if a lie—or deliberate ignorance—by someone in authority isn't challenged, it clearly achieves its purpose. And achieving one's purpose obviously counts as a success. (On Mars, we believe that this is one of the perverse lessons Americans learned from President Richard Nixon's downfall: If you're going to lie, cheat, embezzle, or otherwise commit illegal acts, be sure you aren't caught doing so.)
So, what fallacies does Mr. Brady disseminate in his statement? Here are two obvious ones:
- "There's not a business in America that's survived this recession without right-sizing its workforce." How can this be true? Clearly, as has always been the case, the economic downturn produces not only losers, but winners as well. Yes, the losers will have had to lay off workers, hence the rise in unemployment. But companies in growth sectors will not have done so, and they may even have continued to expand. In this downturn, for example, employment in the oil mining industry increased from 143,000 to 159,000 from 2007 to 2009. A better example is the computer services sector, where employers added 400,000 jobs.
- "The federal government can't be the exception." Someone like Brady who is in charge of National economic policy undoubtedly understands that reducing employment in the Federal sector is never a good thing during a period of slow economic growth. Even economists who aren't sold onKeynesian economics realize that the Federal Government should remain a stable economic player during times like this. Stating otherwise must be a deliberate deception.
That leaves the notion that the U.S. Government must "right-size" its workforce in order to "become more productive with fewer workers." First of all, what does "right-sizing" a workforce mean? If you read Wikipedia's article on the subject, you come away believing that "right-sizing" is merely a euphemism for "layoffs" or "downsizing."
Some dictionaries, on the other hand, suggest there's a nuance to the term that differentiates it from "layoffs." Webster's, for example, defines the term as follows:
To reduce (as a workforce) to an optimal size
"Right-sizing" (or "rightsizing") is a term first uttered on Earth in 1989, when it was really just jargon to justify the downsizing that became de rigeur during the waning years of the first Bush administration. One of the main reasons companies downsize is that their workforce has bulged after a major merger with or acquisition of another company. And as you may recall, starting in the 1980s corporations did a heckuva lot of merging and acquiring. For awhile, even "rollups" where all the rage on Wall Street.
After a merger or major acquisition, it's pretty standard to eliminate inherited workers who do redundant tasks, or those who have a record of poor performance. Companies who downsize for any other reason do so because they're performing poorly, as measured by revenue and profits. In this case, companies downsize to reduce their production costs and make their products or services more competitive.
So, there are two big problems with even suggesting that the Federal Government engage in "right-sizing:"
- Governments are nonprofit institutions, and therefore notions such as competitiveness, profits, and product pricing are meaningless.
- Governments don't merge with or acquire other governments. Well, unless you're talking about conquests, which surely is a special case. Occasionally, governments do split up... for example, when a U.S. State secedes from the Union, or when a country declares its independence from another. In this latter case, of course, the split governments will find the need to "upsize" their workforce rather than downsizing them.
Ah, but what if you believe, as lawmakers such as Brady do, that the cost of the Federal workforce is a major reason why the Federal deficit is ballooning? Well, then I suppose the suggestion does make sense.
As it turns out—and here I'm finally getting to the crux of my argument—the Federal workforce has not been a contributor to the growth in Federal spending. If you're picking up an axe to cut the budget, hacking at the workforce is not only missing the target, but it will actually increase costs in the long run.
What evidence do I have to support such assertions? Consider the following facts for the 40-year period from 1970 to 2009, as illustrated in the accompanying charts:
- Real (adjusted for inflation) Federal consumption spending increased 56 percent, while total Federal employment fell about 30 percent. Most of the reduction in Federal employment came in the defense sector, but the number of nondefense employees stayed basically flat during this 40-year period while nondefense spending shot up 150% (Chart 1). (Note: The measure of spending shown in chart 1 includes only "current expenditures," which basically counts spending required "to keep the trains running"—that is, to carry out basic agency missions.)
- From 1970 to 2009, total Federal employment shrank from 6.1 million to 4.2 million—again, mostly in defense. The nondefense Federal workforce was 1.96 million in 1970, and 1.95 million in 2009 (Chart 2).
- During these 40 years, Federal employment as a percentage of total U.S. employment dropped from 8.6 percent to 3.5 percent (Chart 3).
These facts make it obvious that the Federal Government has been engaging in "right-sizing" for a very long time. How could Federal employees not be a great deal more efficient and productive today if their numbers haven't changed in the last 40 years, while their workload and output have doubled?
Despite continuous calls for less Federal "intrusion" into taxpayers' lives, taxpayers have simultaneously been demanding and expecting more and more of their National Government. As anyone who has been even marginally observant knows, Federal responsibilities have expanded greatly since 1970. Among its new and expanded assignments are:
- Occupational Safety and Health. The Occupational Safety and Health Administration was created in 1970 to "ensure that employers provide employees with an environment free from recognized hazards, such as exposure to toxic chemicals, excessive noise levels, mechanical dangers, heat or cold stress, or unsanitary conditions."
- Environmental Protection. The Environmental Protection Agency was also created in 1970 and charged with "protecting human health and the environment, by writing and enforcing regulations based on laws passed by Congress."
- National Security. The agencies responsible for ensuring the safety of U.S. citizens have increased employment substantially during this period, especially since the September 11, 2001, attacks by radical Islamic terrorists. The attacks resulted in a reorganization of security functions from various agencies into a new agency, the Office of Homeland Security. The number of Federal security personnel at U.S. airports has also increased, of course.
- Natural Resource Management. In 1973, Congress passed the Endangered Species Act, which requires Federal agencies to ensure that their activities "do not jeopardize the existence of any endangered or threatened species of plant or animal or result in the destruction or deterioration of critical habitat of such species."
- National Park System. Numerous Acts and Executive Orders have expanded the responsibilities of the National Park Service since 1970, including the General Authorities Act of 1970, the National Parks and Recreation Act of 1978, and the Alaska National Interest Lands Conservation Act of 1980.
- Drug Abuse. The Comprehensive Drug Abuse Prevention and Control Act of 1970 expanded and optimized the Federal Government's ability to control use of illegal drugs. Among other components, the legislation included the Controlled Substances Act, which established drug "schedules," into which various substances would be classified and for which misuse penalties would be defined.
- Many other functions, including Immigration Control (yes, we have been spending more money and hired more people for this), Education, Technology Infrastructure, and Information Dissemination.
Regarding Information Dissemination, consider the huge cost and workload involved in building all the great Federal websites we now have—including the many channels to obtaining customized information from Federal databases never before available.
For example, the charts and data shown in this article come from the Bureau of Economic Analysis (BEA), the Commerce Department agency responsible for collecting and analyzing statistics on the U.S. economy. BEA is the organization that produces estimates of Gross Domestic Product, personal income, and much more. Their data is now available through an easy-to-use, customizable web interface that generates data in a variety of formats, including tab-delimited, which can be imported into spreadsheet software.
Yes, the Government does much less printing now than it used to, but as one with first-hand knowledge of Federal publishing, let me assure you it costs much more now to publish on the web than printing ever did. For one thing, many agencies were encouraged to—and did—charge fees for printed publications. Obviously, they collect nothing from use of their websites. For another, nearly all Federal printed documents were required by law to use only black ink, or black and one other color. A tiny fraction used the four-color process that's standard for commercial printing.
However, Federal web publishing has been under no such contraints, and so agencies have spent as freely as they thought necessary to make splashy, flashy, and sexy websites that could have been—and often are—designed by a Madison Avenue ad firm. Such sites look nice, but besides being expensive they too often make usability a secondary consideration to appearance. Where once a small agency might spend $500,000 a year on printing, it's now common for it to spend $1 or $2 million on their websites, while still printing some material. (Note: BEA remains a big exception to the norm. Their website eschews expensive graphics and other flashy flourishes, and is mostly easy-to-navigate textual content.)
OK, so it's undeniable that Federal employment has shrunk in the last 40 years, while spending has grown. Doesn't that suggest that Federal employees are much more productive than they were 40 years ago?
Given the data in Chart 1, it's clear that productivity in the Federal sector has risen considerably. However, something must be missing, because it's nearly impossible for an organization to boost output by 50% while cutting its workforce by 30%. In fact, if you lay these data beside analogous ones for the private sector, it appears that the Feds have been using some secret productivity weapon that they should now share with the private sector, so that it can downsize as the Feds have done. (Oops... no, that would cause a huge recession, actually.)
Since 1970, output of private industry has shot up 200%, but this was accompanied by a 70% increase in employment (Chart 4). This means that the gain in private output required 70 percent more workers over this period. If you apply that relationship to the public sector, Federal employment should have increased 15-20 percent to support its 50% growth in output over these 40 years.
So how did they do it? How could the Federal sector manage to increase output by 50% while actually reducing employment? The truth is, they couldn't have done, despite what the data show. For even though the data are correct for what they do measure, they are missing a big component of the puzzle, as you'll see.
The Missing Employment Data
Since Jimmy Carter came to office in 1976, every President except for George H.W. Bush has called for either cuts in or freezes on Federal hiring. This explains why Federal employment has remained flat for 40 years... it has been continuously downsized.1
Given this history, today's calls for cuts in Federal employment are either dishonest and politically motivated, or they are misguided and made by ignorant politicians who have no business being in charge of the Nation's business.
The ugly truth is that for every Federal worker who hasn't been hired since 1970, one or two private-sector employees has been. For most of these 40 years, both the Executive and Legislative branches of the U.S. Government, whether led by Republicans or Democrats, have bought into the notion that "contracting out" (or "outsourcing") Federal jobs was a good way of stretching precious Federal dollars.
But this is simply not the case, for two simple reasons, which I plan to take up in a future article on Federal contracting:
- Inefficiency. Outsourcing to private companies is often much more expensive than retaining work inhouse. Briefly, this is the result of:
- Additional Overhead. Most large contracts are subcontracted, and even subcontracts are subcontracted. Each layer adds to the overhead cost of every dollar spent.
- Inflexibility. Getting rid of bad Federal contractors can be as difficult as getting rid of a bad Federal employee.
- Incompetence or dishonesty. Scrutiny of the background and expertise of companies hired by the government is much less exacting than that of potential employees. Too often, companies overstate their qualifications for a particular type of work, overstate costs, or both. Even when the private enterprise is at fault, the government agency loses time as work must be redone, and typically must shell out additional funds for the privilege.
- Lack of continuity. When a company is newly hired to assume an existing task, it's far too easy for them to claim that the outgoing contractor had been "doing things wrong." Without continuity, Federal managers can face unmeasured duplication of costs merely because the new contractor has a different way of doing things. Sometimes a change is warranted, but too often it is not. This kind of waste can also occur when Federal managers change, but that happens far less frequently.
- Conflict of interest. Private contractors are motivated by profit rather than by public service, and therefore should never be in charge of making policy or spending decisions that affect taxpayers. This is a clear conflict of interest situation, where the private company's goal is to make as much money as possible, and the Government's goal is to serve the public as best it can within its limited means.
Even if you don't see it the way we do on Mars, you will surely find it strange—and disturbing—that the Federal Government has absolutely no idea how many employees it has in the private sector.
If you walk through any Federal office today, you won't be able to tell which employees are contractors and which are on the Federal payroll. For all appearances, everyone there is a Federal employee. Yet they're not, and nobody keeps tabs on the ones who aren't, except to make sure they have the appropriate network accounts, desks, computers, and security badges. The Labor Department, which is responsible for collecting the Nation's employment data, has never included this information as part of its surveys.
Among other management consequences of this irresponsible lack of data is that it's impossible to know whether the Federal workforce is "right-sized" or not. It's also impossible to measure relative employment costs, or to compare productivity for the two groups.
And why do we not have these necessary data on private contractors?
First, the Paperwork Reduction Act of 1980—one of a series of misguided deregulation moves in the 1980s designed to get the Federal Government "off the backs" of private companies—made it extremely difficult for Federal agencies to add new questions to their existing surveys. And second, the lack of knowledge has been a mutually beneficial "wink" among cash-strapped Federal managers, cash-hungry private companies, and dishonest/ignorant legislators who want to claim they're cutting costs by keeping a lid on Federal employment.
Only in the last few years has the superiority of outsourcing public jobs been openly questioned, and that's been spurred mainly by concerns about the propriety and cost of contracting by the State and Defense Departments to support the War in Iraq. Yet all through the George W. Bush years, Federal agencies were under extreme pressure to "privatize" or "contract-out" any functions that weren't "inherently governmental in nature."
Now, I know what "privatizing" means, ugly word though it may be. But no one—including those pushing hardest for it—can explain what an "inherently governmental" function is. If they were honest, such advocates would admit that any public function that becomes the object of lust by some industry group's lobbyists could not possibly be "inherently governmental," and therefore could be a candidate for privatizing or outsourcing.
To hear these people talk, the only "inherently governmental" jobs are those that make and administer budgets and contracts. That means no jobs for
- Computer specialists
- Audio/Video specialists
- Public affairs specialists
- Security specialists
- Meeting planners
- Travel planners
- Systems designers
- Budget analysts
This leaves jobs only for
- Budget officers
- Contracting officers
- Personnel officers
Myth of the Coddled Federal Worker
One final piece of the puzzle behind the recent calls for Federal downsizing, workforce attrition, and worker pay caps is the myth that Federal workers cost more than their private-sector counterparts, because of their great benefits. Legislators like Brady love to stick this one in their speeches because it's a guaranteed applause line, especially during great recessions.
Trouble is, it's not true.
I'm going to sidestep the whole debate about whether Federal salaries or higher or lower than comparable jobs in the private sector, because it's too complicated for a few paragraphs and perhaps even for an entire book. There are numerous problems with this analysis, including the difficulty of finding consistent data that tracks all the relevant variables —including worker age, education, experience, location, and job descriptions.
Under President George H.W. Bush, Congress passed legislation that granted Federal workers additional pay under a system of "locality adjustments." President Clinton more or less moth-balled the system, and then set one up that was a pale shadow of the original. Here's a link for more information on the topic.
Since the Civil Service Retirement System (CSRS) was mothballed in 1986, all new Federal workers have been in the Federal Employee Retirement System (FERS). FERS does offer a small pension, but it's nothing like the one CSRS retirees enjoy. In addition, FERS workers pay a much higher portion of their salaries for that pension than CSRS workers did.
Instead, a FERS retirement is heavily dependent on the Federal Thrift Plan, which is nothing more than a 401K program for Federal employees. (Federal workers don't have 401K plans.)
Federal employees have health care, sick leave, vacation leave, and other benefits that are comparable to those in any large U.S. company. I'll never forget moving from a Federal job at BEA to Citibank back in 1996, and finding that Citibank's benefits were superior to those I'd had in the government. Not only that, my pay was almost double, and I didn't have any onerous supervisory responsibilities. Citibank's pension system wasn't as generous as that from CSRS, but it was comparable to that of FERS.
Are Federal benefits better than those of your typical small company? Yes, very likely they are. And, given the vast difference between a Federal agency of 100,000 and your typical small company of 50, the difference is appropriate.
In any case, very few Federal contracts are awarded to your typical small company. At least, not directly. Any small companies that share in contract spending get work only through some "prime" contractor, not directly by some Federal manager.
CSRS was abolished not only to reduce the pay of Federal retirees, but also to add the Federal workforce to the Social Security pool. Under CSRS, Feds neither paid Social Security nor received its benefits on retirement. Under FERS, they do both in the same way that private sector workers do.
Another reason why Federal employees still have a decent package of benefits is that they are represented by a Labor Union, the National Federation of Federal Employees. If workers in U.S. companies get desperate enough, perhaps they'll recall that having a Union on your side is a good thing in the fight for decent pay and benefits. That's a lesson that's been lost over the years, especially since President Reagan started kicking Unions in the butt back in 1982.
However, just because workers don't have the pay, benefits, and pension they should have doesn't make it OK for them to demand cuts for those who do.
And politicians like Mr. Brady should know better.
He arrived from the tiny town of Butler, Pennsylviania, as part of the new freshman class of Angry Republican Congressmen. After all the feting and touring that greeted him in Washington, Mike Kelly was asked who had impressed him the most.
"Nobody," he said.
To be impressed by "nobody" must mean this guy is hugely impressed with himself, one would surmise. Well, yes and no:
"I hope I don't sound arrogant about this, but at 62 years old, I've pretty much seen what I need to see.”
Today's article in the Washington Post doesn't explore what exactly Mr. Kelly has seen in his 62 years, but from his attitude and statements, I would venture to guess it isn't much.
You see, Mike Kelly came to Washington because he is angry that the Federal Government "intruded" on the running of his General Motors car dealership, where he'd spent 56 years of creative energy. (I guess that means he'd been working on the business since he was 6. Just kidding.)
And exactly how had it intruded? Why, it was making him sell Chevrolets instead of Cadillacs.
And exactly why was it ruining his business this way? Well, you see, Obama had (personally) taken over General Motors and was (personally) requiring dealerships to restructure as part of an effort to save the company.
"This is America. You can't come in and take my business away from me. . . . Every penny we have is wrapped up in here. I've got 110 people that rely on me every two weeks to be paid. . . . And you call me up and in five minutes try to wipe out 56 years of a business?”
This is a reasonable attitude if you believe that tiny, parochial self-interest should be the motivator of those elected to run a National Government. However, tiny attitudes from Big Men In Their Local Communities have no place in Congress. Indeed, those with tiny, uninformed beliefs who fail to see the big picture are precisely the ones inclined to take actions that will fail the interest of the public they're elected to serve.
They are also the most vulnerable to corruption, since if you believe that self-interest is the highest good, then you are likely to be impressed by visitors who flatter your ego and your opinions... and then offer to pay you huge sums to ensure your reelection or to sway your vote on an issue that serves your own interest.
A lot of Big Men in Tiny Bubbles like Mr. Kelly were frightened and outraged when the Obama administration offered to buy a 61% stake in General Motors in the summer of 2009. After all, wasn't this a "Government Takeover", or worse, a "Nationalization" of a private company?
If you were inclined to take a narrow view, it was. However, if you bothered to take the big view, it clearly was not.
Obama was a reluctant participant in the process of saving General Motors, and his sin was that he insisted that the taxpayers have some control over the process. Rather than just handing $50 billion to a company that had proven itself incapable of turning a profit and had driven itself into bankruptcy, he stipulated that outside ("Government") experts have a say in how that money was used. The restructuring that resulted is what caused Mr. Kelly such pain in his private bubble.
As an article in The Economist—a business journal with no reputation for supporting Government intrusion into the workings of Capitalism—ended up apologizing to Obama for sharing the view that his action was a mistake:
August 19, 2010. Americans expect much from their president, but they do not think he should run car companies. Fortunately, Barack Obama agrees. This week the American government moved closer to getting rid of its stake in General Motors (GM) when the recently ex-bankrupt firm filed to offer its shares once more to the public (see article).
Once a symbol of American prosperity, GM collapsed into the government’s arms last summer. Years of poor management and grabby unions had left it in wretched shape. Efforts to reform came too late. When the recession hit, demand for cars plummeted. GM was on the verge of running out of cash when Uncle Sam intervened, throwing the firm a lifeline of $50 billion in exchange for 61% of its shares.
Many people thought this bail-out (and a smaller one involving Chrysler, an even sicker firm) unwise. Governments have historically been lousy stewards of industry. Lovers of free markets (including The Economist) feared that Mr Obama might use GM as a political tool: perhaps favouring the unions who donate to Democrats or forcing the firm to build smaller, greener cars than consumers want to buy. The label “Government Motors” quickly stuck, evoking images of clunky committee-built cars that burned banknotes instead of petrol—all run by what Sarah Palin might call the socialist-in-chief.
Yet the doomsayers were wrong. Unlike, say, France’s President Nicolas Sarkozy, who used public funds to support Renault and Peugeot-Citroën on condition that they did not close factories in France, Mr Obama has been tough from the start. GM had to promise to slim down dramatically—cutting jobs, shuttering factories and shedding brands—to win its lifeline. The firm was forced to declare bankruptcy. Shareholders were wiped out. Top managers were swept aside. Unions did win some special favours: when Chrysler was divided among its creditors, for example, a union health fund did far better than secured bondholders whose claims should have been senior. Congress has put pressure on GM to build new models in America rather than Asia, and to keep open dealerships in certain electoral districts. But by and large Mr Obama has not used his stakes in GM and Chrysler for political ends. On the contrary, his goal has been to restore both firms to health and then get out as quickly as possible. GM is now profitable again and Chrysler, managed by Fiat, is making progress. Taxpayers might even turn a profit when GM is sold.
GM's payback to U.S. taxpayers has already begun, and as The Economist notes, the total repayment over time will likely exceed the original $50 billion investment.
Yet Mr. Kelly probably doesn't believe any of this. Why? Because he doesn't want to. It's not in his interest to do so. It's more convenient for him to believe it's all a lie.
After all, to change his mind would invalidate his reason for popping in to Washington. Given his arrogant attitude that he is the most impressive person in D.C., he is hardly the sort to question himself, let alone to burst the tiny bubble that brought him here.
Just days after I opened an exploration of the way humans view conflict of interest, and how their personal self-interest makes understanding the way this topic is approached in different contexts, the Washington Post publishes a front-page article that exposes the kind of conundrum I'm planning to look into.
The Senate, you see, has no laws restricting the investments its members can make into companies whose fortunes their votes may affect. In particular, they may freely invest in companies that are major players in specific industries overseen by Senate committees. In the Post article, the industry is defense, and the committee typically has "inside knowledge" into the defense systems that will be built, and which companies will benefit from their votes.
This seems strange enough, but as the Post article points out, the Congress has passed laws that prohibit such investments by those appointed to run the agencies — such as Defense — that will let the contracts to carry out the Senate's decisions. Not only that, but such laws have long been on the books to regulate investment behavior by rank-and-file Federal employees.
For several years now, I've been troubled by how humans define the concept of "conflict of interest." My concern has grown as I've realized the importance humans seem to place on avoiding "it", or, at times, even the "appearance of it." The more thought I've given to the topic, the more confused I've become. My confusion stems from the observation that whether or not someone has a conflict of interest seems to depend on who is asking the question, what the context is, and whether or not the answer is in that person's own interest or not.
Even more confusing is the paradox whereby humans believe that allowing a conflict of interest can be wrong in case A but right in case B. Again, the paradox may only be resolved if one assumes that the perspective of the believer is what determines the judgment of right or wrong.
Let me be a little more specific.
In most situations where humans raise the spectre that someone may have a "conflict of interest," the implicit notion is that having such a conflict is bad and should be avoided. Examples here are cases where a judge may issue a ruling that is in his own interest but not necessarily that of the conflicting parties. Or where a public official makes spending decisions that stand to benefit himself—or his friends, family, supporters, etc.—but not necessarily those who are supposed to benefit from the spending.
Most people I've talked to seem to think that this notion is obvious—that weighing such conflicts of interest in one's favor is wrong and should be avoided. As will become plain later in this essay, I certainly do not disagree with this notion.
On the other hand, either consciously or unconsciously, most humans in modern, West-European-modeled societies entertain notions of conflict of interest that, to my Martian mind, seem antithetical to the the one they espouse publicly. In this less-than-conscious notion, acting in one's own interest is something that society, instead of outlawing, should actually encourage, since acting in one's own interest is a natural human tendency that can't be legislated away. Not only that, but acting in one's own interest is viewed as ultimately the same as acting in everyone's interest.
This belief is the very basis of the dominant economic system of what are called "Western" societies. Capitalism would be far less effective, it is argued, if people were encouraged to consider anything other than their own interest in making personal choices, such as purchase and investment decisions.
In reading literature that explains the rise and rationale of Capitalism, texts keep returning to a writer called Adam Smith, whose 1776 book, The Wealth of Nations, was particularly influential. Phrases from that book are frequently quoted to explain why the motive of self-interest is so beneficial to a strong Capitalist system. For example:
It is not from the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own interest. We address ourselves, not to their humanity but to their self-love, and never talk to them of our own necessities but of their advantages.
The most famous quote from Smith's book on this subject puts the notion of self-interest in a macro foundation he famously labeled "The Invisible Hand":
By preferring the support of domestic to that of foreign industry, he intends only his own security; and by directing that industry in such a manner as its produce may be of the greatest value, he intends only his own gain, and he is in this, as in many other cases, led by an invisible hand to promote an end which was no part of his intention. Nor is it always the worse for the society that it was not part of it. By pursuing his own interest he frequently promotes that of the society more effectually than when he really intends to promote it. I have never known much good done by those who affected to trade for the public good. It is an affectation, indeed, not very common among merchants, and very few words need be employed in dissuading them from it.
Smith's promotion of self-interest as a core virtue in economic transactions became one of the central concepts of Capitalism. Unfortunately, the most influential modern spokesmen for Capitalism seize on self-interest as the rallying cry, neglecting various other central ideas Smith expounded in building his argument. Clearly, it is in the self-interest of the wealthiest and most powerful of a Capitalist society to argue that greed (which itself relies on blind self-interest) is a virtue (or, euphemistically, as a "necessary evil"), but it seems surprising that even humans of modest means agree with them. And none of those who subscribe to this argument perceive the central Martian concept that making the pursuit of one's personal interest the foundation of a society's culture is ultimately—and, apparently to most humans, unintuitively—counter to one's ultimate interests.
Again, though Smith is pilloried by many humans who oppose laisse-faire Capitalism, he is hardly the demon of self-centered greed that most of his ardent followers are today. In his first major book, The Theory of Moral Sentiment, which he “always considered ... a much superior work to his Weaith of Nations,” Smith explains that the pursuit of wealth and power is not a worthy goal in itself. Referring to the universal human desire for respect and acclaim by one's peers, Smith writes:
Two different roads are presented to us, equally leading to the attainment of this so much desired object: the one, by the study of wisdom and the practice of virtue; the other, by the acquisition of wealth and greatness. Two different characters are presented to our emulation: the one, of proud ambition and ostentatious avidity; the other, of humble modesty and equitable justice. Two different models, two different pictures, are held out to us, according to which we may fashion our own character and behaviour; the one more gaudy and glittering in its colouring; the other more correct and more exquisitely beautiful in its outline: the one forcing itself upon the notice of every wandering eye; the other, attracting the attention of scarce any body but the most studious and careful observer. They are the wise and the virtuous chiefly, a select, though, I am afraid, but a small party, who are the real and steady admirers of wisdom and virtue. The great mob of mankind are the admirers and worshippers, and, what may seem more extraordinary, most frequently the disinterested admirers and worshippers, of wealth and greatness.
When Smith was formulating his philosophy in the late 18th Century, Christianity defined the dominant moral code in Western Europe—in both religious and political spheres of society—so his ideas naturally reflected that influence. And the idea at the core of Jesus Christ's teachings is that humans should reject self-interest and embrace an affection for one's neighbors and fellow planet dwellers as the highest virtue. At least in his writings, Smith states a contrary, more truly Christian view, which clearly counterbalances the promotion of self-interest in his overall life view:
And hence it is, that to feel much for others and little for ourselves, that to restrain our selfish, and to indulge our benevolent affections, constitutes the perfection of human nature; and can alone produce among mankind that harmony of sentiments and passions in which consists their whole grace and propriety. As to love our neighbour as we love ourselves is the great law of Christianity, so it is the great precept of nature to love ourselves only as we love our neighbour, or what comes to the same thing, as our neighbour is capable of loving us.
That the core teachings of Christianity run so contrary to human nature explains how so many humans can call themselves Christians while simultaneously worshiping at the altar of self-interest, working feverishly to accumulate personal wealth and power—seemingly to the exclusion of all other concerns. Many vocal leaders of the Christian churches provide an easy, self-interested rationale to justify this hypochrisy. In particular, a recently deceased evangelist called Oral Roberts is often cited as the founder of televangelism, based on the notion that there is no moral conflict between the pursuit of wealth and a belief in the teachings of Christ. Roberts apparently based his misguided philosophy on this passage from the Bible's Third Epistle of John:
I wish above all things that thou mayest prosper and be in health, even as thy soul prospereth.
Roberts wasn’t shy about sharing the story whereby as a struggling 29-year-old pastor, he read this passage, decided it meant that being rich was a worthy goal, and, as if to celebrate, bought himself a new Buick the following day.
How could any Christian take Oral Roberts seriously? On Mars, it's clear that a philosophy such as Roberts’ is nothing but a self-serving misdirection from the teachings of the religion's founding prophet. What I didn't understand until recently is that humans who choose to follow ministers like Roberts and his many emulators are simply not interested in being Christians, except in name. Such are delighted to realize that their religion spares them the agony of always keeping their self-interest in check.
This essay is the first of a series that will explore some specific cases where Western societies legislate to prevent "conflict of interest," and perhaps more interestingly, where they do not. The cases will be examined in the light of the way self-interest is perceived by individual humans, as well as by humans grouped into various, possibly overlapping, personal and business relationships.
In reporting these ideas to my Martian peers, I am particularly interested in trying to sort out how humans rationalize the conflict between their own personal interests and the interests of broader layers of society. Is there a boundary that defines the point at which a human will give up pursuing his own self-interest and throw his lot in with the interest of a larger group? If so, there are probably different boundaries for the human clusters of increasing size that radiate outward to encompass the entire planet.
Is there a point beyond which the majority of humans will not pass if it means abandoning their own interests? Or does everyone eventually perceive the point at which personal interests become irrelevant, and mutual interests merge?
Some humans reading this will undoubtedly argue that all of this is perfectly obvious, and such an exploration a waste of time and an unworthy intellectual pursuit.
To those I say, please understand how we Martians think. On a fundamental level, Martian culture reflects some notions that are viewed as "naive" by humans who express their thoughts charitably, or as "sucker-bait," "gullible," or "dupable" by those who don't.
- Before making any decision, from the personal level on up, Martians are expected to consider the decision's possible repercussions on fellow Martians and on the planetary resources on which they depend. The idea of having to make laws to enforce the runaway pursuit of one's self-interest is quite foreign.
- Outside of the pursuit of pleasure, knowledge, and family harmony, the primary motivation of Martians is finding a life's work that suits one's personal gifts, and then working as hard as possible to make sure that the products of one's labor reflect the highest quality one can achieve. It is believed that in this way, one will naturally be rewarded by success and by enough wealth to ensure happiness.
Possible future sources of inquiry in this series include:
- Consider a case where a private company is awarded a contract by a national government agency to help fulfill part of its basic mission. Clearly, the agency's interest is in fulfilling its mission as best it can within the constraints of its budget and its spectrum of resources. The agency's interest, however, is not the same as that of a private contractor, whose primary interest is in maximizing profit.
- When lawmakers for the U.S. Congress make laws, whose interests are they serving? If their expensive campaign was financed by certain private groups, companies, or industry associations, isn't it in the lawmaker's interest to promote the interests of these financiers? If so, what impact does the interest of the broader mass of the legislator's voters make in decisionmaking? What if the lawmaker disagrees with the views of those who financed his campaign? And to what groups does the lawmaker refer when he inveighs against the "special interests"?
- Suppose you're a Congressman who is asked to vote on a law that would reduce your income opportunities, while also restricting your access to fundraisers and lobbyists? If the majority of legislators were to make self-interest the guidepost of their decisionmaking, such a law would never be passed.
- Is it appropriate for a profit-motivated company to be responsible for activities whose purpose is to promote the general welfare? This is a fairly common arrangement in the United States, as far as I can see, but it strikes me as a potentially disastrous conflict-of-interest situation. Obvious examples are private companies engaged in providing basic health care or education to the public.
- What about the widespread situation where a monopoly company, or an oligopoly of companies, fulfills societies basic needs for infrastructure—such as electricity, inter-networking, water, and services distributed by radio waves? How about roads, bridges, airports, rail systems, and air travel? To what extent can these infrastructure requirements be compromised if fulfilled by companies motivated solely by profit?
That's it for now. More later.
For a long time now, I've been explaining why the world would have been better off if Apple's computers had come to dominate homes and businesses. I've focused on the virtues of Apple's software almost exclusively, even though Apple has for most of existence been primarily a hardware company, like Dell or Hewlett Packard. Why? Because it's clear to all us Martians that what makes or breaks a computing experience is the software. To paraphrase one of your ex-Presidents, "It's the Software, stupid!"
I've also come to believe that humans are genetically predisposed to self-deception, allowing them to talk themselves into whatever point of view is most convenient, or is perceived as being in their best self-interest. Thus, argument over the relative worth of one technology or another is pointless, because no carefully researched and supported set of facts will ever be enough to persuade someone with the opposite view. Indeed, the truth of this axiom is encapsulated in the common human phrase of folk wisdom,
"You can lead a horse to water, but you can't make him drink."
I've noted that when someone conjures this phrase to explain a colleague or acquaintance's intransigence about something, those listening will nod to each other knowingly and somewhat sadly aver, "So true."
And yet, how many humans really think they're as "stupid" as horses?
The only time a change of opinion occurs is when some circumstance in a person's life changes sufficiently that what was highly dubious before is now patently obvious. This is why you read so many stories of former PC users who, when confronted with the necessity of using a Mac for a period of time, invariably come to understand how far beyond superior the Mac operating system is when compared with Windows.
I spend little time using Windows nowadays, but my wife is still forced to use a PC for her job. As we both work at home, I have become her de facto Help Desk support for tasks that her remote technicians can't handle. So it was that today I managed to raise my green blood pressure far too high for sustainable health, all in the cause of trying to get a scanner to work with her Dell laptop.
Working with Windows is a lot like trying to communicate with automated phone systems. One menu will explain a variety of choices. Then, you find that either none of them are helpful, or some of them promise more than they deliver. For example, in this case Windows let me know that I had attached a new piece of hardware. (Duh!) Then it offered options to (a) let it try to find the driver on its own or (b) insert a CD that contains the driver. I was skeptical of option (a) but decided to try that. Well, of course Windows came back almost immediately to tell me it couldn't find the driver.
On a Mac? Apple keeps hardware drivers current with all of its OS releases, including incremental updates, and I've almost never had to go searching for a driver for common hardware like scanners and printers. (A Windows user at this point will self-deceptively point out how much more hardware is available for the PC, etc. All I can say is, Mac users have more than enough choices in hardware peripherals, thanks.)
Step two was so infuriating that I refuse to explain it in detail. This involved finding and downloading Canon's driver and software. The finding part was easy as pie thanks to Google and Canon's easy-to-use website. The downloading and installation parts, however, were beyond maddening. The experience exposed so many obvious weaknesses in Windows usability that I had to again wonder how PC users put up with it. I said I wasn't going to go into detail, and I'll try not to. But here are a few observations:
- Clicking download doesn't just download the file, as it does on a Mac. Instead, it spawns a dialog box that requires a choice: Download, or "Run". So, I ran. (Again, a Windows guru would say, "But you can avoid having to make that choice each time by..." And I say, "Yes, but you forget how clueless most computer users are. Even though you can do this, it's not the default experience that it should be.")
- So, after running, nothing happened. Nothing. I thought I'd done something wrong, so I downloaded again. My wife noted that Canon's site suggests saving the file rather than running it, so I did that. But where to save it? From the file browser it took far longer than it should to locate the Desktop, which I assumed would be the default location. Even if it's the default, I had to manually choose it. *Groan*
- So once the file was downloaded, I just wanted to click it on the desktop. Guess what? There's no obvious way to expose the desktop. My wife, a 20-year PC user, says she always minimizes all the windows to get there. Good grief. Think of all the lost time in corporate America with clueless users trying to find their desktop. Scary.
- Having installed, I then had to go through another wizard that wanted to help me help Windows connect the hardware with the driver. To get to the wizard, I had to find the control panel for scanners, another task that all its own makes using Windows look hard from a Mac perspective.
Why does this seem ridiculous to Martians? Simply because, using Mac OS X, you just plug your scanner in and... there's no step two. The Mac's built-in Twain driver typically can pair with the scanner even if the company-specific scanner is unavailable. And since this is a core service of the operating system, it works with any Twain-aware software. Isn't that an obvious approach?
This lengthy and agonizing task (don't even get me started on the Windows user interface, and I'm not talking about its relative beauty) reminded me of another tragedy of modern computing, which I've written about before. Namely, the institution of a "Help Desk" in all companies today is not one of the inevitable costs of having computers on every desk. It is quite obviously the result of having IBM PCs running DOS or Windows computers on every desk.
The process of setting up a scanner should be in the skill range of every computer user. In the Mac world, it is. In the PC world, it isn't. It's as simple as that. And you can extrapolate that observation to nearly every other aspect of office computing we have today.
The Help Desk is a huge revenue drain that every PC user simply assumes is necessary, because it has evolved to be so. Today, Help Desks are self-perpetuating organizations, typically driven by contract companies with a clear incentive to make themselves seem indispensable. These folks (or at least, the companies they work for) are at the forefront of the anti-Mac coalition devoted to doing whatever it takes to keep Macs out of the enterprise.
And who is the company that hires the Help Desk to question what the "experts" say? After all, these are the guys who daily keep their computing environment running. Business managers simply aren't qualified to make decisions about their computing infrastructure, so they rely on outside contractors for recommendations. And guess what? Those are the same guys who regularly argue for expanding the Help Desk and who regularly explain why it would be a mistake to let employees start using Macs at the office. (For more on this subject, refer to the third section of my earlier article, Protecting Windows: How PC Malware Became A Way of Life. The third section is called "Change Resisters In Charge.")
In this case, the advocates for the Help Desk aren't deceiving themselves. Many of them fully understand that if Macs came in, many of their jobs would go away. But somehow, the business managers and computer users continue to spend most of their time struggling with simple tasks rather than actually getting work done, all because they're convinced they have no choice. And having to use Windows, the average user continues to perceive their PC as this unpredictable, inscrutable, frustrating device whose only virtue appears to be access to the Web and to iTunes.
I'll never forget my highly intelligent disk jockey friend who purchased a high-end PC with all the bells and whistles for recording and editing audio and video. Not only did it cost more than an iMac with the same basic capabilities, but it sat in his house for over a year before he had the nerve (and time) to figure out how to use it to do the things he bought it for.
I tried to explain to him that... But you know how it goes. Tell a PC user how simple something like recording and editing audio is on a Mac, and either their eyes glaze over or they start to look at you suspiciously. And that's if they're already a friend!
But I'm done with trying to persuade humans of anything. They'll either figure it out, or they won't. Unfortunately, another observation I've made isn't good news for any human figuring out that they're wrong about something:
Changes in human understanding, and the policy implied by that understanding, only occur through crisis.
This observation is directly related to the original premise, because if it's impossible ever to "prove" an idea or even a set of facts to another human or group of humans through cogent argument, how do you manage to change awareness of the virtue of alternative perspectives? I'm taking back to Mars the theory that such changes are only possible after a human undergoes some life-changing crisis, or after a community of humans does the same.
In a followup essay, I'll discuss several other current controversial topics that have quite obvious answers, yet which humans--quite often on both sides of the debate--keep viewing from obviously kooky perspectives.
Well, obvious to any Martian I know, anyway.
This being that most political of years, serious issues of national significance have been on my mind. Sadly, judging from the typical discourse I see Americans engaged in, I can only conclude that most humans seem to think it's best to just ignore serious issues. Why is it that people read body language more seriously than they do written language? And why is it, after so many years of evolution, a pretty face or the color of one's skin is more influential than what that candidate has to say about--oh, you know, energy policy, health care reform, global warming and environmental concerns, economic insecurities, abortion, and so on.
There was an article in the Washington Post recently that finally expressed what has been obvious to me for many years now: Humans have become so cynical that they honestly believe everything is an opinion. There are no facts. If you don't like a particular fact someone presents you with, you simply respond, "Oh, you think everyone should just agree with you!" And likewise, if someone presents you with a lie that you like, you are quite willing to take it as gospel.
There's no facing reality... no desire to really debate issues using facts. Heck, I'm beginning to think that too many Americans don't even know what a fact is. Here's a simple definition:
Fact: The truth about events as opposed to interpretation.
Ah, but now we enter a realm that, for many humans, presents great difficulty: What is Truth?
It is a question that has reverberated throughout the Western world ever since Pontius Pilate asked the question of Jesus. Jesus had referred to a truth, and Pilate's question suggests that he doesn't believe there is such a thing.
But of course, there is. That my cat ran away the day we moved to our new house is a fact. That my wife and I have been married now for almost 25 years is a fact. I have two sisters. That is also a fact.
Extending these to more difficult lines of inquiry, it's clear that changes in earth's atmosphere are causing global temperatures to rise, for the Arctic ice cap to melt, for glaciers around the world to disappear, and for the incidence of hurricanes and droughts to increase. These are facts, and nearly all scientists today agree that the inference from these facts is that Global Warming is a fact. It is the truth, even if it's extremely inconvenient.
On Presidential Lies
Likewise, it is a fact that the Republican candidate for Vice-President, Sarah Palin, did not oppose the "Bridge To Nowhere," as she claims. She ran for Governor on her support for the bridge, as a matter of fact. Only after Congress tabled the earmark Palin wanted for the bridge did she switch sides. Can you say "disingenuous?" She also didn't sell the Governor's jet on eBay, as John McCain has claimed.
In fact, Sarah Palin and her fellow candidate, John McCain, are going down as the most dishonest folks who ever ran for the Presidency (in my lifetime, at least). Think that's hyperbole? I'm sorry to say that it's not. Every day, more evidence of their willingness to bend the truth backwards is showing up, resulting in nearly daily outcries in U.S. newspapers:
- Running on a Lie (Washington Post, 9/16/08)
- The Odd Lies Of Sarah Palin II: The Bridge To Nowhere (Atlantic Online, 9/15/08)
- Campaign check: Lies and half-truths outed (San Francisco Chronicle, 9/13/08)
- Campaign of lies disgraces McCain (St. Petersburg Times, 9/14/08)
- McCain has become a serial liar (SeattlePi, 9/15/08)
- Press picks over litter of lies on the Palin trail (Sydney Morning Herald, 9/16/08)
- New election low: distorting the fact-checking (Los Angeles Times, 9/12/08)
- McCain: Mr. Straight Talk? (MSNBC, 9/12/08)
- Ringing Untrue, Again and Again (New York Times, 9/17/08)
- True Whoppers (Washington Post, 9/17/08)
And the list goes on and on... just search through Google News some time, and you'll see what I mean. Even though this year's persistent falsehoods are the worst yet, there have been plenty of the same by previous Presidents and their staffs. In fact, I'd argue that it's Presidential Lies that got us where we are in the first place. It all started with President Johnson lying about the Vietnam War, followed closely by Richard Nixon lying about the Vietnam War. And then the real whopper that really made Americans suspicious of their leaders: Watergate. But those observations lead to a huge digression that I should leave for another time.
Here are a few examples of lies told by recent U.S. Presidents:
- John McCain: He continues to repeat the plain untruth that Barack Obama's tax plan would raise everyone's taxes. This scare tactic usually works, whether it's true or not. In this case, McCain knows it's a lie, yet he keeps saying it. As a matter of fact, Obama's tax plan would only raise taxes for the top 1% of America's richest. For every household that makes less than $250,000 a year, Obama's plan makes quite substantial tax cuts, whereas McCain's plan does not. As with Bush's deficit-busting tax cuts early in his term, McCain's cuts would benefit only the very rich and the corporations they run.
- George W. Bush: Hmmm... Let's see, there have been so many lies, told so well, that on Mars we've determined he's lied more than any President in U.S. history. Everyone knows by know---as a fact---that Iraq never had "weapons of mass destruction," nor did Saddam Hussein have anything whatsoever to do with the terrorist attacks of September 11, 2001. There are hundreds of documented lies by Bush and his administration in support of the larger one, but here's a good one. On October 22, 2002, as the public relations effort to sell the Iraq war to U.S. citizens was heating up, The Washington Post published an article whose title says it all: For Bush, Facts Are MalleableFor Bush, Facts Are Malleable, which cited two lies in two paragraph:
In the president's Oct. 7 speech to the nation from Cincinnati, he introduced several rationales for taking action against Iraq. Describing contacts between al Qaeda and Iraq, [David, Bush] cited "one very senior al Qaeda leader who received medical treatment in Baghdad this year." He asserted that "we have discovered through intelligence that Iraq has a growing fleet" of unmanned aircraft and expressed worry about them "targeting the United States."
Bush's statement about the Iraqi nuclear defector, implying such information was current in 1998, was a reference to Khidhir Hamza. But Hamza, though he spoke publicly about his information in 1998, retired from Iraq's nuclear program in 1991, fled to the Iraqi north in 1994 and left the country in 1995. Finally, Bush's statement that Iraq could attack "on any given day" with terrorist groups was at odds with congressional testimony by the CIA. The testimony, declassified after Bush's speech, rated the possibility as "low" that [Saddam Hussein] would initiate a chemical or biological weapons attack against the United States but might take the "extreme step" of assisting terrorists if provoked by a U.S. attack.
- Bill Clinton: "I did not have sexual relations with that woman, Miss Lewinsky." Yes, that was a lie, unless you don't consider oral relations "sexual." And wow, did Bill pay for that one! Indeed, he was actually impeached for that lie... which, unlike the lies of the Presidents who preceded and successors, had zero impact on the health and welfare of the Nation. From my perspective on Mars, it's inconceivable that one President could waste $500 billion on a war the rationale for which he brazenly lied about, and yet receive no punishment whatsoever, while another President had a brief sexual liaison with another consenting adult and lied about it, a sin that led to his being impeached, for only the second time in U.S. history.
- George H.W. Bush: George H. W. Bush's best known lie is, of course, "Read my lips, no new taxes." He said that during the campaign for President, and then proceeded to break that promise. But a far more serious lie is the one he repeatedly told the American people about negotiating with terrorists:
Only problem is, even as he made such pronouncement, he and the Reagan administration were secretly selling arms to Iran, in exchange for the release of hostages. They then turned around and supported the Nicaraguan Contra rebels with the profits from the secret Iran sales. This is all a matter of public record... it is fact, and yet, perhaps because of the complexity of the issues, or perhaps because of the popularity of Ronald Reagan and the transition of his administration to that of George H.W. Bush, the lie and the secret deal managed to wash over Americans' minds without really registering.
Today I am proud to deliver to the American people the result of the six months effort to review our policies and our capabilities to deal with terrorism. Our policy is clear, concise, unequivocal. We will offer no concession to terrorists, because that only leads to more terrorism. States that practice terrorism, or actively support it, will not be allowed to do so without consequence.
- Ronald Reagan: On Mars, we found it hard to believe that anyone so misinformed could rise to become President of the most powerful country on Earth. We debated among ourselves whether Reagan's many factual errors were truly lies, or whether they reflected a fundamentally weak brain. Ultimately, we determined that Reagan was actually smart, and that he told lies in such an aw-shucks manner that the average American not only would believe him, but would never suspect he was lying. Besides lying about the Iran-Contra affair, like Bush, there are so many examples it's hard to narrow the list down. Here's one of our short favorites:
"All the waste in a year from a nuclear power plant can be stored under a desk." --Ronald Reagan (Republican candidate for president), quoted in the Burlington (Vermont) Free Press, February 15, 1980.As a matter of fact, even taking an estimate from nuclear industry sources, the typical nuclear power plant produces about 500 pounds of waste each year. I don't think that would fit under Reagan's desk, do you?
This year, the spreading of lies and innuendo about the candidates--particularly, as usual--has become more brazen than ever. An example of this came to my attention a few weeks ago when someone on a mailing list I (used to) follow sent everyone a column spreading demonstrably false information about Barack Obama. This is only one such rumor that's been spreading virally through the web in attempts to smear Obama. Perhaps the dirtiest is the assertion that Obama is Muslim, because his middle name is Hussein. Well, no. If you read Obama's biography that will be clear, but the folks who make this up aren't trying to spread facts... they're trying to spread fear. I only wish more humans would see through their lies and punish the candidate who abides such evil to an extent that future candidates will think twice before adopting lies as a method of campaigning for President.
All of which leads me to the lies taking place in 2008, nearly all of which are directed from the Republican candidates toward Barack Obama, the Democratic candidate for President. A particularly inflammatory lie that has been making the rounds of right-wing religious groups on the Internet is the one that passed through my email recently. As part of an anti-Obama, viral web campaign, someone called Matt Barber of the Liberty Council published an article called "Obamacide." (Nice title, don't you think, for someone claiming to be a Christian?) The article accuses Obama of supporting "partial birth" abortion, using language that refers to Obama's "love affair" with the practice, which even Pro-Choice supporters do not condone. Here's an excerpt from Barber's libelous diatribe:
While serving in the Illinois state senate, he led the fight against a state version of Born Alive that was substantively identical to the federal BAIPA. In 2002, BAIPA passed the U.S. Senate with unanimous, bipartisan support; yet, Obama vehemently opposed its Illinois twin. This places him on the furthest fringe of pro-abortion extremes. The man's devotion to the pro-abortion industry is so fixed that he would rather allow the murder of newborn babies than give an inch to the sanctity of human life.
And Barber was just warming up at that point... there's much more in the article.
If I may say so, this is precisely the kind of character assassination that Barack Obama is trying to eliminate from our national discourse.
First of all, to oppose Obama strictly on the issue of abortion is to ignore the many other important issues the United States faces. My impression is that even many anti-abortion Christians are beginning to understand that.
Most important, the charges made in Matt Barber's opinion piece are false. They are being spread around the web simply to make Obama look bad, and unfortunately many otherwise intelligent Americans are buying this baloney without question.
After researching the source documents for what really happened, I determined the facts are as follows:
- Obama's Illinois senate votes in 2001-02 were on legislation that was radically different from the U.S. Senate's Born Alive Infants Protection Act (BAIPA) in 2002. How?
- Illinois already had in place a ban on partial birth abortions, dating back to 1975. If you don't believe me, check out the Illinois code on this law.
- Therefore, the Illinois bills were not, in fact, geared to providing the protection against partial-birth abortions that the Federal law was. To compare the two is simply deceitful on the part of writers like Barber.
- The Illinois votes, which were joined by 40% of the Senate including several Republicans, were against proposals that would:
- Provide damages to the family to the cost of raising the child, an obviously irresponsible open-ended liability. This included punitive damages against the hospital that delivered the child. Keep in mind that we're talking about a live birth that is protected under the law... not an abortion.
- Redefine "live birth abortion" to include any non-viable fetus that showed signs of life after abortion. This would effectively ban abortion, since it's not restricted to the third trimester of pregnancy. The Federal BAIPA law requires that the "live birth" be a viable child... one that is capable of living outside the womb.
In reaction to this obviously false information from Matt Barber, another reader of the email thread reacted this way:
The last times we had the Communist party try and put a person in the white house, she was black. Now they are doing it again with a man of color and the blessing of the media. We've gotten worse than Sodom and Gomorra with trying to get someone to represent us in the white house who is left of left with a Marxist attitude and no respect for life.
In a followup email, this person clarified his belief that "Communist = Socialist = Liberal."
For a few minutes, my mind was in a state of serious cognitive dissonance as a result of that paragraph, because it so flagrantly violated my understanding of U.S. history and of its political realities. Here was someone I had exchanged numerous emails with opining that Obama and the Democrats were Communists.
The notion that Obama is a Communist or Marxist is clearly a lie, a propaganda-motivated scare tactic at its very worst. It's the same sort of charge that was leveled against Franklin D. Roosevelt by Republicans during his term(s). To the hatemongers who spread this kind of crazy talk, folks from Obama's and Roosevelt's side of the aisle are guilty of one major, unforgivable sin: They believe that U.S. citizens deserve better living conditions than most of you have, and they believe the Federal Government can actually do something to help. There's nothing whatsoever Marxist, or Communist, about this approach. If those who spread such lies would ever bother to read Karl Marx, they would understand that. But they never will, because (I think) they believe Marx's writings were the work of the Devil himself.
Chief among Obama's "Marxist" ideas is that the U.S. should join the rest of the developed world in having universal health care. If you haven't seen Michael Moore's Sicko, you should. Even if you don't like Michael Moore or think he's some kind of left-wing nut, humans who want to form an opinion of him and of his movies like "Sicko" should really take the time to see them and judge for yourselves. Don't just take some right-wing nut's word for it.
Obama also believes the Federal Government should be investing heavily in alternative (non-carbon) energy in order to become free of dependence on foreign oil in 10 years. Ridiculous? Perhaps... but at least it's a goal worth pursuing. One historical parallel you should recall in determining the idea's ridiculous-ness is the Federal Government's investment in the Interstate Highway System.
Can you imagine our country without such a vital transportation network? And yet, if the government hadn't built it, who would? It was a huge investment, championed by Republican President Dwight D. Eisenhower, that changed the lives of all Americans for the better.
By investing in solar and other alternative technologies, the Federal Government, as the nation's largest consumer, has the ability to vastly expand that market, thereby lowering prices, increasing production, and improving the technology. Isn't this worth doing in order to make the United States less dependent on foreign oil? After all, would you have invaded Iraq if it weren't one of your biggest oil suppliers? Just consider how much money that war has cost, and the many better ways the government could have spent that money. I don't know about you, but the thought makes us Martians feel not only incredulous, but also incredibly angry.
Another major goal of Obama's is to address the issue of Global Warming. Don't you think this problem trumps abortion on the list of the Earth's ills? By abandoning carbon fuels, and enacting legislation like the one Bush vetoed recently that would have required reduced emissions, humans might have a chance to turn this around.
Regarding Christianity and one's position on abortion, it's clear that Christ would not have approved of the practice, as neither do the Catholic nor Protestant churches. However, it's important to understand and respect the fact that one of the singular, founding principles of the United States was--and is--the separation of Church and State.
This means that the Government cannot pass laws that enforce the views of any particular religious group, and since opposition to abortion stems primarily from religious beliefs about the point in time when a zygote/embryo/fetus becomes a "person," anti-abortion laws such as those supported by Right To Life groups are clearly unconstitutional.
The Religious Right, however, does not believe that the Separation of Church and State was ever a constitutional principle. In some ways, such an argument is in the same league as a belief that the Holocaust never occurred, or that the Earth is flat, or that humans have been abducted by Martians for scientific experiments. Poppycock! We would never do such a thing, my friends. Ours is a thoroughly peace-loving, generous, and thoughtful society. If we needed to learn something from a human's body, we'd ask them to participate in one of our studies: By choice, not by deceit.
That said, there is certainly a great deal of interpretation that goes on in understanding when and how the principle of Separation became part of the U.S. Constitution. It's true that the literal phrase 'separation of church and state' does not appear in the Constitution, but clearly the concept is there, ingrained in the American psyche through its founders' strong belief in religious liberty. The First Amendment says "Congress shall make no law respecting an establishment of religion or prohibiting the free exercise thereof...."
Two of the great thinkers at the time the Constitution was enacted had this to say on the subject of religious liberty. In making these declarations, the United States became the first country in the history of the world to propose religious liberty as a founding principle of a Nation:
- In 1776, Thomas Paine wrote, in his famous pamphlet, Common Sense:
- "As to religion, I hold it to be the indispensable duty of all government, to protect all conscientious professors thereof, and I know of no other business which government hath to do therewith"
- In 1779, the Virginia Statute for Religious Freedom, written by Thomas Jefferson, proclaimed:
- "[N]o man shall be compelled to frequent or support any religious worship, place, or ministry whatsoever, nor shall be enforced, restrained, molested, or burthened in his body or goods, nor shall otherwise suffer, on account of his religious opinions or belief; but that all men shall be free to profess, and by argument to maintain, their opinions in matters of religion, and that the same shall in no wise diminish, enlarge, or affect their civil capacities."
- Later, as President, Jefferson wrote a letter to the Danbury (Conn.) Baptist Association that is the origin of the controversial phrase itself:
- "Believing with you that religion is a matter which lies solely between man & his god, that he owes account to none other for his faith or his worship, that the legitimate powers of government reach actions only, and not opinions, I contemplate with sovereign reverence that act of the whole American people which declared that their legislature should make no law respecting an establishment of religion, or prohibiting the free exercise thereof, thus building a wall of separation between church and state. "
I think it's also important for Christians, as well as all humanity, to consider the many other forms of violence that occur in the world. There are many examples I could choose from -- human rights abuses, genocide, starvation, eviscerating the environment -- but a very pertinent one to the debate over abortion is the rate of child abuse in the U.S.
One could easily argue that the violence inflicted on living children as a result of parental abuse inflicts far more damage on society than the violence of abortion. Not only are these children themselves the victims of often horrifying violence, but the impact of this violence on these children's' personality development will live on for years. Studies show that nearly all of the adults who victimize others--their wives, husbands, children, or others--were themselves the victim of violence as children. Thus, beating up your chienvironmentld causes a chain reaction of events that reverberates far beyond that single incident. According to Health and Human Services (HHS) statistics, there were 3.6 million referrals for incidents of child abuse in the U.S. in 2006--about 3 times the number of abortions. After investigation of all the referrals, it was ultimately determined that about 1 million children were the victims of abuse that year.
There are many causes for outrage among those who abhor violence in this world. Abortion is one of them, child abuse another. But there are many others as well. My original point was that as Christians it's a mistake to make political decisions based on abortion alone, since it's not the only thing Christ cares about. I'm sure Jesus Christ would also abhor the alarming economic disparity between rich and poor in this country and many others throughout the world. As an anti-materialist philosophy at heart, Christianity has too often been mute to the economic violence meted out by greed, and has failed to join the battle against those who celebrate greed and become successful from it. Do Christians who stand by in the face of such injustice assume that the meek won't get their due until the Second Coming? If so, it's hard for us on Mars to understand how such seekers of monetary wealth can call themselves Christians. We fear that these are the same hypocrites who can't recognize a lie when they hear one, or, on learning it's a lie, continue to believe it true.
I fear for Mankind, and for all of God's glorious creation, if these are the people who will inherit the Earth.
I’m drowning in ideas I have no time to pursue…! I think this is what some people mean when they complain of “information overload.” In my case, it’s more like “idea overload.”
I recently tried some of the “getting things done” software tools I’ve downloaded in an attempt to get my idea-log under control… but none of them really helped. I’m leaning to iGTD since it’s free, full-featured, and actively under development, but honestly, the work of compiling my list of projects and trying to prioritize and schedule them, etc., merely made me even more aware of how swamped I am, and how far behind I am in the things I want to be doing!
I will certainly be getting some of these “things” done eventually, but it sure is harder as the projects pile up. And my wife keeps wondering why I’m killing myself over work I don’t even get paid for…! Now, there’s a conundrum that simply could not have existed before the web came along. Questions keep flying past me as I ponder my situation:
- Am I having more ideas for interesting projects now because there’s a potential audience that might be likewise intrigued?
- Or is my “idea center” being overstimulated by the vast number of other fascinating projects now so readily at my disposal?
- And is ADD merely a byproduct of living with the web? That is, are we more distractible today because web browsing can lead us onto so many irresistable, multi-nested, looping digressions?
- If I complete a project that arose as a digression from one I still haven’t completed, did I get anything done?
Heck, with so many interesting avenues to pursue, no wonder so many of us have trouble getting anything done!
Still, hope springs eternal, as they say, so I tried doing a GTD list. However, once I had finished, the exercise merely confirmed precisely why I’m feeling so overwhelmed. And my preliminary list only includes the large projects, not the routine ones that consume so much of my day. The irony of this situation is that all of these are projects that have sprung from this blossoming well of creativity I seem to have fallen into, which I would normally view as pure “fun.”
Having new ideas is great, but only if you have more brains to explore them with. Sadly, I still have only the one that came with my Martian birth.
Just for fun—and because I haven’t got anything else done today—I thought I’d jot down the ongoing projects I’d love to finish, as recorded in my GTD list:
Classic 45s Site
Enhance Classic 45s Features
Better search, customization options, etc. Numerous ideas from way back.
Enhance Classic 45s Interface
Convert from tables to CSS, add Ajax improvements to UI.
Update Classic 45s core software
E-Commerce package in use is old code, hack-prone. Need better reporting and stock-management tools.
Finish crystal clear
Try to get it working in Leopard in addition to finishing planned features.
Update Crystal Albook
Need a whole raft of icons for apps, and I’d like to add some for documents and file types too.
Musings from Mars: Articles
Article on Best Software
I’ve long planned my own take on the “essential Mac software list” category, but I keep waiting in vain for me to finish my software inventory…
Article on CSS 3.0
|Did the first part of it (yay!), but I need to finish the job…|
Update Ajax/DHTML article
At one time, this was the most popular article on Mars, but I haven’t updated the info there since September 2006. I did a little updating earlier in the year, but I haven’t published it yet…
Update PC/Mac Price article
This article from May 2005 was prescient, but I’d dearly love to analyze for myself what the relative prices look like today.
Write second half of the “Why buy a Mac” article already… the first half is almost 2 years old now!
I’ve shied away from this because I get so hammered every time I bad-mouth Microsoft. Not that I don’t think it’s worthwhile… I just don’t think any of the walking brain-dead who actually need to listen to why Microsoft doesn’t deserve their dollars, will in fact listen.
This is one of the top priorities right now, and it’s a topic several readers have been asking for. SAM is my acronym for “single application mode,” a method of managing your Mac desktop and application windows that I’ve found highly effective at eliminating window clutter. It’s also one of the best ways to get the most out of a transparent theme like Crystal Clear.
Musings from Mars: Site Development
I’ve got this function about half written, if I recall correctly. It’s one of my Ajax experiments.
Finish article archives
I finally got the archives done and eliminated that horrible month list, but I now want to let readers view the articles by category as well as month.
Finish updating heatmaps
The category/tag system on Mars has gotten way out of hand… I started on a fix some time ago but need to finish it. Unfortunately, it’s all tied up with separating out the Software Addicts site (a separate project).
Fix slow code on Mars
This is an ongoing effort that will never end…
Fix Comment form and block spam
This has been waiting in the wings for aeons… I tried one spam solution late last year, but though it’s certainly eliminated spam, it also made submitting comments to Mars a more-than-frustrating experience for readers. That wasn’t my intent. Since removing that roadblock a few weeks ago, I’ve been having to wade through 150-200 spam messages a day, with only about 1 message a day that could be approved. I finally put up a new spam filter and some Ajax interactions to the comment form today. Fingers tightly crossed.
Adapt Crystal Clear to Uno Install
I’d like to make Crystal Clear available to folks who don’t have or want to have ShapeShifter. This takes on an increasing urgency as I look ahead to Leopard…
Finish Classic 45s Jukebox widget
I’ve got this widget about 60% done… I just need to devote a day or two to wrapping it up!
Modify SetAlphaValue for background colors
I’m also working on this one, which I think is a really exciting enhancement to Crystal Clear. Again, the changes in Leopard are spoiling my party a bit…
CoreUI is the new user interface framework Apple is building for Leopard. It looks really cool, but it’s hell trying to figure it out without any documentation…
Software Addicts: Site Development
Review Demo Software folders
I’ve stopped adding demo software to Mars because my backload is already so daunting. The list already on Mars shows 319 software titles-in-waiting. Yet that’s not even all of the ones I’ve got in my Demo Software folder over here. I know this can’t be right, but one of my file inventory tools shows 778 applications in my Demo folder. I desperately need to get this under control before I go further… !
Review existing Applications directory
Same is true with my “approved” applications. I need to know what I’ve got and get rid of what I don’t need.
While I’m doing the above two things, I need to figure out what my naming system is going to be. I keep calling a category “Developer tools” one place and “Programming tools” another. I’ve got a taxonomy in draft form, which I spent a couple of days on a few months back. But it needs to be tested in practice—such as by organizing my applications and demos!
Software Addicts: Reviews
Add to Ongoing Software “Roundups”
Roundups to continue:
Write roundup reviews
Roundups that could be written include:
Write software reviews
Focus on software I’ve purchased licenses for and those I’ve clearly rejected. Probably 10 or more software packages in the “Maybe” category have already been approved, but I haven’t had time to write up my review and pros/cons list.
"At some point [in life] I realized I didn't really care about this species"... "That I didn't really care about this country deep down. That I was in them but not of them. And I found myself in a divorce from my species and my culture."
With growing interest and amazement, I read the back-and-forth argument between two long-time, highly respected Mac nerds yesterday on the subject of Mark Pilgrim’s decision to abandon Mac OS X for Ubuntu Linux. John Gruber is simply one of the best Mac writers there is, and regardless of what he has to say on a particular subject, you have to admire the elegance, precision, and logic of his writing. So when Gruber raised questions about the wisdom of Pilgrim’s move in a recent blog post, his large readership weighed in, and Pilgrim responded, you can be sure that a great many Mac users like me paid attention.
As usual, I agreed with nearly everything Gruber had to say, and the couple of niggles I have are not worth mentioning here since they would distract from the purpose of this article. And what is that purpose, you are wondering? Before I get to that, let me briefly summarize (if I dare) the exchange so far between Gruber and Pilgrim.
- Pilgrim has become fed up with Apple’s “closed”-edness. After 22 years as a sophisticated, high-end user, he’s decided Apple’s “closed” ecosystem of software and hardware is too closed for him. His primary concern is that the integrity of the data he stores in that ecosystem is at risk, because Apple doesn’t always document its data formats and doesn’t respect for long the proprietary formats it develops for storage. Pilgrim feels jerked around from one closed format to another and is tired of the data conversions and consequent data loss they inevitably entail.
- Gruber is surprised and a bit incredulous that Pilgrim would have suddenly been bitten by this bug. He agrees that closed formats aren’t good for long-term archival purposes, but questions whether losing his iTunes metadata and other format problems is worth chucking his expertise with the Mac operating system for something completely different. He points out that a good backup strategy is part of the solution to preserving precious content. He also devotes a large part of his response to criticizing the Mac blog writers who had knee-jerk reactions against Pilgrim’s decision, and who cited old “Mac is better than Windows because…” arguments without realizing the advances Windows has made since Windows XP (or 95, or whatever). Gruber argues against black-and-white thinking in general and for the very reasonable position of respecting other people’s choices even if you don’t agree with them.
- Pilgrim replies that Gruber missed his point and reemphasizes that his feeling “closed in” by proprietary formats has been coming on for a long time. Apple’s decision to abandon the widely used and understood mbox format for Mail was just the last straw. He feels betrayed that Apple switched formats in Tiger without informing its users, without providing them a way to back out, and without documenting the new format.
So why do I want to wander into this disagreement between two Macintosh heavyweights I don’t know, but greatly admire and respect? As I read their separate articles, I saw something with my Martian eyes that may not be clear to them. What I saw wasn’t an OS switch story, but rather a love story.
I’m coming to believe that the human brain just isn’t very reliable. Long ago I concluded that humans would never understand their own behavior, simply because they’re not capable of analyzing behavior without affecting it. The mind is too complex, there are too many variables that define behavior, and how can you stand outside human-ness and study it without reflecting your own beliefs and preconceptions? It just can’t be done, which is why we’ve made so little progress in psychotherapy and instead are becoming more and more dependent on the “objective” injection of drugs (which we also don’t fully understand).
So the prospect of someone as intelligent and knowledgeable (those are different things) as Mark Pilgrim abandoning an OS as highly evolved as Mac OS X over a file format issue is incredible. This is clearly an emotional response, and that’s the only way I can understand it. As a heavily invested Apple user myself, I have become incensed at the way Apple often treats its customer base. I’ve only been a Mac user for 10 years–less than half of Pilgrim’s 22–but I definitely feel Apple “owes me” something for my “loyalty”, especially after not abandoning the platform in the sad years of the late 1990’s when Apple lost its way. Like Pilgrim–and Gruber–I have many criticisms of the Apple platform and software, and if that Automator action interrupts my work one more time I’m gonna scream! (Why can’t it work in the background when it doesn’t require any input from me?) I have tried–and abandoned–and tried–and abandoned–using a local iDisk with .Mac so many times it’s not at all funny. Each time Apple says they’re improving webdav for .Mac, my hopes go up, and I try again. I’m currently trying again, in fact… we’ll see how long it lasts.
One of the advantages of being a Martian, though, is that I do have the ability to stand outside myself and see when I’m being silly. (It’s the antennae.) So I would recognize when my “fed-up”-ness was wresting control of my good judgment and rein it in. In this case, I think Pilgrim has failed to recognize that his beef isn’t with Mac OS X or the physical entity he calls his Mac, but rather it’s with the faceless corporate entity called Apple, run by its arrogant and righteous managers and programmers.
As Gruber points out, the question isn’t whether Apple is open or not, but whether they’re open enough. One of the things I value about Apple and its products is the new ideas they bring to computing and their willingness to take risks with new approaches. I’m a “love new stuff” kind of guy, so I welcome anything new with open arms. This can be a problem when the new thing turns out to be a skunk in disguise, but with Apple that’s pretty rare. As a New-loving guy, though, I realize that nothing lasts forever. New things always displace old things, and they’re only new for a short time.
As I reminded a web developer colleague of mine recently, when you’re in the business of building websites, you simply have to accept the fact that whatever you think you know today is not going to be sufficient 2 years from now. API’s change, languages change, programming techniques change, graphics technology changes, browser technology changes, hardware capabilities change… you name it! It’s all malleable, and you have to be able to roll with it.
This constant “newness” in computing means that if you want to play and create in the digital world, you have to be prepared to convert from one format to another many times over in the course of your lifetime, or risk leaving valuable creations behind, locked in some old file format (or hardware format) nothing can read anymore. (The alternative, of course, is to stick with easels and canvas and pencil and paper.) I can easily identify with Pilgrim’s concern here… all creative individuals can. What we make we want to keep (the good stuff, anyway), and we want to be able to enjoy it again 5 years from now, or whenever the mood strikes. (I better do something about that large reel of magnetic tape with the original copies of my 1978 recordings on it before it’s too late!) Like Pilgrim, this is a constant worry. I try not to go into any new technology or tool without understanding how I’ll get my content out again. If I find I can’t, I abandon the technology before I’ve invested more of myself than can be easily migrated manually.
A couple of recent examples from my world… Bloglines is a great RSS service, and after having used NetNewsWire for about a year, I switched because Bloglines had this very cool “clip” function. In Bloglines, you can easily “clip” an article and assign it to a folder. Sounds like a great way to archive content you’re interested in, no? No. It turns out Bloglines provides no way either to present that content except through their administrative interface, and no way to export the metadata the clips represent. So, bye bye Bloglines.
Del.icio.us is another terrific service for Web 2.0, and I still use it. But when a series of service interruptions meant I had no access to my bookmarks a few days last fall, I panicked. I used a free plugin to Wordpress called Mysqlicious to import all of my Del.icio.us content and metadata into a MySQL database, and I now keep a mirror of all my Del.icio.us bookmarks there.
In fact, these two conversions are what led to the evolution of Musings from Mars. The “library” of News, Resources, and Software I keep here is nothing more than my bookmarks, all grown up with a great deal more content and metadata than I could store in Del.icio.us or Bloglines. I’m completely at ease now, because relational databases and SQL are standard, well understood data stores and methods of retrieval. The content itself is simply Unicode text, in some cases tagged with HTML. As a web guy, I’m very comfortable with HTML as an archival storage method, and with standard graphics formats like GIF, JPEG, PNG, and TIFF to store the images.
So, what about some of the conversions Pilgrim has had trouble with? Mail, for example? I’ve had lots of fun with mail formats over the years, but it’s possible that the solutions that work for me just wouldn’t do it for him. First, regarding Apple’s new “closed” format that replaces mbox. Is this such a tragedy? I’m of the mind that the engineers at Apple are a lot smarter than me about this kind of thing, and if they felt the need to change mbox in order to optimize Spotlight, I say, go for it! Being able to search across all of my past email is the most important thing anyway, isn’t it? What’s the good of having your mail in an “open” format, if it’s not easy to search? Pilgrim’s beef seems very strange to me, especially after I did a couple of quick experiments this morning with the .emlx format. (Actually, it looks to me like Apple still uses mbox for the mailboxes themselves, and emlx for the individual mail items. In any case, .emlx refers to the mail items, not to the mailboxes.)
First, I tried a new (to me) piece of shareware called File Juicer just to see what would happen when I fed it a folder-full of .emlx files. I was pleasantly surprised to find that everything converted very neatly to .txt, .html, .gif, .jpg, etc files. File Juicer put each file type in its own folder at the end. Lo and behold, all those ads I’d trashed still had their HTML files gloriously preserved! All of the attachments were neatly dumped out for me. Now, this isn’t an email archive, but if it’s the content you want to get at, it’s certainly an easy way to do that.
Second, I used a tried-and-true piece of shareware called Emailchemy, which I had previously used when archiving my Exchange mail to Apple Mail last year. Emailchemy makes it wonderfully easy to convert from just about any email format to another. You just point it at your Mail folder, and Emailchemy will preserve the directory structure, setting up mbox files, Eudora files, Thunderbird files, and more, which you can then import into another mail program. If you want to use a Mail client interface for accessing your archived mail, this is a very easy way to do that. It didn’t take long to import my Apple Mail mail into the terrific Opera mail client this way.
Third, I went ahead and tried a piece of freeware I’d downloaded earlier this year called MHonArc, which is specifically designed to convert from email formats to HTML as an archival utility. Now, this is thinking outside the box, folks! MHonArc is a command-line utility, but there’s a Mac OS X interface (also free) that makes it easy to use without having to learn the command syntax. You just point the software at your mail files, and it converts them to linked HTML. After I combined my “sent” and “inbox” folder in the same archive, I really saw the wisdom of this approach. Since the software preserves threads, now I could easily find my replies and my receipts in the same thread! And honestly, I think data in HTML is pretty darn safe!
A last option I didn’t bother to try, but which I find also very compelling is a tool like MailSteward, which converts your email to a relational database. Now honestly, Mark, doesn’t this sound better than switching to Linux? For $50, you can sock your email in a safe database from which you can output text files, SQL files, mbox files (yes!), and print (PDF) files, and which you can search much more flexibly than any Mail client can.
See, Mark… it’s not Mail, and it’s not iTunes, and it’s not AppleWriter, or whatever. Slowly, over many years of frustrations, you’ve developed a negative attitude toward Apple, and the bough has now broken. Apple is a lucky company in that they instill intense loyalty, verging on worship, in their users. But like love, loyalty has a dark side. Like someone we love who has betrayed us, Apple has a way of pissing off its loyal customers through neglect and indifference. Everyone who has used the Apple support forums has found them useful, but also quite cold. I have never ever seen an Apple employee step in to one of them to help people out. In fact, they seem to deliberately avoid doing so. Not good PR, in my view.
Apple and PR
A lot of Windows tech writers think Apple is great at PR, and that we all love Apple products because we love the packaging, or the advertising, or whatever. But Apple is actually pretty lousy at PR, except the very quiet kind at a very safe distance. Their website is wonderful… easy to use, inviting, attractive, informative, filled with tutorials and videos, and other fun stuff. The support site is great, too, but you never sense there’s anyone at Apple actually at home when you visit.
As an introvert personality, I can understand this: I like to provide interesting things and hope people enjoy them or find them stimulating. But I don’t want to shake hands with anybody or stand in a room filled with readers and give a talk. Still, you gotta admit it’s not a great PR approach.
When Apple does get noticed–its latest TV commercials, for example–they’re flashy and interesting and well made, but they aren’t going to change anyone’s minds. I don’t think the iPod ads ever sold an iPod, actually. What sold the iPod was friends meeting friends who showed them their iPod. It’s such a great product it “advertises itself.” If iPods were priced like PC’s, this would not have happened, but once Apple got the iTunes music store going on Windows, and an iPod for under $300, that’s when the gates really opened up for the iPod economy.
After having spent $7,000 in one year at my local Apple store, I became livid at a manager there who refused to give me my Federal employee discount, simply because I’d forgot to bring it to the store with me. I said, “Just look me up in your database.” He said, “We don’t have access to that information.” “What!? How can you provide top-notch customer service when you have no way of getting to know your customers?” He just shrugged his shoulders and acted like he didn’t know, didn’t care.
If Apple’s customers sometimes feel like they have a personal relationship with the company, then it can be a particularly bruising relationship for guys like Pilgrim who are digitally gifted and technologically savvy. It would be like being married to a spouse who you admire because they’re a little bit smarter than you. The spouse does many wonderful things and makes many fine decisions. But lots of decisions get made without consulting you. You argue about this, and she promises to get your opinion the next time before going off in a different direction. But then she doesn’t. Never does, in fact. Even when you think she’s brilliant, you bristle at her superior attitude. Finally, it’s too much for your ego, and off you go.
For “the rest of us,” Apple is simply brilliant so much of the time that we just forgive the few blunders. I personally stand in awe of so much about Mac OS X that I could give a rat’s ass about the mailbox format it uses. My issue is, “Make Mail play nice with Exchange 2000!” I’m delighted that Apple uses XML as a core data format, including the format for my iTunes metadata. I’m in love with QuickTime because it’s completely interoperable with open video and audio standards, and it’s extremely easy to convert from one format to another. Unlike Microsoft, Apple hasn’t tried to develop its own formats for the sake of controlling the standard and locking up the market. At least, I don’t believe that’s their motivation. This is why there is no Apple video codec, or audio “codec,” or graphics “codec.” Apple uses PostScript (PDF) as the basis for its graphics engine rather than developing one of its own. Isn’t that pretty darn open?
No, the beef isn’t with Apple’s openness or about conversion issues, which are generally very easy to work out. Actually, one of the risk factors for data that Pilgrim leaves out is the degree of support the format has from developers. To that extent, as Gruber says, the era of being at risk by using a Mac is now over, and if anything the pendulum is swinging the other way. There are so many developers building quality Mac OS X applications nowadays, that Cocoa is well on its way to being a factor that converts users to the Mac all by itself.
On the other hand, a format like mbox is used less often nowadays. Thunderbird doesn’t use it, and neither does Outlook, or Opera mail. Eudora has a shrinking slice of the pie, and Apple Mail’s slice is rising. I’m not saying mbox is at risk, but I wouldn’t count on it being around forever. It’ll be around only so long as there are developers willing to support it, which requires customers who are demanding it. Moving from one minority platform to an OS with even smaller support–especially when the platform you’re leaving supports everything the minority platform does–seems a little odd. Especially when the platform you’re leaving is more open than it has ever been before, as Gruber points out.
So odd, in fact, that I think it can’t be explained logically. None of the reasons Pilgrim gives make any sense. I’m not arguing for Mac OS X or against Linux here, I’m just saying a switch like this takes a great deal of effort, and why turn your world upside down over a change in mail formats? Especially when all you really have to do is switch mail clients. That, I could understand. I’ve done it often enough myself.
No, what we are witnessing is the end of a love affair. And when someone falls “out of love” with a company like Apple, there are no counselors the couple can turn to for help. So they do the only thing possible… a clean break, get away from everything that reminds you of the great things you accomplished with that beautiful spouse over the last 20 years.
You write up your reasons for breaking up, and everyone but you realizes you can’t see the full picture. But certainly I, like many Mac users in a similar position vis a vis Apple, mourn the breakup and wish it didn’t have to be so. Powerful emotions can close minds as surely as any proprietary format. And unlike that closed format, prying open a closed mind is nearly impossible.