December 05, 2008
|
I've been meaning for a couple of years now to post a little ditty here about one of my favorite authors, Roger Ebert. He is one of the most important 'men of letters' writing in America today.
You may remember him as one of the thumbs up/thumbs down reviewers from TV's movie-review programs "Siskel & Ebert" or "Ebert & Roeper". But he's been a newspaper man for a few decades in Chicago, and his work has been online for the last several years. I visit his site regularly (rogerebert.com) and catch up as much as I can from the archives in addition to the new stuff that's posted. Mr. Ebert and I often disagree in matters of taste as well as politics, but I enjoy reading his work immensely.
Ebert has a phrase that he likes to employ when discussing a movie that has a particular bent; he'll remind us that "a movie is not about what it is about; it is about how it is about it." This is a profound thought, and it bears consideration. For the sake of argument, think about the movies "The Godfather" and "Analyze This". Among other things, both are ostensibly about how hard it is to be a crime family. But how they go about showing that is what makes them the (very different) movies they are.
This is all by way of saying that similarly, Ebert's movie reviews are not about the movies he's reviewing; they are about how they are about them. This is what makes his writing so much fun. His reviews and essays are multi-layered.
Take, for example, his review of "The Aristocrats," a movie featuring a large number of different comedians telling the same joke. Ebert starts his review by commenting on the nature of different kinds of humor (the quick and the slow build-up), then takes a series of quick snips at pieces of the movie. His review then ends with a killer punch-line, that reveals that the whole review was a slow build-up all along. Brilliant!
(As a former producer of a comedy radio show, I was put on guard when he states early in his review of "The Aristocrats" words to the effect that, "I know something about humor." Echoes of "Good Morning Vietnam" rang through my head. But when I got to his punch line, he had me convinced.)
Alas, my post today is not really about why I enjoy Ebert's writing so much. But take my word for it; you should go read him.
Mr. Ebert spent a couple of years recuperating from surgery complications that nearly ended his life. He still is unable to eat or speak (so yes, his television days appear to be behind him at present and may well stay that way, and yes, he is much thinner than the guy you might remember seeing on TV), but he is back at work, writing as if there were no tomorrow. During his medical ordeal, a movie called "Expelled" came out that, in Michael Moore-ish fashion, creatively mixed fact and fiction to claim that Intelligent Design proponents were being unfairly treated by Big Science.
As Ebert resumed writing, he was frequently pestered to write a review of this pseudo-documentary. He recently posted his response, not within his formal movie reviews on the Chicago Sun-Times sponsored site, but on his personal blog here.
In my opinion, the blog entry stumbles out of the gate, but once he picks up steam, he hits it out of the park. (How many metaphors can I mix in one sentence? My record so far is tied at three.) You should read it. Go ahead. I'll wait.
As you'll notice, he allows comments to be posted. What inspired me to write tonight was a series of comments that appeared below this particular entry. Keep in mind, his essay was about the movie Expelled and, in particular, the intellectual dishonesty that Ben Stein and the movie's producers employed in claiming that Intelligent Design was anything other than a cover for religious dogma. That was the whole point of Ebert's essay.
He often notes the use of the "excluded middle;" the failure of the movie's producers to entertain the notion that some people can be religious and still believe that evolution works the way scientists describe. What I found fascinating was how several commenters (commentors?) completely missed the point of Ebert's essay and went straight to the same "excluded middle" assumptions by begging the question, "What is the meaning of life if we're all just the result of a bunch of chemical interactions?"
Understanding biological evolution has nothing to do with resolving philosophical or, for that matter, religious conundrums. Ebert's review did not take a position on religion (although, if I recall correctly, he has stated elsewhere in his writing that he believes we are more than just a bundle of chemical reactions). Religion wasn't the point. Intellectual dishonesty in a movie that claimed to be a documentary was the point.
But I feel compelled to address the little philosophical conundrum that those commenters posed, because I hate, hate, hate crimes against logic. The commenters in question assume that subscribing to the concept of biological evolution necessarily means believing that we are nothing more than a bundle of chemical impulses. Therefore, they further deduce from this faulty assumption, people who agree with evolution have no ends worthy of pursuing; "no heart to love / no evil to rise up above," etc. If we accept the theory of evolution as demonstrated, then our lives hold no value and we hold no faith but greed.
These responses completely missed the point of Ebert's take on the movie, they completely miss the point of scientific inquiry, and they insultingly miss the point of logic. They also assume that atheists (as if everyone who understands evolution must therefore be an atheist) don't feel emotions, engage in morality, or hold values. Which begs a question that's interesting to ponder:
If this is all there is -- if we get one shot at life, and there's nothing left of our consciousness once our brain stops working -- then isn't this life that much more precious than if we instead assumed that life is never ending? Isn't the atheist who dies for a cause more noble than the believer who expects that there will be rewards in the Great Beyond? Isn't triumphing over evil all the more urgent if we know that there are no second chances to get it right? And likewise, aren't we less likely to strap a bomb to our chest or commandeer an airplane on a murder/suicide mission if we are assured that what waits for us "on the other side" is not our own personal paradise, but instead... nothing?
It seems to me that life is precious, no matter which side of the philosophical or theological fences you find yourself standing.
September 09, 2008
|
As I write this, our son Nolan is three years old. He is a middle child, with three years separating him from his older brother, Alexander, and there's another three between him and baby brother Andrew. He has just started pre-school, and is racing past the milestones that all three-year-olds approach at around this time: potty training, independence, asserting his personal preferences, etc., etc.
This is not his story.
Oh, sure, I posted that picture of him playing on the school playground because it's a darn fine photo, and it shows off his enthusiasm as well as his beauty. But, more to the point, I posted it because I like it. I took the photo, and I'm proud of the way I captured him. And I'm proud of him. This is my blog, and even when I'm writing about my kids, or politics, or pop culture... I'm writing about me. The photos here are not just about the subjects of the photos; they are also about the photographer.
So riddle me this, Batman: at what point does a story (or a photo, or whatever have you) cease to be mine to post? I've commented on this dilemma previously, but it's becoming increasingly relevant now. For example, I've noted before that you don't see me mention certain family members here because they prefer to keep any details about themselves private. But when their details are also my details, and I want to go public... where is it appropriate to draw the line?
While that's already a bit of an issue with regard to adult friends and relatives, what about the people in my life for whom I currently make those decisions, but who will eventually be making their own? I have devoted an entire section of my website to each of my three sons... but what happens as they get older, and assume more responsibility for their own image?
When Andrew reaches the age where most of his friends have relatively unfettered access to the internet, how much of his life's story is it fair for me to have posted online? When Nolan starts dating, how easy should it be for his prospective paramours to discover the details of his potty training? Am I exposing Alex to teasing down the road in high school because today, while he is six, I broadcast that my son wants to compete in the Olympics?
When I was in high school, I was simultaneously an extrovert and shy. "Shy?" you ask. "Is that possible?" Well, what I mean is, I was shy when it came to romance. I was never very forward when it came to girls, and I always kept any budding romances pretty much to myself. Sure enough, when given the chance to meet one of my girlfriends, what did my mom do? Pull out the old photo albums and show pictures of me from prehistoric times. A regular Cringeosaurus.
But this is different. I've been keeping a blog for all these years as a means of keeping in touch with old friends and new; a way to let y'all know how things are going in my life, for those who might find it interesting. This is *my* story (and, for all that, it's only the part of my story that I currently choose to broadcast... and it's highly edited, at that). My story necessarily includes the fact that I have three brilliant, beautiful, athletic boys. (And I'm not biased in any way about them.)
At what point, though, do I back off from sharing stories here that include them? If one of them faces a particularly challenging problem that is of concern to me, at what point do I say, "One of my children..." instead of saying, "Alex...". At what point do I stop sharing publicly the photos I take of them? (And, instead, I save them somewhere, only to be hauled out at family reunions or when new significant others are introduced?)
I love my boys.
They are growing so fast.
How long can I hold onto them? And how long can I share my experience of them with you?
June 12, 2008
|
Around this time last year (and again this year), traffic to my site increased partially because there are a lot of folks out there searching Google, Yahoo!, and other search engines for ideas regarding "Valedictorian speeches." An essay that I posted here a couple of years ago ("Worst Valedictorian Speech Ever") ranks high among the search results. I guess there aren't a lot of us former Valedictorians who have posted our speeches, even though I *know* that there are many, many, many better examples out there of the species than my own feeble creation.
That said, I've received a couple of different kinds of reactions, neither of which I expected. One was a series of responses (both in public comments on the site and private e-mails) from people who were there, giving their [favorable] reactions (over two decades later, granted). The other kind of response I've received has been from kindred spirits currently facing the same kind of dilemmas I faced back then: given an opportunity to deliver a speech to the entire graduating class, its teachers and their administrators... now what?
"Kevin" posted the following just a few days ago:
Hi Allan, i am in a similar situation and seek some advice. i am valdictorian at my school and i've written about 3 first drafts of my speech by now, but every time it gets shut down [by] the principal because it is too negative and will make the school look bad. all i want to do is speak the truth about the injustices taking place and i am still debating whether or not so speak about the injustice. well, i would like to know if you feel it was worth it?
Well, now you've done it, Kevin. You've asked an old fart to give you adivce. Here goes.
First, congratulations on earning the top spot in your class. You certainly must have done an awful lot of busy-work homework assignments well in order to have filled that slot. My heart goes out to you. You will never be able to get those hours of your life back. But, then again, you wouldn't have been able to get them back if you had spent that time playing Nintendo, either.
Second, when you send out your college apps and resumes, be sure to capitalize your 'I's. [Sorry, Kevin, I couldn't resist.]
You ask if I feel it was worth it, delivering the speech that I did. Your question brings up dozens of thoughts, often conflicting, so here are a few in no particular order:
* As I mentioned in my essay, I'm embarrassed by that speech now, and by how bitter it makes the younger me out to have been. Never mind that I'm even more bitter now. Possibly. But bitterness isn't attractive. It doesn't get you the girls. Trust me, I know. From cruel, bitter experience.
* I gave that speech in 1986. I'm now forty years old. The events that led up to my writing and delivering that speech, and the fallout afterward, have had no lasting impact on my life that I'm aware of.
* I am not aware of my little speech having had any lasting impact on anyone else, for that matter. I got on stage and said my piece. As one teacher had remarked, I rained on some people's parade. Did the teachers and administrators kiss and make up as a result? No. The administrators continued the path that they were on, of favoring discipline over academics, and my alma mater's educational scores plummeted. Some very good teachers left the school, while other very good teachers stayed and did the best they could. Did any of the teachers change their approach to teaching or their relationship with the administration or how they handled their students? I don't imagine so, and I've never heard anything to imply otherwise.
* As The Man said so much better than I ever could (granted, he was speaking about other things, but the words are just as true now): "The world will little note, nor long remember what we say here...."
* I'm proud that the speech mostly holds up after all these years, a few awkward phrases notwithstanding. If you agree it holds up, then I suspect it's because the speech was not specific about incidents, but about particular issues. Those issues apparently still resonate, even though the incidents that brought them to light are long since moot. [I told you my feelings are conflicting. Embarrassment and pride at the same time?]
* As a result of my speech, all of the valedictorians who came after me had to have their speeches vetted by the principal. So, I guess there was that one legacy. Since I'm opposed to censorship, I'm not proud of that legacy. Then again, what's the point of free speech if you aren't free to say something that makes the Establishment uncomfortable? Kevin, it's individuals like you who have individuals like me to thank for the fact that your speech has to be vetted. I'm sure the same would be true if our chronology were reversed.
There's more, but it's late, and I have [paying] work to do before I call it a day. More to the point is your own situation: unlike my situation, you have to have your speech vetted by the Establishment. Your principal is either hoping to make sure the event goes smoothly for all concerned, or he's covering his behind, or both.
You speak of injustice at your school, but of what nature? Racism? Favoritism? Socio-economic classism? Religionism? Were people expelled because they wore black trench coats [as has happened at some schools] or wrote violent essays? Was someone offended because they saw a Christmas tree and they don't believe in trees?
What was the severity of the injustice? Were the victims made to feel bad about themselves? Were they given lower grades or fewer privileges? Were they drummed out of school? Were they physically harmed?
Who is your speech aimed at? Your fellow students? The administration? Parents? The janitorial staff?
These are important questions that will shape your approach. The thing is, your message can be expressed in either positive or negative terms, and you have the choice of being humorous or serious.
Comedy is much, much, much more subversive. It is also harder to pull off. I've spent most of my "adult" life (including my time at high school) absorbing and practicing comedy. But as I learned recently, I can still miss the mark badly. I wrote a self-deprecating piece about my Irish heritage a couple of weeks ago, and managed to offend family members in the process. This is the exact opposite of what I wanted to do (and I'm still working on the apology -- a well-crafted apology is even harder to employ successfully than comedy, which is hard enough). I knew when I wrote my valedictorian speech that I was neither in the mood nor did I have the chops necessary to write humor for that particular audience at that particular time. Your mileage may vary.
If your target audience is your principal, you've already delivered your message, and it has been rejected. So, move on. If your goal is to upstage your principal, you can always have him vet one speech and then deliver another. Note: I am *not* recommending this. Unless you see your principal as a villain, it's reasonable to assume he has good reasons for steering you in the direction he is. Allow me to suggest you work with him to address his concerns but still address yours, as well.
If your goal is to give a memorable speech, allow me to recommend stand-up comedy rather than the aforementioned pointed humor or deadly-serious approach. There are several examples of this approach available on the web.
[Then again, they come up higher in the search engines than my speech, so since you found me, I'll assume you've already decided to consider other-than-stand-up-comedy options.]
BTW, nobody will remember your speech. Sorry. But you can post it online later to remind them. When you do, send me a link.
I knew my speech would never have been approved if it had to go through serious vetting. As was pointed out in the comments section of my post on the subject, I wrote the speech at the last minute with the collaboration of my writing partner of that time (and with whom I later went on to edit a college humor magazine). Had I written my speech in collaboration with the school principal instead, I couldn't begin to guess what form it would have arrived at. It might have been a fine speech. But it certainly would have been different.
But my goal wasn't to upstage the principal; nor was it to rain on my fellow graduates' parade. I knew that was going to be the result, but that wasn't the point. I truly wanted to say something worth saying. In retrospect, I'm still not sure that I did.
You ask my advice, but here I am yammering "It depends! It depends!" Here's the deal, Kevin:
- eighty percent of communication is non-verbal. Tone of voice and body language convey more meaning than the words being said.
- sometimes, some well-placed silence, employed strategically, can speak volumes more than words ever can. [Along those lines: if I had had more time, this essay would have been much, much shorter.]
- if at all possible, work with your principal in good faith on the message, especially if the principal is working with you in good faith.
- practice, practice, practice the delivery.
May 04, 2008
|
I don't drink. There's no particular reason; I simply never got into it. The taste of most alcohols simply doesn't appeal to me, although I will cop to occasionally enjoying a milkshake made with Irish Cream and coffee flavored Haagen-Dasz. Several of my favorite recipes call for cooking with alcohol (take a look at the recipes I've posted, like Jambalaya, for a tasty example). But that said, drinking isn't my thing.
The fact that I don't drink is somewhat unexpected, given my Irish heritage. Here's how Irish my heritage is: my grandparent whose surname at birth was McMahon died of liver failure, resulting partly from her penchant for beer. No kidding.
When my cousins and sister and I were kids, the biggest honor we could imagine during those summer weekends at our grandparents' cottage was to be allowed to carry the beer pitcher from the tap to where the adults were sitting in the yard.
[For long-time readers of my blog, I'll point out that these grandparents are not the ones who were Methodist ministers. Here's how NOT-Irish my other grandparents were: when administering communion, they used grape juice instead of wine. No kidding. ]
This is what it means to grow up as part of an Irish family: the tap I mentioned above jutted out from the side of a refrigerator that resided on the front porch of the cottage, with a keg inside. The fridge contained nothing else. I'm not making this up. The aforementioned cottage was in Canada, where the national bird is the Molson Golden. Okay, I made up the bit about the national bird, but really, what else has Canada contributed to American culture but hockey, beer, William Shatner, and beer?
The extended family that populated my summer visits to Canada were consummate story-tellers and avid card players, and beer was ever present in the background, no doubt helping to facilitate both. Given that I soaked up all the story telling and card playing, I find it an interesting quirk that I never had any interest whatsoever in appreciating so-called adult beverages.
[I will also acknowledge that another aspect of my Irish heritage involved being exposed to Irish cuisine, which consists of boiling "food" until it has no flavor and no nutritional value. Salt to taste. "Food" consists of some combination of potatoes, cabbage, and meat. I have also sidestepped that aspect of my Irish heritage.]
Later, in my grad school days, I made it a point to learn what wines go best with the meals I would prepare for my paramour at the time. She came from a family that had some means, and I occasionally felt like my blue-collar background colored (unfavorably) their opinion of me. On occasions when I was not feeling particularly charitable about an upcoming visit with her family, I'd contemplate asking them what meal they would be preparing so that I would know what kind of beer to bring.
But for all that I was steeped in the couture of wine and the culture of beer -- ha! "Steeped!" There's another drink I don't drink: tea -- I've simply never acquired the taste.
A few years ago, I tried explaining to someone that I never could get into the taste, and she pointed out, "Allan, people don't start drinking for the taste." [This someone has, in the ensuing years, become quite the wine snob, so she might or might not give the same response in her older, wiser frame of mind.] While I know that this is not necessarily true, it does bring up the valid point that some people don't drink for the flavor, but for the effects.
I have long suspected that my lack of interest in drinking might be related to my innate desire to maintain self-control. But I have added a few data points in recent years that make me wonder about another possibility.
As I mentioned a few years ago when it happened, I required oral surgery that involved reconstructing my gum line -- a gingiva graft. During one of the procedures, I was offered nitrous oxide to augment the anesthetic, and I decided to try it. As soon as they started, I had to wave them off to tell them to stop.
"This feels terrible. I'm all light-headed, and I feel like I might throw up."
"We told you it would make you feel a little like you've been drinking." For a second, I was afraid they wouldn't turn it off; the person controlling the gas seemed genuinely surprised that anyone would not want to feel that way. This was a truly frightening moment for me. Then she eased up on the gas, and the terrible feeling evaporated with it.
As I may or may not have mentioned in my posts about my oral surgeries, I was prescribed a small amount of Vicodin/hydrocodone to use as a pain killer. This drug did absolutely nothing for me. Nothing. I have long wondered why something so useless could be such a hot commodity. My painkiller of choice remained Advil, even though it presumably has more serious side-effects (stomach bleeding, anyone?).
Which brings us to a few days ago. I've been recovering from an ear infection these past few days, and saw my doctor on Wednesday to have him check on my progress and to discuss pain management. My approach as been: when it hurts, take lots and lots of Advil. Alternate with Tylenol. Repeat as necessary.
Talking to your doctor can sometimes be a good thing. He pointed out that I was taking a toxic amount of Tylenol (notorious for potential liver damage), and a prescription-level's worth of Advil. He recommended a short course of Vicodin to help manage the pain, "Which should go away in a few days anyway," and would do less damage to my body in the meantime.
So I filled the prescription. I noticed immediately something different: unlike the other times I'd been given hydrocodone (the generic equivalent), these pills were large enough for a horse. Insofar as this medicine had never had an effect on me before, I took one right away (this was during a break at work) with lunch, unconcerned that I'd be driving a few hours later.
Horse tranquilizers.
An hour or three later, I noticed that I was sleeeepy. Then I made the connection: bigger pill might mean an actual effect. Then I noticed: my ear still hurt! When I'd had my oral surgery, the doctor who prescribed the Vicodin said that I'd probably still feel pain, but I just wouldn't care. I thought about that. Did I care that I was still in pain?
$%*!, yes, I cared! Ouch!
So, there I was, sleepy but still in pain. *And* I had some driving to do. And, come to think of it... I was just as uncomfortable as I'd been when I'd briefly tried that nitrous oxide.
Looks like I picked the wrong week to give up caffeine.
Twenty ounces of Dr Pepper (have you ever noticed that there's no period in the "Dr" part of Dr Pepper?) and four Advil later, and the effects of the hydrocodone were again rendered moot. I had been worried it would take longer for the hydrocodone to wear off (unlike the nitrous oxide, where the effects disappeared immediately), but I guess my body just didn't have much use for it.
So, what have I learned from all this? Well, for starters, I won't be taking Vicodin / hydrocodone ever again. It just plain doesn't work for me, and makes me feel anxious and sleepy, to boot.
I've learned (or, perhaps, reaffirmed) that it's very, very difficult to give up Dr Pepper.
...and I'm wondering if maybe, just maybe, one of the reasons I've never developed any interest in alcohol has something to do with my body already sensing that it simply has no use for depressants. I realize that narcotics and alcohol are chemically different, so it's possible that I'm over-generalizing with this guess. Then again, nitrous oxide is a depressant, and it is neither an opiate nor an alcohol.
Whether my aversion to alcohol and other depressants is psychological, physical, or both, I do know this: it has nothing to do with virtue, and it has nothing to do with fear. The concept of temptation holds no meaning when one is not even interested.
True to my Irish roots, I may die of liver failure. However, it would be the results of my accidental overdose of Tylenol, and not because of beer.
June 29, 2007
|
As I mentioned in my previous post, I suffered a bit of downtime last week with a gastrointestinal bug that, well... let's just say it forced much more stuff out of my system than it was allowing in. I lost four pounds in the course of one day, and I'm still not fully recovered.
(That said, I've tightened my belt a notch and haven't had to let it back out again in the week since this all happened. Even this nasty cloud had a silver lining.)
Does the fact that I was a victim of a GI virus qualify me as an expert on GI infections? Seriously. Does my experience now make me an expert on maladies of the gastrointestinal tract? The mere fact that I know it's a GI issue instead of the flu (did you know that nausea is not a symptom of the flu?) certainly must count for something, right? The fact that I listen to a science/medicine podcast only adds to my knowledge of such matters. Combine that with my first hand experience, and I'm an expert... right?
No?
You wouldn't consult with me regarding matters of GI infections? You wouldn't trust me to advise public policy on the treatment of GI viruses?
Of course you wouldn't. Being a victim doesn't qualify me as an expert. Having seen the virus attack both of my sons before it hit me also doesn't make me an expert. I don't even qualify as an expert on how my body reacts to that kind of a virus; I'm only an expert on how *I think* my body *reacted*.
A close relative of mine got into a nasty car accident a few years ago and was killed. Does that make me an expert on automobile safety? The fact that I'm now well read on statistics regarding auto fatalities... am I now competent to advise public policy on highway design or automobile design or DUI laws? Well, perhaps more so than your average bear, but certainly less so than a qualified expert -- for example, someone with a degree and a career in mechanical engineering or physics or civil engineering or, for that matter, law or public policy.
While I was out of town recently, I was annoyed to see this headline in my complimentary nation-wide newspaper: Families skeptical of Va. Tech panel. The lead paragraph read:
Relatives of Virginia Tech University shooting victims challenged the credibility of a state panel investigating the massacre on Monday, demanding that a family representative be appointed to join the eight-member committee.According to the article, the mother of one of the shooting victims has said that if the victims' families are not represented on the committee, the panel could reach conclusions "that may not be accurate."
The author of the USA Today article, Kevin Johnson, notes that a spokesman for Virginia Governor Tim Kaine said that each of the members of the panel were appointed for their special expertise... and he put the words "special expertise" in quotes. As if there's something dubious about their qualifications, or something suspect about being an expert.
The purpose of the panel, as the governor's spokesman is quoted, is to (and I'm quoting the article again here) "help determine what went wrong and how to prevent a future tragedy."
So, then: how does being the family member of a shooting victim qualify one to be an expert in the prevention of similar tragedies? How are they competent to help determine what went wrong? What qualifications do the family members have that will help them to make sure the committee doesn't reach conclusions "that may not be accurate?"
Please don't get me wrong. My heart goes out to those who lost loved ones in this tragedy. To whatever extent our society can reasonably move to prevent future incidents like this, however, I'm going to have to put more faith in the counsel of individuals with "special expertise" than in individuals whose primary qualification is that they've been harmed.
By all means, let the families of victims consult on the best way to memorialize their loved ones. But do not allow sympathy for the victim's families to cloud better judgment when it comes to improving public safety.
September 08, 2006
|
Confidentiality is not just a problem at the highest reaches of the government. It seems to me that once any modicum of fame is involved, the entire concept of confidentiality goes right out the window.
During the course of my professional career(s), I have held a few different positions at a few different companies. In most cases, when I left one position at one company to take another position at a different company, I entered into a "confidentiality agreement" with my former employer(s). The essentials of these agreements boil down to a simple arrangement: I won't tell anybody the nature of my departure from company X (nor divulge any company trade secrets) and, in exchange, the company will also not tell anybody the nature of my departure from the company (nor divulge any other personnel-related information about me).
This is Standard Operating Procedure for most organizations, especially larger ones, and it stands to reason: it protects the company as well as the individual from a number of possible problems down the line. It protects the individual because it establishes what is essentially company policy: the company will never say anything bad about you to potential future employers who choose to check your references. A potential future employer can confirm that you once worked for company X, but not what your salary was, or why you left, or whether anybody at company X had problems working with you. There's nothing left to interpretation. They can't say *anything* about you (other than to confirm that you once worked there), so there's nothing they can say that could possibly be misconstrued (or, for that matter, correctly construed) as a reason for the potential new employer to not take you on.
It also protects the company. You agree not to say anything derogatory about your former employer, or to otherwise give potential job applicants, stock analysts, or other industry professionals any reason to be concerned with how things are going at company X. More importantly, if there was any kind of a severance or other financial arrangement that was part of the deal, current employees should not hear from you what the terms of those arrangements were. For obvious reasons, your former employer wouldn't want everyone to know how much you were getting paid, if anything, as part of your separation arrangement.
As a former manager, I can assure you that there are often financial components involved in separation arrangements. And no, I won't give you specifics.
The heart of the matter is this: when employer and employee part ways, both agree not to bad-mouth the other. This is a contract. A binding, legal commitment. And yet, we read examples of confidentialities being betrayed seemingly every day.
I'll skip the obvious examples of how this happens in the higher levels of the government. The Bush, Clinton, Bush, Reagan, Carter, et al, administrations seemed to be plagued with bigger leaks than the Titanic. One such leak killed the Nixon presidency, and another has caused some harm to the current administration.
However, the problem plagues the civilian ranks, as well. The best (most public) example I can think of in recent memory is the departure of corporate executive turned television star, Carolyn Kepcher. She's got good looks, brains, a book deal, a well-defined public persona, and she and her former employer parted ways. So what happens? A "person close to the situation" told the New York Post that she was fired because she wasn't taking her job with the Trump organization seriously.
Further, according to the Associate Press article linked above, this sentiment was "echoed for The Associated Press by a person close to the situation. The person insisted on anonymity because it was a personnel matter."
That's right! It's a personnel matter! That means they are not allowed to talk about it. There's confidentiality involved. These individuals are betraying a very real, very important confidence. The Trump organization could lose a lot of money if Ms. Kepcher were to choose to sue over this (assuming, of course, that the betrayal came from within their ranks rather than from hers.) Since confidentiality and integrity actually matter to me, I'd like to see the Trump organization either find the source of the leak and fire that person; or, if the leak can't be found, fire everyone in the department who *could* have been a source of the leak.
(I've often felt the same way about leaks from within Presidential administrations. How ironic -- and pathetic -- if one of those leaks should have actually be approved of by the President himself. I'm not just talking about the current administration, by the way, with regard to the Plame Game. The leak of the Stealth bomber project under Carter's administration comes to mind, among many, many others....)
I realize that this particular example may not elicit a great deal of sympathy. There is a general preconception that the rich and powerful play by different rules (read: dirty), and therefore when they break their promises to each other (even if it's a lowly minion who is casting the stones without the approval of his or her boss), the only people being harmed are, well, rich and powerful and therefore they can handle it.
Bullshit. Integrity matters, whether you're the boss or the employee, the elected or the appointed or the electorate, the wealthy or the aspiring.
In the voting district where I used to live, a candidate for State Representative had previously left the employ of a large, local company (where I, too, had formerly been employed) in order to run for office. Word got out that her performance at said employer was not quite up to par. This, from a source "close to the situation."
As a former News Director at a commercial radio station, I recognize that sources must occasionally be protected. But these cases, like so many that we read about on a regular basis, involve sources very clearly breaking the law and/or violating confidentiality in order to share information that is not only not theirs to share, but is also not in the public's interest to have that confidentiality breached. By news organizations coddling such sources, and corporations (or government organizations) not acting to cauterize such leaks, our society as a whole infers the message that confidentiality agreements will neither protect you nor are they binding upon you.
This is a shame because confidentiality, like any social convention, is part of the glue that helps hold our society together. We erode it at our own peril.
June 20, 2006
|
Subtitle: Do People Change? Part I
It's that time of year again. The time of year often referred to as "Dads and Grads" -- when Father's Day and Graduation Day collide. What better time of year for me to trot out the Worst Valedictorian Speech Ever?
There are a number of reasons that this should come up right now; several different conversations between and among colleagues of mine, past and present, converged upon my recent discovery of a copy of the Bennett High School (Buffalo, NY) valedictorian speech for 1986. It is a crude document, and I don't even know if this copy is a first draft or the piece as it was delivered. I do know that starting the following year, Bennett's valedictorians had to run their speeches by the principal before they were to be delivered.
By way of background, I'll tell you how my thinking led up to this particular speech. [I'd considered posting this speech anonymously, but I'll cop to it. I wrote it. I'm embarrassed by it now, but I wrote it.]
Valedictorian addresses tend to be 1) long, 2) boring, 3) filled with homilies about how "we are the future" and all that nonsense, and 4) otherwise devoid of a point. I therefore set out to write a speech that was: 1) short, 2) not boring, and 3) offered no pat epigraphs nor advice for the future and 4) made a point.
That said, I could have gone the comedy/humor route and accomplished those goals, but I since the point I wanted to make was not funny, I ended up going down the crabby route instead.
Also by way of background: the teachers and the administration were actively and openly fighting each other during my last two years at the school, which had some very direct and very personal consequences for a few of my classmates.
I am not proud at all of this speech or my choices in making it. But it is what it is, and I was who I was at the time. I can be every bit as crabby these days as I was back then (although, to be fair, I'm not *always* crabby), but I'd like to think that I have a more delicate touch these days, when I choose to use it.
Allow me to set the scene: it's 1986. Summer in Buffalo. Hot. Sticky. The graduating class of 300 or so adolescents is rowdy. Each grad having been allowed up to four guests (and many finding a way to sneak in more than that), the auditorium is packed. I took the stage. I waited for everyone to quiet down. After I stood there for a few moments, they did quiet down. Silence. Then I read a short note that went something like this:
So ends four years of high school.What can I say? There are many things I'd like to say, but I don't know where to begin. Some people have said they think my speech should be positive while others think I should talk about the negative side of Bennett. The fact is that there are both positive and negative aspects that we should consider . . . about Bennett, and about leaving Bennett.
When I decided to come to Bennett, I though that high school would be a place where administrators and teachers worked together to raise the level of education of the students . . . an institution where creative thought was fostered and intellectual and athletic pursuits were encouraged. Well, I didn't find quite that here at Bennett, but I did find several experiences which will serve me well in my future endeavors. None of us are leaving Bennett without an education, although much of that education was received outside the classroom. In fact, most of the knowledge we have gained here is based upon our experiences with the politics of a high school culture. It has become clear to me that the students who pursued knowledge were able to find it. Keep in mind that even though we are graduating, we should still pursue an education.
To my fellow graduating students, I wish you farewell. There is no warning I can give you that you haven't already heard; no advice that hasn't already been offered; no profound thought that would make a difference at this time. I have come to know some of you and found friendship with a few of you.
And so, here I am, with a great opportunity to say all of the things I've been wanting to say, but I'm leaving most of it unsaid. I am concerned about too many things. If I told you everything that bothered me, nothing positive would be accomplished and it would give you an inaccurate view of my opinions of Bennett. If I talked about Peace, Love, and Kindness, it would no doubt make you throw up in those silly little hats they make us wear at these ceremonies. Yes, I'm leaving a lot of things unsaid.
So ends four years of high school.
When I finished, you could hear a pin drop in the auditorium. I don't recall there being any applause. A teacher later mentioned to me that after I left the stage, she leaned over to a colleague and said, "If I ever hold a parade, remind me to invite Allan over to rain on it." Or words to that effect.
Did I really say "throw up in those silly little hats they make us wear?" I shudder to think that I may have.
But if I was bitter at the time, I will note that history vindicated my displeasure. At the time I entered BHS, it had only recently been the spawning grounds of the City Honors school. After a few years under the reign of Principal W., it became one of the worst rated schools academically in the state of New York -- a dubious distinction that it continues to maintain, despite the departure of the aforementioned principal a couple of years ago.
BTW, I like Ms. W. as a person. She was kind and supportive of me, and certainly presented a laudable attitude toward the school. I just thought at the time (and still think) that her priorities for running the school were contrary to providing a sound education.
As another side-note, I will also mention that my dearest friend and academic rival from my high school class has offered a credible claim that a math error in calculating our class standings falsely reversed her (salutatorian) and my positions within the ranks. In other words, she has a compelling case that she deserved the valedictorian position and I the salutatorian. [Our respective GPAs, adjusted for giving honors classes a stronger weight, were a statistical tie, with naught but a sliver of a sliver of a percentage point separating us. It could easily have gone either way. The official results gave me the edge. My friend's contention is that the official results are based upon an ever-so-slight math error in the calculation of her adjusted GPA.]
If her argument is true (and I suspect that it is), it throws my acceptance into Cornell (and later, UPenn for grad school) into doubt, not to mention any subsequent edge I may have enjoyed in employment opportunities because of my degree(s), cascading into a domino effect that could mean that I *should* be a very different person today than I am. [How do you like that lead-in to my "Do people change?" subtitle?]
I am certain that my high school rival's speech, had she the opportunity to have written one, would have been far more eloquent than mine. BUT... would she have had the guts to rain quite so hard on our graduation parade?
Look for more thoughts on these and other questions in an upcoming post...
March 22, 2006
|
As regular readers of my journal will note, I do tend to visit the recurring theme of "unintended consequences." In a recent post, I noted the irony of how well-intentioned rules can, in many systems, occasionally thwart the very people that the system is theoretically supposed to help.
We have yet another example today from the headlines, this time involving criminal justice.
I have noticed that when it comes to the cases reported in the national media, statutory rape committed by male teachers (in secondary school) against female students tend to result in the male teacher being prosecuted vigorously, found guilty, and thrown in jail for a very long time. This is as it should be.
I have also noticed that when it comes to the cases reported in the national media, statutory rape committed by female teachers (in secondary school) against their male students tend to result in the female teacher being slapped on the wrist and told not to do it again.
These are trends in what I've observed and, the national media being what it is and my time & attention being otherwise directed, I don't really know whether these reports represent the majority of cases where teachers get caught flagrante delicto with one or more of their students.
Each case is special, and the reasons given by the respective sentencing judges vary, but the trends reported in the mainstream press seem rather clear. Male teacher? Monster who must be punished. Female teacher? Well, typically, we find that the woman is misguided, confused, disturbed, a victim, or has some other extenuating circumstance that mitigates her offense. Convicted? Typically, yes. Punished? Well, maybe not so much.
In the case of Debra Lafave, as reported by the Associate Press and elsewhere, the judge rejected a plea deal because he was horrified that the plea deal did not include jail time. Go, judge! But without the plea deal, the case would have to go to court, and the 14-year-old student victim did not want to testify. A psychiatrist told the court that the child suffered from anxiety as a result of the media attention. The state attorney's office dropped the charges, saying that it was unwilling to jeopardize the well-being of the victim in order to prosecute the case.
It is certainly understandable that a prosecuting attorney would not want to further harm a victim in such a sensitive case. But, was the best plea deal he could come up with one with zero jail time? I'm guessing so, since prosecution in another county led to a plea deal for the same defendant that also included zero jail time. Three years of house arrest was pretty much it. Inconvenient, sure, but it's hard to imagine a similarly light sentence for a man convicted under otherwise similar circumstances.
The media reports that the accused is being treated for bi-polar disorder. Are such mitigating circumstances reasonable to entertain if the perpetrator is a man? Oh, and by the way, this young woman is almost as photogenic as Mary Kay Latourneau, and I think that's a point worth noting. (Seattle media *love* to show the former Mrs. Latourneau's pics whenever they can. She's just *scrumptious*.)
Is there a double standard when it comes to statutory rape cases perpetrated by men as opposed to those perpetrated by women? A quick search through recent news articles reveals:
- Nicole Long in Ayersville, OH (who is referred to as "Melissa Long" in some news reports), was convicted of having sex with a student. She could have been given five years in prison, but instead, she learned in court this past Monday that she was sentenced to 90 days.
- Alas, Toni Woods, who is not so photogenic, pleaded guilty to several counts rather than pursuing a deal. She's going to jail for a minimum of four years and could possibly remain there for a full twenty.
- Last week in Philadelphia, Sara Singley served two days of her supposedly 30-day minimum sentence for her conviction of having sex with an underage student. [Note: the linked story indicates that the student is male, but other stories reference a female victim. See comments for additional discussion.] She was to serve the rest of her sentence under house arrest, because she'd had no prior record.
- ABCNEWS.com reports that Dang Van Dinh of Lafave's home state, Florida, was sentenced earlier this month to five years in prison for having sex with one of his underage female students.
- Last week, Stephen Sherman didn't get quite as good a plea deal offer as the above-mentioned Lafave. In exchange for pleading guilty, the prosecuting attorney in Brown County, WI, will recommend to the judge a mere 15-year prison term. Oh, and the judge isn't bound by this plea deal! He can sentence Mr. Sherman to up to 91 years in prison, if he so chooses. Stephen Sherman is 29 years old, by the way. Debra Lafave is 25, which pretty much makes them age-peers. Like Lafave, Sherman is being charged in two counties (and the other county hasn't weighed in yet for Sherman). Unlike Lafave, Sherman's life is pretty much done.
- In Nashville, local media look at how bail is set in such a way as to indicate a possible gender bias. (In the two cases that were compared, the man's trial has yet to conclude, so the cases can't be compared by their ultimate outcome just yet. That said, the woman's bail was set very low -- she pleaded guilty, but then was released after the first year of her eight year sentence -- while the man's bail was set substantially higher, even though he had fewer counts brought against him.)
- Here's an interesting one from Washington State. Robert Swalstad married his then 15-year-old student after getting her pregnant. The family of the child bride initially did not cooperate with prosecuters, but prosecutors did not give up the case (unlike what just happened in Florida). They eventually did convict him, and he was sentenced to six months in jail. The community petitioned to have him given a harsher punishment. (Since the charges were dropped in the Lafave case, I don't know what form community outcry could take, if there will even be any.) Teacher and bride have relocated their residence out of state, although Swalstad *is* currently sitting in jail.
I couldn't easily find many recent convictions of male teachers having sex with their students. Then again, what *is* the ratio of men to women in the teaching community? (You'd think that as a former teacher myself, I'd know; but my experience was both brief and very local to one specific school.) This ratio would have to be taken into account in any serious study of conviction rates and sentencing of sex offenders by gender.
Are the Lafave and Sherman cases truly comparable? I don't know. The victim won't testify against Lafave, but the victim in Sherman's case plus another, previous victim *were* willing to testify. The prosecutor had little else to go on in Lafave's case; Sherman, on the other hand, was dumb enough to videotape his crimes. The victims were the same age in both cases (14 years old), but then again: Lafave preyed on a boy, while Sherman preyed on two girls -- and there may be just as much of a bias about the gender of the victim as the gender of the perp. Lafave had a well-groomed legal team to defend her; Sherman can't even post $20,000 bail.
And what if, when all of the evidence is collected and compared, it turns that there *is* a gender bias when it comes to teachers having sex with their students? Is that appropriate? Apologists for unequal treatment will point out that female victims could get pregnant, while male victims are exposed to fewer life-altering risks. That women like Latourneau and Lafave are just confused or unbalanced women misguidedly looking for love (Latourneau was married when she committed her crime, by the way), while men like Sherman and Van Dinh are depraved perverts defiling the innocent flowers under their care. That women and men should be treated differently under the law because women and men are different by nature.
I strongly suspect there is a double standard, that the double standard is not entirely fair or reasonable, but that the double standard is also not entirely unjustified.
There was a judge in Florida who rejected such a notion. He said, in fact, that he was appalled by the notion that a woman should be able to plead guilty and yet serve zero time for such a serious crime. That a grown-up teacher having sex with a child under their care is a crime, regardless of the teacher's or the victim's gender.
Thus the law of unintended consequences was called into play once more.
February 28, 2006
|
There is power, I believe, in recognizing reality, even if one does not have the power to change it. There's a saying that "you can't change the wind, but you can adjust your sails."
This is the rationalization I shall adopt in making the observation below, knowing full well that there is no perfect world in which certain unbalances can ever be redressed.
Today's observation is this: that those who need help will be punished, and those who don't, won't. And as Roxie Hart noted, "That's showbiz, kid."
I've been babbling a bit of late on the subject of free speech versus censorship versus discretion. One of the many reasons this suite of topics is ever-present on my mind these days is because I've found myself so often refraining from talking here (on this website) about what is uppermost on my mind. Discretion always demands that one consider carefully before making a potentially adverse observation about a former employer (and one would never, ever comment on a current employer in any but the most glowing of terms), or expressing too many concerns about the demands on one's time, or openly disagreeing with the prevailing decisions within the political party of which one is an active member, or about the interesting choices that various friends are making in their business or love lives.
A friend of mine, when faced with a similar dilemma of not feeling able to speak freely on a public blog, chose to go the "subscription" route. He sends out his weekly missives via a private listserv, over which he controls membership. This has given him the freedom to say what he wants to, knowing he is among like-minded (or even respectfully disagreeing, but nonetheless supportive) friends.
I have long admired the candor with which he addresses topics in his journal that we might feel free to talk about on a one-on-one basis, but would tend not to broadcast to the world. Like when he admitted that depression was setting in even though everything seemed to be going well. Like the trials and joys of relocating for the sake of a job.
The fact is, one is generally not well advised to talk openly about being depressed. [Sidenote here: I began writing this missive shortly after having come out of a rather profound stretch of unhappiness, but was not at the time, nor am I now, feeling blue. I find it safer not to bring up such topics if I'm feeling down, at least in public, just as I don't comment on being out of town until I've returned.] This could lead co-workers or bosses or friends to be wary of trusting you. Likewise, one should not be too glib about how one's employer, generally speaking, *has* to put up with your eccentric choices because, hey, you *are* the best person for the job by a far sight. It may be true [although, I'm not sure it ever has been so in my own case], and all parties might agree that it's true, but you still don't necessarily want to be glib about it in public.
But it's the depression thing that has resonated the most. I've known a great many people who have suffered from various kinds of depression and/or mood swings, but they have always had to be careful about how and with whom they broach the subject. The irony of it being that, in many cases, they'd feel better if they could just *talk* about it.
...to someone other than a $150 per hour pair of ears. (Or however much therapists charge these days.)
This brings me to the topic of today's missive: how those who need help and ask for it must be punished, and those who need help but struggle silently with their burdens get to punish themselves. Blessed are the needy, for they shall be punished.
I love irony. If that hasn't become obvious to anyone reading these pages for more than a few entries, let me state it here now: I love irony. And so, I embrace the notion that the very systems we have set up to help us (medical insurance, for example, or financial credit and loans) are actually designed to punish us when we need the help they are designed to provide.
When do you get the big credit card offers? When you don't need to borrow money, of course. If you need to borrow money, you are de facto a bad risk. When I apply for mortgage re-fies (re-fi's? re-fis? REE-fies?), and if my income is high enough, they don't need to see proof that my income is high enough.
Huh?
But if my income is below a certain level, the lender wants to see my most recent pay stubs and bank statements. So, the more I need the money, the more I have to prove I need it. If I don't need to borrow it, the more they want to lend it to me.
Go ahead, go see a doctor and ask to be tested for Fragile X or some other genetic disease. See if you can ever get appropriate health insurance (or life insurance) after that.
True story: I have a friend who's mother is suffering from some mental and physical deterioration that is known to be passed down genetically. The disease usually manifests itself sometime in late middle-age, if I understand correctly, and things from that point only get worse, never improve. No, I don't recall the name of this ailment, but it sounds most unfortunate. And my friend dares not get tested to see if he/she has it. Sure, this would enable him/her to make preparations now, if need be, for what the future may hold in store. But if he/she gets tested, and it turns out that he/she has the disease lying in wait, then it becomes a "preexisting condition" and switching to a better health insurance plan will never be an option again. Nor would be increasing his/her health insurance coverage. Ergo, the health insurance programs would punish him/her for trying to determine the current (and possibly future) state of his/her health.
I've been told by non-married-yet-non-celibate friends of mine that the prudent course of action for them is to occasionally get tested for particular STDs, especially AIDS, but that they don't dare do this through their health coverage because doing so automatically results in premiums going up and/or coverage being cancelled altogether. The system encourages risky behavior when that is exactly the opposite of what it should encourage.
And as for mental health... ignore the fact that, as with AIDs tests, you don't want to raise that red flag on your health insurance. *Especially* employer-provided health insurance. But, here's another true story: someone I know was feeling down and went to see a licensed therapist about it. The psychologist (or psychiatrist or social worker or whatever they were) told the person right up front something along the lines of "if you tell me that you feel self-destructive or that you might be destructive to others, I'm required to inform the state." How's that for encouraging an open dialog?
It's like: "If you need help with feelings so bad and so desperate that you might even consider hurting yourself, don't come to a professional about it, because then it will go on your Permanent Record... and we might have to lock you up." Who goes to see a professional for just a mild case of the blues? (Outside of New York City, I mean.)
These are but a few examples, but there are many, many systems that are set up to punish the people who need them the most. I know that these are not the intended consequences of these systems. I understand that these are the unfortunate side-effects of regrettably necessary policies.
We do this on a personal level, as well. It's not just big systems and big institutions that short circuit themselves with this kind of irony. But that's a topic for another day.
In the meantime, let the healthy have health insurance, the mentally stable have therapists, and the wealthy have big loans. Let the Eskimos have refrigerators and the Southern Californians have fires. As for those who need: the beatings will continue until morale improves.
January 17, 2006
|
Let me see if I understand this correctly. I don't have much time to read the news these days, so all of my information comes to me via intermediaries of intermediaries. So when I heard this, I doubted it could be true.
Some guy wrote a novel that is loosely based upon, among other things, exaggerated stories from his own life. His publisher agrees to buy his book and sell it as a memoir instead of selling it as fiction. It becomes a bestseller as a memoir, one of the very top sellers of 2005 thanks to an endorsement from a certain famous book club persona and talk show host. Then a website dedicated to uncovering such things publishes their evidence that the memoir is, in fact, a work of fiction. Pandemonium ensues.
Do I have the gist of that right?
The book club persona/talk show host as well as the publisher have defended the book in question, saying that if it is not, strictly speaking, true in any real sense of the word, it is nonetheless emotionally true.
In doing research for a novel I am working on, I have read a great many books on certain New Age topics (both defending and deflating). One such book defended its subject matter in the face of contrary evidence as being nonetheless "emotionally true". The subject in question was past life regression -- I don't want to get too into that topic on this site, since it's relevant to my work-of-fiction-in-progress (and, I've learned from my Do Over project what a bad idea it can be to publicly discuss works-in-progress). But... if past life regression's "emotional truth" helps in therapy to resolve real patients' real issues, then that's great. If, on the other hand, it is being presented as an actual truth as opposed to a mental construct, then it must hold up to certain standards for establishing what is real and what is not.
Since readers of this site are generally well read and above average intelligence (and, may I say, damn good looking, as well), I probably don't have to go into the rules of evidence and scientific method that should be employed to establish that which is real/true and that which is not real/not true.
The New York Times opined that the book in question should have been advertised as Fiction rather than Non Fiction. While I agree, them's still mighty strong words coming from the Times, which is becoming increasingly notorious as peddlers of Fiction in the guise of News.
But what of it? Why doesn't the Times use this as their Get Out of Jail Free Card (tm) and tell the world that Judith Miller's and Jayson Blair's fabrications and misinformations were "Emotionally True", which is why they didn't bother -- nor need to bother -- with anything so mundane as "facts." (Those pesky facts again.)
For that matter, the current Presidential administration has been operating under an Emotionally True doctrine ever since taking office in 2001. That Al Queda was based in Afghanistan was not only emotionally true, but also actually true. Weapons of Mass Destruction in Iraq, on the other hand... well, maybe that wasn't so much factually as it was emotionally true. Likewise, North Korea's nutcase leadership rushing headlong into a nuclear confrontation with the US -- factually true, but it doesn't have the emotional resonance, so let's ignore it and maybe it'll go away.
In fact, isn't that the whole problem with Kim Jong-Il? He's not feewing the wuv, so maybe a big ol' nuclear tantrum will get some attention.
If "Emotional Truth" is enough of a justification to absolve fraud (and, after all, isn't deliberately mislabeling fiction as "non-fiction" and "news" exactly that?), then by all means let's embrace this New World Order. Let's embrace the teaching of Intelligent Design as "science" and the idea that States' Rights trump the Feds (except where the States disagree with the Right). Let us further embrace the idea that all criminals are victims, all victims are righteous, and whenever something goes wrong, it must be the government's responsibility.
Let us embrace the notion that we all deserve more pay, but that prices should never rise. That Walmart is pure evil, except when we find a really good deal on a plasma screen TV there. That the millionaire ball players (and managers and owners) of the Boston Redsox were "cursed" until they won the World Series. That it's wrong for male teachers to take advantage of their female students, but it's okay for female teachers to fuck their male students.
If it appeals to us emotionally, let's embrace it. If it rings true, let's believe it. Life is too short to bother with the real truth. As with aspartame and other substitutes: Emotional Truth now; consequences later.
September 18, 2005
|
So, my wife and I bought the TV show "24 - Season Two" on DVD and spent a couple of weeks watching an episode or three each evening after the kids were put to bed. Having now seen the first two seasons this way, I must heartily recommend "24". Wonderful fun, with an emphasis on plot reversals:
In a "reversal", the plot or action suddenly veers off in another direction from what was expected. The reversal can be good *or* bad. It doesn't always have to be bad. A really good reversal changes the goals/questions for the characters involved.
If you are a writer or an aspiring writer, you could do worse than to take in how 24 approaches plot reversals (regardless of how you evaluate the plot holes).
As a friend of mine commented recently, watching a couple of seasons of 24 back-to-back can give one an acute attack of paranoia. These episodes are all about conspiracies within conspiracies, and they can make you a bit jumpy.
Inspired by the gleeful paranoia-euphoria of being fresh off of season two of "24", and thinking of a couple of very dear friends of mine who live their lives in such a state, I pounded out my little tidbit, "Choose Your Own Conspiracy". It was a lark, intending to mock how quickly and irrationally we can sometimes resort to blaming conspiracies when simpler, more credible forces are more likely at work.
One such friend (ie, one of my friends who sees conspiracies within conspiracies as being rather pervasive) posted a response chiding me for being naive. I'm going to repeat her comment here because it deserves some elucidation:
Much like a child who is completely unaware that he is, in fact, the reason why his parents got divorced, you are happily clueless.You are blissfully unaware of what is going on around you and your own culpability therein.
You won't even acknowledge a conspiracy that was so clearly pointed at you!
It is arguably amusing, but very, very costly.
Now, this sounded to a couple of other faithful readers like an "insane" slam from "the angry left". At first blush, it certainly seems nasty.
It was none of these.
Like many shouting matches that pretend to be reasoned debate on the talking head news shows, the conversation here is falling apart due to lack of context. Let's back up a little bit and provide that context.
Jehan and I used to work together for a well known national brand that she occasionally refers to as "thatplace.com". She and I have spoken often and at great length about the different kinds of conspiracies that may or may not be plausible in the realms of politics, racial profiling, and the day-to-day grind on the job.
I've never been public about my reasons for leaving thatplace.com except in the vaguest of terms -- and I intend to keep it that way -- but it is not perhaps much of a secret that before I left, my successful team was reorganized out of existence, much to the dismay of my team and myself.
Jehan was a member of that team, and remains one of the most talented devs I've ever had the pleasure to work with. Like most of my former team (and myself), she eventually left thatplace for much the same reasons that the rest of us did. She and other members of my former team showed an amazing amount of loyalty to me and to each other, for which I will always be profoundly grateful.
Jehan's and my on-going conversation has included reflections upon things that happened to me during my last few months at thatplace. It has always seemed to me that those things were obviously part of the larger reorg (and aftermath) that engulfed our entire division of the company. There were, it seemed to me, sound business decisions behind the reorg, however much I may not have agreed with them.
My friend and former co-worker believes otherwise. She believes that the events that unfolded were designed not for business reasons, but for personal and political reasons. To be blunt, she believes that I and my team were not collateral damage, but deliberate targets.
Our (hers and mine) long-running conversation on the subject gets further complicated by two things: my position is reasonable and requires no evidence, whereas her position is less reasonable, requires evidence, and yet she nonetheless has enough evidence to make a compelling case.
Now, re-read her comment above. See how context changes everything? She's not raving about vast right-wing conspiracies (which is what I believe some readers have come to think). She is mocking me for mocking conspiracy theorists. Here, I was mocking those who would be so paranoid that they would see a conspiracy in the destruction following a hurricane. She counters that I would be so blind as to deny an obvious conspiracy that targeted me directly and personally... insofar as she believes this is exactly the case.
Did this clear anything up? I hope so. Now, let's get down to business.
One of my faithful readers is another friend whom I met in a completely different context, named Allen. Since very, very few readers of my blog could know the circumstances to which Jehan is alluding, it is only reasonable that her remarks should be misinterpreted by many of my readers. But Allen went so far as to label her response as being from "the angry left".
Allen, you're a good man and I love you like a brother. (You know, the brother who moved away to Canada like some commie-symp blue-stater, so we don't talk about him so much at the dinner table; that kind of brother.) But just as the "angry left" was being ridiculous to keep crying about some phantom "vast right-wing conspiracy", so too is it ridiculous to cry about some phantom "angry left".
Not all who oppose us are necessarily part of a unified enemy. Sometimes, we are opposed by our dearest allies. Not all who disagree with us oppose us. Intelligent people will disagree about the best way to accomplish common goals.
It's true that Jehan's remarks did read a little harsh, and I appreciate your standing up to defend me. But, well, your remarks were a little harsh, too.
Can't we all just get along?
May 11, 2005
|
The economy, as it works in the Western World in the early 21st Century, is pretty much a pyramid scheme. I have no doubt that this is the logical, albeit unintentional, result of the convergence of various financial instruments and institutions with mass culture.
Naturally, if you understand how the pyramid scheme works, you have a chance to protect yourself somewhat... although the best protection from a pyramid scheme is to avoid it altogether, and it would be difficult in this day and age to successfully avoid the economy.
Here's how it works:
Take any major economic market of your choice. The stock market and the housing market are excellent examples.
What determines the value of an item in that market? Economic value is determined where the price someone is willing to pay for the item meets the price someone else is willing to sell that item for. If supply of the given item is smaller than the demand for that item, then that price point will necessarily rise.
An interesting phenomenon creeps into any market system when the supply and demand are influenced primarily not by need or desire for the item as such, but by the perception of how much *others* need or want that item. For example, if you buy a house not because you need or want a house, but because you see that the value of houses is going up and you want to buy now and then sell later simply to cash in on the anticipated rise in value, then you are contributing to the prices going up because you are adding to the level of demand.
Likewise, if housing values are falling, you might want to sell now so that you can get out of your house before it loses any more value, and you thereby contribute to the falling prices because you've added to the available supply.
This phenomenon is an economic feedback loop, and it distorts the actual value of the item in question.
A pyramid scheme requires new money to keep coming into the system in order to sustain its growth. Thus, if Farmer Bob owns a share in a pyramid scheme, he can only make money if new people come into the system after him and throw their money into the scheme.
When the economic feedback loop I described above influences a major market, it turns that market into a kind of pyramid scheme. Farmer Bob sees that stock prices are rising faster than inflation, so he buys stock (and thereby drives the prices a little bit higher, himself). Once he has purchased that stock, though, the only way the value of his stock can go up is if someone else comes into the system after him to put even more money into propping up the stock prices.
Before you argue, "But a pyramid scheme involves the selling of something that has no intrinsic value," let me remind you that in the feedback loop I've described, there is a disconnect between actual value of the item in question and the economic value people are willing to pay.
Examples abound: the high tech stock race in the late 1990's involved people buying "shares" in "companies" that produced nothing but debt, and were therefore useless. The housing market in southern California now, likewise, involves people paying millions of dollars for houses that are arguably worth only a couple hundred thousand dollars worth of materials and labor on land that is in imminent danger from landslides, sinkholes, and earthquakes. The intrinsic value of the house has nothing to do with the prices people are willing to pay. And many people are buying simply because they reason they can turn around and sell for even more, thereby driving up prices even further. These "investment properties" make the bubble ever larger.
A pyramid scheme collapses when The Last Guy has put in his money, and there's nobody behind him to do the same. With no more cash flowing in, the value evaporates.
Likewise, in a major market, the bubble that was fueled by the economic feedback loop bursts when The Last Guy puts his money down and nobody stands behind him to do the same. With no more money coming in, values level off. With values being level, there's no point for investors to continue keeping their money in the system (and, in the case of a market like housing, there's real cost in the form of mortgage payments that are even more discouraging), so they start to sell. Demand drops, available supply increases, and the feedback loop now feeds itself in reverse. Prices plummet, and so on.
The difference between the stock market and the housing market and other economic markets versus a typical pyramid scheme is that they do, ultimately, represent items that have *some* intrinsic value. Stocks with no intrinsic value, of course, must necessarily disappear with the wind during a downturn in the market, but stocks that are backed by real companies that produce real value survive. Houses, of course, remain houses, even after the housing bubble bursts.
And how does that happen? Well, in the frenzy of everybody selling at a loss in order to not lose any more, someone eventually realizes, "Hey, that house [or company, or whatever] still has actual value, and I can buy it at these cheap, cheap prices now because the prices are currently below that intrinsic value."
Then The Last Guy, who has held on to his stocks or investment properties or whatever in the hopes that things would turn around, finally sells his investment because he reasons that the situation can never be salvaged. When the Last Guy finally sells, available supply is no longer a glut. Prices level off. The recovery begins, and the feedback loop renews (slowly, at first) its uphill climb.
I first explained my Theory of The Last Guy to a friend of mine in the late '90's when he explained to me that the high tech stock bubble was going to have to burst imminently (like, within days). I told him that I didn't agree, because I still knew people who were reluctant investors who were just then deciding that maybe they should get into the frenzy. Until they actually put their money in, I reasoned, The Last Guy hadn't spoken. And until The Last Guy puts his money down, the market continues in its up or down hill trajectory.
Sure enough, however, about a year and a half later, The Last Guy finally ponied up, and then the high tech stock bubble burst. My friend and I have joked since then that we know The Last Guy, because we actually know a few people who put their money in just *one day* before their stocks began to plummet.
So why am I telling you this now?
I'm seeing more and more articles in the newspaper talking about the housing market bubble in parts of the country, and there can be no doubt that such a bubble exists. But the very fact that newspapers are cautioning us about the nature of this bubble means that the Last Guy hasn't spoken. The Last Guy is the most reluctant investor. He doesn't put his money down until the market continues to surprise his expectations by continuing to go up, up, up, up. The Last Guy has money to invest, but is playing it safe until, finally, all indications are that this isn't actually a bubble, this is a permanent state of affairs, and he may as well get in on the game.
When he does, of course, the jig is up.
That's why the time to worry about the economy is when the economists all agree that, well, they were wrong before that the economy had to eventually slow down, so it must all be up, up, up, up for the foreseeable future. When they all agree that things are only going to get better and better, that's when the Last Guy gets in on the action, and that's when all hell breaks loose.
We haven't reached that point yet. But it's coming. The next economic shockwave that will rattle America will come from the housing market. It must happen soon (within the next couple of years). But not until The Last Guy puts his chips on the table.
So, what do you do with this helpful piece of information? First, you keep your eye on the Last Guy. If you *are* buying a house, buy it for its intrinsic value to you (my wife and I, for example, bought our house because we needed a bigger house, and not for investment concerns). This is the most dangerous time to get into the game. Don't invest in real estate right now.
But when The Last Guy who bought at the top of the market starts whining that it's time for him to sell that investment property, THEN is when you take a serious look at investing in real estate again.
The Last Guy always buys high and sells low. Study what he does, and then bet the other way.
September 23, 2004
|
For a very long time -- since high school at the very least -- I've thought of the human brain as a rather sophisticated pattern recognition engine. Wherever possible, we seek to establish direct causal relationships among events. When that's not possible, we infer more subtle relationships at work. This enables us to make connections that are not at first obvious but which nonetheless help us to survive.
Example of the obvious: Me throw sharp stick at charging lion. Charging lion fall down, bleed out eyeballs, stop bothering me. Me survive to throw another sharp stick another day.
More sophisticated: Sun come up over here. Sun go down over there. Sun must be circling around Earth to come up again over here tomorrow. Me get cave facing over there so that me sleep in tomorrow morning.
Of course, the more subtle models where we make connections with limited information may prove to be incorrect down the road, but they are sufficient for the time being and thereby enable us to get along. Once they no longer work, we adopt even more sophisticated models. For example, we figure out that in order to explain the way stars and planets move across the sky over time, the sun couldn't possibly be going around the Earth, but perhaps the Earth goes around the sun. Etc.
During my high school days, this was my theory for explaining away the concept of intuition/precognition and the practice of religious belief. I figured that the experience of intuition (a preferable word, for me, to 'precognition' or 'premonition' or 'clairvoyance', which imply new age cosmic woo-woo) was simply a sophisticated pattern recognition model that generated correct answers on the basis of incomplete information. When somebody gets a strong hunch that a certain event is going to happen, and then it doesn't happen, they shrug it off as a bad guess. When that strong hunch pays off, they call it premonition. Someone who is able to consistently experience good hunches on the basis of limited information could be considered 'intuitive' or 'psychic', depending upon your preferences, but it all came down (as far as I was concerned) to having some good pattern recognition going on somewhere in your brain.
Likewise, when we have incomplete information about how or why things happen the way they do, our brain finds comfort in (or actually develops and enhances belief in) the subtle and sophisticated models of that we call religion: a belief system that explains the otherwise inexplicable connections among objects and events. Before we can explain rainbows scientifically, we ascribe them to an invisible Being that paints the sky with colors to remind us of a promise (as in the story of Noah in the Judeo-Christian religions). Later, when we figure out that raindrops can form prisms that separate out the colors of sunlight, we no longer need religion to explain rainbows. (And, importantly, some of us choose to keep the religious explanation, while others write off the religious explanation as fanciful stories.)
The story of Noah is more charming than the story of raindrops acting as prisms that separate out the colors of sunlight. But understanding how light works enables us to create CD players and computers, which in turn allow me to write goofy essays and beam them to you on something called a 'web site'. The story of Noah does not allow us to manipulate the physical world thusly. The story of Noah *does* allow us to convey social memes and to disseminate ethical ideologies. But the pattern explained by the story of Noah is simply not sophisticated enough to build televisions, and so we move on.
This, as I said above, was my interpretation of the world at the age of 16 or thereabouts. I was struggling to explain events that had occurred in my life, without sacrificing my emerging sense of *reason*. If you've read the essay that immediately precedes this one (Pattern Recognition, Part I) -- the one I wrote five years ago or so -- you've no doubt noticed that I'm still fascinated by the concept of brain-as-pattern-recognition-engine.
So while I was visiting a friend's house recently, I was pleasantly surprised to discover a book called something like Why People Believe Weird Things. In it, the author (editor of Skeptic Magazine, or something along those lines) asserts his own similar views about the brain-as-pattern-recognition-engine concept, and uses it to explain why people believe in astrology (a subject for another essay of mine, I expect), UFOs, the Green Party, and other flights of fancy. (Okay, just kidding about that last one. Nobody believes in the Green Party.)
The author includes a chapter on near death experiences and comes to the conclusion that, well, there is no conclusion. The scientific evidence is simply not there, one way or the other, to sufficiently (for the author, at least) explain why so many people seem to experience so many common events as part of the dying process. The author manages to successfully debunk both the common scientific explanations as well as the common woo-woo new age ga-ga explanations, leaving the question of life after death and the status of the immortal soul open. Of course, since it isn't explained one way or the other by science, this remains in the realm of religion for most people. We are then left with the agnostics' dilemma (which the author acknowledges): if we don't have a basis for believing in the certainty of life after death, and by extension the immortal soul, then how to we find comfort in living what may well be, by extension, a meaningless life?
He offered an answer that I found interesting... until I gave it some serious consideration. His answer may be paraphrased like this:
History is a long, long continuum of events that impact other events. We know this as a fact. We also know that one small event at one point in the continuum can have very powerful repercussions farther down the line. We know this, too. There are innumerable examples of people who died in seeming obscurity but who later became famous because something they did ended up having a huge impact on society down the line. Since we do not know at what point of the continuum our lives are played out, it is entirely possible that there is still a great deal of history yet to be written, and we may very well end up making that small difference now that could have a huge impact in the future. Ergo, we should take comfort in the fact that even if we don't know it now, what we do with our lives may prove to be extremely important / influential in the years that follow us.
So went the author's reasoning, and I found it to be a comforting thought at first. But then I gave it more consideration, and realized that it suffers a bit of a fatal flaw:
The author says that the person who does not know whether to believe in a life after death can find meaning in the idea that we may, for all we know, end up leading an influential life anyway.
"Wait a minute," my brain says. "In other words, that means if we can't find comfort because we don't know our place in the cosmos, then we should find comfort in the fact that we don't know our place in history."
Now THAT doesn't make any sense at all. What the author presents as the agnostics' solution is really just restating the agnostics' problem: "Sure, you don't know if you matter; but at least you don't know if you matter."
So I see a pattern in the problem of the pattern-recognition-engine. Recognizing patterns can be a source of comfort because, in general, it is recognizing patterns (correctly, usually) that helps us to survive. But when we reach the limit of our ability to recognize the patterns, we are left with... what? Filling in the blanks with "To be determined?" Or filling in the blanks with a leap of faith that the religion you've picked (or that has picked you) will have the best answers? Neither option is intellectually satisfying. (And, perhaps, we have more options available than just these two....)
That said, the question of which to choose comes down to this: which option helps you to move forward? Which option helps you to answer the questions that need to be answered in order for you to make progress? When a person has hit the limits of his or her ability to solve the puzzle at hand, is it better to keep banging away at it until you've revealed enough information to guess at the pattern, or is it better to simply leap, and have faith that the net will appear?
September 20, 2004
|
I originally posted the essay below to this site on November 18, 1999 -- roughly a year before I went to a content engine, which is why it does not appear in the archives. I am repeating it here because it leads into another essay I'm about to write, regarding pattern recognition and how the brain works.
This essay was originally published as ""Why a Little Insomnia is a Dangerous Thing":
I bought some more cool decks of cards last night. This time, they were special "Bicycle" versions of Canasta and Pinochle decks. You know, it seems to me that Pinochle decks used to be almost as common as poker decks at the check-out counter of any grocery store; but, now, I don't think I've seen them for years. I also picked up a Mille Borne deck, last night; I've been looking for that for a while.
What is it with me and playing cards?
When I was in my teens, my compulsion of choice was books. I had to "complete the set" of certain authors -- Heinlein, Bradbury, Ian Flemming, Stephen King -- and I even went so far as to buy "collector's editions" of early paperbacks of these authors. This meant I sometimes ended up picking up several copies of the same book, if only to get ever closer to that "First Time in Paperback!" copy.
I don't do that quite so much, anymore, although I still pick up each new James Bond or Stephen King book as they come out. The days of trolling around for old paperbacks have gone, however.
In college, music was my vice. If I heard a song and liked it, I had to get the album. Certain artists also required "complete the set" action: Dire Straits or Pink Floyd (which i never actually did finish, come to think of it) or Billy Joel, for example. Other particular faves -- Concrete Blonde and Suzanne Vega -- I tracked from the beginning, snapping up each EP single as well as each new album as they were released.
I don't listen to the music quite as intently as I used to. I still buy lots of new releases, but I don't listen to them as much, and I've recently discovered just how much I miss that. I'd like to return to some kind of work environment where music is a part of it. (Note: I used to work at a radio station.)
While none of these -- and other, similar -- compulsions have completely died away, they've certainly abated. In the meantime, one of my lesser hobbies has grown: my fascination with playing cards.
The cards thing, like most "collect-the-whole-set compulsions," started when I was a kid. As a kid, when you go places, your adult companions want to buy you souvenirs. Very gradually, whether by my choice or by the accident of what other people chose for me, a trend emerged where I ended up selecting souvenir playing cards more than any of the other common items.
Keep in mind, adults who would go on vacation would also pick up souvenirs for the kids. Somehow, as a preference evolves, people pick up on it and use that to make their souvenir purchase decisions easier. "Oh, Billy likes 3-D Viewmaster slides, and Janey likes little license plates with her name on them." Next thing you know, they are reinforcing the preference, and so on.
For my mom and my maternal grandmother, the collectible of choice was and is souvenir spoons. My mom even has a souvenir spoon from the European city where I was born, with my birth date etched into the back. Both my mom and grandmother have racks upon racks of various collectible spoons.
For my sister, the collectible of choice is shot glasses. And, of course, for me, the souvenir of choice is playing cards.
However, I've since developed a deeper interest in "igralniye kartiy". The beauty of everyday playing cards from Soviet Russia encouraged me to look beyond just the standard souvenir backs. Since being exposed to those wonderful decks, I've developed a keener sense of the aesthetics of playing cards in general.
(Author's note: this is going somewhere. Trust me.)
Independently of my aesthetic appreciation, I became more involved in actually playing card games. I began to sit as an alternate at a friend's poker table many years ago; eventually becoming a regular. Another friend of mine and I created our own table, as well, in Boston. Since then, I've joined a few tables and made a stab at starting a few other new ones of my own. I've even played poker in Reno (and, held my own, I might add). I'm not a great player, by any stretch. I enjoy playing, nonetheless.
One thing I've been coming to recognize with all of this card playing is something that applies throughout all areas of my life: all of my life has been centered around pattern recognition. In this way, I'm not so sure that my life is any different from anyone else's, but it's becoming clear just how pervasive this is for me. Most of my introspection revolves around identifying patterns.
In many games I enjoy, mastery comes with pattern recognition. Chess, poker, cribbage -- all start with basic rules, then moves, groups of moves (gambits), then series of gambits, series of games, and so on. Trends, and trends of trends.
A deck of cards is loaded with patterns. There are patterns of face value (suits and ranks) and patterns of design (where pips are placed; the drawings of the face cards, design of the backs, the fact that backs within are uniform while faces are unique, etc.). Most magic tricks, incidentally, rely upon recognition and then violation of these patterns. (And, yes, I've picked up a few magic tricks along the way, too.)
Sets of decks have patterns, too. Given decks may have a different back from each other, but they maintain that one back design throughout the entire deck. Each may place pips of different sizes and shadings, but there's always a pip in the upper left-hand and lower right-hand corder of each card -- even if some decks have additional pips in the other two corners or along the sides (like the decks I just bought). The four suits are always hearts, diamonds, clubs, and spades -- the first two being reddish in color, the last two being black or another very dark color. Face cards are drawn with different styles from deck to deck, but the queens inevitably hold flowers while certain jacks and kings carry weapons.
Sets of sets of decks have patterns, as well. Trends among US Playing Cards brands (Bicycle, Aviator, Bee, Aristocrat) are distinct from the trends found in Hoyle brands (Hoyle, Maverick), or Gemaco or Liberty, ad infinitum. There are even greater "meta" trends. US manufacturers vs. European, Eastern Block, and Asian manufacturers.
Then, there's playing cards vs. Tarot cards vs. other game cards (like Mille Borne, Pit, Uno, Wizard, Rook, etc.).
There's cards as games vs. "stone games" (backgammon, mah jong, chess) vs. ball games (from billiards to baseball) vs. board games (Monopoly, Scrabble, and so on).
And, there's cards as introspective or forecasting tools vs. astrology vs. palm reading.
There's even cards as building materials vs. match sticks vs. toothpicks vs. Lincoln Logs vs. Legos.
Lest we forget: cards as noise-makers (putting them in your bicycle spokes) vs. bike horns vs. bicycle bells.
There are cards as promotional products or souvenirs vs. spoons vs. pens vs. shot glasses vs. key chains vs. t-shirts.
All sorts of patterns, and patterns of patterns, and different ways to classify and re-group. This isn't just a statement about cards, of course. The same "patterness" is found among foods, cars, houses, clothes, relationships, political systems, biological systems, religions, music, literature, languages, etc.
All life is pattern recognition.
The late, great Canadian comedy troupe "The Frantics" have a great sketch that asks: "Is the idea of this game show to find out the idea of this game show?" We are posed with the same question here: is the meaning of life to find out the meaning of life?
The answer is in the cards.
I am confident that the answer is a resounding "Yes." The problem, as has been pointed out in Douglas Adams' book Life, the Universe, and Everything, is not with understanding the answer. The problem is with understanding the question.
July 26, 2004
|
Below is a slightly modified version of an essay first posted to my site August 6, 1999:
I am blessed to have friends and family who manage to travel along all walks of life. One of my dearest friends is a noted journalist/activist who has written some of the best work you'll ever read about the destruction of the environment in Upstate New York; he's had stories of his own appear in many of the major papers, and has even been the subject of a few of them, himself. (The New York Times, in particular, comes to mind.) In fact, he was the E-n-C of Generation when I first started there. That's where I had my first real taste of the finer points of journalism-on-a-deadline.
Like many of my friends, Eric is about as multi-talented as you can get. One of his many current careers though is that of professional astrologer.
I happened to have a chance to get together with him a couple of weeks ago in Manhattan. Although he and I have been working on a few things together here and there for the past couple of years, it had been seven years or so since the last time we'd actually seen each other in person. Ah, the wonders of the Internet!
Spending some quality time with Eric (insofar as eating at the Stage Diner can be considered "quality time") led me to think about some cold, hard realities... and some not-necessarily-realities. Such as, for instance, astrology.
What if Astrology was the real deal, and my skepticism was ill-founded? What if, someday, the results of the double-blind studies came in, and Astrology won out as being a valid indicator of personality, behavior, and destiny?
I'd like to present for you the year 2020, if Astrology were proven to be a valid science:
- Colleges would, naturally, use the birth dates of applicants to determine not only whether they should be accepted for admittance, but what majors they'd be allowed to choose.
- Presidents of the United States of America would be required to be 35 years of age, and a Leo. Secretaries of State, likewise, would be Aries. Voters would tend to choose Pisces for legislative positions, just to get them out of the food service industries.
- Police would only require a breathalizer test of non-Geminis. Geminis, themselves, would be presumed guilty automatically.
- "The stars told me to do it," would become a legally justifiable excuse for misdemeanors. For felonies, the plea "Not Guilty by Reason of Astrology" could be entered, but each side in the case would inevitably bring in fifteen Astrologers each to argue the true meaning of having been born at 10:15am GMT on some particular date.
- Pop Astrology would take up the debate about whether consumer products (like cars or software) were "born" the day they were released to manufacturing, or the day they first went on sale.
- "You must be this tall and a Libra to ride this roller coaster."
- The legal drinking age would be 21, except for Aquarians, for whom the drinking age would be 19.
- Budweiser would stop using "Born On Dating" on their cans and bottles after they discovered that many batches couldn't be sold because they were "born under a bad sign".
- OJ Simpson would *still* be looking for the real killers.
When you get right down to it, is the world I just described any less scary than the one in which we currently live?
April 03, 2004
|
Constitutional amendments aside, here is how best to preserve the sanctity of marriage.
-- Insofar as "sanctity" means to be held sacred or holy, the major faiths must resume treating marriage as if it were, well, important. This means that getting married must again require some amount of effort on behalf of those who are getting married: going through the various rituals and/or training that used to be required in some churches/religions, paying dowries, and so on. And, of course, honoring the tradition of having marriages pre-arranged by a morally upright, disinterested third-party whose job it is to bring together two people who have never met before. Hey, it worked for our great great grandparents, right?
-- It also means that getting out of a marriage must be, well, difficult. To make divorce an exception rather than a rule, make the cost of divorce high. Excommunication from the church used to be a big deal. But civil penalties can be imposed, as well. You need very special legal grounds, argued in a court of law. No quickies. Divorcees lose certain legal privileges, like the right to vote or otherwise take part in government, since we wouldn't want people who violate a sacred trust to be in any way entrusted with the affairs of state. Oh, and divorce would be heavily taxed.
-- The penalties for violating the oaths of marriage would also have to be severe. Infidelity by either partner? Stoning. To death. I'd also suggest similarly harsh penalties for spousal abuse, even though there's not much of a tradition of punishing spousal abuse.
Ah, but if we want sanctity for marriage, we need more than to make it costly to get into and out of. We also need to provide some holy benefit. Some religions, like the Mormons, allow women into heaven if they get married (as part of a family deal -- if the husband ain't going, neither is the wife). But what incentive is there for the men? Can someone familiar with the world's major religions help me out here? What are the blessings accorded to married people that makes marriage better than singlehood (or non-married couplehood, or non-married polygamy, etc., etc.)? Once I know that, then perhaps I can make suggestions for changes to our civil policies to incentivize marriage, as well.
But, in the meantime, let's bring back arranged marriages and stoning for breaking the vows. That should go a long way toward restoring the former glory of marriage.
March 18, 2004
|
So, I guess the mayor of San Francisco decided that he (or the city he manages) was above state law and decreed that the city would recognize gay marriages. I don't follow the news as much as I used to, so I'm a bit hazy on the details, but that's about the gist of it, right? And the state of California said, "No, buddy, a marriage is defined as a civil union between one man and one woman, so there's no such thing as a gay marriage." Am I following the story so far, even if only in general terms?
As a result of all of this, the U.S. President says he wants an amendment to the Constitution to codify what the Congress has already legislated, and what the states have already legislated, defining marriage as a one man, one woman arrangement. The Defensive Marriage Amendment or something like that, right?
Now, I'm not sure why a constitutional amendment is needed, insofar as the laws are already on the books, unless one is worried about the laws being overturned by the Supreme Court. But, that said, the rationale I'm hearing for such an amendment is this: that we need to preserve the sanctity of marriage.
Of course, given that there is *supposed* to be a separation of church and state in this country, it seems rather odd to me that the government should be in the business of preserving the sanctity of anything. It's up to the various religions to determine was is sanctified and what is not, right?
Now, before I go too far down that road, let me also acknowledge that yes, this country was founded upon Christian ideals and that, additionally, the government does have legitimate reasons to regulate the legal status of marriage, a civil union with peculiar property rights issues and child guardianship matters and which is much more than merely a public proclamation before the church and any God or Gods concerned.
It stands to reason that our nation would regulate the legal status of marriage in accordance with the Christian traditions that have informed so much of our nation's governing principles. Still, to do so in the name of preserving sanctity is a dubious claim, especially when sanctity is a church issue, and some churches define marriage (and divorce and annulment and so on) so differently from others.
But whether you agree that the government should or should not get into the sanctity business, and whether you agree that the default concept of sanctity should or should not be based upon the traditional Presbyterian (or other Protestant, non-Morman* church of your choice) definitions of marriage, I am struck by the idea that the gravest threat to the "sanctity of marriage" is the idea that women want to "marry" women and men want to "marry" men.
The alleged sanctity of marriage has already been completely and utterly undermined by the trend of men not wanting to stay married to women and women not wanting to stay married to men. In practice, the notion of "Until Death Do Us Part" has been replaced by "Until I Don't Feel Like It Anymore." According to a February 2002 report by the U.S. Census Bureau, 50% of first marriages in the United States are likely to end in divorce. According to this report, as of 1996, a mere 55.9% of first marriages that began in the early 1970's even made it to their 20th anniversaries.
One of the hallmarks of marriage is supposed to be commitment. It is exactly that commitment that is lacking in the modern American definition of marriage, while it is exactly that commitment that gays and lesbians say they desire for themselves. And thus, we arrive at the irony of preserving the sanctity of marriage: that we don't honor our own commitments while at the same time we refuse to recognize the commitments that others (gays & lesbians, polygamists, etc.) would like to make to each other.
Allowing gay and lesbian civil unions, by whatever term you wish to call them, does not cheapen monogamous, heterosexual marriage. Divorce, infidelity, jealousy, annulment, and spousal abuse cheapen marriage.
Protestant Christian America has left marriage in a ditch along the highway of history, nearly road-kill and barely clinging onto its life. How does only allowing straight, allegedly monogomous men and women to degrade it do anything toward preserving its sanctity?
In my next post, I'll describe the most effective way to preserve the sanctity of marriage.
MORE...
March 05, 2004
|
As many of my faithful readers know, I'm not very good at remembering names or dates -- which makes my choice of being a history major in college something of a mystery even to me.
So I can't remember if it was Attorney General Edwin Meese or Senator Jesse Helms, but *someone* up there in the federal government during the 1980's, when unable to actually define what constitutes pornography, uttered the famous words: "I know it when I see it."
The Supreme Court, equally decisive in condemning offensive material and vague in defining exactly what constitutes the same, favored the notion of relying upon "community standards" to determine what is, and what is not, offensive.
The FCC at the time was less vague. I worked in broadcast commercial radio at the time, and we had very clear guidelines on what was acceptable. The "seven dirty words" (as memorialized in George Carlin's comedy routine about an earlier Supreme Court decision banning seven specific words from the public airwaves) were never appropriate. Innuendo was fine any time of day, but any overt sexuality (such as the Dr. Ruth show) was to be saved for after 10pm.
For a couple of years, I ran a two-hour comedy show each week on Sunday nights at 11pm. We set and followed our own guidelines, whereby material we deemed to be risque would be held until after midnight.
Certainly, accidents happened, both at the station as a whole and during my comedy show in particular. These mistakes could take the form of a miscued bleep of one of the aforementioned dirty words, or somebody making an error and swearing while his or her microphone was accidentally left on. One idiot at our station referred to Aretha Franklin as "Urethra Franklin" on air by mistake because he'd gotten into the habit of doing so off the air.
There was a procedure for handling these kinds of situations. We'd log what happened in our daily FCC log, and we'd prepare to face the music if anyone ever complained to the FCC.
As it so happens, no one in Ithaca, New York ever complained about such mistakes, which were not common but not unheard of.
Flash forward fifteen years or so. Half-time acts during the Super Bowl regularly make a spectacle out of themselves by grabbing their crotches and undulating on stage, draping flags around themselves and singing about punching out cops and bonking their fuck-buddies. This has been going on for several years, and I guess the lines have been getting blurry. With an apparent lack of guidelines as to what is and is not appropriate for the public airwaves, the ambiguity of "community standards" when talking about a national audience, and the problem of who knows what when they see what, the lines have gotten so blurred that the notion of offensive material was almost forgotten by those waltzing along the lines.
Then, this year, Janet Jackson flashed a pasty (pastie?) covered boob in a choreographed routine that positively exuded sex with a hint of violence, and someone at the FCC jumped up and said, "That's it! I see it! I knew I'd know it when I saw it! That's offensive!"
I'm not making this up: I actually heard a sound clip of the FCC chariman refer to the Super Bowl halftime show as a "sacred moment", the enjoyment of which was permanently soured when his family was so unexpectedly exposed to this . . . this . . . boob.
So, like, Mr. FCC Man: what freaking planet are you living on? The Super Bowl is a popular sporting event. It is not sacred. Get over it.
And where were you during all the crotch grabbing?
Where were you when Kid Rock danced on the stage wearing an American flag that had been torn in the middle and turned into a poncho?
Where were you during the songs about punching out cops?
And why was any of the sexual suggestivity on the stage any more suggestive than the freaking *cheerleaders* who shake their groove thang in front of the cameras going into every single commercial break? Mr. FCC Guy: how did you explain cheerleaders to your young and impressionable progeny?
Why are you more afraid of a boob than you are of rows and rows of heavy thugs lining up time after time on opposite sides of a pig-skin with the singular purpose of pummeling each other into the ground?
Americans are more afraid of sex than of violence. I acknowledge this fact intellectually, even though I don't understand it. (As a history major, I can give you all kinds of reasons, stemming from our Puritan roots. It's still insane.)
Let me go on record as saying that I prefer sex to violence, and I'd rather see a shapely breast than a boxing match. (And, let me also concede that, having said this, I was watching the Super Bowl nonetheless with the expectation of seeing a football game rather than a peep show.)
Janet and her buddy Justin, though, combined sex with implied violence, which I guess makes it a little worse than even just sex.
So the American public was all atwitter about what happened during the Super Bowl, and the media couldn't stop talking about it for weeks. Nor could the rest of us. Often I'd go out to various meetings, only to have the issue come up. Some folks thought Janet's performance was obscene. Some thought the rest of the halftime show was obscene. Others thought football was obscene. Still others thought there wasn't a problem at all.
Janet revealed more than a little bit of skin that afternoon. She revealed that community standards are not. She revealed that while while all "know it when we see it," we all see it differently.
Obscenity is in the mind of the beholder.
I'm not the first to make this observation. Even in Genesis, Adam and Eve's reaction to nudity was all in their minds. Before they became "enlightened", nudity was no problem. But after eating from the tree of knowledge, boy did they become uptight. Get me a fig leaf, quick!
Okay. So obscenity is all in the mind, and we all have different minds, so we are all offended by different things. Are we all on the same page?
Janet has been forgotten. But the FCC has not. The FCC is on the prowl. It feels it has let the American public down (and, in many ways, it has), and it wants to atone. So it's going after that most dreaded den of obscenity: talk radio.
Congress has not adequately defined obscenity. The Supreme Court has dodged behind community standards. But the FCC sure knows it when they see it. Or hear it. So, they are fining stations that carry talk radio shows that say things that they (the FCC) find offensive. But they (the FCC) have not issued guidelines as to what counts as offensive and what doesn't.
It's an effective strategy. The government won't define it, but it *will* take violators of the unwritten rules to court. And the government *will* fine violators of these unwritten rules. The result? Terror. Radio stations are muzzling their talk show hosts, telling them to lay low for a while while they try to figure out what kind of policies they should follow in order to best avoid getting fined.
As a tactic for keeping broadcasters on their heels, it's brilliant. Of course, it doesn't produce better (or even, necessarily, less offensive) programming. But it *does* produce *nervous* broadcasting.
Long before we bestowed the term "terrorist" upon rogue elements who sought to earn sympathy for their political causes by murdering people (a stretch of logic I still don't quite understand), historians singled out a particular kind of government tactic as rule of terror. Here's how it works:
First, ban some behavior using vague terms.
Next, enforce this ban haphazardly, seemingly on a whim, and make the punishment excessively punitive.
The result? A scared, scared population.
This is exactly the road down which broadcast radio and television are currently heading.
There is a great deal to be said in favor of regulating standards of conduct among public broadcast frequencies. (Private broadcasting mechanisms, such as cable television, is another matter and one for another discussion.)
But what Janet revealed is that those standards need to be specific and well-defined. They must not be left up to the whim of whomever happens to be watching from the FCC that particular day. They must not be left up to the whim of what a given judge in a given court finds offensive on a given day.
This is partially a question of favoring rule of law over rule of terror . . . I, for one, prefer that the United States not slide down that slippery slope that has engulfed so many other democracies which have relied upon rule of terror instead of the rule of law.
But it's also a question of accomplishing your stated goals in the first place. The best way to make sure that standards are adhered to is to publicize exactly what those standards are and enforce them consistently. Don't leave it up to "you'll know it when you see it." The producers at MTV have different standards from the producers of PAX. (And quite frankly, I find both offensive, but for different reasons.)
If *I* set the standards, Beyonce Knowles would have had the center stage for the entire halftime show (she did an amazing rendition of the national anthem at the start of the game, don't you agree?), there would be none of those fireworks or laser light shows or any of that nonsense, and the cheerleaders would have been allowed to perform topless during the game. But only if they wanted to.
--Allan
PS: if you want to read a funny story from the point of view of a cheerleader, check out this story by my friend Joseph Paul Haines.
February 06, 2004
|
One of my character traits that has been dogging me for years is that of tending toward overcommitment. I'm not what some people refer to as a "joiner" -- I don't go around joining clubs just so that I'll be a member of a lot of clubs. Rather, I'll commit myself to performing various tasks or roles to the point where I don't have the time to do them all.
[In case any of my new co-workers are reading this: this negative trait of mine is only in my personal life, and it doesn't apply to my work habits. At work I'm very careful not to overextend mysel-- wait a minute. That doesn't sound too good, either. Hmmmm.]
Overcommitment a different kind of insanity from being a joiner, but not by much. These days, I've got a monthly open mike (open mic?) night at a local coffee shop that I emcee, I'm on my homeowners' association board, I'm webmaster for a couple of non-profits, there's writing workshops and critique groups, trying to be active in my local political party of choice, and never mind regular (and firm) commitments with Alexander (doctor's visits, lessons, playgroups) and the daily commitment to my employer.
Additionally, I have writing goals I'm trying to make and chores around the house that require regular attention. And so on, and so on.
Some of these commitments come about out of necessity, but many come about either because I'm passionate about it (writing; public performance) or because I have some sense of "should" about it (civic participation, and taking a shower *at least* once a week).
Then there's watching ER on Thursday nights, which isn't a formal commitment, but it just works out that way.
I frequently entertain the (false) notion that I used to not be overcommitted -- that I used to live up to all of my obligations. If I were to be honest with myself (it happens, but only rarely), I'd acknowledge that I've been overcommitted since at least elementary school. Cello practice? Who has the time!? Yearbook staff meeting? I'm too busy to make it!
I used to think that I wanted to "be a writer", until I finally wised up to the fact that what I really wanted was to have written. I didn't want to write a novel; I wanted to have written one. Well, I wised up, and decided to become a writer, and then I wrote a novel.
A lot of my commitments are going south because many of them are things I want to have accomplished, rather than because they are things I want to do. Worse, there are a number of things I *should* accomplish that I'm not doing because I'm spending so much time on commitments that I neither should nor want to do anymore. I have stuck out of a sense of duty rather than out of any real need or desire.
If I learn how to quit some of these commitments -- just walk away from them -- then I can take the newfound free time and... blow off my other commitments with less anxiety.
A few months ago, in a rare moment of insight (and free time), I wrote in my private journal that I needed to quit a few of my commitments. I chanced to pick up my journal again recently, and noticed that from that long list of expendable commitments, I'd released myself from exactly one of them. How pathetic.
Clearly, I'm not committed to quitting my commitments.
So, what do I do? When I commit myself to quitting, the first commitment I quit is the commitment to quit commitments. Ack!
I believe there's some organization like a "joiners anonymous." Although, by its very nature, wouldn't all the members really just be posers? I mean, by joining such an organization, aren't you defeating the whole point of getting that joining monkey off your back? So, by extension, there's probably no *valid* sort of "overcommitters anonymous", because the very idea of going to meetings regularly would defeat the purpose of trying not to commit any more.
[sigh.]
I should just be committed.
February 05, 2004
|
For the past year or so, I've been posting rather infrequently to this here website, which is funny (not funny ha-ha, but funny weird funny) because traffic to my site goes up every month. I guess the less I write, the more popular I become. Or something.
But whereas I had only the lame excuse of "gee, I'm busy" to keep me from posting here, I now have a more coherent reason for my relative silence. I've started work at a new employer.
When I was first getting to know him, a friend of mine named Allen claimed to be so bored one day that he read through my entire website. He found it odd and interesting that he had to read an awful lot before discovering any mention of my wife (let alone her name), and he was also curious as to whether I was still working for my previous employer (I was not) because my blog has generally only hinted at my employment situation, as well.
I wrote an essay a while back about the conflicting interests that surround freedom of speech. My contention was (and still is) that, while we enjoy the freedom to say what we will, we also are obliged to deal with any consequences that may result -- and that there are often consequences.
My primary concern in that essay was about the annoying (to me) error of referring to consequences as censorship, or even more strongly put, McCarthyism. The Dixie Chicks certainly have a right to say they don't like the current President of the United States. Radio stations in the Bible Belt, likewise, have a right to not play Dixie Chicks records. Both the Chicks and the radio stations are making a point about what they believe or what they are against. But the radio stations are not censoring the Chicks. They are, rather, selecting their own messages just as carefully as the Chicks did.
In a more recent example, Janet Jackson's choices regarding her freedom of expression (which, while not Constitutionally guaranteed, is considered by the Supreme Court to be Constitutionally implied) have led to her being uninvited to be a presenter at the Grammy Awards later this year. Is CBS censoring her? Or are they choosing, instead, to select performers with a public image that is more copacetic for their intended audience?
On the other hand, is the FCC censoring CBS and/or Miss Jackson by threatening and/or imposing fines for what happened during the Super Bowl half-time show this year? Arguably, yes, they are. Censorship is pressure brought to bear by the government regarding what one says or how one expresses it.
Now, then, what does this have to do with me having a wife or changing employers?
Quite a bit.
Paulette, my wife, has a life and a set of interests of her own. She tends to not be as public with her stories as I tend to be with mine. I believe she prefers I not say too much about her in such a public forum as my web site, for fear that I might say something that she'd be uncomfortable having broadcast.
I have a choice, of course. I can put everything out there for the world to view, or I can just shut up about anything that concerns Paulette. Or I can walk a tightrope somewhere in between. Alas, since we are married, and our lives are so interconnected, there are very few things that are a part of my life that aren't also a part of hers.
Is this a case of censorship? Hardly. But anything I say can and will be used against me.
It's reasonable for Paulette to want her privacy. It's reasonable for me to want to share my stories with the world. It's also reasonable for me to respect her privacy. So I do what I can to say what I want to say without pulling her out on display with me too much.
Our son, Alexander, is another matter. My preference is to say enough to tantalize those parties who are interested -- maybe even give a photo or two -- but not say so much as to have Child Protective Services pay us a visit for being bad parents.
Likewise, there has rarely been much for me to say, nor any benefit in saying it, about changes in my employment situation. Usually, all the interesting stuff happens during one's employment, not afterward. (Your mileage may vary, of course.)
Shortly, I'll be posting the story of how I came to get my current job -- it was most unusual, even by my standards -- but, for the time being, I'm simply too busy during the day actually *doing* my job to tell *stories* about it, and I'm generally too tired in the evening to even look at the computer.
I'm sure to post it soon, however. I don't want the server to break down under the strain of all the increased traffic I'll get if I *don't* post. :-)
June 14, 2003
|
Another installment in my gingiva graft saga! A movie review! Existential angst! Haircuts! All this, and more, in one essay! The mind reels!
In case you haven't been following the my gum surgery story, or in case you'd like a refresher course on where we are, here is a brief summation of what has gone before:
- Around Christmas time, a recession in my gum line (lower jaw) split open, leaving two flaps of gum material just sorta hanging out at the base of one of my lower teeth. Icky.
- Saw a periodontist, who recommended a gingiva graft: take gum material from roof of mouth, insert into exposed area, and sew it all up.
- Had the procedure, but some of the gum material escaped, so I...
- Had a second procedure, in which the remaining transplanted gum material was more securely fastened. Alas, this didn't quite heal right, so I...
- Had a third procedure, in which other gum material (from upper side) was transplanted to the base of the previously exposed tooth, to act as a barrier to further decrepitude.
That third procedure was due to happen about a month ago, as I mentioned in a previous essay. Sure enough, I went in and the procedure itself went as perfectly as it can go -- just like the other two had -- but there was something very unsettling that I happened to notice as they were showing me the work after it was done.
[By the way, my stories about my gum surgery might be considered a little graphic by my readers who are a little squeamish... you have been warned.]
There was nothing wrong with the work they did. What was unsettling was that my mouth didn't look like my mouth anymore. Specifically, it was my lower lip. Most folks have some vertical tissue that connects their lower lip to their lower gums. I'm sure this has a name, but I haven't a clue as to what it is. Some people have two strands of tissue, others have one. It's funny, the things you notice after you've had gum surgery.
Anyway, I had one strand of tissue that rose up in the middle of my lip, rather high, connecting to my lower gums. Because there was a lot of tension on my lower gums (they were very tight), the periodontist kept cutting back that connective tissue, lowering it with each procedure. By the end of this third procedure, the connective tissue was so low as to be not even visible to a casual inspection of my lower mouth.
So here I am, looking into a mirror at my lower mouth, and the gum work is picture perfect. A fine looking set of gums on these ol' choppers. But it's not my mouth! That one little change -- the apparently missing connective tissue -- completely messed with my concept of what I should expect when I look into a mirror at my mouth.
This was not the first time this spring I'd looked into the mirror and seen someone else.
A couple months earlier, I'd gone in to have my hair cut. This was the second time I'd seen this particular stylist, and so we had to talk about kids and all that obligatory introductory stuff that you have to talk about when you and your hair stylist are getting to know each other. She was washing my hair (prelude to a cut) when I told her that I had a son at home, and she asked what color his hair was.
"Blond," I said. "Like mine."
"What do you mean, 'Like yours?'"
"What do you mean, 'What do I mean?'"
"You're not blond."
Well, my hair was wet, so certainly it must have been darker than when it's dry, but when I sat down in the chair for my haircut, I noticed that, well... my hair was brown. She cut my hair, and it continued to dry. It stayed brown. I got home, and looked in the mirror. Nice haircut. Brown hair.
Ahhhhhhhhh!
Now, the area where I live happens to enjoy rather short days during the fall and winter (and early spring). Shorter days than anywhere else in the US, except for Alaska. Plus, the area where I live tends to be overcast for much of the winter, which is the rainy season. (Winter forecast: drizzle, 45 degrees. Every day. Summer forecast: partly sunny, 75 degrees. Every day.) Like many blonds, my hair tends to get lighter with exposure to sunlight; darker without.
Okay, okay, but this was ridiculous. My blonditude was in doubt, which meant my entire self-concept was in doubt. Who am I? Hair color isn't just about hair color. It's about identity. You identify people by their appearance, and that includes hair. How long their hair is, how it is styled, whether it's curly or straight... and what color it is. I began to understand why there is such a big money industry surrounding hair-loss products for men and hair styling products in general. When we look in the mirror, we want to see ourselves looking back. That, or we want to see a *better* ourselves looking back. This is why some people dye their hair, because doing so changes their identity to something they'd prefer. This is why they fight baldness, because they want to retain the identity they've grown accustomed to.
When my hair-line receded at the temples ("widows peaks" is the term for this kind of AWOL hair, but I don't know why), it didn't bother me all that much because it had happened gradually, and it was minor. I still had hair and, hey, I was still a blond.
I have an uncle who is a cop. One week, while his wife was out of town, he and his fellow cops did what cops whose wives are out of town are wont to do: they got drunk, and they shaved their heads. My uncle used to have (thinning) red hair. Very Irish. When he shaved his head, he looked, well, like a cop. A tough cop.
Then his wife came home. For the sake of this story, let us say that she was not amused. He let his hair grow back. It grew back brown. No kidding.
(For the record, let me state that I have considered the "shaving your head to change your image" idea, but it wouldn't work for me. There is a photo of me after a skiing session where I'd worked up a sweat, and my hair was all matted down so as to make me look bald. I looked like Uncle Fester, of the Addams Family. Not the image I'd want to adopt. [shudder])
So. Throughout the months of March and April, I felt my identity slipping away. I wasn't a blond anymore. Who was this stranger looking back at me in the mirror? I don't know. Somebody with brown hair. Maybe, like my uncle, the change was permanent. For my birthday, Paulette got me a card in which she had written, "You'll always be blond to me." I'm not sure if I was supposed to find that reassuring.
I've gained a few pounds over the years. Let me rephrase that. Every year since college, I've gained a few pounds. I graduated 13 years ago. A few pounds every year means... oh, brother.
And now, the inside of my mouth is completely different from what I'd become accustomed to over the last twenty years or so. Who the hell is this fat, brown-haired guy with the unfamiliar gums?
If you've read about my first two gum surgery experiences, you know that it's important to take some time off and relax just after you've had the procedure. For me, this means staying away from home, since Paulette and I work from home, and the kid is an added distraction (and he *is* work). So after my third procedure, I went to see a movie and sipped on a big gulp of Sprite. Tried to forget about my mouth for a little while. What movie did I go see?
In the movie, ten people are stranded at a hotel during a rainstorm. The roads are out, the phones are out, and one by one, people start dying. At the same time, a convicted killer is being considered for clemency by the men who put him away. The movie aspires to be Hitchcockian, and it comes close. The acting is superb, and the direction is well done. There are some very nice touches, especially surrounding how the two stories relate to each other (for example, the weather in one story line is always the same as the other story line, which is a very nice detail). Both stories are self-contained and interrelated at the same time. This is part of what makes the movie work, but it is also part of why the movie didn't quite realize its aspirations for me. I'll explain why below, so that you can skip that part if you don't want to see spoilers about the movie.
The key to the movie, to nobody's surprise, is the title. The movie isn't just about the identity of the killer, it's about the killer's Identity writ large. It's about *each* character's identity. Anyone who has seen the Twilight Zone as many times as I have will figure out the mystery before the movie reveals it, but that doesn't detract from the mystery as it unfolds.
As distractions from physical discomfort go, this film was a fine way to spend the first couple hours of recuperation from my most recent gum surgery. But spending a couple of hours in the Twilight Zone of someone else's imagination did nothing to rescue me from my own private Twilight Zone.
It's a big ol' world, and there are a lot of nasty things going on. Just a couple days ago, NBC News showed me, during the dinner hour, a man in Louisiana getting shot fifteen times by police. The guy f'ing *died* right in front of me while Tom Brokow blathered on about the investigation. In the grand scheme of things, changes in hair color or how my lip is attached to my gums is hardly Earth-shattering. I'm fortunate enough to be in a position where I can afford the luxury of a minor identity crisis.
Which is all by way of saying, the shock of seeing a different mouth in the mirror has worn off. It's still weird, but not shocking. I'm more sensitive than ever to my hair color (strange, but true), but as the days have been getting longer (longer than anywhere else in the US, outside of Alaska) and I've been taking Alexander on daily strolls through the neighborhood, I see encouraging signs that my blondness is returning. Whew.
Trivial concerns? Absolutely. But that doesn't make them any less real. I'm surprised that I would even react this way to things as minor as these cosmetic changes. But as I mentioned earlier, an entire industry is doing booming business because of these very concerns. Even you, dear reader, have been concerned about your appearance once or twice in your life. Before this little episode, though, I hadn't been so overtly aware of how much I have invested in my appearance, sloppy though it has always been. That investment includes a piece of my very identity.
MORE...
May 22, 2003
|
How does the saying go? "Always a pallbearer...."
Dead people are starting to catch up with me. They're taking more and more prominence in my life.
RICHH is dead. I was planning to tell you about him anyway, but now that Mr. Feinstein is dead, I have to tell you about RICHH. Can't put it off any longer.
Rich Halberstein was co-editor of the Cornell (TM) Lunatic humor magazine in 1987 when an accident forced him to take time off and thereby thrust the burden of keeping the magazine alive onto the shoulders of his co-editor -- who, in turn, ultimately handed the magazine over to a friend of mine and me. Rich was, by all accounts, a very funny man to be around and his humor in the magazine was simultaneously brilliant and tasteless. I mean, really tasteless. Like, "Missing Children Playing Cards -- Collect the Whole Set."
I found out about Rich dying a few months ago (about a year after his death, in fact). I did a little digging and discovered that he and I have been (sorry, had been) living in the same town for the past several years. A little more digging revealed that Rich was a bit of a legend on Usenet during the early years. (For those of you who don't know what Usenet is, think of it as the global bulletin board analog to the web or to e-mail. If that doesn't help, then just think of it as a geeky computer gnurd thing that involves lots and lots of people who aren't you.) There are *fan sites* dedicated to collecting his various posts. (Examples here and here.)
Typically, Rich would post under the user name RICHH, although he did occasionally use pseudonyms, as well. He posted treatises on philosophy and pop culture, he posted humor, he posted raunchy porn of a particularly literary bent. Much of what he wrote way back then reads kinda like a modern day blog. The guy was ahead of his time. And now he's dead.
Weird things have been happening lately. Like, soon after finding out RICHH was dead, I received a lifeline from past Lunatic editors who are trying to organize alumni who worked on the magazine. I mentioned RICHH being dead. They -- you know, they -- asked if I'd like to write a few words about him for an upcoming newsletter. The more I researched Rich's life in an effort to say something interesting about him, the more fascinating his life became to me.
At my suggestion, and I know this sounds tacky, we're going to put his obit under the "Where are they now" section, featuring (I hope) a photo of his tombstone. Since he was buried in Florida (nowhere near any of my friends or family in Florida, too) and not in Seattle, it'll be hard to get that photo. But the more I read of his writing, the more I think he would have wanted it this way.
The thing about RICHH is, he was a downright funny as well as tasteless guy who lived hard and died young. I feel like it's possible to be irreverent in my obituary for him in the Lunatic alumni newsletter without being disrespectful. But there's also an undercurrent in any irreverence I may choose to employ regarding his death. Any dime-store psychologist would recognize such humor as a defense mechanism.
Earlier today (Wednesday, not Thursday -- I'm writing this after midnight, but today is still Wednesday to me), I received the word that Paul S. Feinstein died a few months ago.
It's getting a bit late in the evening now for me to adequately eulogize this man. He was my honors English teacher for junior and senior years in high school. He was one of the best teachers I've ever had. He had a cool and reserved sense of humor, a firm sense of what was right and wrong, and he demanded that we do our best to live up to our potential. He died all too young.
I wish I could sum this all up with some profound observation about life and death. I don't have any. Rather, I tell you all this by way of simply saying that death is increasingly on my mind these days. Like most of you, I've lost close family members throughout the years. But the rate at which I'm losing peers is now accelerating noticeably. A few years ago, an old high school friend with whom I had lost touch passed away. He was the first of my peers to go (not including my cousin Mark). Rich and I never got to know each other, but his death is nonetheless relevant to me right now. But losing Mr. Feinstein is bringing it all home.
Once upon a time, my peers started getting married. Later, so did I. Then, they all seemed to be having kids. Later, so did I. Now, my peers are starting to die. And eventually...
That's a signpost up ahead. Welcome to Midlife Crisis, USA.
March 31, 2003
|
I have too many things to say just now. I can't sort 'em all out and make them coherent. So, instead, I'll offer you a guest essay.
Several years ago, this site would feature about one guest essay per month, but I haven't posted any in quite a while. Shortly before the war broke out in Iraq, I heard the following item read at RASP, a monthly coffee-shop open mic, and I particularly liked it. Fred Jessett is a regular there, and I have always enjoyed his fiction and his essays. I asked Fred if he would be kind enough to let me post this latest essay of his, and he consented. Here it is:
"LOVE IT OR..."
by The Rev. Fred Jessett
It was only a bumper sticker, and it was many years ago, but it still haunts me. I was living and working on the Rosebud Sioux (Lakota) Reservation in South Dakota in the early 1970s, the days of battling sticker slogans. "Make Love, Not War" vied with "America: Love It Or Leave It."
One day I saw a particular a bumper sticker that hit me hard. It felt as if every Native American from the past 500 years, living and dead, was speaking the words on that car: "America: Love It, Or Give It Back." Or maybe it hit me because it also felt like the voice of God.
My first reaction was to realize more deeply than ever that we are not the owners of this land, we are its stewards.
And a question arose in my mind. Do our actions show that we love this land as well as its original inhabitants did, and do?
Then another question came up, what does it mean to truly love this country? Is it a lump in the throat when the national anthem is played, or saluting when the flag passes in a parade? Yes, but also much more.
Here's what I think it means.
It means that we strive to live up to the vision of the Declaration of Independence. That vision said "...all men are created equal...endowed by their Creator with...inalienable rights: life, liberty and the pursuit of happiness..."
It is a vision of a people who each take responsibility for the common good of the community, and of a community that safeguards the rights of each person.
It is a vision far greater than the signers of the document could themselves fulfill. Some of them owned slaves, few thought men without property should vote, or hold office. None of them thought women should. Their vision enabled them to see further than they themselves could go.
The process of fulfilling that vision is a long one. It took "four score and seven years" for slavery to be abolished. It took much longer than that for women to gain the vote. Today we continue working to fulfill that vision.
To love our country is knowing that the flag and the national anthem are not the possession of any political party nor of those holding one particular point of view. These symbols belong to all of us. They should never be used to divide us.
They call us not only to defend our rights and responsibilities but also to exercise them, and to respect the rights of others to do the same.
It means knowing that political dissent is not disloyal or unpatriotic. That's what this country is all about: the right of citizens to differ with the government and speak their minds freely. It is the most basic right we defend.
It means no matter how much we may disagree with the policies of our government, we will not take that out on the men and women who serve us in the military. When there is war, we will pray for them, welcome them home, heal the wounded, and mourn those who have given their lives.
It means we pray for all those who have been entrusted with the power of governing whether we agree with them or not.
It means we do not prosecute or persecute any other Americans because of their religion, race or national origin. We will resist all threats of violence or discrimination toward people for these, or any other reasons.
In a time of national crisis, we must remember what we are striving for, not just what we are fighting against. If our love for this nation, and all her people, is true, then we will never cease working for the fulfillment of the vision on which we were founded.
With all my heart, I pray that we will do that, for those words still haunt me, "Love It, Or Give It Back."
(c) copyright 2003, Fred Jessett. Used by permission.
March 16, 2003
|
Should the United States go to war with Iraq? Some say we should; others say we shouldn't.
Back when I was in the sixth grade, one of my teachers, Mr. Z, sat us all down and told us there was only one thing in the world that we ever *had* to do. "Nuh-uh," was the general response. He said he didn't think we even knew what that one thing was.
"I have to take out the garbage on Thursday nights."
"No you don't."
"I have to do my homework when I get home from school."
"No you don't."
And so on when the conversation, each child holding up his or her hand to volunteer the one thing he or she had to do. Mostly, we started with chores. Then there was the occasional, "I have to breathe," or, "I have to wear a coat in the winter."
But Mr. Z kept responding that we didn't have to do those things.
So what was the one thing we *had* to do? He let us in on it: we had to pay the consequences for everything we did or didn't do.
We didn't have to wear a coat in the middle of winter. But we had to pay the consequences for that choice. We didn't have to do our homework. But we had to pay the consequences for that choice. And so on and so on. You get the picture.
This was a very liberating and a very troubling idea for a sixth-grader to behold. It gave us -- those of us who chose to think about it, anyway -- an immense sense of... responsibility. We could make any decision we wanted. It was okay. But we had to pay the consequences. Responsibility, as I've learned in the years since then, is a very powerful thing. It can be used to shape your life in any number of ways. When you accept responsibility for your life, you own it all. Success and failure alike.
Taking this principle, it is a truism that we all have the right to say whatever we want. But we also have to pay the consequences. In Soviet Russia, you could criticize the government in public. Of course, the consequences were pretty severe... severe enough to probably prevent you from being physically able to do so a second time.
Should the United States go to war with Iraq? Some say we should. Others say we shouldn't.
Happily, I was born a citizen of a country where the law says that the government shall not interfere with my right to speak one way or the other on that, or any other, issue. My friends and I have discussed this issue in public and in private. We are sometimes agreed, and sometimes we disagree. Sometimes, we raise our voices. Or, in e-mail, we might TYPE IN ALL CAPS. If I wanted to, I could even broadcast my views on the possibility of a US war in Iraq right here on this web page, where literally *dozens* of people could read it.
The only thing I would have to do is pay the consequences.
As I said, my government is proscribed by law from interfering with me for expressing my views, even if said opinions should run counter to the current administration's views. But that doesn't mean there wouldn't be consequences.
Take Martin Sheen, for instance, who has a higher profile than I do (if only a little). He has stated publicly that he disagrees with our current administration's stance on war. His language has been more colorful than that, but you get the idea. He's been rather adamant in expressing his opinions.
Now, coincidentally, this actor happens to play the President in a popular television series. The network that carries that show has expressed some concerns about the publicity surrounding Sheen's comments. Visa has stopped airing commercials featuring Sheen. And now some Hollywood folks are expressing concerns that this could escalate into a rebirth of McCarthy-era blacklisting.
Visa denies that they pulled the commercials for political reasons. Let us suppose, however, that their decision may have been at least partially influenced by the controversy surrounding Sheen's remarks. If so, does this mean that they are resorting to McCarthy-era blacklisting? I argue that the answer is, "No."
If the *government* were to step in and say, "Sheen should not be allowed to work in this industry because of his stated opinions," then that would be McCarthyism. That would be a violation of the first amendment. If an individual advertiser says, "Hmm, do we want to continue to have a controversial critic of the government representing our product," that's different. Visa, in such a case, is defending its own freedom of speech.
Speech involves more than just the text of the words. Speech includes how they are said. When an organization picks a spokesperson -- be it a rock star, an actor, a sports celebrity, or a cartoon camel -- that spokesperson becomes a part of the message. It's all fine and well to say that Martin Sheen should be allowed to speak his mind. With that, I whole heartedly agree.
But it is also appropriate for Visa to exercise its own freedom of speech. When they present their message, it is appropriate for them to evaluate whether the message is diluted because it is presented in a controversial form or through a controversial medium -- or, in this case, by a controversial spokesperson. When Visa delivers their message ("our credit cards make your life easier"), they want you to think about their message rather than think about war, the government, actors who insert themselves into the political arena, or whether you admire or hate the spokesperson for his outspoken political views.
The Dixie Chicks, during a recent concert in England, reportedly announced to the crowd that they were ashamed of the current administration in the United States. The Dixie Chicks are from Texas and, according to the report, they said they were ashamed that the President came from their home state.
In Texas, some people who hold a differing view called up radio stations and asked them to stop playing the Dixie Chicks. Some radio stations have made the decision to remove the Chicks from their playlists. Are the Dixie Chicks losing their right to speak? No. They continue to enjoy the right to express their opinions. But it is also within the purview of the radio stations to choose what message *they* want to convey. If they don't want to be identified with the Chicks' opinions (or, for that matter, if they wish to give the message that they actively disagree with the Chicks), then it is entirely reasonable for them to decide to remove the Chicks from their playlist. It is even reasonable, as was the case with one station, for them to announce that they'd rather destroy Chicks CDs than play them and encourage others to do likewise. Should that station be allowed to say such things? Should the Chicks be allowed to say what they said? *MY* opinion, of course, is that yes, they should. In both cases. The right to free speech unhindered by government intervention applies to those on both sides of any given issue. Even if they be boneheads.
Martin Sheen, the Dixie Chicks, Visa, and Dallas radio stations have the right to speak their message. You and I have the right to agree or disagree with any of them, and to express our views publicly or privately, as we see fit. But there's one thing the Constitution of the United States simply can not address: while the state is not allowed to abridge your speech, it also is powerless to save you from the consequences of your speech.
When Sheen's chosen speech is at odds with Visa's chosen speech, the two will part ways. Both parties will suffer or enjoy consequences for their decisions, both leading up to and following these events. Perhaps Sheen's decisions will lead to world peace. Perhaps it will lead him to new acting roles that he will get simply on the basis of his principled action. Perhaps it will lead to loss of work because potential employers wish to avoid controversy. Perhaps Visa will gain or lose customers on the basis of their decision to drop the Sheen ad. Perhaps the consequences for either party will be trivial.
An advertiser's aversion to controversy is not the same as McCarthyism. And while Sheen's rights should not be abridged, nor should his responsibility.
Should the United States go to war with Iraq? Some say we should, and others say we shouldn't. Some say nothing at all. But regardless of what we say or don't say, the only thing for certain is that we will all have to face the consequences of our action or inaction.
What say *you* on the topic of freedom of speech? Feel free to enter your comments... or pay the consequences for your silence!
November 27, 2002
|
First of all, let me apologize for the lameness of my posts lately. I'm not only not posting very often, but I feel that my posts these days don't say very much. I assure you, it's not a function of having a kid in the house. I think it's the result of a number of things, including (but not limited to) being in a generally crabby mood these past couple of months, being overworked, underfunded, etc., etc.
One contribution to my crabby mood in the past two weeks in particular has been an ear infection. I blame my last post on the ear infection. (I mean, really, all I wanted to do was post a picture of the kid, but I found it necessary to speak vaguely about "family" without actually saying anything meaningful. Sad, sad, sad.) The earache was painful. So painful that it hindered my enjoyment of talking (something you know I love to do). Chewing was a problem. Even eating M & M's was problematic.
No, not problematic. It HURT.
The doctor prescribed ear drops. The pain got worse. He prescribed pain killers. The infection continued to worsen. He prescribed antibiotics and steroids. Things have gotten better.
But you might get a kick out of the "Cautions" for one of the drugs he prescribed for me. After reading this, I wasn't sure if the cure was better than the problem:
DO NOT STOP TAKING THIS MEDICINE without checking with your doctor. Stopping this medicine suddenly may cause serious side effects. KEEP ALL DOCTOR AND LABORATORY APPOINTMENTS while you are using this medicine. BEFORE YOU HAVE ANY MEDICAL OR DENTAL TREATMENTS, EMERGENCY CARE, OR SURGERY, tell the doctor or dentist that you are using this medicine. THIS MEDICINE MAKES YOU MORE SUSCEPTIBLE TO ILLNESSES, especially if you take it for an extended period of time. Prevent infection by avoiding contact with people who have colds or other infections. If you are exposed to chickenpox, measles, or tuberculosis (TB) while taking this medicine or within 12 months after stopping this medicine, call your doctor. Report any injuries or signs of infection (fever, sore throat, pain during urination, or muscle aches) that occur during treatment and within 12 months after stopping this medicine. Your dose may need to be adjusted or you may need to start taking this medicine again. CHECK WITH YOUR DOCTOR BEFORE HAVING IMMUNIZATIONS (VACCINATIONS) while you are using this medicine. BEFORE YOU BEGIN TAKING ANY NEW MEDICINE, either prescription or over-the-counter, check with your doctor or pharmacist. [pregnancy warnings omitted]
...and this was before listing possible side effects, which included: difficulty sleeping, mood changes, nervousness, increased appetite, indigestion, swelling of feet or legs, unusual weight gain, black tarry stools (whatever that means), vomiting material that looks like coffee grounds (I'm not making this up), severe nausea or vomiting, headache, muscle weakness, and prolonged sore throat, cold, or fever.
I'm pleased to say that I feel much better now. Today should be my last day on *that* particular drug, which is also a good thing.
November 17, 2002
|
Many people I know spend a great deal of time lamenting the deterioration of our society. The news has shifted from reporting to opining and entertaining. Politicians are sleazier and sleazier. Crime is up. Education is down. And our popular culture is dumbing America noticeably.
As one who was trained as an historian, I often find it necessary to point out that these things come and go in cycles. That the so called "news" today may be bad, but the same kind of scandal-centric infotainment was all the rage back when Hearst's papers inspired the term "yellow journalism." That Clinton was hardly the first President to be accused of inappropriate liaisons while residing in the White House... nor the first to be re-elected with that reputation. That crime is always going up... and down... and up... and down. That Johnny, by and large, can read. That our pop culture is just as varied in its quality today as it ever has been... but that the good selections from the past have survived in our memories while the inane selections have been conveniently forgotten.
I stand by these observations. By and large, the world is a better place today than it was ten, twenty, fifty, a hundred, a thousand years ago. A hundred years ago, the average life span in America was what, forty-eight years old? It's now in the seventies. Sure, AIDS is bad and cancer worse, but so were polio and TB and smallpox back in the days of our grandparents and great-grandparents. The world political situation is a bit edgy these days (is that a gross understatement?), but do you remember the cold war and fears of nuclear armageddon a not-too-distant decade-and-a-half ago? Not so long ago, we were taught to "duck and cover" because we lived in a world gone mad. The world may not be sane right now, but my point is that not all things are always getting worse. We simply don't always acknowledge to ourselves where things have gotten better or are getting better.
Still, every once in a while, I find something to remind me that in some respects, we are in a "trough" for various quality cycles. Take television writing, and sitcoms in particular. Sure, there have always been bad shows and good shows, relatively speaking. But the writing for the past ten years has been arguably awful, and there's little sign of improvement (for now).
I want to take a moment here to talk about the Dick Van Dyke Show.
What is the best written sitcom today? I'm going to go with "Frasier." Formulaic, certainly, just like any sitcom must be. But, there's a lot of cleverness that manages to come through even within the constraints of the formula. Do you think there's better writing in a sitcom today? Please comment below, as I'd love to know.
During a recent trip along the West Coast, my family and I were staying at a hotel and we chanced to watch some television one night. We don't have a television feed at home (long story), and haven't had one for about three years. There is something very liberating about not having television at home. Something isolating, as well. So, for the first time in a while, we surfed through what cable had to offer, and found the Dick Van Dyke Show on Nick at Night.
The episode involved a golf outing where Rob (Dick Van Dyke) encountered a fellow who used to date Rob's wife Mary (Mary Tyler Moore) back in college. Unbeknownst to Rob, the fellow is now a priest. The priest doesn't realize that the Mary he talks about is the Mary who is married to Rob. As the episode unfolds, Rob confronts Mary about the priest (neither one knows that he's a priest, remember), Mary invites the priest over for dinner, Rob invites his female officemate to dinner as a blind date for the priest, and much hilarity ensues.
This is sitcom plot number five. There are only seven, I'm told. This plot is the comedy of insufficient information and incorrect assumptions.
I was expecting the withheld information (the priest's identity, Mary's identity, et al) to be kept from the participants for the duration of the episode, which is a common ploy these days. But instead, the characters figured out the errors of their respective ways pretty quickly, which was both MUCH more believable and MUCH more funny. Everyone copped to their various mistakes, and moved forward while still providing a great deal of laughs at a ridiculous-but-plausible situation. The writing was positively brilliant.
The episode then threw me for another loop in the epilogue, when Mary brings out an old shoebox of letters and poems that the priest had written her back in college. She reads Rob a sonnet. Here I was expecting the sonnet to be particularly bad or humorous. Instead, it was... beautiful. Touching. A completely non-funny, totally romantic love poem. And Mary makes a very interesting observation about the sonnet that is also not funny, but appropriate. The result? A sitcom episode that was both hilarious and deep. It was moving as well as entertaining.
And this was a typical episode of the Dick Van Dyke Show. This wasn't a "Very Special Episode, in which Rob Discovers He Has the Disease of the Week." While Frasier (or the sitcom of your choice) may have writing that is above average for today's television drivel, the characters are all caricatures. They react neither the way we would react, ourselves, nor the way we would hope we would react. As a result, they don't engage us. Without engagement, there is no tension. Without tension, the humor is forced.
(Why do I hear the voice of Yoda in the back of my head just now, saying "pain leads to anger, anger leads to hate, hate leads to suffering, suffering leads to pain, pain leads to codependency," etc., etc.?)
I do not subscribe to the philosophy that everything is getting worse all the time. Nor will I go so far as to say that television writing is on a one-way slide into oblivion. Except when it comes to Saturday Night Live. Nonetheless, I think television humor has become substantially less sophisticated in recent years. "Edgy" or "cynical" is not the same as sophisticated.
One thing about being in the trough, though... things will get better. Someday soon, this may even be said of the Great American Sitcom.
November 11, 2002
|
Wayback (not "way back," but "wayback") when I was in college, a good friend and I enjoyed watching a television show called The Wonder Years, which focused on the coming-of-age of a fellow named Kevin and his friends and family during the 1960's. The story was told like one big flashback, narrated by actor Daniel Stern as the adult Kevin, even though we only saw the young Kevin (played by Fred Savage) on screen.
My friend pointed out on some rainy Tuesday many years after we'd started watching this show that every single episode seemed to involve the narrator saying something along the lines of, "I knew then that things would never be the same."
Kevin kissed his girlfriend Winnie for the first time, and he "knew then that things would never be the same." Winnie's brother was killed in Vietnam, and Kevin "knew then that things would never be the same." Kevin played hookey from Coach Cutlip's gym class, and he "knew then that things would never...."
Well, you get the picture.
It was sort of a funny formula, the kind that drinking games are made of. "Next time Kevin says he knew then that things would never be the same, everyone drinks a shot." Whatever. Despite this predictability, the show was fun to watch. Even as I type this, I realize that there may even be a little bit of "Wonder Years" that was lurking in the back of my mind as I began exploring the good and the bad of 1980's Buffalo in my recently completed novel.
But that's not why I bring this all up.
It seems that most days with Alexander are producing in me the same kind of "and I knew then that things would never be the same" response that seemed to fill up the ficitional Kevin's life. Ferinstance, Alexander (three and a half months old at this point) completely rolled over from lying on his back to resting on his tummy all by himself yesterday. More than once. After rolling over, he started trying to crawl. He moved around a bit, but didn't quite manage to get anywhere. But you could see he was figuring things out.
Once he rolled over the second time, I knew then that things... you know.
Allow me to point out that we don't currently have a television feed in our house. We rent movies, borrow DVDs, etc., to pickle our brains as necessary, but we don't have cable or satelite or anything like that. And yes, this is a little odd, given that my current project (near completion!) is a pilot for a television series being written on spec. It's also a little odd, given my role as some sort of pop culture consumer type guy. I'm catching up on my pop culture reading though. :-)
Anyway, this all means that Alexander hasn't been spending much time plopped down in front of the television. In fact, he hasn't been spending *any* time in front of the TV.
Until recently.
Now I must point also out here that there's this little device called a "pacifier" which is a pretty magical gizmo. You place the little rubbery thingy in his mouth when he's crying, and he stops crying. If he doesn't seem tired and you want him to sleep, you give him this wonderful invention, and he goes to sleep. I knew from the first time we gave him a pacifier and he took it that, well, things would never be the same.
Recently I was watching a video course (this is like an audio course, only it's... oh, you know) from the Teaching Company about detective fiction. I no longer get my pop culture the old fashioned way; now I watch videotaped college lectures about pop culture. (Actually, I'm learning more about the form of the detective novel because I think I can learn from these kind of thrillers as I put together my next novel.) As I was watching this very dry presentation by a rather high-pitched professor, I noticed that the previously-antsy Alexander had moved around on the floor where he was babbling so that he could see the screen. He was fascinated. Completely drawn in. The television was acting as uberpacifier. He watched until I was done with my lecture.
We are not using the television as a baby-sitter for Alexander, and we have no intentions of doing so. But now that I've seen the immense power of the television on our child, I can't unlearn that knowledge. Things will never be quite the same.
...I gotta say, though, that the television makes the pacifier look much less of a controversial choice than it once had seemed. :-)
October 09, 2002
|
Paulette recently sent some friends and me a link to an article on abc "news" dot com about research into the "World's Funniest Joke." While I'd hardly call this news, it certainly fills the infotainment genre that ABC, CNN, and others call news. I was very infotained, as several of the jokes listed were quite fun.
The winning joke, as quoted by ABC:
"Two hunters are out in the woods when one of them collapses. He doesn't seem to be breathing and his eyes are glazed. The other man pulls out his phone and calls emergency services.He gasps to the operator: "My friend is dead! What can I do?" The operator in a calm, soothing voice replies: "Take it easy. I can help. First, let's make sure he's dead."
There is a silence, then a shot is heard.
Back on the phone, the hunter says, "OK, now what?"
I like it. This would probably work better as a radio sketch than it does as a written joke, but I still like it. A friend, however, who was on this discussion thread said she had heard about the contest results on a local (New Jersey) radio station, and that the station had said that the winning joke was about New Jersey. Alas, looking at the ABC "News" article reveals nothing about New Jersey.
Then, on a lark, I checked the site of the actual contest, which revealed that the winning joke *does* mention New Jersey:
A couple of New Jersey hunters are out in the woods when one of them falls to the ground. He doesn't seem to be breathing, his eyes are rolled back in his head. The other guy whips out his cell phone and calls the emergency services. He gasps to the operator: "My friend is dead! What can I do?" The operator, in a calm soothing voice says: "Just take it easy. I can help. First, let's make sure he's dead." There is a silence, then a shot is heard. The guy's voice comes back on the line. He says: "OK, now what?"
Notice that the New Jersey reference is not the only change made in the ABC "News" article.
What galls me the most, insofar as anyone can be galled by an infotainment piece about the World's Funniest Joke, is that ABC "News" presented the winning entry in quotation marks and then paraphrased it, rather than quoting it.
And, why? Why? Did ABC's rewording of the joke make it any funnier? Any less offensive to New Jersey hunters? Any less shocking to the squeamish, with the original joke's reference to his eyes being rolled back? I mean, what gives?
I'm very accustomed to the "news" getting it wrong. Misquotes are a fact of life, and always have been in infotainment. But what gives when you have the original text right in front of you to cut and paste into quotation marks? What?
Censorship isn't funny.
Say, that reminds me of a joke. How many feminists does it take to change a lightbulb? Oh, wait....
August 13, 2002
|
In response to Part I of this essay, my friend Tyrean notes that the changes that come with having a baby are more impactful than the changes that come with getting a driver's license, et al, because they affect your life 24 x 7. There is something to be said for this.
Once you have a baby (and choose to provide for its care), each aspect of your life changes. Your daily activities may or may not change, but they need to be planned for with a new element in mind. Your prior method of scheduling time goes right out the window, and basic tasks such as eating, working, and especially sleeping are profoundly altered by the requirements of your new responsibility.
When I had been told this kind of thing, as an expectant parent, my internal response was, "Yes, I know this." And I also knew that living it would be different from knowing it, and that has also proven to be true. Knowing beforehand that your daily routine is about to be altered forever has a different quality of experience from dealing with the reality when it happens.
All well and good. But the same profound shifts have occurred several times in my life, as I'm sure they have in your, dear Reader. When I went away for college, I left my home town and my parents' house. Everything in my life changed profoundly: planning meals, managing my time for even the simplest tasks, and even my sleep schedule were profoundly impacted *every day*. Gradually the newness wore off, of course, but I still eat-sleep-act-think without parental interference. (This, some may argue, may not be such a good thing, given the photographic evidence.)
Getting married profoundly affected my life, as it may have yours. I don't think I went through the emotional swings that many of my friends have described -- rather, I did all that before making the decision to commit. Still, all kinds of decision-making were impacted because most decisions I made/make would/will have the added dimension of how they affect this other person in my life.
Moving to Russia, moving back, working full-time for a small business, going to grad school, working full-time for a behemouth corporation, working full-time for a high-profile dot com, going to Clarion West, etc., all marked major changes in my life. Certainly, some have been more profound than others, and some have had more long-lasting effects than others.
My central thesis, however, is that *whatever* the nature of the change, any well-balanced person requires these profound changes from time to time. Some changes are thrust upon us (like when we are forced at gunpoint to attend kindergarden), are accidental outgrowths of our own decisions (like becoming a paraplegic because you chose to get behind the wheel one day while under the influence), or are planned and desired landmarks (like deciding to go to college, get married, get a job, have a baby, whatever).
There is also a necessity for our lives to experience *retrenchment*. A prolonged period of stability, where we deliberately reduce risks and expose ourselves to less likelihood for change. This is certainly a physiological necessity, although we all obviously have different thresholds for how long and how stable such periods must be before we feel ready to push ourselves toward change again. The point is, we cannot live effectively if everything always changes. But we nonetheless require periods of change in order to advance our lives.
[I have a theory about the interrelations between fear and ennui as biological imperatives to encourage change and, thereby, growth and advancement. Would you like to hear it?]
I admire my grandparents, who embraced change in their lives not only as they advanced throughout their careers and their family life, but also beyond retirement. They are constantly trying new things, expanding their horizons, and staying involved in their lives. I think it is their ability to embrace change that has helped to keep them alive and alert and active this long.
A friend of mine picked up stakes and headed for New Mexico and (shudder) grad school after a profitable career at a major software concern where he was able to work without having completed his undergraduate degree. He has left the known (with all the good and bad that it includes) deliberately and chosen to make a change. That change now involves major house reconstruction, dinosaur digs, and developing patentable radar technologies.
Change is not for everyone at all times; it's certainly not for *me* at all times. But profound change is necessary from time to time for those who want to grow; who want to participate in their own lives. Having babies or dropping your career to go to grad school or getting married or even moving out of your parents' house may not be something you, my kind Reader, are interested in doing. But if you find yourself feeling a little bit of ennui, if your life seems to be just futile, allow me to suggest that you make a change.
You may discover that the change you made needs to be refined a bit (read: mistakes will happen), but at least you'll be participating in your life.
[end soapbox]
August 05, 2002
|
This two-part essay about "Changes" is not, ultimately, about having a newborn in the house, but that *is* where the essay begins. I'm starting off with an illustration, as it were, of one kind of life changing event, but there will actually be some content here that is not baby-related.
Allow me to start off my illustration, however, with a few baby pictures, since visuals are always fun.
In addition to the pictures I posted here when Alexander was first born, I've been sending along photos to Alexander's paternal grandmother who has been posting these additional photos on her site. I particularly like the batch at the bottom of page three, which I had taken when Alexander was only one week old.
I haven't had as much time as I'd like to scale the full versions of the photos I have down to a manageable size on the web, so I've been kind enough to allow Alexander's grandmother to take care of that. However, I *am* taking photos, and I simply *must* call attention to a couple that I'd taken yesterday, the day after he turned two weeks old.
It's amazing how much can change in a mere two weeks. The changes in Alexander's appearance only capture part of it; there are changes in how he vocalizes, changes in how he sleeps, changes in how he interacts. Naturally, we're still figuring things out. When he's awake, he's a very alert baby; when he is not happy with the world situation, he's very vocal about it. Each day is different in terms of how awake, how happy, how upset, and how hungry he is.
Alexander's mood, like anyone's, is prone to changing frequently and often. In a newborn, however, those mood changes do not appear to be terribly subtle or sophisticated. As adults, our emotions might shift several times within an hour, but the shift is rarely profound enough to be noticeable to outside observers... or even to ourselves. For example, in the mail, Paulette and I receive a gift for the little one from a friend, and I am happy. In the same pile of mail, there's the new car payment bill. I'm concerned. I drop the mail onto the kitchen counter and realize I'm hungry.
Little Alexander's shifts are a little more abrupt. He is set down on a favorite couch, as in the photo above, and he is happy. He remembers that the Dow Jones Industrial Average is off by several hundred points, as in the action shot below, and he is concerned.
This change took place within about ten seconds, in a photo shoot that lasted about, oh, thirty seconds.
When Paulette and I first started telling people that we were expecting, the most frequent response was "this will change your life forever." And of course, my most frequent thought about this response was, "Well, duh." Getting a puppy changes your life. Getting a driver's license changes your life. Forever! Anything that shifts your responsibilities and your capabilities has some profound effect on the quality and shape of your life.
Has Alexander changed my life? Certainly. But the whole idea that "having a kid changes your life" is trivial. It's a tautology.
Try this one on for size: life is all about change. Change *is* life. Once you stop changing, your life is ostensibly over.
*That* will be the focus of the second part of this essay.
...to be continued...
June 03, 2002
|
Is the following essay a coherent expression of an interesting concept, or just a pep talk to myself? I'm not sure.
I mentioned earlier that I'm discovering (rediscovering?) the power of commitments. While we might generally agree that it's a good thing to keep your commitments to other people -- you know, show up on time, do what you say you're going to do, etc., etc -- there is also a concept that I was introduced to a couple of weekends ago that goes a little deeper. That by establishing that your word is good, you give new power to your word. Your word becomes a tool to create. Since I'm working on my writing and speaking careers, this concept carries a few interesting entendres.
The concept as explained to me -- or rather, as I interpreted it -- goes like this: when I give you my word that I'm going to do something and I fail to follow through, I injure us both... even if only slightly. You are likely to feel let down, even if only a little, by my failing to follow through. I'm late to our appointment, for example. Not the end of the world, certainly, but you're a little put out. Bothered. It doesn't enhance your day, and can only serve to detract from it. Likewise, I feel a little bothered. Disappointed in myself. Rushed or frustrated. It doesn't matter *why* I didn't meet my commitment. The reason may be just as trivial or large as the promise itself. Either way, though, I've made a negative impact on our respective situations.
Likewise, if I fulfill a commitment I've made to you, both of us benefit. Again, you may not end up jumping for joy because I managed merely to meet you at the agreed upon time or followed through on some small favor, but by doing what I'd said I would, you at least feel good that I followed through and we can move forward from this point. I, likewise, feel reliable. I've invested some worth into my word.
Now, I realize this may sound like a load of New Age, touchy-feely nonsense. I've long subscribed to the concept that keeping a contract (however you choose to word it) is of the utmost importance -- whether from the Objectivist Epistemological arguments of Ayn Rand, or simply from the aphorisms my grandfather used to recite to me over and over again ("If the appointment is worth making, it's worth keeping," et al). When you get right down to it, this is the Randian contract as seen from a psychological viewpoint instead of a moral or social context.
But let's take it one step further. When you make a promise to yourself, you are doubling the stakes. When you break a promise to yourself, you are injured both as the promissor who failed to follow through and as the promissee who was failed. Likewise, when you keep your commitment to yourself, you benefit both as the promissor who followed through and as the promissee who was valued.
Now, none of us can keep every single promise we make. For many of us, most of the transgressions tend to be small. "Sorry I was ten minutes late, but traffic was awful." This may be because most of our commitments are actually small in nature. Ultimately, the damage or the benefits of breaking or keeping your word accumulate over time.
If you tend to make your word good, if you make a point of honoring your commitments, then when you give your word, you are more likely to be moved to make it happen. If you keep your promises to yourself and others, you are more likely to keep more of your promises. In other words, you develop a cycle of reinforcement.
(It's fun to note that in the ancient tradition of the Judeo-Christian model, God and the Word were one and the same. Here we have a being whose Word *is* Good, and therefore when the Word is issued, what is said simply... is. "Let there be light!" Lo and behold, there is light.)
So why do I bother mentioning all this mamby pamby mush? Because it provides me with a lever with which I can begin to move myself in the direction I want to go. Because by rededicating my word, I am making more distinctions, better decisions, and stronger commitments. I'm making fewer promises, now, but I'm making those commitments stick. As my word gets better, not only to others but also to myself, I'm finding it less effort to move forward.
A few entries ago, I mentioned that I'd followed through on a commitment to begin writing a new short story and to send out another story for consideration by a publisher. I then made a new commitment: to finish the story I'd started and to have it out by this past Friday. I have to admit, I wasn't feeling terribly moved by this commitment. It was a half-hearted promise to myself, at best.
But I decided that if my word is to have any weight, I have to do what I can to follow through with my declarations. I didn't even decide this with a great deal of deliberate thought. If I had, I think there would have been more of a "chore" aspect to following through. Instead, I simply... did it. Recent habits helped carry me forward. I began Friday with about 1,300 words or so written. By 11pm, I'd written a total of 3,600 words, trimmed 200 back out, and sent it off to my critique group. I'm now intent upon sending it out for consideration by June 15th.
As I develop the habit of keeping these small commitments, I expect to be able to follow through on the larger ones. Like becoming a published novelist. Like becoming a great father to my child/children. Like building a life that matters.
This may all be mumbo jumbo, or it may be the most profound concept ever devised. I'm inclined to think that it falls somewhere between the two extremes. I can, in fact, think of several counterexamples of people whose word was worthless but who nonetheless managed to do big things. However, I do know that renewing this concept of commitment is helping me right now to go where I want to go more effectively than I'd been managing before. So, whether it's legit medicine or just a placebo, I think I'll see how far this concept takes me.
--
Side note: is this what those "Promise Keeper" groups are all about? Is there some kool-aid I should be drinking? I wonder, but I'm not sure if I really want to know. :-)
October 24, 2001
|
1. Do the goals/ideals of feminism mesh well with the genre of science fiction? Why/why not? Does science fiction offer any special opportunities for feminist writers? Or, does it present any special difficulties?
The first part of the question assumes that there is a coherent set of goals/ideals associated with the term "feminism" -- an assumption that I think is dubious. The term is generally considered to describe "the doctrine advocating social and political rights for women equal to those of men." (This is the definition found in the Random House College Dictionary) I favor this definition.
However, Gloria Steinem and Camille Paglia, among many others, show just how divisive the moniker "feminism" can be. There are several major schools of thought pertaining to the advocacy of social/political equality for women, and they are often bitterly opposed. The legality of abortion, for example, is both fought and defended by camps claiming to defend feminist ideals. Some feminist camps deride the choice that some women make to become mothers or housewives, while other camps maintain that women do not have to pursue careers to the exclusion of family in order to become "equal".
Since (unlike my friend in grad school) I am not a student of feminist theory and am therefore not certain which aspect of feminism is being favored as the "true" school of thought, I'll simply refer to feminism as defined by Random House, above.
There is also the problem of defining science fiction. There is a very long and hard fought disagreement among those who discuss this field as to whether a story must rely exclusively upon scientific principles in order to count as sci-fi. For example, since several of Ray Bradbury's stories in The Martian Chronicles do not *have* to occur on Mars in order to still be coherent, do they count as sci-fi? Again, I'm going to defer to the definition I find in my dictionary, rather than go into this argument here. Random House defines science fiction as "a form of fiction that draws imaginatively on scientific knowledge and speculation." I read this to include the works of Ray Bradbury and Ursula LeGuinn, even though others may disagree.
The ideals of social and political equality for women clearly mesh well with the genre of science fiction. The genre encourages authors and readers to consider not only what life and human nature is like now, but what life *could* be like, given any number of opportunities, environments, or histories. It allows us to speculate on the good and bad results of living in a world where equality is supported or denied. It affords us the chance to consider "What if...?" As we imagine these different possibilities, it also allows us to imagine that they are possible, and that we might well pursue and attain them.
In general, stories in the genre tend to favor the ideal of a society in which women and men are socially and politically equal.
That said, while the genre meshes well with the ideals of feminism, it does not always conform to the ideals of feminism. Because this is a literature of speculation and free-thinking, it also includes stories that endorse or advocate views opposed to those of feminism. Given the definition that feminism is an advocacy for equality among the sexes, the fact that science fiction includes some works that do not share that point of view reveals that the goals of this genre *can* mesh well with those of feminism, but that doesn't mean they always do.
Science fiction does, however, offer many special opportunities for feminist writers. Like other genres of literature, it enables authors to tell stories that embody or challenge ideals of human relationships -- political, social, and otherwise. But, what is unique to this genre is the ability to extrapolate behaviors from settings; to distill ideals to their purist forms and tell stories that evoke much more vividly the concepts that are being presented.
While there are historical fiction stories that may display the grit and resourcefulness of a female protagonist, or mainstream novels in which equality is shown to be preferable for all concerned than inequality, science fiction can challenge our assumptions on a more basic level. For example, Ursula K. Leguinn's classic The Left Hand of Darkness takes us to a society were members are inherently equal with regard to gender because they do not express/embody gender except during mating season, and even then, they may change from one gender to another as they move from one mating season to the next. In a society where gender is not a given, we look look to other cues to explain characters' behavior.
When I'd begun writing this essay, it seemed to me that science fiction presents a particular difficulty to the feminist author, however, that other genres do not. There has long been a general precept in science fiction that something has to happen -- that action must take place -- in order for the story to move forward. This is not a requirement imposed by other genres, where it may suffice for a story or novel to simply describe a setting or a society without much activity on center stage.
Milan Kundera's literary novel The Unbearable Lightness of Being explores the social and sexual roles of men and women (and has a very strong female protagonist) against a backdrop of invasion and war... but, the characters never do much of anything. They talk a lot, and it is through these conversations that their gender roles are explored. But action? Forget it. I even learned recently that in the movie version, the director had to substantially cut back on the battle footage montage because it stole attention away from the non-action of the rest of the film. There could never be a science fiction equivalent to Kundera's work.
Science fiction, conventional wisdom states, requires action. This is not to say that lizards need to eat their way out of our favorite characters' bodies, or that kickboxing robots are necessary to blow up large buildings. Nonetheless, characters need to be going places and doing things.
The more I've considered this idea, however, the more I realize that it is not entirely accurate. There are counter-examples. Flowers for Algernon, one of the genre's best examples of an intensely personal exploration of the meaning of identity, is hardly action-packed. The conflict is ultimately, as it is in Unbearable Lightness, internal to the characters.
That said, I suspect it is nonetheless harder for a writer to present the "people talking" style of story within science fiction than in the more mainstream genres. Is this a "special difficulty"? Perhaps not. This tendency toward action within sci-fi has not discouraged writers from "talking" at length in their stories about the points they are trying to make (Robert A. Heinlein and Ayn Rand leap to mind).
---
Sheesh. I sure can leap into that stuffy old academic tone of voice when I want to, no?
Tune in tomorrow, when we address the second question in the series. :)
October 05, 2001
|
The following essay was originally posted here on November 5, 1998. I am reposting it now partly to explain my little outburst in yesterday's entry, and partly because I can't think of what to write tonight, and partly to justify an upcoming essay. This essay may also be of interest with the recent launching of yet another new Star Trek series....
* * *
Do you follow NFL football? There's an interesting story brewing in Minnesota and Buffalo, where two old "has beens" are turning in extremely strong performances at quarterback. Randall Cunningham of the Vikings and Doug Flutie of the Bills are two former stars who have returned from recent obscurity to just eat up the attention of football fans everywhere, leading their teams to the tops of their respective divisions.
What's that? You don't care? You haven't been following this uplifting story? You weren't aware that Doug Flutie came back to the NFL -- refusing a million dollar contract from the Canadian Football League to take a paycut of 75 percent -- simply in the hopes that by playing in the higher-profile NFL, he might raise awareness of autism? (Flutie's son is autistic) Or that Randall Cunningham credits his resurgence to a newfound faith in God?
You weren't paying attention to the fact that these guys are over 32 years old and can still play this kids game better than the $25 million kids who are supposed to be the best?
It doesn't thrill you to follow the story of how these grown men put on brightly colored costumes and then go run into each other in the hopes of carrying an oddly-shaped ball across an arbitrarily set "goal line"?
Well. Lemme tell you something. I didn't used to follow sports, either. But, lately, I've gotten more into it... to the point where I actually not only watch the games on TV when I can, and attend a few in person -- on occasion -- but, I even read the articles in the sports pages. Not just the scores... the actual articles!
This has been a gradual change in me. But, the question has come up from time to time: why? Why do you care about what's going on in professional sports? Recently, I had a chat with a friend who posed this question yet again. Only, this time, I stumbled upon an answer.
Competitive sports are, like novels or movies or television sit-coms, a particular kind of entertainment. Like the daytime soap operas, they are serials -- each episode building upon the previous to tell a story line that spans several months, with recurring themes year after year.
Football is like Hill Street Blues -- a soap opera with more violence and less romance. Baseball is like Dick Francis novels... the story always follows the same formula, but the details of each story vary. And, lets face it, some endings are more satisfying than others.
In fact, the best comparison that I can think of is to view professional sports as a kind of live-action equivalent of the Star Trek novels, books, or TV shows.
What?
No, really.
First, there's the formula. Each sport consists of a league of teams which are composed of characters who fill particular roles (the quarterback/pitcher/captain, the running back/designated hitter/engineer, the receiver/catcher/science officer, etc.) that play out their drama within a certain set of goals (get the ball into the endzone, run to home base, spread peace and harmony throughout the galaxy) within a certain set of rules (try to make 10 yards within four downs, try to score a run before three outs, try to seduce the alien spy before the show is over).
As in Star Trek, pro sports have a code of conduct which may or may not result in penalties... it all depends upon whether you get caught (no holding, no stealing, no interference with the development of a civilization's culture).
But, as with Star Trek, the whole is greater than the sum of its parts in professional sports. Although one episode/game contains a great deal of drama, intrigue, and even a little character development, a series of episodes/games strung together in a season tells a more sweeping story. It is precisely this sweep that grabs the interest of the sports enthusiast and the Star Trek geek. As a football season progresses, we begin to notice certain teams emerging as contenders for dominations of the league... just as a season of Star Trek may begin to reveal certain races/species/aliances vying for dominations of space/time/whatever.
Sometimes, such contenders may suddenly fall apart. The Denver Broncos, in the season before they won the Super Bowl, were eliminated from the playoffs by Jacksonville in the first round. The Borg, just as it's shaping up to be a real menace one season in Star Trek: Voyager, gets practically wiped out by "Species 8472".
Then again, sometimes the underdogs struggle back from near annihlation to virtual dominance. There was that year the Buffalo Bills came back from a pathetic opening of a season to just barely get into the playoffs to stage the greatest comeback in an NFL playoff game ever, and eventually even make it to the Super Bowl as a Wild Card team. Just like the Cardassians, after beaten into submission, form a surprising alliance with the Founders to end up wipiing out half of the Federation fleet.
Of course, the Bills lost that Super Bowl, and the Cardassians are having troubles of their own in the Star Trek universe.
Here's a similarity with a twist: both pro sports and Star Trek have good guys and bad guys. Heroes like Mark McGuire, Joe Montana, and Captain Kirk. Villains, as well... Charles Barkley and the Evil Romulans. However, in sports, villains are usually identified as powerful adversaries to your particular favorites. So, if you're a 49ers fan, Green Bay or the Cowboys might be your villains. In Star Trek, alas, the villains are a little more universal. We know when to boo the Klingons or the Cardassians, because we are told under no uncertain terms that, at any given time, they unequivocally represent evil.
As with Star Trek, sports' sweep extends beyond single seasons. The Klingons evolve from season to season, changing from dishonorable enemies to wary allies to brothers-in-arms. The Broncos dominate the AFC but lose every time they reach the Super Bowl... until, near the end of John Elway's career at quarterback, they finally win the Big Game. Traditions and records span through the seasons; some changing, some not. Vulcans are traditionally logical. Yankees are traditionally jerks. Kirk is often alluded to as a history maker in the Star Trek mythos. Likewise, Joe Montana or Babe Ruth. Remember the Curse of the Bambino!
Ah, which brings me to the real draw of sports as entertainment. Depth. The more you follow the story, the more details you discover that subtly enhance the story; give it flavor.
If you are intrigued by the story line of a Star Trek series, you can get into other series... or, the books, the movies, short stories, interactive computer games, technical manuals, collectible toys.... The Star Trek universe is rich with detail.
The same is true with sports. You can follow the careers of specific players, teams, divisions, coaches. There are stats, records, and scores to track for a game, season, career, or even the entire history of a team, league, or the sport itself.
As I mentioned earlier, the ending isn't always satisfactory. One episode/game may be poorly written/played, or have an outcome you don't like. The bad guys sometimes win. Luck sometimes has more impact on the outcome than ability. Sometimes, you cheer the good show of the good guys, you appreciate the development of a dynasty; but, sometimes, you also see cynicism win out. Florida Marlins, anyone? Star Trek V: The Final Frontier?
Like soap operas, Hardy Boy mysteries, and other forms of serial entertainment, neither Star Trek nor pro sports show signs of ending. The story is open-ended and ongoing. This, too, may be part of the draw. Fans get upset -- very upset -- when a soap is threatened with cancellation. Days of Our Lives, anyone? Witness the fan reaction to the cancellation of the original Star Trek, or the various baseball/football strikes. Look at the current NBA lockout. Competitive sports are as much an opiate for the working class as soaps used to be for the traditional homemaker or Star Trek is for geeks. Because they endure.
And, that's what keeps us coming back.
August 26, 2001
|
This year's World Con -- the World Science Fiction Convention, which is the largest annual assemblage of professionals, semi-professionals, and fans in the industry -- is being held in Philadelphia starting on Wednesday, August 29th. Dubbed "Millenium Philcon," this year's event is described at their website, http://www.milphil.org. Each year's event tends to take on the name of the city in which it is held. Last year's World Con, for example, was held in Chicago and was therefore known as "Chicon".
Each year, attendees get an opportunity to "vote" on where the convention will be held in three years. Thus, at Chicon last year, the attendees cast their ballots for where World Con 2003 would be held (the winner was Toronto). This year, we will have a chance to decide between two final candidate cities for the site of World Con 2004: Boston, MA and Charlotte, NC.
In order to vote, you must essentially buy a supporting preliminary membership to that year's convention. In other words, you have to put your money where your mouth is. I'm not sure if the minimum you can put down is $50 or $100, or what, but I'll be finding out soon. This year, I intend to vote.
If you, dear reader, happen to be an attendee at this year's World Con, I strongly encourage you to vote for Charlotte, NC in 2004.
Reasons to vote for Charlotte instead of Boston:
* Boston's convention facilities and airport are currently a disaster because of a massive public works project called "The Big Dig". There is no realistic reason to believe that this work will be completed by August of 2004. Because of the Big Dig, traffic into and out of the airport is a nightmare; renting a car involves a hellish journey into the bowels of Revere, MA (many miles from the airport itself), which takes you further away from the convention facilities and places that much more construction and traffic between you and your World Con. The facilities themselves are not conveniently located all in one easy-to-navigate area, and have a run-down quality that I would hardly deign to call "charming".
* Charlotte does not suffer from the ails of the Big Dig, and its facilities are newer; more modern. The city is easier to navigate. The airport is easy to manage.
* The people. Simply put, the locals in Charlotte are pleasant; the locals in Boston would just as soon you go away, which they make paifully clear in every encounter.
* The traffic. Boston drivers are aggressive to the point of being homicidal. If you dare use your traffic signal, they will immediately move to cut you off... even if it means missing their exit or turn. I've seen it happen. I lived there for several years, and was reintroduced to this sad fact the last time I visited the area (I made the mistake of using my turn signal, and was rather rudely reminded that I had to relearn all of my old, nasty Boston driving habits if I was to survive). Drivers in Charlotte, in my experience, are reasonable; quick, without being rude.
* The weather. The natural unpleasantness among the Boston drivers and the shopkeepers in the area is exacerbated by the brutally muggy and hot summers. If weather is a factor in your voting, let me tell you: Charlotte has no disadvantage when it's compared to Boston in the summer. Both will be hot; Boston will be unbearably muggy.
* Hospitality taxes. Okay, I have to admit something here: I don't actually know what the hospitality tax situation is in Charlotte. All I do know is that it is patently absurd in Boston. Want to rent a car? There's sales tax. Excise tax. Massport (airport useage) tax... even though the rental car companies are not actually located at the airport during the Big Dig. On my last visit to Boston, the taxes added roughly 35% to the total bill. Want to get a room at a local hotel? There, too, the taxes are simply outrageous. Again, I can't say whether this is the case in Charlotte. They certainly must have *some* taxes upon the hospitality industry. I intend to do the research. But, let me warn you, friends: the Commonwealth of Massachusetts has worked hard to earn the moniker "Taxachusetts".
Okay, I'm ripping into Boston little bit here. Believe it or not, I love to visit Boston. It's one of my favorite cities in the country to visit. But, not in the summer; not in August. I've done too many conventions in Boston at that time of year to know better. (Living there for several years has also informed my thinking on this subject). Visit in the fall. Visit in the spring. But not in February. And not in August.
Charlotte is also a fun town to visit. Clean and friendly, with many interesting sites to see and new facilities to enjoy. It's a town that knows how to beat the heat... it has to. :)
So, if you're going to be at World Con this year, allow me to encourage you strongly to vote. And, if you do vote, allow me to encourage you to vote for Charlotte.
Either way, I look forward to seeing you at World Con 2004... as well at World Con 2001!
Your humble Science Fiction correspondent,
--Allan
May 10, 2001
|
beginrant
So, people are still making snide comments in e-mails and web postings about "the stolen election" and how the Supreme Court "gave him the office". They do this apropos of nothing, discussing topics that are in no way otherwise related to politics or government. I see it repeatedly on any number of listserves I'm on and websites I track.
Now, I have to confess that our Fearless Leader is not impressing me thus far. Aside from his general ungoodspeakeness and his dubious handling of certain foreign affairs issues (the one area where his father particularly outshone the eight-year interim office holder), I'm most bothered by el Presidente's insistence upon making faith-based charity organizations into yet another government welfare baby. When some administration down the line chooses to cut this particularly dangerous cord -- and this will happen, someday -- these organizations will suffer the same withdrawal symptoms from the crack cocaine known as Federal Subsidies that so many other local- and state-based organizations have suffered when their own supply was cut. (Remember what happened when President Reagan finally pulled the plug on those ill-advised educational welfare programs in the mid-80's, anyone? Now, *that* was painful... and, totally avoidable had the crack not been handed out so gleefully by previous administrations.)
But, all that being said, the problem remains that whether y'all like the facts or not, our current President was selected by the very same system that has been in place (with a few tweaks from time to time) since the Constitution was adopted. You can bang your drums about how just one more recount might have changed the results, or how the Florida ballot unfairly penalized idiots who couldn't remember to read the bloody directions (the form, interestingly, was designed by a member of the losing political party and was approved by a bipartisan panel and had been used, in various incarnations, repeatedly both in certain Florida counties and other counties throughout the country for decades), but the facts remain these:
1) the vote was a statistical tie
2) supporters of the losing candidate were going to be bitter about the results, regardless of who eventually "won"
3) in the end, this country determined the results of a bitterly contested and pretty much evenly-divided election through legal institutions and not through more nefarious means.
So, please, for crying out loud: Get over it!.
We survived Bubba; we'll survive Dubya. Now, stop your whining.
And if it bothers you that much, get involved in your local elections later this year. The reality of the situation is that your local and state legislators have a much more dramatic impact on your daily quality of life than any yammerhead in Washington. If you don't believe me, spend some quality time in Buffalo, Boston, Seattle, and San Francisco all in one month. Same country, same Federal programs. Very different economic and cultural climates. Why? Local politics.
I know, I know. It's easier to whine about how things didn't go the way you think they shoulda down in some backwoods southern districts than it is for you to get off of your lazy butt and try to do something that might actually make a real difference in your life. Quite frankly, I was more bummed about the results of the national primaries last year than I was about the results of the general election. But I'm tired of hearing about it. It's over. Let it go. Please.
endrant
April 19, 2001
|
Hmmm. Had an epiphany about my employer today regarding the direction things are heading within my department. It's a three part epiphany, which I will summarize forthwith:
1) My group is stratifying along functional lines rather than business sector lines. At first blush, this is obviously less efficient for each product line when it comes to attending to their specific business needs, but it has the potential of being *more* efficient from a company-wide perspective. Why? Because, if all Web Devs or Program Managers or Catalog Specialists are interchangeable, then you can shrink or grow headcount as needed.
So far, this is hardly interesting. Having a re-org in order to accommodate layoffs or massive expansion is to be expected. However, I'm coming to see -- with each new 'process' and 'workflow' -- that we are adopting the McDonald's model of reproducibility. (Sorry for all of the potential spelling errors in here, by the way. It's late, and I won't be running this through a spell checker tonight.)
Once you have functional uniformity, and each functional unit interacts within a clearly established framework, then you invite the opportunity to franchise off sets and subsets of your operations. As goes Amazon, so goes the Borders.com/Amazon.com deal, and so goes Amazon.co.uk, and so on. Work will not get done terribly quickly on a store by store basis and store-specific innovation will become practically unheard of, but company-wide initiatives and innovations will be more easily and effectively propagated.
Thus, big-picture-wise, this should be a good thing.
2) That said, the current employees come to the realization that they are, nonetheless, "training their replacements". This was the big outcry from the latest round of layoffs at my employer: the Customer Service team was sent out to build a new team working on the other coast of the country, only to return to Seattle and be handed pink slips. This was a rather surprising reward for being so loyal to their company.
Alas, alack, from an objective position, one can recognize that this is simply a business decision that will necessarily have growing pains. C'est la vie, and don't let the door hit you on the way out. Truly, there's no need to take it personally... the company owes the employee wages in return for the laborers efforts, and no more. Loyalty -- by the company toward the employee or visa versa -- is neither required, rewarded, nor appropriate.
So, knowing this, I and my fellow employees can choose to accept the reality for what it is and stay until our run is through, or we can mosey along now while the moseying is good.
But.
3) Then there's the movie "Memento". In this movie, the story begins with the last scene and then works it's way backwards. The story is told from the point of view of what writers lovingly refer to as "the Unreliable Narrator." This Unreliable Narrator suffers from a kind of brain damage that won't allow him to make new memories ever since he took a rather nasty blow to the head. The only way for him to follow a line of continuity toward his stated purpose (which, as revealed in the very first scene, is to kill the man who raped and murdered his wife) is to leave himself notes, polaroid pictures, and other clues/reminders about what he has discovered and what he needs to do next.
From a story-telling standpoint, the technique is terribly fun to watch. But, from a story standpoint, you quickly learn an inherent problem: he who has no immediate history is apt to magnify the foibles of his immediate past.
My employer has this kind of condition. My employer, like the Unreliable Narrator of Memento, apparently is unable to make new memories. And so, it keeps covering the same ground, not realizing that it has tried certain approaches before that have led it astray from its stated goals.
Centralization along functional lines may aid in replication (the franchise formula), but it will never aid in increased efficiency among business units. Amazon.com's stated goal is *profitability*, and it's stated intention is to do this with the existing business (and not by selling itself off as a franchise). To attain profitability, the company must enable its most profitable (and/or best-margin) stores to immediately react to changes in the marketplace. Thus, a decentralized model is the most likely candidate. Layoffs, which are easier in a centralized world, are not a ticket to profitability. Ever.
My employer has vacillated back and forth between the centralized and decentralized model several times. Is the problem one of ever-changing goals? I'm not so sure. More likely, I think it's a case of having no short-term memory. It conducts experiments and then forgets the results.
This is too bad, because if this is, indeed, the case, then we are looking at an Unreliable Narrator which will ultimately lead itself, inadvertantly, far away from its desperately sought-after goals. It's always a shame to see any person or organization with so much potential end up totally burning itself (himself/herself) up. It's even more of a shame to be a party to the situation. I'm a passenger in a car that is running a red light, and I don't know how to affect the driver or the vehicle and thereby avert the imminent wreck.
April 04, 2001
|
We all train each other on how to behave. Every day.
Habits form at the outset of any relationship, and they tend to reinforce each other and evolve each other over time.
Take the customer/vendor relationship. Customers say they want good service, but when it comes to putting their money on the table, they often grant their patronage to stores with bad service because, well, it's cheaper. So, the cheaper-but-you-get-bad-service behavior is rewarded, and it gets reinforced.
When you visit sites on the web, they sometimes send instructions to your browser to open up a new window with some advertisement or another. This is very irritating. One major online retailer has also discovered that when *they* pop up a window promoting a special sale, more people end up buying.
The result is that now this online retailer pretty much *always* puts up that annoying pop-up. It won't be long before the other major online retailers do the same. We, the customers, are rewarding them for their bad behavior.
Now, I should also point out that the major online retailers track where you come into the site and at what point you leave. They do this to find out what's working and what isn't.
If, like me, you are annoyed by unsolicited advertisement windows popping open on your browser whenever you visit an online retailer, my advice for you is to simply close all of your windows related to that store and wait a few minutes before reentering. If enough people do this, then the stores will stop this behavior. I know this for a fact: I (so far, at least) still work for one.
March 27, 2001
|
There's no shortage of news like this throughout the country these days, but I'm amazed at this news item, nonetheless. In Buffalo, NY, the Powers That Be (read: the idiot lawmakers) have decided to try an education experiment that will be funded with federal money.
They are going to pay students $5.00 per hour to attend summer school who require the summer session in order to advance from 8th to 9th grade. That's right: students who are not meeting the state minimum requirements to be admitted into high school are going to be paid to attend summer school.
What are these nitwits thinking? The are going to financially reward students for failing to meet statewide minimum standards. This is as perverse a system of educational incentives as any I've ever heard.
In school districts around the country (including the one in which I briefly taught eighth grade math), honors and "advanced" classes are being scrapped for fear that their very existence might hurt the self esteem of those students who are not selected. Being ahead of the intelligence curve (or, simply applying one's brain at all) is not being encouraged or fostered. That's already bad.
But rewarding sub-par performance? This is somehow going to improve the "outcome-based" results of public education?
I guess the theory behind the new program is that requiring students to attend summer school is not enough, and we should provide added incentives for them to attend. I, for one, am in favor of a more traditional incentive: let's *really* not let them into the high school until they have legitimately fulfilled the requirements of entry. (There are another few essays in me regarding why students are promoted without having met the minimum requirements, but those will have to wait for another day.)
There is an old -- and rather ironic -- Russian phrase that says "people will get the government that they deserve." While we may agree or disagree with this sentiment, the fact is that when the government engages in social engineering -- and any and every policy regarding the education of its citizenry or future citizenry is, by definition, a social engineering project -- the government does end up with the citizenry it deserves.
We have seen numerous examples of how, when the population is rewarded for bad behavior, the result is an increase in the undesirable results. The welfare system in New York State (and other states, as it so happens) that rewards pregnancy and punishes marriage has resulted in a disproportionate number of unwed mothers among the poor in New York State. This, in turn, has resulted in a number of societal ills: single-parent families in poverty are more likely to stay in poverty than two-parent families; children in single-parent families are more likely to be abused; children in single-parent families are more likely to engage in drug use, crime, and the like.
What, then, can we expect of a system that pays our society's children to perform poorly? What can we expect of any system that reinforces any behavior? We can expect to see an increase in that behavior over time, until it is endemic. In this case, we can expect to see a stellar increase in poor performance.
Let's not reinforce poor educational practices. Let us, instead, reward excellent performance. Let's recognize those who do well, and give children across the board unequivocal incentive to excel.
As for Buffalo; if they enact this policy as they are currently planning, the performance of its children will decline significantly in the coming years. And that is a crying shame.
March 18, 2001
|
Seems these days all I do is carp (karp?) about my job or politics. My plan today was to take a lighter subject write about "Quotable Underpants" (you'll see what I'm talking about when I get around to writing that essay), but a friend of mine called me twice this morning about what he saw on TV, and it brought me right back. I keep trying to get out, but they keep pulling me back in.
Seems that on this morning's "This Week with Sam Donaldson", Jeff Bezos came on and Sam grilled him about what it means to become "pro-forma profitable". My friend was incensed. "Where were theses guys last year? Why didn't they hold Jeff's feet to the fire last year instead of making him Time's Man of the Year?"
My reply: "Last year, the stock price was high and Amazon was still promising to *lose* money. As long as you promise to *lose* money, it's really not important which accounting method you use."
Anyway. I'll karp (carp?) more about work in another essay. My friend went back to watching TV, and then called me again a half an hour later. "George Will was just on. He says that Barlett's Familiar Quotations is coming out with a new edition, and it will contain only three quotes from Bill Clinton. Guess which three."
Now, this is a fun game. The first one was easy. "I did not have sexual relations with that woman. Miss Lewinski."
"Yup. Next?"
The second one was also easy. "That depends on what your definition of 'is' is."
"You're two for two. Next?"
I must confess that I had to think about it. It took me almost five seconds. But, I finally came up with, "I didn't inhale."
My friend told me that, indeed, those were the three Clinton quotes that made it into Bartlett's. He said that George Will then went on to compare these quotes to the many Kennedy quotes that appear in the book.
After our conversation, I thought about this. What are three memorable quotes from Bush? Reagan? Carter? Ford? Nixon? Let alone Kennedy and Johnson. I also realized that, truthfully, comparing Clinton to Kennedy is a little disingenuous... even though Clinton has long maintained that he wants to be considered the modern JFK. Observe:
The three quotes that come immediately to mind for George Bush are not all that wonderful.
"Read my lips: no new taxes." A broken promise.
"A thousand points of light." A vague campaign analogy.
"Voodoo economics." A slam against Reagan's proposed economic plan when the two man opposed each other for the Republican nomination in 1980.
(My copy of Barlett's does refer to all of these. It is a 1992 edition. Barlett's also reminded me of one that didn't make my initial three: "I want a kinder, gentler nation.")
If we grant Bush "kinder, gentler nation" and drop one of my other three, then I guess we get a mix of good intentions, but still not terribly strong stuff.
Well, I started having fun with this. Name the first three quotes that come to mind of a recent President, and see what Bartlett's recorded.
You may want to try this before you read what I came up with (and what my 1992 edition of Barlett's came up with). It's fun.
Reagan: I didn't have to think long at all to come up with three quotes from this man. First, there's "Mr. Gorbachev, tear down this wall." Interestingly, this doesn't appear in my copy of Bartlett's. I can only hope they add(ed) it in a later edition.
The second one that popped into my head was "I didn't leave the Democratic Party. They left me." This one also doesn't appear in my copy of Barlett's.
My third quote from Reagan (or, rather, the third one that came to my mind) was his reference to the Soviet Union as "the Evil Empire." This one did make it into Bartlett's.
After I perused Bartlett's (there's a good one about "Government is like a big baby -- an alimentary canal with a big appetite at one end and no responsibility at the other."), I was reminded of another one that didn't make my initial list of three but should have, and which also isn't in Barlett's but should be. It was a gaffe; Reagan was performing a microphone test prior to a radio address, and someone had recorded his joke test message and sent it to the media. It caused quite a stir.
"I am pleased to announce that we have just passed legislation outlawing Russia. The bombs will be flying in ten minutes."
So. The quotes that come immediately to mind about Reagan convey power of conviction, if nothing else. Bush's echo with unfulfilled good intentions. Clinton's are defensive nonsense designed to confuse, not to clarify.
What about Carter? I'm sorry to say that the only quote that came to mind was from an interview when he admitted to having lusted after other women in his heart. This was hardly strong stuff, but Carter was a born-again Christian, so I guess it made waves in that context. (According to Barlett's, he said "I've committed adultery in my heart many times. This is something that God recognizes I will do -- and I have done it -- and God forgives me for it.") There are other quotes attributed to Carter in Bartlett's, but none of them sound either familiar or important.
Ford? Again, I come up short. There's only one that sticks in my mind: "Our long national nightmare is over." (This was in his first address to the nation after Nixon resigned.)
Barlett's also includes "I'm a Ford, not a Lincoln" and a gaffe from a debate with Carter. It does not mention his "Whip Inflation Now" slogan. Okay, so that's two I came up with.
Nixon? Ha!
"I am not a crook." (in Bartlett's)
"Peace without dishonor." (not in Bartlett's -- I'm thinking that he said something along these lines with regard to pulling out of Vietnam)
"You won't have Nixon to kick around anymore..." (this one is in Bartlett's)
Nixon also coined the phrase "silent majority", which is a great term. I'd forgotten that was him. But, I *did* remember the famous Checkers speech, in which he successfully deflected accusations of an illicit slush fund by saying that the only potentially inappropriate contribution he'd received was a puppy named Checkers, and by golly, he and his family were going to keep that puppy.
I'm going to skip to Kennedy now. Each of the above mentioned Presidents only has a few quotations listed in Bartlett's. Kennedy has a couple dozen. I don't necessarily recognize each of these allegedly familiar quotations, and I don't think the man was any more quotable than Reagan, but I'll let that go for the moment. Kennedy certainly resonated for a generation in a manner that no President has since.
Here's my top three for Kennedy (all of which appear in Bartlett's):
"Ich bin ein Berliner." (Barlett's points out, correctly, that this translates literally to "I am a jelly donut." But, it also notes, correctly, that the Germans understood the point he was trying to make... even if it did raise a few chuckles at the same time.)
"Ask not what your country can do for you. Ask what you can do for your country." (Bartlett's also notes that this sentiment appears in speeches by three other prominent statesmen: Oliver Wendell Holmes in 1884, LeBaron Russell Briggs in 1904, and Warren G. Harding in 1916. Bartlett's further notes that Kennedy had been dwelling upon this idea for some time; a quote from Rousseau appears in his early private papers that expresses the same sentiment.)
"I believe this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to earth." (I needed Barlett's help in getting that one exactly right, but I've always linked this famous sentiment to Kennedy.)
Kennedy's familiar quotations are about goals; about getting off our collective butts and accomplishing something. Even if you disagree with his statist positions ("ask not what your country can do for you..." at first sounds like a repudiation of the welfare state, but then "but what you can do for your country" keeps the state firmly at the center of individuals' lives...), there is a motivational and unambiguous quality to Kennedy's familiar quotes. In this regard, I think that he and Reagan are particularly similar. Reagan vocally advocated a space-based defense initiative; he proclaimed that the United States would never yeild to terrorism; he stood up to the "evil empire" and then boldly negotiated nuclear arms reductions with the Soviet Union.
Most who admire one of these Presidents tend to find many faults with the other, but I think the case can be made that both were men of action who spoke of goals and of attaining those goals. Ford and Bush also spoke of goals, but were vague about how to attain them. Ultimately, they proved to be ineffective.
And, Clinton? If you look at his familiar quotations, he comes across as most similar to Nixon -- a man who also would have been impeached, had he not stepped down. Their most familiar quotes center upon the self: "I'm not a crook" and "I didn't inhale." Their most famous speeches concern defending themselves against accusations of impropriety.
Both men were obviously smart. Both men were obviously quite capable. But, both men also were blind to their own fallibilities, and they blamed the media and the public for the problems they brought upon themselves.
Clinton expressed many brilliant thoughts; he also expressed many terrible ideas. This is true of any man to hold the office of President. Nonetheless, when we look at *familiar quotations* of these men, we come to the inevitable conclusion that Kennedy (involuntarily) and Reagan left the office in such a way so as to allow us to remember the bright and powerful things they said. Clinton, like Nixon, managed to leave the office in such a way so as to only remind us of his terrible foibles and his wasted potential.
March 08, 2001
|
In the aftermath of the recent shooting at Santana High School in Santee, CA, four students have been prohibited from returning to school.
The four students were friends or acquantances of the alleged shooter and had not taken him seriously when he boasted that he would take a gun to school to shoot kids who had been taunting him. Because the alleged shooter was known to be a bit of a joker, these acquaintances apparently assumed this, too, was a joke, and didn't warn anybody.
These four students were initially barred from the school because "the investigation is still on-going." Later news reports say that they're barred from the school "for their own protection." In a recent town meeting, residents said they blamed these individuals for what happened, because they should have told somebody.
So, I would just like to set the record straight, here. We all make judgement calls on a daily basis; we all do the best we can. These four kids, recognizing a pattern of behavior, assumed that what they saw fit into the pattern they had come to know.
But, when it comes to assessing blame, we get back to the same problem as the Columbine shooting and so many others like it. Don't blame the neighbors. The Friends. The music the shooters listend to. The books they read. Their parents. The movies they watched. The video games they played. Images in the media. The bullies who taunted them. The girls (or boys) who turned them down for dates. The internet. The bomb-making materials. The pistols.
Accountability starts at home. It starts with the person who pulled the trigger.
Everyone who has ever been to high school -- anyone who has ever had a pulse -- has had to deal with bullshit. Has been taunted or teased or laughed at or disagreed with. Has had bad days. Has had things stolen. Has had problems with parents. Has been denied something. Has been surrounded by idiots with a different world view.
Shooting your fellow classmates (or co-workers, as in several other recent incidents) is not a legitimate form of expression. Accountability starts with the perp, first and foremost. If you must assess blame, blame the shooter.
So, I guess I should get mad here at ol' Rev. Jesse Jackson. He got caught with his hand in the cookie jar again; this time, for paying his mistress $120,000 as an employee of one of his non-profits and failing to declare her on the tax disclosure forms.
Whatever. Whether this "oversight" was intentional or unintentional is generally irrelevant... until Jesse spouts out with quotes like this:
"There is no evidence that there is any inconsistency or impropriety."
This kind of nonsense just pisses me off. The Rev. is not asserting innocence, but is claiming virtue by way of an alleged *lack* of evidence to the contrary. Not "I didn't do it," but "You can't prove I did it."
This is not a new tactic; Jesse did not invent the "There is no evidence, so there must be no crime" shtick. Everyone knows that Al Gore invented that (shortly after he invented the Internet).
I say that tongue-in-cheek, but let's acknowledge that when Al was caught taking bribes in Japan, he didn't protest that they weren't bribes. He said "there is no legal governing authority" that had jurisdiction in such a case. Ergo, no crime was committed, technically speaking.
President Clinton, likewise, used technicalities to obfuscate meaning when he claimed, "I did not have sexual relations with that woman." When challenged later, he argued the definition of every word, even going so far as to say, "That depends upon what your definition of 'is' is."
Now, I realize I'm going off on a rant here, and it's taking me toward a generally recurring theme that you've seen on these pages before: the use of language to communicate meaning versus the use of language to obfuscate meaning.
This generation did not invent the use of language to confuse. Neither liberals nor conservatives; Democrats nor Republicans nor Socialists nor Communists; politicians nor citizens nor corporations nor academics nor lawyers -- none of these can lay claim to inventing or cornering the use of language to confuse. (Well, Al Gore can claim he invented it, but he'd be exaggerating.)
And, quite frankly, I don't think it's getting worse. Or better. But, it nonetheless rankles me. Just like crime rankles people in Detroit who nonetheless refuse to move.
But, just as the folks in Detroit may have recourse, of sorts, to try to at least curb the problem of crime (even if they can't eliminate it), there must certainly be *some* recourse to curb this doublespeak that is so steeped into our culture.
The first step, I believe, is to call bullshit where bullshit needs to be called. I am only one man; but, I can at least refuse the bullshit on a microsocietal level. So, here's my tiny public message to the Rev. Jesse Jackson:
"If the glove don't fit, I don't give a shit. Pay your taxes and shut up."
March 02, 2001
|
Seattle experienced its first (and, likely, *only*) major snow of the season a week ago. People dialed into work from home on their computers, sending out e-mails and cancelling meetings. "Can't make it, too much snow, blah, blah, blah."
I *did* make it into work, and had a jolly good time poking fun at my colleagues. "Why, back where I come from, we wouldn't even close the public pools for this measly few inches of snow!"
Now the California transplants are making similar jokes about Wednesday's big event. "You call *this* an earthquake?" :-)
I actually found the recent earthquake in Seattle rather exhilarating. It came without warning, gave us all a helluva good ride for about fifteen seconds or so, and then left us to our own devices. When is the last time a tornado, hurricane, blizzard, flood, or other nasty weather-related imposition dropped in for a visit and then left so quickly? While I'm not a big fan of natural disasters, I have to say the weather-related ones have a much nastier tendency to hang around. Mr. Earthquake said "boo" and then left. It was shocking, thrilling, scary, and adrenalizing.
By and large, I think the folks of Seattle and the surrounding areas handled it all rather well, and it's even cooler to realize that, with all of the potential for calamity (it was, to be fair, a big 'un), there were few serious injuries and no directly related deaths. The same cannot be said for the Mardi Gras festivities in Seattle the night before, which had similarly resulted in a lot of property damage (on a smaller scale, to be sure) but, sadly, also cost many folks some time in the hospital and even one fellow his life.
Since many of you have dropped a line to ask how things are going or how they went, here's my Seattle earthquake experience in brief: I was in a meeting on the 7th floor of one of the new downtown office buildings when it hit. By a freak coincidence, my group had recently been the beneficiaries of some emergency-related training, and the whole situation unfolded for me in a surreal state of "No problem. Everything's under control." I heard one of the big metal beams start to twist, and my first thought was that the construction that had been going on in our area was getting out of hand again. (They are building a new stadium across the street, and their work often shakes our building.) A pause, and then another squealing sound from the building, and I began to think those construction workers were trying to break into our room. A rather funny thought, since the construction work was going on across the street, but that's pretty much how things played out. Someone said, "Is this an earthquake?"
Yours Truly, in "everything's under control" mode, told everyone to get under the table and grab onto the legs. (That's to keep your cover from getting away from you, don't you know.) With four of us in the room, and with the quick thinking on my part and the quick acting on their part, this meant that there was really no room under the table for me. :)
So, the building shook and rocked like a cruise ship that had just hit hard seas (been there, done that) and after a particularly nasty lurch, I suddenly felt the adrenaline hit. Wow. Then, the building began to settle into more routine shaking and rocking before it finally calmed down.
I had "sea legs" for the next hour or so.
Lots of rooms sustained lots of damage (bookshelves and monitors tipping, falling, breaking, bursting, etc.), but in the end, it was mostly superficial. There were the occasional "safety czars" giving us conflicting directions ("Get out of the building now!!!" "No! Stay in the building! It's unsafe out there with the transit tunnels!" Etc., etc.).
Everyone went to their cell phones. None of them worked because the circuits were overloaded instantly. I went to my office (after walking all the way down the stairs, and then walking all the way back up, following the various instructions I'd been given) and used the land line. Got in touch with Paulette. She was okay. Then, I made plans for getting over the lake to check on our house.
QED. End of story. No structural damage to our house that we can see, and not much in the way of disarray with the contents. A few picture frames askew, but that was about it. In fact, the class at the University was still on for that evening. Far out.
The corporate headquarters for my employer is closed for a couple of days while they repair *flood* damage caused by bursting sprinkler systems. My own building escaped that fate, so it was back to work and back to business as usual today. Just like that.
The quake did a lot of damage. Our building, like many others downtown, is still structurally sound, but it will nonetheless require a lot of repairs. Any good conspiracy theorist will tell you that this was all a plot arranged by the unions to make sure that there will be good jobs for construction workers even in the midst of the dot com bust that is leading to a decrease in demand for new buildings and houses. Thus, the local economy will continue to do well, taking money out of the insurance pools that it has been funding all these many years, and life will go on.
Unlike many of my peers here, I did not find this event to be life-changing. It was interesting; an experience worth having, certainly, and I highly recommend it as long as you can arrange to live through it unscathed, as most of us did on this particular occasion. It's pretty wild when terra firma becomes terra jello. Nonetheless it was, after all is said and done, just another interesting day in the already topsy-turvey world in which we live.
February 27, 2001
|
When Everett and I were at grad school together, we often tossed about the idea of working on a paper comparing the parallel evolution of American Science Fiction movies and the prevailing political attitudes of the day.
The argument was pretty obvious, but we hadn't seen anybody address it in the academic press, and we thought it might be fun. Here's the obvious:
Fear of nuclear bomb testing was obvious in such cheesy grade-B movies as They!, Godzilla, Attack of the 50 Foot Woman!, and so on.
Worried about communist perversion of the American ideal? There were scores of invasion flicks that highlighted that theme, but the best by far had to be Invasion of the Body Snatchers.
For fear of nuclear war, look no further than the parable in The Day the Earth Stood Still or the more literal Fail Safe and Kubrick's Dr. Strangelove.
We began to feel a little bit more optimistic at the power of our ingenuity in 2001: A Space Odessey, as well as the Star Wars and Star Trek sagas that came a decade later.
Concurrently in the 70's and 80's, the popular sci-fi movies presented growing concerns about technology getting us in over our heads in Alien, Logan's Run, and Mad Max -- and, later, Terminator and its many rip-offs.
My thesis stopped there; this was, after all, 1991 at the time I contemplated writing this scholarly work.
I've been reminded of this little idea, though, as I've been preparing to host a get together of some friends to watch a movie. This group gets together on a monthly basis with the members taking turns hosting. The host can assign homework that pertains to the movie that the host intends to show.
I decided, for various reasons (mostly pertaining to the fact that certain members of the group are big into conspiracy theories), to show The Parallax View. I assigned as homework for the members of the group to watch either The Conversation or Three Days of the Condor.
These three movies came out in 1974 and 1975, and each are about conspiracies and the use of very plausible, very real technology in carrying out those conspiracies. Having now seen all three quite recently, I have to confess that I don't think Parallax holds up as well as I remembered. It feels a little dated, and the conspiracy is simply too far fetched... but, then, that's quite possibly the point. Alas, all three films have their flaws. In the end, though, I think Conversation holds up the best. Francis Ford Coppola is expert at making every scene count.
The fact that all three films came out at the same time is no coincidence. The assassinations of JFK, King, and RFK had started to take their toll on the American psyche, and the revelations of Watergate fueled a national mood of distrust -- both of the government and of technology.
This distrust was echoed again and again in the mid-70's, in mainstream films like All the President's Men as well as in the science fiction of the day. Aside from Logan's Run and others, there was the remake of Invasion of the Body Snatchers. This one is particularly telling. In the original, 1956 version, the G-men save the day at the very last minute. In the 1978 version, the government has already been co-opted. Authority can not be trusted. In the end, no one can save us.
Getting back to my three conspiracy movies of 1974 and '75: it's been fun for the past week to watch these movies and pick apart their similarities and their differences. But, in the interrim, I happened to catch up on a movie I've been meaning to see for some time: The 13th Floor.
Interestingly, this movie came out at around the same time as three other movies with the exact same theme. If The Conversation, Three Days of the Condor, and The Parallax View are all representative of a culture that is increasingly paranoid about conspiracies, what should one make of the period of 1998 and 1999 producing four movies that focus on the idea that our reality is merely a construct by some outside power?
I maintain that The Truman Show, The Matrix, eXistenZ (written and directed by the same man who brought us the 1978 version of Body Snatchers), and The 13th Floor are representative of a new undercurrent in American political thought. As a nation, we are in the midst of an incredible identity crisis, completely uncertain about what is real -- what is true. In Truman and Matrix, the message seems to be that we are at least partly culpable for our part in confusing reality with make-believe... willingly participating in, if not actively encouraging, the deception.
Do these movies resonate with the public because they ultimately forgive the pop culture for its lack of moral conviction? I'm inclined to think not. Rather, I'm inclined to believe that these movies have tapped into a growing ennui that must, eventually, lead to an awakening. We laugh at the conceit of The Truman Show even though we know the joke is on us. But as the nation contemplates, in its own politicorganic way, the nature of reality, I have a sneaking suspicion that the wake-up call is not too far behind.
February 19, 2001
|
Maybe, as posited by the ubergovernment in George Orwell's 1984, changing how people speak really does change how they think and, in turn, changes the reality in which we live.
For example, we see doublespeak like this in the financial papers:
"According to First Call/Thomson Financial's research analyst Ken Perkins, of the 137 retailers monitored by First Call the sector overall is expected to show negative growth of about 5.4 percent year-over-year, which is down slightly from the 6.5 percent recorded in the third quarter."
Retail sales are expected to show negative growth? Negative growth? Hello? There used to be a term in economics that described "negative growth": recession.
Can you say "recession" boys and girls? I thought so.
While my employer has been right-sizing to optimize for our negative growth scenario -- which is double-plus ungood, if you happen to be on the unright side of the right-sizing -- I've become increasingly sensitive (a good, healthy American word if ever there was one) to the manipulations of meaning being broadcast by our decision makers.
I would say that my employers are, in fact, lying to my face, but I'm being constantly reminded by my peers that this is an unright way to look at it. They are not lying to us. They are not even telling us "untruths". They are simply assuaging the negative growth in our expecations with non-truths because that is completely appropriate in an environment such as this.
Language, in theory, is a tool for communicating meaning. Lately, however, it is increasingly being used as a tool for obfuscating meaning. From the former President ("That depends upon what your definition of 'is' is," and, more recently, "[sure she gave me lots of money, and sure I pardoned her husband, but] there was absolutely no quid pro quo."*) to the captains of industry to tell us "We all need to be in this for the long term" while they take $26 million out of the company as the stock price continues to plummet.
My favorite nontruth was recently uttered by a Vice President (my employer now has an organization that goes three Vice Presidents deep. Three! There are three VPs between me and the President of the company. How can we possibly need that many VPs?) when a fellow employee asked point blank "Are there plans for any more layoffs this year," and the VP said with a straight face, "No, there are no plans for any more layoffs this year."
Meanwhile, I'm being told to figure out how to manage my team with at least one fewer person on my staff by this summer. (BTW, in corporatespeak, people are not people. They are "headcount". In national security terms, layoff casualties are "collateral damage." Thus, I am not actually losing people... I'm decreasing headcount.)
My staff now has a better bead on the truth here than I do, because the rumors they hear are often more accurate than the official line I'm told by those higher up the food chain than I am. I think this is partly because the folks on the front lines don't bullshit each other the way upper management bullshits their staff.
Did I say bullshit? I meant to say "lie through their teeth."
Telling the truth doesn't make reality any more palatable, but it *does* make it more likely that you'll be able to negotiate reality's treacherous waters successfully. But, neither our news media nor our captains of industry seem to think we can handle the truth.
---
*note: the second quote above [with my paraphraseology in brackets] is attributed to Clinton by ABCNews' account of the incident in this online article. ABCNews claims to quote the former President's statement in an Op-Ed piece which appeared in the New York Times, but I have not seen the original article.
February 09, 2001
|
Lately, I've taken to writing the beginnings of these magnum opus essays on this site, which I have then never gotten around to finishing. I finally got called on it.
A long and thoughtful e-mail took me to task for the part of an argument I'd left unfinished. And so, allow me to continue my thoughts about comedy and context. I offer no promises that this completes my thoughts on the subject, but at least I can get into it more now that I know where the dialog is heading.
The reader's e-mail begins: "You seem to imply that 'The Homecoming Queen's Got a Gun' was only funny pre-Littleton."
My essay does imply this, but the implication comes from an omission on my part. Rather, the events at Littleton changed the context in which I (and others, I'm sure) receive the song, and *that* changes the nature of the humor with which it is received.
Pre-Littleton, the song is funny because it is an absurdist fantasy. High school punishes all who enters its doors -- students and faculty alike. But to the typical student, the Homecoming Queen (or Prom Queen, or Captain of the Cheerleading Squad, or whatever) appears to be the one little darling least affected. This song's humor lay in the fact that it tweaks our recognition both of the frustration that leads to such a seemingly unlikely event, and the casting-against-type of the actual perpetrator. We recognize and empathize with both the antagonist and the protagonists in the song. It's ludicrous. Impossible to imagine... and yet, it's perversely satisfying at the same time. A Homecoming Queen reigning destruction upon the previously celebratory event.
Post-Littleton, the scenario is not so absurd; not so foreign to the imagination. I agree with the reader that any reasonably intelligent person would have deduced when this song was first released in the '80's that a Littleton-style event was not only possible, but even *probable*, eventually. But, it was nonetheless outside the realm of our actual experience. The schoolyard shootings leading up to, including, and following Littleton banished that little false sense of "it can't happen here."
And, so, anyone who is familiar with the school shootings (and related events) that have taken place in the '90's receives "Homecoming Queen's Got a Gun" with a different context: the situation itself is no longer absurd; only the particular angel of vengence.
(I will remind the audience that back in the '80's, high schoolers who felt particularly frustrated with their situations tended to commit suicide rather than homicide. That, or they played Dungeons and Dragons. I'm not sure which was worse...)
I was picking apart the structure of the song to myself as I sat at the concert hall listening to it, and it really is an exellently constructed bit of humor. I won't bore you with my analysis (I'll bore you with my rant about context instead), but I agree with the reader's e-mail that the song is still funny. *However*, because the context has changed, so has the nature of the joke.
The reader goes on to state (and, I think this is the heart of the matter):
"All this being said, I probably wouldn't have bothered to write except I think the idea that context is everything is rather offensive if not mildly dangerous.
"I remember years ago I was telling you about an episode I liked of 'Homicide, Life on the Streets.' I actually agree with you about what you found offensive, but I still liked the writing and presentation. Anyway, the plot revolved around some clean cut kid who committed a murder. He got his hands on a gun, and once he held it he felt it had power over him and he had to shoot someone. That's really simplifying but it's the basic idea. You were very right in that it played to the anti-gun lobby's contention that it's guns that are bad, and the shooters aren't responsible.
"In a sense I see the same sort of danger in ideas like 'song's about molesters are only funny until you know someone who has been molested.' This implies an inability to reason from the abstract to the specific. It also gives creedence to the idea that only those who have suffered from a gun crime should be allowed to have an opinion on gun laws. Or, to speak to another of your recent essays, the idea that only those who have suffered from racism should be allowed to have an opinion on affirmative action or other laws."
[snip]
While I see the point, I believe there are two distinct issues here. The songs "Kinko the Clown" and "Homecoming Queen's Got a Gun" remain the same as they ever were, before and after the potential listener becomes involved in an outrageous event such as the ones that serve as the setting for these songs. The outrageous event in the song is absurd. The outrageous event in real life is tragic. (The same can be said for Olivia Newton-John's "Let's Get Physical," I suppose.)
But, the listener may well interpret the songs differently after having actually experienced an event such as those depicted in these songs.
Our tastes in humor necessarily change over time, and I contend that this is largely because of our expanding library of context. Many people I know find the old Warner Brothers cartoons much funnier once they're adults than they did when they were children, because they had more context in which to fit more of the jokes. Alas, just as context can enhance the meaning of a joke, it can also sometimes detract from a joke's effectiveness.
I, for one, have outgrown scatalogical humor, but I've found an increasing love of puns. Go figure.
But there's a different, underlying issue that the reader points to, and it is one of politics, not aesthetics. Here, we come back to my original title, "Censorship and Context".
We may agree or disagree as to whether it is appropriate to play a song for a wide public audience that attempts to be funny against a backdrop of violence (or some other potentially tragic setting). As I stated in my last essay, I agree with Dr. Demento's decision not to play "Homecoming Queen's Got a Gun" on his radio show, given the events at Littleton. And if I were still hosting a radio show of my own, I would make the same decision.
I neglected to say in my previous essay, however, that I nonetheless believe that this is and should be a matter of taste -- to be exercised by the host (or performer), and not to be imposed by the government appointed arbiters of the airwaves.
Dr. Demento willingly refrains from playing "Homecoming Queen", although I suspect he looks forward to the chance to play it again on the radio one day. No doubt, his decision is as much motivated by business concerns as it is by any sensitivity on his part. Nevertheless, I would find it particularly offensive to have the government dictate his playlist by banning this song... just as I am offended that the government does see fit to dictate that certain other songs are stricken from the airwaves.
One of the many ironies here is that Dr. D can play a funny song about an absurd school shooting, but chooses not to, while he is prohibited from playing a lovely little ditty called "Sit on My Face (and Tell Me That You Love Me)" -- set against a pleasant backdrop of mutually consentual gratification -- but you can be certain that he'd play it if he were allowed.
How long will it be before the FCC finally regulates the thoughts we choose to express on the Internet (either on the web or via e-mail)? I shudder at the idea.
February 06, 2001
|
As many of you know, I used to host a radio comedy show called A Night at the Asylum at WVBR-FM in Ithaca, NY. The show was largely inspired by Dr. Demento, only we focused more on comedy and less on novelty records.
Recently, one of my fellow former producers of said comedy show discovered that someone she knew was wanted by the police for child molestation. The culprit was caught, and as the facts about his predatory practices were revealed, it became clear that this very sick individual had messed up a great many people's lives... including friends who were very near and dear to her.
As we discussed this traumatic chain of events, my fellow former comedy show producers and I came around to the question of a routine we used to play on the show: Kinko the Clown, by Ogden Edsl. None of us could remember ever really liking this particular song, and we all wondered why we'd every played it. It didn't have any particularly funny lines, and it's rather insenstive to a nasty subject.
But... I've been thinking about this more and more lately. I think that, in fact, we *did* find it funny at the time; we've simply forgotten why. Our context has changed.
The reason I believe this to be the case is because I happened to see Dr. Demento in a live performance this weekend. Focusing on "things [he] can't play on the radio", the syndicated radio show host played songs and videos of a number of bits that don't (currently) pass FCC muster. Some of these items would never, ever make it, but were very funny (including an extremely rude Mick Jagger tune that he recorded with the *intent* of being so bad that the record company would never release it, simply to fulfill a contract that he wanted out of). Others used to be playable on the radio, but have since elicited fines from the FCC. This collection surprised me, in particular, because it included a number of routines we used to play all the time: Monty Python's "Sit on my Face", for example.
Then, the good doctor showed us a music video and prefaced it by saying, "This song used to be one of the most requested on the Dr. Demento show, but I haven't played it in a couple of years, given the aftermath of the shootings at Columbine High School in Colorodo." The video was for Julie Brown's, "The Homecoming Queen's Got a Gun."
Wow. I was stunned. This song was frequently featured on our show. And, as the video unfolded, it was so patently clear why playing it now would be so beyond the bounds of acceptable taste. Given the events that transpired in Littleton, there was no way to interpret this song as anything other than a sick and depraved acting-out.
But, the thing is... this was recorded *years* before Littleton, and it was mocking high school homecoming pagentry; it was not advocating violence. The song and video were so clearly cartoonish; the humor so obviously a coy swipe at high school's culture of popularity. Yet, in the context of a post-Littleton world, it is both mean and savage; an indictment of a culture of violence.
Watching this video on Saturday, I completely agreed with Dr. D: even if the FCC had no reason to fine you for playing it, this was one routine worth dropping from the playlist. And, yet...
And yet the fact is that, in its day, this piece was actually quite funny. It still is, in it's own juvey way, if you can overlook Littleton.
But Littleton did happen.
And there really are maniacs who go around molesting little children.
And context is everything.
January 18, 2001
|
So, back when I was self-employed or worked for small companies, I would often be confronted by economic choices. For example, if I or someone on my team wanted or needed a new piece of equipment -- let us say, hypothetically, a new monitor -- the decision to purchase would often boil down to the business.
For example, I might ask "How many more widgets must I/we sell to offset the cost of this monitor?" There's also the quintessential "What would it cost me if I *don't* purchase this item?" Even though the second question is more important, the first question always helped to put things into perspective that helped to create incentive. Usually, it would cause me or the member of my team to think in terms of "What can I do today that will help to drive up sales by X widgets?"
But.
What if you work for a company that loses money on every sale? What if you work for a dot com? THEN what do you do? It's like being in a bizarro world. Selling more means... losing more. So, if you want to clear the cost of a piece of equipment, do you try to sell more? Or, do you try to sell less?
Are you better off encouraging your friends to shop with your employer when you know that every dollar they spend brings your employer closer to bankruptcy? I don't get it. I just don't get it.
I think I'm beginning to understand why my essays are getting dumber and dumber. It's because *I'M* getting dumber. Spending time in the land of dot coms is hurting my brain. Decision making here has absolutely no basis in reality. This must be what it's like to work for the government.
January 05, 2001
|
So, I understand that I risk looking foolish by exposing my ignorance and my only half-formed ideas on the subject, but I nonetheless need to explore this issue. I do this like I explore any issue -- by throwing it out there and seeing how it looks, then rearranging as appropriate. It's just the way I'm built; some internalize. I gotta get it out there. Better to risk looking foolish now than to not examine the issue and risk *doing* something stupid later.
I think.
There's also the painful reality that my many friends and family who happen to understand The Race Thing first-hand will be uncomfortable seeing me make a fool of myself like this. Let's face it: this is embarassing. I'm making an academic exercise about one of the most emotion-laden issues around. By exposing my ignorance to my dearest of friends and family who happen to have a different background than me, well... I hope you'll understand that *I'm* just trying to understand. And, I'm starting this out by trying to understand just how much I *don't* understand.
First of all, let me state something that can only sound obnoxious, but I believe it to be true, nonetheless. When I meet people, *yes* I notice their appearance (including their skin color, et al), but I honestly believe that I don't *assess* them on the basis of their physical traits. In this day and age, that can only sound like bullshit (and like self-serving denial), but let me try to explain.
Let's put this in the grossest of terms, because I think you *will* understand. When a heterosexual man encounters a woman he has never met before, he will react to a number of attributes he encounters. He may, for example, find himself physically attracted to her bust, her butt, her face, her neck, her hands, her hair. He may go crazy (or not) with lust over her voice. Her eyes. And yet, another man might not even give a second thought to these very same attributes. So, Mr. Smith meets Jane Doe and immediately notices her full, shapely breasts. Mr. Smith is a breast man. He can't stop thinking about the large and inviting bust-line of Ms. Doe. Mr. Jones walks up and meets Mr. Smith and Ms. Doe. Mr. Jones is not a breast man. He likes butts. Breasts don't really do it for him (even though Ms. Doe believe that every man she meets is only interested in her breasts), and Ms. Doe's bust in particular is of no interest to him. Since her figure otherwise has nothing terribly attractive to him (her butt being somewhat not his type), he does not end up focusing on her as a sexual being. She's just another woman he is meeting. That's all.
This isn't an essay about sexual attraction (now that I've alienated another large segment of the audience), it's about perception. Ms. Doe, because she has been physically endowed with a chest that gets an awful lot of attention, can't quite grasp the idea that NOT ALL MEN ARE INTERESTED IN HER OVERSIZED BREASTS. And, yet, some men simply don't care. Doesn't phase them at all. And, note, I'm still talking about heterosexual men, in this metaphor. In this example, Mr. Jones can talk with Ms. Doe and not have a single thought about sex. At least, he's not thinking about sex with her.
Like anybody else, I make assessments of the people I meet based upon any number of attributes. And, I *do* notice skin color, shape of face, voice, eyes, language, weight, all that stuff. But, for whatever reason, I'm just not interested in most of that stuff. Grooming habits probably register more deeply in me than skin color. Eyes matter a lot. They reveal a lot. A dishevled shifty-eyed white guy will always worry me more than any black man in a suit. (Except for Don King, ha, ha.)
Now, I could go on for another thirty paragraphs about why I think this might be the case for me (parental upbringing; unique experiences in my high school, mental defects, whatever), but this isn't about "look how non-racist I am." Rather, it's a starting point for understanding just why it is I *don't* understand.
I want to tell you about my Uncle Philip. Phil is great. He's only five years older than me, and we grew up in close proximity for many years of my youth. He sharpened my chess game, let me use his computer (remember the TI 99/4?), tried to explain the Theory of Relativity to me. He's a great guy.
Now, he *also* happens to have been born with Cerebral Palsey. So, this affects his speech and his motor control. Talking with him is difficult, at first, until you get used to his speech. Anyway, Phil came to visit me my senior year at Cornell, and we went to a Cornell hockey game. I drove us to the parking lot by Lynah Rink and started looking for a place to park.
"Allan. What are you doing?"
"I'm looking for a place to park."
"Park there!"
"But, that's a handicapped spot!"
I am such an idiot.
In my mind, I'm not thinking "Ooh, I have a handicapped person in my car." It's just my Uncle Phil. One of the most brilliant minds I know.
This is an example of how patently stupid I can be when it comes to keeping in mind very obvious physical realities of someone else's existence.
A similar incident occurred more recently, when I had the pleasure of joining my friend Harry from Cornell at a little soiree at his house. Harry was News Director when I first began working in the News Department at WVBR, and he was one of my first and most enduring mentors there. Harry went on to become a reporter for the NPR station in San Francisco. Very cool dude with a very sharp intellect.
Anyway, many cities later, he and I both live in the same town again, and he invited me over to his place. As it turns out, I was one of only two white people at the event. Everyone else there was Asian-American. I didn't even notice it at the time. But, we all ended up settling into this excellent discussion about the radio business and the software industry, and the subject of discrimination came up. I was surprised, at first, when they started talking about how NPR doesn't have any minorities in its upper ranks, etc., etc. What surprised me wasn't the facts that they brought up; what surprised me was... they were talking about themselves. ie, this topic was immediately relevant to *them*. And, I'm thinking to myself, "But, Harry, what are you talking about? You're not a minority. You're Harry!" (Unlike my conversation above with Philip, I actually didn't say this out loud.)
So, you see, I'm an idiot. (Sorry, Harry. Sorry, Philip. I hope you both can forgive me. I can only hope I have other redeeming qualities.) And, this particular kind of idiocy has led me to completely not get The Race Thing. The concept of "minorities in executive positions" has always been an academic subject for me. But, it *isn't* an academic subject. Real people are facing real glass ceilings on the basis of physical attributes that have nothing to do with their abilities.
More to the point, *some* people's *entire lives* are shaped by the fact of thier ethnic background. And, this fact is leading to huge injustices on both sides of the racial divide.
Tune in for the next installment, wherein Allan the Idiot brings up O. J. Simpson and the criminal justice system over lunch with some co-workers of multiple ethnic backgrounds, and watch as the fun ensues.
January 03, 2001
|
For a few weeks now, I've been intending to write an essay here called "The Race Thing". The upshot is this: I don't get it. I don't get the race thing. I don't understand racism and I have no tolerance for racism. At the same time, I haven't been exposed to the kinds of racism that many of my friends experience on a daily basis. I don't know to what extent racism pervades our society today; I've simply never seen it in the computer industry and I haven't been terribly active in those sectors of the population where it allegedly prevails.
Don't get me wrong; I *know* that it exists. A relative of mine who is a cop makes that obvious in the stories he tells. And, sadly, I do know several people who have expressed unflattering opinions about people based upon their skin color. I can only chalk this up to ignorance and frustration, and I've seen it happen with people of all different ethnic backgrounds.
Just because I haven't seen it in the computer industry doesn't mean it doesn't happen, of course. But, nonetheless, because I'm not reminded of *my* skin color every day, I guess that can make it difficult for me to imagine that *some* folks *are*. When I get into a conversation on the topic, I am therefore constrained to intellectual observations rather than any real first hand data.
(perhaps when I get around to writing this essay, I'll mention my experiences as a "minority" at Bennett High School, but I'm getting ahead of myself...)
Nonetheless, I find the recent news that a major employer in the computer industry is being sued for discrimination to be particularly hard to fathom.
The lawsuit alleges that the company in question maintains a "plantation mentality" when it comes to its African-American employees. When I read this, my first thought was: "Well, Duh, assholes! They have a plantation mentality toward ALL their employees!" I have known people to have to seek psychiatric help over their working situation with this particular employer. The suicide rate seemed rather fantastic while I was there: pretty much every other week, the corporate newsletter mentioned the passing of some co-worker from some undefined cause.
This wasn't a race thing. This was an everything thing. You either "drank the kool-aid" or you were an outsider. If you allowed the borg to assimilate you, then congratulations, you were eligible for promotion... and, you could do well. But, if you clung to a life that was outside of the corporate culture, you surely would not succeed there. I have many brilliant friends of all ethnic backgrounds who are doing well there; but, their lifestyle choices are more amenable to that style of working situation. The plantation life ain't so bad, I guess, if you like that kind of work.
It was clear that if you kissed The Man's ass, you got promoted, and if you didn't, you didn't. HOW IS THIS DIFFERENT FROM ANY OTHER CONTEMPORARY WORKING ENVIRONMENT? The folks filing this lawsuit are seeking a class action remedy because they (the seven plaintiffs) were "passed over" for promotions that were given to others (whites) who were "less qualified". I have some news that may shock some: LOTS OF PEOPLE GET PASSED-OVER FOR PROMOTION IN FAVOR OF TWITS WHO ARE LESS QUALIFIED.
Note to all y'all who feel oppressed because of your gender, race, religion, or whatever: the key to success in this corporate world is to learn what to kiss and when to kiss it. If you really want to be "equal" to the straight white male who got that promotion, learn to kiss ass like he does.
If you're above kissing ass, then you're above being promoted. Whoever thought that being promoted was glamorous missed a class somewhere.
Now that I've insulted all of my former colleagues who have ever gotten promoted, I think I'm going to take a breather. I'm getting worked up.
In my next installment, I'll insult several ethnic groups, deride America's educational system, and further expose my raw, naked bitterness (even more fully than I already have here) before I finally capitulate and admit that I really don't know what I'm talking about, apologize to my former overlords, and beg for mercy from my new masters.
Copyright (c)1998 - 2010 by Allan Rousselle. All rights reserved, all wrongs reversed, all reservations righted, all right, already.
Click here to send me mail.