Several Years On

I have several posts a’workin’ reader, on such topics dear to this space as dance and books. But, I had to drop them to write a piece on How I Feel Now more than two years after starting this blog. Now that life has taken an alt-ac shape, and even something of a post-ac one.

I was inspired to write this by read posts of one of my favorite post-ac writers, Walking Ledges. He wrote a post a few days ago about how good it feels to sign a contract (for teaching) and know that the search is no more. (See it here.) That he no longer has to run around and get used to new things, virtually every year.

Yes. One of the most difficult parts of transitioning from graduate school is the search and the uncertainty.

My life includes a bit of teaching also. But as time transitioning out of graduate school has gone on, I’ve realized how nice the life of a freelance writer can be. Now, this is not really new news. If you count this blog, I’ve been writing for more than two years.

But a robust freelance career does not spring full blown from the head of Zeus. It’s taken me more than a year to fully transition to the point where I have good clients, interesting work and enough dinero to pay the bills.

I think what I want to write about, though, is the fact that I HAVE DONE IT. I’m successful! I’m happy! I’m been so focused on the push that it’s only been in the last few months that I’ve realized I slowly did climb the mountain. At least part of a mountain. I’m here where I can see the valley!

mountains and valleyI sometimes use the “climb the mountain not straight up but with switchbacks” analogy with students. That is, do something (in their case, study) steadily and methodically rather than trying to cram it in at the last minute, and eventually, as in the case of hiking with switch-backs, you’ll achieve a good altitude without muscle strain or broken bones.

Here’s what I really love about the life of a writer. I do what I want when I want. I have flexibility — as much flexibility as I used to have studying and teaching.

I used to spend research summers reading and writing in shorts and a t-shirt with the fan blowing. I loved it. When I realized how tight the academic job market really was, I had many thoughts. But chief among them was: I can’t stand to lose my research summers.

And I feel that I haven’t.

I think the whole short-and-t-shirt-with-fan is a beatific vision of being present in the mind, actually, and not having to be present in (to cite one alternative) a 9 to 5 office. It’s not that you can’t be present in the mind there, but present in the mind is (in my experience) rarely the focus. Present for a set of tasks and a social world that is sometimes Kabuki-esque. So my dream scenario, which I have now made the real scenario, combines present in the mind and comfortable in the body.

Can’t be beat, right?

It isn’t that there aren’t still things I want. I need some boost in income. I love working in archives, so I’ll be looking for a way to work that into the writing life. I’ll keep up this blog as a think space.

But life is good. Alt-ac’ers and post-ac’ers out there, light is at the end of the tunnel. Keep a’traveling on.

Advertisements

Why Don’t Trust the B— in Apartment 23 Is Like The Hunger Games

I don’t keep up with television, and, frankly, only watch many television shows when they make it to Netflix—which I do watch. Recently, I saw one that relates so much to the current moment—including presidential politics—that I had to write about it.

I recently watched the first several episodes of a show that aired several years ago entitled Don’t Trust the B— in Apartment 23.b apt 23

Reader, it seemed to me like The Hunger Games of television. The latter, both book and movies, are something of a parable about competition’s role in times of economic and social uncertainty. Young people are loosed to kill each other in a series of bread and circuses televised for everyone’s enjoyment. That’s the game.

hunger gamesThe premise of Don’t Trust the B— in Apartment 23 is simple. A young Midwestern woman, June, comes to New York City dewy-eyed, with a great job in finance, a great medical student fiancé, and a company-paid place to live. All economic and social issues firing happily on all gears, in other words. Love and work, and a nice cushion.

She no sooner arrives in the city than she finds the company—which paid for her move, her apartment, and will pay her salary—has had its assets frozen for CEO fraud. When she walks in her first day, law enforcement officials are cleaning the place out. Needless to say, that means company-sponsored apartment is moot. She is desperate to find a place to live in a highly expensive city.

So, she becomes prey. Now, this is a comedy, mind you. But, after finding a number of predatory/weird situations, they comes upon a more urban young woman, Chloe, who needs a roommate. Our Midwestern woman is ecstatic.

However, viewers know that Chloe has a scheme. She takes the deposits and first month’s rent, and then drives the roommate out by her behavior. In the meantime, she’s a scam artist, with a number of grifter-like techniques to obtain food and clothes.

Then, Chloe sleeps with the fiancé, thereby helping to unmask him as a major philanderer.

June’s engagement is broken, she retaliates against Chloe by behaving outrageously as well. In the end, sitcom style, they have become friends.

Which is just as well, because June doesn’t have any place to fall back on. Her parents, seen on Skype, tell her that, to pay for her MBA tuition, they have skipped their mortgage payments.

Oh! And she takes an unpaid internship which she loses to a woman with a tipped uterus. The boss likes that she can never have kids. She starts working in the local independent coffee shop.

I know this is a long synopsis, reader. It’s long for a reason.

The plot coordinates have to line up to let us see how starkly this is a parable about predatory behavior in the age before Bernie Sanders put on the table a discussion of how predatory the environment looks to young people. Actually, a lot of people, young or not.

Now, what happens is that Chloe is still predatory—she makes a living scamming people in various ways—but the two women bond. Significantly, June pays her back in kind by stealing all her furniture and holding it (with the help of one of Chloe’s ex-roommates) until she gets her money back.

So, if we go with The Hunger Games as metaphor, June learns not to be just nice sunny smart person with a plan, but someone who retaliates in kind. She is a strategist with buoyant optimism when we meet her. She has to learn to become a warrior.

More centrally, however, the fact that Chloe uncovers June’s boyfriend as a no-goodnik bonds them. They become friends.

Oh, let me add one other detail. June’s would-be mentor at her job is, of course, downsized when the firm closes. He ends up as a coffee shop manager. June needs a job. He hires her, saying words to the effect of, “oh, I fired someone for absolutely no reason to make room for one of my friends.”

I find this very grim. In the moral universe of the show, predatory behavior, stealing and dishonesty are all ok. They are a kind of lingua franca, actually. People bond by doing them, and then move on to friendship. But it means that the bedrock is untrustworthiness.

There’s a whole part of academia that examines the philosophical meanings of television shows. There is, for example, a book called The Big Bang Theory and Philosophy.

friends_hContrast this with the universe of Friends, a similar “people come to New York and hang in apartments and coffee shops” kind of show. In the Friends moral universe, they actually are friends—supportive. it’s important to have a social network that is not dependent on economic need. Economic need is hardly ever mentioned. (They are never in danger of being the baristas at Central Perk.) It’s the product of a good economic time.

It disturbs me that the predatory moral universe of Apartment 23 is played for laughs. And I hope Bernie Sanders and his campaign—president or not, elected or not—changes the cultural climate in which it took place.

 

 

What’s Represented in Representation?

Today, I want to talk about how different representations can change what we see and how we interpret it.

Why do I want to talk about this? Well, as part of my work, I have been looking a lot at offerings in and around the California Institute of the Arts, an art school in southern California. Poking around their web site, I saw iterations of their logo that I’d never seen before, and they seem worthy of comment.

California Institute of the Arts is often referred to as CalArts.calarts-logo-square-orange It was started as a kind of “Caltech of the arts” by, among other people, Walt Disney. Therefore, it makes sense to think of the logo as the equivalent of that, CalArts, running together the abbreviation for the state and the abbreviation of its main offerings, just as California Institute of Technology does in its shortened name, Caltech.

Except that, poking around, I saw another logo that is often used. It’s right below, here. cal arts black on white

This logo is so different than the one above. It doesn’t seem to be making the case that it’s abbreviating California Institute of the Arts. It seems to be, um, saying Ca, La and then a bunch of letters. It seems to be emphasizing the primacy of Los Angeles (assuming that’s the la) in California, and the mutually reinforcing nature of the two. (CalArts is about 30 miles outside of Los Angeles proper. Interestingly, though, one of the foundational schools that eventually became CalArts was the Chouinard Art Institute, which taught the early animators of Walt Disney studios. CalArts was in part started so that animators, and the rest of the industry, would have a state-of-the-art art school on the West Coast.)

So, since it is well known that the Disney animators like to make visual puns in the Disney films, I couldn’t help but wonder if this were one. If we read the ca and la as mutually referential and reinforcing, the rest of it is “rts”—which could be pronounced “arts.” At a minimum, it seems to be making the point that CalArts is in L.A., not San Francisco (the other California center for the arts) or New York. Is it a response to the fact that Disney was dissed in New York, and his films not considered real art? (This according to the recent PBS documentary on Disney, which is fascinating.) Or, is it just making sure that LARTS is the most dominant part of the logo, with “ca” the smaller brother? Is “ca” a smaller reference to Chouinard Art, even?

I think this is an intriguing example of how letters and their arrangement can work as caltech white on blackart. In the Caltech logo, the “ca” is not set off as much—it reads as Caltech, just the way it is said. And, in other representations of CalArts, as in the orange logo above, it looks that way too. But in the black and white CalArts logo above (or should we call it calarts), the “ca” seems to inhabit a different universe from the rest of the letters.

I think this is some artist, somewhere, making the point that representation, not the letters, the building blocks of words only, matters.

I’d love to know the history of the logo, so if anyone knows, please contact me!

Bernie Sanders’s Hair: or, Marked Men

So, having argued in my last post that professional women are less marked than ever, here’s what I think of Bernie Sanders’s hair. Sanders’s hair is commented upon not less than Clinton’s, I think, but more. (Admittedly, this is a personal sample based on what I read and hear, but I think it’s true.) The fact that his hair is sometimes not neatly combed, or blows in the wind, is alluded to and often serves as a frame for his ideas, particularly on television. (For a perfect representation of the icon of his hair, and one that also ties in the upcoming Halloween holiday, see here.) He’s not neatly coiffed according to some abstract presidential candidate standard, I guess. (Although it looks ok to me.) The sense that he doesn’t look right also underlies some of the Saturday Night Live debate satire, which focused on his having one set of underwear (!). He himself played into this, telling reporters that when he was first elected to office in Vermont, he only had one suit.

And I think it’s commented on more because Bernie Sanders himself is marked. Marked isn’t specifically a gender category, I think, it’s a category having to do with marginality or outsiderness. (Tannen’s article was written when only so many women were in the boardroom, so their presence just attracted heightened scrutiny.) Because of his Vermont-ness (rural state, mix of liberal and gun control, and so forth), his espousal of economic issues few people were talking about before he brought them to the table, and his accent, he isn’t just any generic guy running for president.

And this brings me to a third central point. The exchange between Bernie and the Times is a debate of sorts. On the one side, he assumes that hair is a trivial thing to be asking a presidential candidate about. On the other, the reporter says it’s a valid gender issue. I think both are correct if we delete the word gender.

The problem, really, is that the Times relied on shopworn observations and analysis, because it is Sanders who is the marked candidate in the race, not Clinton. The stance also illustrates the perils of relying on outworn social observations as much as political or economic ones in politics. Social observations change and become outmoded. It takes constant testing of the wind to see what is really, in fact, happening, not relying on decades-old truisms.

In that sense, Sanders’s incredulity about being asked what he seemed to interpret as a fashion question when he was trying to get out an important economic message was a breath of fresh air.

Bernie Sanders’s Hair: or, Marked Women

Happy Rockober, readers!

I want to talk today about Bernie Sanders’s hair. (Whenever I have started to write this, I keep thinking about the fact that the cadence of “Bernie Sanders’s hair,” especially if you write it “Bernie Sanders’ hair,” sounds like the improvised Native American chant “John Wayne’s Teeth.” It was written by Sherman Alexie and featured in his movie Smoke Signals. It’s a big digression, but before we start, here is a relevant clip.)

So. Bernie’s hair.

A few days ago, I was looking up the Saturday Night Live skit spoofing the Democratic president debate earlier this month. I enjoyed the debate, and I wanted to see particularly the Larry David version of Bernie Sanders. Which can be seen here.

Ok. I ran across yet another clip. (Reader! Perhaps the secret subject of this post is clips!) This one had two young men talking about a conversation Bernie Sanders had with a New York Times reporter in which he scolded her for asking him whether, and I paraphrase their clip-within-a-clip (!), is it fair that Hillary Clinton’s hair gets more scrutiny than his does.

He responds incredulously, asking her to verify that she is actually asking about hair when he is trying to talk about massive income inequality in this country. Yep, she responds, I am, and then says something to the effect of “I can defend that—there’s a gendered reason.”

At that point, my meta really swung into action.

First, yes, for the past several decades, analysts have been pointing out that women’s appearance is often the focus of comment and criticism, whereas the appearance of public men (politicians, corporate leaders, etc.) is unremarked on, no matter what they look like. It’s been an issue gendered, yes.

But second, the larger point is that women were subjected to this extra scrutiny because they were what the scholar Deborah Tannen calls “marked.” (If you’ve been anywhere near English of Women’s Studies departments, you have very likely read her essay “Marked Women, Unmarked Men.” The text, helpfully, can be found here.)

Basically, Tannen pointed out that the styles women choose mean something, while the styles men choose, if they are basic office wear, don’t. They mean “basic guy” with no inflection. When clothes meant something—when the meaning had to be decided upon—the women could be subject to judgment or criticism. At the least, a higher level of scrutiny, a gender inequality when people simply don’t think about the way men look.

So as part of my discussion, I want to focus on two things. The first is that it is arguable that women today are not nearly as “marked” as they were when Deborah Tannen wrote her article in the early 1990s. Yes, commentators sometimes mention Hillary Clinton’s penchant for bright colors…but the firestorm over her wearing head bands, also in the early 1990s, was exponentially greater.

The proof of the “less marked than ever” theory, as I guess I’ll call it, was that Donald Trump tried a version of “markedness” on Carly Fiorina by deriding her looks. And, basically, he lost. People generally were much more with her. They supported her dignified response, regardless of what they felt about her being a presidential candidate. She dresses and wears her hair like a generic, regular-type corporate woman, and so, to a large degree, does Hillary Clinton. There is a recognizable female professional style, and they recognizably both walk within its boundaries.

In the sense that anyone who follows that style is no more marked than their respective male counterparts, that’s a victory for women. Trump’s remark may have been the last gasp of being able to deride professional women for looking any particular way, and the beginning of acceptance of their looks just as we accept the looks of every male presidential candidate without serious comment.

A Stepford Too Far

Hi, readers!  Today’s post is occasioned by something seen in the grocery store checkout line.  Namely, before and after pictures of Renée Zellweger.

The ones I saw were in Us Weekly, but similar juxtapositions have been all over the media, since, apparently, she was nearly unrecognizable at a fashion preview a month or so ago.

renee-zellweger-2014-split-horizontal-gallery

I find the change really chilling on several levels.

First, although her face clearly looks quite different than it did before, she is apparently maintaining that the differences are due solely to health-related diet and exercise changes.

I’ve always liked her persona in films as she (often) plays women with humor, sass, and common sense. Having plastic surgery to this degree is a dreadful negation of these things, and trying to erase the effects by a not very convincing story is even more so.

Second, if Zellweger had plastic surgery to hide the effects of age (she’s 45) that in itself is a sad commentary on ageism in contemporary society. Indeed, the New York Times devoted a recent “Room for Debate” to her recent plastic surgery as a example of prejudice against age. (It can be read here. “Room for Debate” is a recurring feature in which 5 to 6 commentators opine on a subject of contemporary interest.)

But an even sadder commentary is the fact that her original looks were completely sandblasted away. The plastic surgery is not obvious just because—or even primarily because—she suddenly looks younger; it’s obvious because she suddenly looks profoundly different.

Isn’t it possible that age was secondary, and a misguided attempt at looking some criterion of “better” the primary reason? Renée Zellweger has always been nice looking, but someone to whom the adjective “cute” or “attractive” would likely be applied, rather than “beautiful.”

It’s equally dreadful if a woman with perfectly acceptable looks comes to feel that they are not acceptable unless they adhere to an extremely narrow spectrum, and I’m almost more afraid of that than the aging motive. (I want also to say that we don’t know, obviously, unless she advances her reasons, but we can speculate about the culture pressures.)

The whole thing reminds me of Scott Westerfeld’s young adult novel Pretty, a science fiction tale in which every member of society, at 16, undergoes plastic surgery to be (as you might guess from the title!), pretty, beautiful, with perfectly symmetrical features. Westerfeld is a master at crafting his dystopian fiction to be only a few turns different than the contemporary world. After their surgery, for example, young people congregate in New Pretty Town. Parents are “middle pretties” and the generation before them are “old pretties.” One can easily imagine an actress thinking she is an “old pretty” who needs to be rejuvenated.

But, if I really get my meta on, I have to say it also reminds me of Jean Baudrillard’s simulacra. Let me quote the introduction to Baudrillard in The Norton Anthology of Theory and Criticism: “simulacra seem to have referents (real phenomena they refer to), but they are merely pretend representations that mark the absence, not the existence, of the objects they purport to represent.” An example Baudrillard himself gives is Disneyland–indeed, it is the example most frequently given in the explication of his work. Frontierland and pirates in Disneyland are fantasies, not the real thing or even images of the real thing (as photographs, for example, would be). Unfortunately, Renée Zellweger’s face is now an example of simulcra as well.

Happy National Novel Writing Month!

Today, I’m going to celebrate the opening of a new month by writing about a cultural phenomenon that takes place within it: National Novel Writing Month, or, as the shortened form is known to cognoscenti, NaNoWrMo.

National Novel Writing Month is a challenge to write 50,000 words on a novel either by yourself, in concert with the Web site dedicated to it, or in tandem with writing buddies. Hundreds of people worldwide participate in it. While the statement on the Web site talks about 50,000 words, you can really set any goal for yourself. The point is to generate words every day for the entire month.  Chinese lantern pictures

And then pat yourself on the back at the end! And publish, revise, or whatever your heart desires.

I’m going to link this to elements in my graduate study, as I love to do. One of the first courses I took talked about the distinction between modern clock time and the festivalization of time that preceded modernity. Modernity is (among other things) about the institution of clock time: a standardized, regimented span of days, continually beginning and ending at designated times. Older eras were defined by feast days, festivals, and so forth. One of my professors argued that, in the contemporary world, widely celebrated holidays (think Thanksgiving, also coming up this month) were one of the few retentions of festival time (a continually replenishing, continually consumed table over the years, containing ritual elements).

Interestingly enough, it can be argued that the academic year also contains elements in common with festival time (very broadly defined, of course). Why? Well, rather than being a series of standardized days of roughly equal length, semesters have periods of waxing and waning, bookended with time that is celebrated as (first) a beginning (think welcomes and invocations) and (second) as ends (think holiday parties and breaks, which are unregimented time).

You can see the components of “festivalization” most clearly, I think, by comparing the waxing, waning, and punctuation of beginning and ends with corporate life. In the latter, one may have a vacation or holiday time off, but it is not celebrated as a beginning or end (certainly not in common), and while there may be busy periods or slow, it is not felt as a waxing in the way that the semester goes uphill, uphill, and then down (final grading!).

Well, I’m going to add NaNoWrMo to contemporary iterations of festival time. First, it has a specific time dedicated to it in which a huge community out there celebrates. It is kicked off with a celebration (there are write-ins that begin on October 31 and kick off as the chimes of midnight herald the month of November). There are numerous mini-celebrations within it (you can get badges and prizes for writing a certain number of words). There are communal write-ins throughout the month, including all-night events. (Talk about unregimented!)

And I think it is no accident that this custom happens during the bleakest month of the year. (I know many people would nominate December for this honor. Not me. Whereas the daylight in December increases after the 21st, the daylight in November only goes downhill.) It’s a shared ritual of harboring the light within, I think, and making sure that you are producing a kind of internal, creative warmth. I find it very encouraging to be part of such a team on these cold and dark mornings. So, all hail, NaNoWrMo!

Independent Day

In my pursuit of alt-ac-hood, I am a great fan of Versatile PhD, a web site devoted to forums and information for those transitioning out of the tenure track as a goal. The other day, someone posted a comment on one of its open forums to the effect that they were dissed when registering for a conference as an independent scholar.

The person querying had published a book, and had been invited to talk at the conference because of it. (She also works at a nonprofit, but that position apparently isn’t related to the research done for the book.) In other words, you would think she had a position inviting respect. Instead, she was told that her specific affiliation for the conference—independent scholar—was a euphemism. She wanted suggestions, from the Versatile PhD’ers, about how she could respond graciously.

Euphemism for what, it didn’t say. (Party-crasher? Suppliant? Poseur? Mystery Guest?) First of all, it isn’t a euphemism if you’ve actually published your research, it seems to me.

Second, as it happens, the John D. and Catherine T. MacArthur Foundation recently gave one of their awards to an independent scholar, Pamela O. Long. (The MacArthur awards are frequently referred to in the media as “genius” grants; awardees this year include a generous helping of academics and artists.) Ms. Long is 71 and has worked as a successful independent historian most of her life. Read more of her story here.

The whole issue of independent scholarship is a very vexed one for alt-ac’s and post-ac’s just because of this prejudice. It’s part of a larger set of practices that devalue any work that is not specifically linked to an institution of higher education and even within that, to work as a tenure track appointee within a department. That this is happening even as more and more people flee the tenure track or abandon it prior to obtaining it due to either the paucity of positions or the almost complete lack of free time for a fledgling academic (and as entire conferences are devoted to the alt-ac world, such as the recent one at Penn State) is increasingly frustrating. There are a scant number of openings in the current job lists, measured against multiple hundreds of applicants.

It seems to me that one solution is simply to be more inclusive about what constitutes thought and research. Some of this takes place in institutions; some of it takes place in conjunction with both institutions and private studies (past recipients of the MacArthur have included people like Anna Deavere Smith and Junot Diaz, who have university appointments but are better known for their artistic activities), and some takes place in private studies. The pudding—Pamela Long has a number of books to her credit—should be the proof.

Part of my own transition out of academia has included believing that perhaps there are a number of good models, not just one.

I’m happy to report that Pamela Long has written an informative and generous article on being an independent scholar that foregrounds the idea of the work being central and also provides an intriguing parallel with the lives of artists. To quote: “there are plenty of us out here, so it seems reasonable that we should claim some cultural space….it would be good for our profession to move a bit closer to this [by artists] focus on the work as opposed to the position.” I’ve only known about this essay for a couple weeks (since the MacArthur announcements were made), but I really think this should be required reading for alt-ac and post-ac folk. The whole thing can be found here.

Me ‘n’ Mysteries

Here’s the thing about being in transition from graduate school, for me: I just can’t get enough of mystery novels.

I’ve always liked mystery novels. But in the past, they formed a minor part of any reading pie I was in the middle of. I’ve always read pretty voraciously, but pre-graduate school I was a great aficionado of what bookstores call literary fiction and nonfiction. I was never a huge fan of genre fiction—not, at that point, mysteries or, then or now, any other genre form—fantasy, science fiction (and certainly not horror). From the early 2000s until about a year or so ago, once I began re-entry and then fully being in graduate school, I was seldom outside the realm of required-and-if-not-officially-required-then-you-need-to-know-about-this academic reading.

Then, a halcyon moment arrived: while still paying attention to the field and academic writing, I could once more make an opening for plain old reading. (I find both academic reading and plain old reading pleasurable, mind you. But few things are as wonderful as realizing that I could once again wander through the library stacks unimpeded and pick whatever I wanted to read.  For a similar moment in the postac blog Walking Ledges, see here.)  For me, it was a gigantic moment of feeling knit together: the old pleasures with the enlargement of the new.Book Review Ghost Hero

But what I want to read are mysteries, mysteries, and more mysteries. In the past year, I have finished catching up with Sue Grafton’s Kinsey Millhone and Walter Mosley’s Easy Rawlins (both old favorites) and gone on to find S.J. Rozan’s Lydia Chin-Bill Smith dually narrated series, Laura Lippman’s Tess Monaghan (and Lippman’s really good realist non-mystery novels as well), Laurie R. King in Britain and San Francisco, Susan Dunlap’s series in Berkeley, and Harry Dolan’s in Ann Arbor. (I’m fond of university towns and regionalism, and both if I can get them.) Plus I’ve recently discovered Tana French in Dublin and Rachel Howzell Hall in Los Angeles. (Some of these I’ve found through word of mouth, but others are discoveries made in National Public Radio’s cool “Crime in the City” series.) In fact, I’m feeling very sad that I’ve finished most of these, although the discoveries of French and Hall make me believe that great series in the genre come on a-comin’.  the likeness

Moreover, while I keep trying to pursue my old habits, they basically refuse to be captured. An example? My bedside table is occupied by Linda Gordon’s Dorothea Lange: A Life Beyond Limits. In the old days, that would have represented an incredibly nice cup of tea: a fascinating person, an intriguing social world, and a wonderfully written biography. Yet when I picked it up recently, I literally thought “…but I know what happens and how this ends.” !!

So why is the burning desire to read at least an hour of mystery a day upon me now when it was pretty dormant before? Well, I think because the transition out of graduate school is a mystery. It’s an arc when what happens and how it ends isn’t fully known. I’m working part-time jobs until I arrive at the ultimate place I will be post-PhD (and still completing a PhD). So whatever the ultimate dénouement of the graduate school scene is for me…I don’t know it yet.

As a genre, mystery novels start with one of the journalistic w’s (what…happened), and the plot is the unfolding of every other journalistic w (why, when, where, and who. And we also usually get the journalistic h, how). Now, I have been holding off on this blog because…hey, mystery novels are kicked off by a bad thing (murder, theft, kidnapping, rape…something not good). So I was a bit hesitant to make the analogy: while yes, the job market post grad experience kind of sucks, for me, the experience of learning more in an environment dedicated to it was the opposite of murder, theft, and kidnapping. It felt like being restored to treasures I considered the most valuable. My inner self made also outer. It was a setting right, not a going wrong.

And then I realized two simple facts. First, the plot device that kicks off mysteries is a change in the given order. For them, the change is the wrong thing. But there is no law against the change being a right thing. (Hmmm…quite an idea for a novel itself!) It can be just a change, which graduate school certainly represented for me.

The second is…in strongly written series like these, one identifies with the investigator(s), not the person to whom X has happened. Indeed, writers on mysteries posit an ethical role for the PI: s/he is the one who sets right the rending of the social fabric. In some sense, being involved in the midst of the completion of PhD and what comes after is the investigator role of your own life.

So, all those mystery series out there…bring ‘em on!

Leaning In and Its Discontents, Part II

Yesterday, I talked about Sheryl Sandberg’s book Lean In: Women, Work, and the Will to Lead, focusing on some elements I found depressing—mainly the relentless exhortation to, basically, work all the time if you want to succeed. (I have to say that I am not putting down work; I am myself very hard-working and organized. So, I feel the need to mention that I have the very focus and organization she is talking about.) But I think the work, work, work ethic here is ultimately not a desirable or sustainable lifestyle, and I worry that, because of that, the model of high achievement for women is going to explode in the future. Or come crashing down.

So, what I want to focus on today, is that fear. What scares me is this: are children being brought up by leaning-in parents—told not to scratch lice because their parents have to review notes for a meeting and don’t have time to look up from their iPads, or told to sleep in their clothes so the family can get out the door 15 minutes earlier—likely to repudiate the feminism of their mothers simply because events in their own childhoods were unpleasant and they don’t want to repeat them in their adult lives?

Will the children of lean in be a- or anti-feminist because they equate feminism with a (as it might be seen then) antiquated work-all-the-time-and-be-in-the-C-suite mode that they see as old-hat, unworkable, and thus slightly ridiculous?

This is the analogy I have in mind: children raised in communes of the 1960s and early 1970s. Some of them, when they became adults, wrote memoirs of the experience in which they said that, far from liking the wheat grass served there, they just pined for mac and cheese or a bologna sandwich.

Communes and have-it-all feminism very different, you say? Well, both are examples of movements that 1) had an au courant moment, 2) stood for certain things once outside mainstream culture (vegetarianism, say, for communes; larger scope and voice for women, for feminism). Then, both 3) saw some elements largely adopted by the larger culture (soy milk and organic food in your local grocery and high-level education and careers for women). Finally, 4) in the case of communes, they were ultimately pretty much abandoned. They returned to the margins.

Sandberg herself apparently bought into the idea that second-wave feminism was old hat at one point. She mentions that the actual Gloria Steinem is “the absolute opposite of my childish image of the humorless feminist” by being “funny and warm.” So stereotypes already lurk among women that—had the second (and third and fourth) waves been less massive—might have silenced feminism not only as a movement to be avowed or disavowed, but as a concept. It is not entirely impossible that in a generation or two the idea of more scope and ability for women might become an unfashionable idea. And the fashionable model be one in which women only have the domestic sphere.

It has happened before.

Some examples? Students in the mid twentieth century were routinely taught that most writers were men. The big book on American literature through the 1950s and 1960s was arguably F.O. Matthiessen’s American Renaissance: Art and Expression in the Age of Emerson and Whitman. It only discusses men—with an underlying assumption that all artists of fine quality will be male. Part of second wave feminism (and later) was discovering the multitude of women writers in this period who had literally been written out of history.

Along with being fascinated with how much the twenty-first century replicates the nineteenth, I am fascinated with the vanishing process. How does this stuff get written out of history?

One reason might be the association with particular trends that follow not simply the content of the idea, but its fashionability. The first and second wave were tagged (unfortunately) in the public mind with shorthand versions that did a disservice to the many fronts on which feminist issues were fought. The first wave (nineteenth and early twentieth centuries), all about the vote. Women striking for the vote, the suffragettes, later became an unfashionable model for younger women. The second wave, about more education, power, and careers for women, became tagged with humorlessness or needing a man like a fish needs a bicycle (if you believe the young Sandberg). And younger women in turn repudiated those images.

More importantly, perhaps, mainstream culture simply adopted central tenets of both waves. Think of someone seriously arguing against the major victories of the first wave, for example: that women shouldn’t have the right to vote, or that women should not be allowed to pursue college degrees. It doesn’t happen, because those values have become fully assimilated in middle-class life.

What mainstream culture does not adopt becomes associated with the fringe, and it returns back to the margins.

So on the level of fashionability, there is a troubling pattern of generational disavowal about earlier models of feminism. And the possibility of that generational disavowal developing in the future because of examples in books like Sandberg’s haunts me.