Skip to content

Technology Can Make You Younger! or how not to act old around a computin’ machine

April 18, 2010

No, this isn’t about those new anti-aging face creams with micro-currents. But it is related to how technology can make you look younger — figuratively at least.

Ever noticed how when young people are trying to do something on a computer that takes several steps — such as sending an email with a link or attaching a document — they move their cursor quickly all over the screen, searching and clicking, clicking and searching? They go by instinct and if they click on the wrong thing, they just abort and move on.

My IT guy does this, too, while I usually stare slack-jawed.

Me? I look around for just the right icon, usually while arching my neck so I can peer through the correct prism in my progressive lenses. I’m not about to waste all that energy.

Then, one day, while a student was looking over my shoulder as we worked on a class project, I found myself sweeping across the screen as though I were a video game master, clicking here, closing and opening things there, hovering for just a second, click, click, nope, yep, well that didn’t work, here it is, yes, that got it. I was showing off, I realized later. I was acting. In a flash, I had been transformed from Angela Lansbury in Murder She Wrote to Angelina Jolie in Lara Croft, Tomb Raider. Or, at least that’s how it felt, as I wielded my mouse like a rapid-fire weapon.

That’s right, a big part of being up on the new technology is swagger — being able to pretend you’re up on the new technology. Haven’t you caught on yet that the first thing your IT gal does is re-boot the machine? That should tell you that she really doesn’t know what’s going on but hopes the computer will resolve the issue on its own. The next step is clicking around and seeing what happens (and never being able to replicate whatever it was that fixed things).

The second biggest part of looking like you’re technologically younger and hipper than your years also involves acting, but it’s more in the style of Angelina’s better half, Brad Pitt. It’s all about the cool factor — pretending that whatever gut-quivering error message you just received when you went to save a day’s work was really no big deal. It’s part of the secret handshake of binary code, which sees operating systems as a bit organic. The IT folks are forever saying the network isn’t feeling well or has been temperamental today, poor thing. You want to see temperamental? But, now, now, if you want to seem young, the appropriate response is one of laid-back nonchalance with an overwhelming confidence in your abilities to recover your lost document.

Ahem. Well, the point of this is not to rant to the patient and talented IT people but to reveal the social and generational interactions that surround our use of technology. Truth be told, it has been my generation that has led the way in digital media, and my mature colleagues are more knowledgeable about electronic networks, code languages, media platforms and social media, as well as the social, political and historical ramifications of the current information revolution than the young people they teach.

But young people, not having ever lived in our once static, totally analog world, are more comfortable not knowing, if that makes sense. They explore technology at will and move forward without angst, understanding that if they click around enough they’ll either figure it out or they won’t.

So, if you want to instantly look 10 years younger, use digital technology. It works. All it takes is a bit of bravado and cool detachment. Oh yeah, and lose the bifocals.

a field guide to now

March 18, 2010

The Blood Brain Barrier or No Wonder I Can’t Think Straight

March 2, 2010

You know how you’ll hear or read something unusual that seems clever, interesting or different and you’ll notice it, maybe without realizing it. Then, a few days later you’ll hear or read a reference to that same thing, and then again a month or two later?

This probably happens most to people who use their analytical brains a lot for a living — teachers, writers, psychologists, etc. It’s how catchy names and phrases like “Google” and “think outside the box” go viral, I suppose.

This has happened to me lately with something I knew nothing about previously — the blood brain barrier.

It came up in a couple of books I’ve been reading, though I can’t remember which ones, which might say something about my blood brain barrier. It was also mentioned on a health segment on cable news not long ago. Then, last night it surfaced again on House, the Fox doctor show, in regard to a patient’s cancer treatment.

Looking up the concept is fruitless because of all the medical-speak, but I found a site called “Neuroscience for Kids” that was helpful. It turns out that not everything in the bloodstream can get into the brain’s own blood supply because the brain knows that it is the most important organ in the body.

It reminds me of something I read many years ago that portrayed the brain as a character in a drama supremely interested in self-preservation, which is why we usually duck our heads first and ask questions later when physical danger approaches.

The brain also dodges substances in the blood that other organs blithely absorb despite their dangers. One caveat: Lipid soluble molecules, such as barbiturate drugs, rapidly cross through into the brain — an indication perhaps that even the obsessive cerebellum needs an escape now and then.

I suppose I’ve been tuned in to this knowledge about the brain because of an evolution in my thinking toward the idea that the brain has a mind of its own, so to speak. I’m sure my brain doesn’t like that I’ve discovered that I can argue with it, but that’s too bad.

For example, our brains are geared toward finding the negative. Happy, positive events are nice but since the brain’s major goal is self-preservation, it is more attuned to anything that might be a threat. That means our brains can make us believe there is an impending negative event when there is none.

Also, some of us have brains that constantly seek stimulation, so they introduce drama and pathos where there is actually mostly peace and harmony, or at least lots for which to be grateful. Our brains tend to survey the landscape and look for anything that doesn’t fit. The brain has a bias toward dualities — good/bad, black/white, smart/dumb.

It’s good to know this about the brain because life is really more complicated than that, and less dramatic. I call it thinking in mutualities. It’s not easy to do; it goes against the brain’s natural tendencies. But it breaks through the psychological brain blood barrier that can keep us locked in our own negative thinking.

Things usually aren’t as bad as they might seem.

Women Prevail in History of Women and Media

February 22, 2010

My first major act as a feminist was to ask for the same pay as a male colleague in my first job out of college — as a reporter for the Monroe Morning World in Monroe, La.

He had been hired two weeks after me and had exactly the same amount of professional experience: none.

He was hired at $10 a week more than my salary, which would have been about a 12 percent increase and would have made the difference in whether I could pay my monthly rent with one week’s paycheck.

I got the “raise,” even though my boss seemed incensed that I asked.

My second major feminist act was attending the National Women’s Conference in Houston in 1977. I was the only person I knew who attended and most people I knew thought it was weird that I wanted to go.

I’ve been a feminist since I first saw my father give my mother an allowance and I sensed that his word meant more than hers.

I never advertised my feminism, but I never hid it either. Most women I knew were in favor of “equal rights” in a general way, and we all had fun making our way into the world of work in the 1980s, with our skirt suits and sensible heels.

This post isn’t intended to go over all that, but to say that I’ve noticed lately that it’s quite okay to talk about feminism — not as “equal rights” — but as an everyday part of the social fabric.

I’ve been noticing that for awhile but it’s particularly true with women in the media, who probably avoided labeling themselves for fear of not being neutral or objective.

Now, there are activist media women all over — in places like the International Women’s Media Foundation, the Women’s Institute for Freedom of the Press, and on and on.

And, there are particular women like Gail Collins and Joan Walsh, for example, who are out there in the mainstream media, representing themselves as feminists, openly and unapologetically.

The uninformed don’t even know that feminism has transformed society and is alive and well.

At the same time, a lot of feminists aren’t satisfied with the current state of affairs and are chagrined that women still face discrimination.

But, I, for one, feel comfortable for the first time in a long, feminist life with regularly identifying myself as a feminist in everyday conversation and action. Sad, perhaps, considering how long I’ve been around, but liberating nonetheless.

Pulitzer and YouTube in the Same Sentence?

January 26, 2010

If you’re an old-school journalist and the thought of “Pulitzer” and “YouTube” in the same sentence doesn’t jar you a bit,  well, then, you’ve become fully digitized. Congratulations!

I must admit this sliver of semantics got my attention when I received the announcement in my in-box of the second Project: Report contest and saw the cash prizes, the corporate sponsorship and the increasing ambition behind it.

While the declining news industry elite cuts staff, frets over business models and loses advertising dollars — Surprise! — innovation and money is flowing in other directions, and at the hands of people whose main concern is not profits, but journalism.

Those people include YouTube executives, the leadership at the Pulitzer Center on Crisis Reporting and legions of would-be documentary journalists who want to tell stories on which the mainstream media have seemingly given up.

Pages: 1 2 3 4

Word of the Year

January 4, 2010

It’s time to turn a new page on a new decade. This is an opportunity that won’t come along again for another 10 years. Instead of making a New Year’s resolution, I’ve been challenging myself to think in terms of decades. When I think back to 2000 I’m astounded at all that has changed in my life.

My children were small, in elementary school. Before the clock struck midnight Dec. 31 my oldest had been accepted to college.

I was floundering in graduate school. As we enter 2010 I have had a doctorate degree for seven years and spent five wonderful years as a professor in journalism.

Pages: 1 2 3

Late to the Academic Game

January 1, 2010

I was a late bloomer. But I turned out to be a perennial. I got a Ph.D. late in life and decided to pursue a career in academia after spending several decades in journalism.

As it has turned out, I’ve entered the new world of higher education but kept one foot in journalism. I hadn’t planned that, but it’s actually been a blessing.

Even though I sometimes go crazy with my split personality of a life, I think people might actually be jealous, not of me in particular, but of the fact that I get to run around on different playgrounds, playing tag with the athletes, the nerds AND the preps.

Pages: 1 2 3