Contact Us

Use the form on the right to contact us.

You can edit the text in this area, and change where the contact form on the right submits to, by entering edit mode using the modes on the bottom right. 

PO Box 3201
Martinsville, VA 24115
United States

Stephen H. Provost is an author of paranormal adventures and historical non-fiction. “Memortality” is his debut novel on Pace Press, set for release Feb. 1, 2017.

An editor and columnist with more than 30 years of experience as a journalist, he has written on subjects as diverse as history, religion, politics and language and has served as an editor for fiction and non-fiction projects. His book “Fresno Growing Up,” a history of Fresno, California, during the postwar years, is available on Craven Street Books. His next non-fiction work, “Highway 99: The History of California’s Main Street,” is scheduled for release in June.

For the past two years, the editor has served as managing editor for an award-winning weekly, The Cambrian, and is also a columnist for The Tribune in San Luis Obispo.

He lives on the California coast with his wife, stepson and cats Tyrion Fluffybutt and Allie Twinkletail.

IMG_0944.JPG

On Life

Ruminations and provocations.

Filtering by Tag: analytics

Is Twitter's downfall imminent? I sure hope so.

Stephen H. Provost

Twitter lost 2 million monthly U.S. users in the latest quarter – 3 percent of its total.

I’m not exactly doing cartwheels over this, primarily because, at my age, attempting such would be downright dangerous. It did, however, make me smile.

There are things you do because you want to, and there are others you do because you have to.

For me, Twitter has always fallen into the second category. I pretty much have to have some presence there because I’m part of the communications business. Journalist. Author. If you’re in either game these days, you need all the exposure you can get.

But Twitter is, to me, what eating my veggies was to my 7-year-old self. It’s something I do while holding my noise to avoid the bitter taste, because I’ve been told, “You must do this because it’s good for you.” Needless to say, that imperative makes it all the more unpalatable.

Veggies have grown on me but, unfortunately, Twitter hasn’t.

I’m not alone in my disdain for Twitter, even among writers and journalists, some of whom have dumped the platform altogether. For these folks, it’s just not worth it:

Last year, a fellow journalist, New York Times deputy Washington editor Jonathan Weisman, quit Twitter because he got sick of dealing with anti-Semitic attacks on the platform. It had become, in his words, “a cesspit of hate.”

Lindy West, an author and columnist, also bowed out, declaring Twitter to be “unusable for anyone but trolls, robots and dictators.” She concluded her piece in The Guardian with the words, “Keep the friends. Ditch the mall.”

CNN’s Aislyn Camerota realized she was “hanging out with people who find satisfaction spewing vitriol, people who spread racism, misogyny and anti-Semitism.”

The medium frames the message

Should we blame the messenger?

As Marshall McLuhan once said, “The medium is the message” (or “mess age,” as he sometimes quipped). I’m not sure I’d go that far, but the medium certainly frames the message, and Twitter’s 140-character format does just that … in a such a way as to discourage people from thinking. Or analyzing. Or conducting any kind of in-depth dialogue.

Why does Twitter attract the kind of people who ultimately alienated Weisman, West and Camerota? Maybe because it encourages hit-and-run attacks rather than reasoned discourse. Sound-bite politics does the same thing – and is, unsurprisingly, dominated by similar attacks. If you don’t like negative campaigning, you probably won’t care for Twitter, either, because Twitter is all about campaigning.

The platform is dominated by celebrities and wannabrities (along with their fans and sycophants), who are there to promote their name or their brand. Donald J. Trump, celebrity turned politician, is the ultimate creature of the nexus between politics and celebrity that Twitter has become.

Trump’s ubiquitous presence on – and reliance upon – Twitter has confirmed my opinions of both: of Trump as a simpleton who’s deluded himself into thinking he can tackle complex policy issues in 140 characters, and of Twitter as the platform that empowers him (and people like him) to do perpetuate such delusions.

High anxiety

This isn’t to say everyone who uses Twitter is a simpleton or a troll. My point is that the platform’s format attracts such folks, and like many others, I’m not comfortable in the kind of environment that creates.

As someone who’s generally unimpressed by celebrity, that doesn’t appeal to me. Besides that, there’s research that indicates using a large number of social media platforms just isn’t good for you. A study published Dec. 10 in Computers in Human Behavior found that people who used the risk of depression and anxiety in those who used the largest number of platforms was more than three times that of people used two or fewer.

That’s the last thing I need. At last count, I was active on Facebook (my primary platform), Instagram, Twitter and my blog. If I were asked to drop one, it would be a no-brainer to eliminate the one that seemed the most superficial, the least user friendly, the least interesting and the most, well, just plain mean.

That would be Twitter, folks. Where anxiety-inducing trolls and bullies are perhaps most prevalent.

Maybe other people are coming to the same conclusion, and perhaps that’s why Twitter’s user base – never remotely close to Facebook’s in the best of times – is starting to shrink. Maybe another part of it is Trump fatigue. Either way, I’m hoping users are sending a message by abandoning ship: It’s long past time for Twitter to change, and fundamentally, or die.

The Internet is our Matrix, and it's killing us

Stephen H. Provost

You take the blue pill, the story ends. You wake up in your bed and believe whatever you want to believe. You take the red pill, you stay in wonderland, and I show you how deep the rabbit hole goes.
— Morpheus to Neo in "The Matrix"

The Matrix has you.

Whether you know it or not – and if we stick with the film’s analogy, there’s a good chance you don’t, you’re in the process of becoming dependent upon a form of virtual reality that could just drive you nuts.

So, here’s your red pill.

If you’re reading this, you’re online, which means you’re hooked up to our 21st century approximation of the Matrix, a mechanism that has supplanted our traditional sources of … you name it: shopping, news, entertainment, information. You get the idea.

You’re more likely to find a newspaper – minus the paper – on your computer screen than on your front doorstep these days. You find movie times online now, too, along with the movies themselves. Netflix, anyone? YouTube? Who needs a TV weatherperson when you’ve got weather.com? And who needs a book when you can download the text to your Kindle?

Despite pockets of resistance, the Internet has become so pervasive that it’s starting to look like Standard Oil at the turn of the 20th century. That company became so powerful, and society so dependent upon it, that the Supreme Court ruled it was a monopoly and broke it up into 34 separate companies.

We can’t do that with the Internet, which unlike Standard Oil, isn’t a single company. And it doesn’t work exactly the way a monopoly does. Unlike a traditional monopoly, hasn’t limited our options, it’s broadened them exponentially, providing access to more streams of information and entertainment than ever before.

Growing dependence

I love that about the Internet, and a lot of other people do, too, which is why it’s become so successful.

Yet in doing so, it’s also become nearly indispensable, and there’s the rub. Even as it has broadened the number of options at our fingertips, it’s narrowed our means of accessing them. The more brick-and-mortar stores close, the more we’re reliant on Amazon and its brethren. The more newspapers shift their focus online, the more we have to shift our focus there, too. The more “streaming” video becomes a thing, the more we rely on it for our entertainment. And so it goes, right on down the line.

National security experts long ago started worrying about our growing dependence on the Internet. Back in the days when MySpace was still a thing, they began warning that even warfare would shift from traditional battlefields to online cyber-skirmishes involving black hats, white hats and a whole new form of espionage.

Turns out they were right. Russian interference in our political process is merely the most blatant example of a problem that’s been simmering for a long time involving hackers on the one hand and security experts on the other, each trying to stay one step ahead of the other.

This involves continual – and rapid – change, something human beings aren’t always comfortable with.

Information overload

Yes, change is good, but constant rapid change puts people in a continual state of anxiety, slaves to a fight-or-flight response that feels like it’s always on the verge of kicking in.

Ever wonder why so many people resist moving away from fossil fuels and toward alternative forms of energy? It’s not because they like pollution or want climate change. It’s not even just about jobs or industries, although that’s a part of it. Fundamentally, it’s about security. We develop habits and, no matter how much we strive to embrace innovation, there’s a part of us that resists it for no other reason than “we’ve always done it this way.”

More to the point, we know how to do it this way.

There’s a tendency to dismiss resistance to change as backward or ignorant, but there’s far more to it than that. It’s a natural human defense against the kind of upheaval we’ve experienced as we’ve become more and more dependent upon the Internet – where rapid change is the rule rather than the exception.

We’ve moved out of the information age and into the age of information overload. I’m not just talking about the proliferation of choices the Internet has offered us. Those are, after all, still choices. There might be millions or even billions of websites out there, but we tend to find those we like and stick to them (insulating ourselves in the process from opinions that don’t gibe with our own, but that’s another story).

Not-so-brave new world

Still, we don’t always have a choice to shield ourselves from information overload, or the anxiety it causes.

One simple example: The demand that we continually change (and remember) multiple passwords as a means of shielding ourselves from identity theft, computer viruses, etc. It’s not like the old days, when you taught your child to remember his home phone number, which never changed unless you moved to a different city.

That’s stressful, and it’s just the beginning.

Add to that the stress of staying on top of internet marketing techniques, whether you work for a major company or are in business for yourself. Google, Facebook and so forth are continually tweaking their algorithms, so marketers have no choice but to respond. A generation or two ago, you took out an ad in the newspaper, on radio or TV, then measured the results in terms of how many shoppers turned our and how big a sales boost you got. Simple cause and effect.

Now, you aren’t limited to those three marketing options, which are largely secondary anyway. Online, you have to market your product via Facebook. And Twitter. And Instagram. And Snapchat. And Pinterest. And LinkedIn. And Amazon. And Goodreads. And on and on and on. Each of these platforms has different rules, different systems to learn and different ways of maximizing page views.

(As an author active in promoting my work on all those platforms except for Snapchat, I know what I'm talking about.)

Once you’ve mastered those rules, you’ve got to test them by figuring out where those clicks are coming from, along with the demographics of real and potential customers.

You’ve got to use the proper metadata and keywords. Then, once you’ve got all that in place, you’ll need to measure the performance of text vs. images vs. videos at attracting a user’s attention within a milieu of never-ending options. Are users staying engaged? Are they returning? You’ve got to measure those things, too.

We’re not built that way

Suddenly, you’re light years away from a relying on folks to page through the Sunday paper at their leisure, providing solid customer service when they visit your establishment and hoping they’ll spread the word.

Pretty soon, nearly all your time is being taken up adapting to ever-changing systems, processing information and analyzing the results. Then starting all over, virtually from scratch, when some algorithm-writer changes the rules.

In an atmosphere where marketing is king, queen, prince and pauper, there’s little time left for actually creating the product you’re supposed to be selling in the first place. The process is king, and the product takes second place. Heck, just getting through the process is a challenge – one that often demands a greater degree of multitasking.

Despite what this word might suggest, our brains aren’t built to multitask. If they were, texting and driving would be no big deal.

"If you have a complicated task, it requires all your attention, and if you're trying to spread your attention over multiple tasks, it's not going to work," David Meyer, cognitive scientist at the University of Michigan, said in an article by Joe Robinson titled The Myth of Multitasking.

The fact is, we can’t think straight when we try to process too many things at once. Our memories suffer. Our ability to think creatively – which involves things like daydreaming, brainstorming and joyfully exploring the world around us – is stifled. We're actually 40 percent less productive.

We become stressed-out automatons, more prone to breaking down thanks to hypertension, fatigue and burnout. But we've fashioned a world for ourselves where we seem to have little choice. It's a world of diminishing product value, increasing health problems and rising frustration – a world where style hasn’t merely surpassed substance, it’s supplanted it.

Welcome to the assembly line. Welcome to the future.

The Matrix has us all.

You ought to be setting aside large chunks of time where you just think. Einstein was not multitasking when he was dreaming up the special and general theories of relativity.
— David Meyer, University of Michigan cognitive scientist

Bernie Sanders no slave to the McGovern Effect

Stephen H. Provost

Some Democrats are still scared of George McGovern. They look at Bernie Sanders, and they see someone “too far to the left” to win the general election.

That’s the conventional political wisdom. But keep in mind that this same “conventional wisdom” all but guaranteed that Hillary Clinton would be the nominee in 2008 and dismissed the notion of Donald Trump being anything but a flash in the pan this year.

Even Nate Silver’s analytics-driven FiveThirtyEight was flat wrong (along with a lot of other pundits) in predicting that Clinton would win this year’s Michigan primary handily – probably the most badly bungled prediction since “Dewey defeats Truman.”

Political punditry isn’t exact, and it’s not a science.

Sometimes, it’s nothing more than spin: advocacy disguised as analysis.

Other times, the pundits are so full of themselves they believe their own “infallibility” hype. They get cocky, and they get it wrong.

And often, they’re wrong about the future because they’re wrong about the past. Certain assumptions are just repeated ad nauseam on cable TV until they become a sort of political gospel.

This is where the McGovern Effect comes in.

Ever since the Democrats nominated “peace candidate” George McGovern in 1972 – only to watch Tricky Dick Nixon annihilate him in the General Election – they’ve been deathly afraid of history repeating itself.

Nominate someone too far to the left, and it’ll be another massacre. So the conventional wisdom says. Just look at liberal Mike Dukakis, who failed to work any Massachusetts miracles against George Bush I.

It’s the gospel truth.

And because a lot Democrats today have accepted that gospel, they look at Bernie Sanders and see George McGovern staring back at them. They look in the other direction, at Hillary Clinton, and they see a last name that’s shared by a relatively moderate two-term Democratic president.

No brainer, right?

Go with what works.

Except they’re so worried about history repeating itself that they’re ignoring a more recent, more telling precedent. All they have to do is look across the aisle.

Reagan's revolution

Four years after McGovern lost in that landslide, a Republican challenged the incumbent president from the right and nearly beat him. That challenger was, of course, Ronald Reagan – who scared establishment Republicans out of their wits. He was too conservative, they thought. They remembered what had happened to Barry Goldwater in ’64 when he won the nomination from the far right: LBJ had destroyed him in the general election, just as Nixon later buried McGovern.

The GOP establishment breathed a sigh of relief when they saw incumbent Gerald Ford hang on by the skin of his teeth to defeat Reagan … only to watch him lose to Jimmy Carter in the general election.

We all know what happened four years later: Reagan won the nomination on his second try and defeated Carter for the presidency.

Historically speaking, Sanders resembles Reagan a lot more than he does McGovern. Or Goldwater. Or Dukakis.

For one thing, like Reagan, he’s generating the kind of excitement his primary opponent can’t match. Hillary Clinton is about as exciting as Gerald Ford was – without the clumsiness but with a whole lot more political baggage. Would Reagan have carried enough enthusiasm into the general election to beat Carter in ’76? We’ll never know. But we do know he beat him four years later.

By then, Ford was out of politics and Carter was a wounded president, crippled by a sluggish economy and the Iran hostage crisis.

That made him vulnerable – in much the same way the Republicans are vulnerable this year. Will the Republican nominee be Donald Trump or Ted Cruz? It hardly matters. In either case, the Democrats will face someone with the kind of anemic approval ratings that resemble Carter’s a lot more than Nixon’s.

The opposition

This is where the McGovern Effect breaks down even more.

In Nixon, McGovern faced an incumbent who was highly popular at the time among everyone except the far left. Naturally, the far left voted for McGovern, and everyone else chose Nixon.

The same held true for Goldwater and Dukakis, both of whom were victims of strong opposition far more than their own ideology. Goldwater was up against the heir to a charismatic president whose death was still being mourned a year after his assassination. And Dukakis’ opponent, the first George Bush, was Reagan’s chosen successor. Kennedy and Reagan: the two most iconic presidents of the second half of the 20th century.

Somehow, the names Trump and Cruz just don’t have the same gravitas.

On top of this, Sanders also has an advantage in social media that McGovern could never have conceived of.

Does this mean Sanders’ nascent revolution is destined to repeat the Reagan revolution’s electoral success?

I’m not going there.

What I will say is that anyone who dismisses Sanders as a viable Democratic candidate based on the McGovern Effect is ignoring some powerful evidence that points in the opposite direction.

“Destiny” and “inevitability” are the language of pundits who crow about their predictions and then end up eating it. The crow, that is.

A sparrow might just tell another story.

We’ll have to wait and see.