Sunday, March 18, 2012

I'm Surrounded by Sweets -- and I Know it

From the Web comic xkcd.com
Nope, I am not that person in the comic from xkcd above. I am one of those Internet users who is more apt to be distracted by a good LOLcat than I am a troll's inflammatory statements on a blog somewhere.

I am one of those that Evgeny Morozov's book talks about -- one who is more likely to use the Internet for entertainment than for furthering some set of my social beliefs. I generally think of myself as a positive person, but this class has shown me that I've fallen more on the pessimistic side of the fence more often than not. Is this because I'm gaining all this wisdom they said I'd get in my 30s? Somehow, I don't think so.

This time, however, the reading for class is written from a skeptical or dystopian standpoint. And, big surprise, I agree with his assessments that the Internet is not the ultimate democratizing tool a lot of Western thinkers believe it is. Morozov's isn't an argument against closing the digital divide by any means, and his isn't an argument against globalized access to the Internet as some may view his pessimism. Instead his view is, in my words, summed up as "it is what it is."


What I found most interesting was an obvious tactic that I'd never thought about -- an authoritarian or repressive government not censoring its Internet use, but instead actively creating content for it (or not limiting access to it) to distract the masses. I didn't connect the prevalence of questionable or pornographic websites in otherwise pretty restricted societies as an aid to further those restrictions, I always just associated it as a result of the of that's society's fragmented foundation. It's a pretty ingenious use of pitting human nature against the humans - don't take away from the people, just give them more of the bad stuff; most won't seek out the healthy food if they're surrounded by sweets.

What I wish Morosov would have done more of was offer more than broad conclusions. I'd like his take on what the government should do (if anything), or who should be part of the Global Network Initiative and what he thinks its main goals should be. He addresses some of these answers in a talk he gave (along with Clay Shirky) at Brown University. It was a quick answer to a student question at 1:05:30 (a long talk, so scroll ahead), at the end of their talk, but it's what I'd like more of.

I know that one of my biggest is apathy -- I would not stay up and respond to that person who was "wrong" on the Internet as the subject of the comic would. I'd merely shut my laptop, mutter "idiot" under my breath, and move on. I'd like to think that's what my kindred spirit Morozov would do, too, but considering he already had responses to criticism ready to publish in the paperback version of The Net Delusion, I'm sure he'd sock it away to use as an anecdote in a future book.





Sunday, March 11, 2012

New Hope for the Digital Future

I may be changing my opinion. How open-minded and scholarly of me, no? I credit Chapter 4 of John Hartley's "Digital Futures for Cultural and Media Studies" entitled "The distribution of public thought." You'd have thought that Chapter 3, "Journalism and popular culture," would have spoken to this recovering newspaper person, but it was Chapter 4's discussion of public thought, what it means, and why having more of it isn't akin to a societal regression that made perk up and say "well... yeah."

Another thing that happened this week that immediately came to mind when reading this book was this whole Kony 2012 business. The video, which began circulating widely on social media outlets early last week, prompted a response in me. And when it was over, I began to think about that response. I reacted exactly how that video wanted me to react, which led me to be suspicious.

Hartley contends that more public thought will likely lead to "better" times ahead (p. 96.):
"In terms of history, more of anything worthwhile has never meant worse -- more education, healthcare, affluence, freedom, comfort, intellectual, or entrepreneurial activity, or whatever, has consistently resulted in, well, more. . . . Extending once priestly or royal privileges to everyone benefits . . . everyone."
So extending this idea to public thought, of course means that yes, we'll have more people thinking publicly (in Internet forums, blogs, online news content, etc...) to wade through, but with that comes more quality thought to stumble upon.

So, enter the viral Kony video. Twitter and Facebook exploded, as was the intent, with sympathetic shares and retweets. But what was most refreshing -- or even comforting -- to me was that just as quickly as the word spread about the video, so too did the fact-checking and other-side-of-the-story shares.

Hartley mentions that we may be at the next "Gutenberg" era in communication -- a change so big that it only happens once every 500 years or so -- and I'm inclined to believe him. And as Hartley says, it took a while for the full effect of the printing press to be seen; it will also take a while for the implications of social and digital media to be felt. Right now, the populace is learning how to best deal with it. With a lot of noise will come a smaller number of clear and focused voices that will rise above the rest. I think we caught a glimpse of this over the past week with the Kony story being wildly shared, then the "other side" being just as wildly broadcast.

So, as much as I still bristle at the phrase "citizen Journalism," I've softened to the idea of any idiot being able to publish anything on the Web. I now think it'll take a while for us to learn how to best navigate this new world and all of its opinions, but I'll hang up hope on the "practice makes perfect" hook and reserve my cynicism.

If more than a fleeting awareness of the myriad complicated issues in Africa come from the viral spread of #Kony2012 --  wonderful. But I hold on to enough cynicism to think that fleeting might be just the term that applies. I think at the very least Cultural Studies scholars will have a new term to discuss:



Wednesday, March 7, 2012

Here, by Popular Demand

Some of you requested screenshots of "Slick Willie," a game I remember during my college years during the second Clinton administration. Here you go, loyal readers: not just a screenshot, but a video courtesy of Classic Mac Gameplays:

The original "Slick Willie." Clinton's body-less head appears toward the top left, avoiding Ross Perot.

And here you can see the game in action:


Sunday, March 4, 2012

Puzzles and Ammunition

It's funny how each week brings a new book and a new perspective for me. You know that phenomenon when, say, your friend buys a new car and you suddenly begin noticing other folks driving that same type of car? Of course that model hasn't exploded in popularity, you've just been exposed to it and are now more aware. I think that's what's happening to me.

I came home from Knit Night (admittedly more socializing than knitting) last week to find the husband attached to our TV via headphones and other computer accessories. It was gaming night with "the boys," a relatively recent restart of weekly ritual. "The boys," as only I lovingly call them, used to work together in the same IT department at a university back home. They've mostly gone down different career paths now, but can still come together in the camaraderie of shooting each other over the Internet and requesting backup, no matter where they live.




 As I sat down and began reading "How to do Things with Videogames," by Ian Bogost that night, I couldn't help but laugh at the timing.

Bogost's book, which breaks down the discussion of games into aptly named chapters such as "Reverence," "Pranks," "Texture," and "Habituation," gave me a new way to look at games, but didn't act as a spotlight on a previously unknown world to me. Afterall, I've lived with someone who programs (and who is sometimes connected by wires to various machines) for close to 12 years now. I've seen how videogames bring friends together as a group, and how they can suck an otherwise social personality into a storyline of single-player-ness. I suppose I am the Jane Goodall of gaming, always observing, but never being a true "gamer." I'm what Bogost would call a casual gamer, one who doesn't want to read directions and be fairly successful from the start of the game. Tetris, that's my kind of game. I can shut it off and not return for years. It's addicting and forgettable and I don't have to spend time developing a character or learning a storyline.

As Bogost concludes his book, he ponders a world where the term "gamer" isn't used anymore; a world in which it won't be a special enough hobby or activity to warrant its own term. I think this is just like any other medium. Take books, for example. Some read trashy romances as an esccape while others reach for nonfiction to learn and gain historical perspective. I wouldn't classify readers of the "Twilight" series in the same camp as those who pick up the latest from David McCullough, but both are simply called readers. (Full disclosure: Both of these examples are on my bookshelf at home. I won't bother defending myself on the "Twilight" matter, you wouldn't believe me anyway.)

I think we're closer to that day than Bogost's book might make it seem. True, when you say "gamer" the vision of the kid with a backward baseball cap and unfortunately bleached or spiked hair comes to mind, but when you press a little further, each of us has been lost to a game of Solitaire or Nintendo's first Super Mario Bros. As we can look back on books we've loved with nostalgia, so too can we look back upon videogames. From my generation on, videogames are going to be a new medium by which we can mark periods of our lives (childhood of the '80s: NES' Mario Bros. and Duck Hunt. College in the '90s: "Slick Willie," which featured Bill Clinton's head flying around eating cheeseburgers while trying to avoid various Capitol Hill personalities. I'm not kidding.)

I don't think this is an especially surprising idea or controversial one. I think it simply is what it is. Videogames allow us to escape, become engaged, solve puzzles, and even interact with our friends while doing it. Just as we decide how we will enjoy books, or TV programs, or music, we'll decide what kinds of games we like to play. As technology progresses, we'll have more items to add to this list; videogames are just the most recent. As I look to the future, I see myself continuing to solve  puzzle games, while Larry the, to use that archaic term "gamer," will likely be talking to "the boys" about weapons and arrests well into his silver-haired years.












Sunday, February 26, 2012

Back on the Psychiatrist's Couch

Nights like tonight make me look back at my life as a newspaper person and let out a long, slow sigh of relief. The Daytona 500 was supposed to take place today, in addition to the Oscars, and an NBA All-Star game in Orlando. Last post I dazzled you with Twitter screenshots. Well, dear readers, today I will use my Facebook friends' posts to illustrate my points.
I used to work with Bruce and commenter Dave in Daytona. We all left the paper at different times for greener pastures, but have since reconnected on Facebook. At least some of us have sympathy for those left behind.


The social media scholar danah boyd points out in "Can social network sites enable political action?" that most people use social media to connect with people they already know. That's what happened tonight. I didn't have the race on, nor will I tune into the Oscars most likely, but I'm connected through Twitter or Facebook to friends who will be tuned into these things. How did I find out the race was postponed? Facebook (see?).

George is a good friend who still works in the Sports department in Daytona. It's nights like tonight that I look to the wonders of Social Media to reconnect me to my old stomping ground.

Our readings for this week ran the gamut of Social Media issues, but I've loosely placed each in one of two camps:
  1. Who uses it, how often, and why? 
  2. Can it be used for change or to change minds?
What I wish more of the articles addressed were implications of what they'd studied. Granted most were a few years old (a few were 2010-ish), but after reading them, I was often left with the feeling of, "Well, yeah. Of course." I wondered why most of the studies thought that human nature might change when technology is put into the mix.

For example,"The Network in the Garden: Designing Social Media for Rural Life," by Eric Gilbert, Karrie Karahalios, and Christian Sandvig looked at rural vs. urban social media users. Turns out that rural users matched up with the researchers expectations on most counts: they had fewer "friends" (as in Internet contacts, not actual in-person friends), those "friends" were geographically closer to the users, more users were women, and more profiles were kept private.

Another example is in "Dynamic Debates: An Analysis of Group Polarization Over Time on Twitter," by Sarita Yardi and Danah Boyd (you'll see her name a lot in social media research - sometimes traditionally capitalized and others all lowercase). They studied whether Twitter users of a particular slant or viewpoint would engage mostly with users of similar views or if they sought out alternate viewpoints. Not surprising to me they found that the old saying holds true, "birds of a feather flock together."

As I sit here tonight thinking about social media and its users, I just have to look at my own profile pages to see the results of these studies. And I can't help but think these articles studied the wrong things. Social science research has been done for decades now, and we're fairly well researched in the human condition. Why is it so surprising that we'd "follow" or "friend" people who share similar interests and beliefs? Or that rural users would be more cautious or private?

I don't think we should study social media as this new and strange thing, putting its users under microscopes or back on psychiatrists' couches; technology is an extension of humanity - it's not going to change how we behave. Rather, I think it will capture our behavior like a snapshot and keep it for posterity. Those are the implications that should get some attention.

The readings for this week essentially summed up that people use social media for various reasons including narcissism and news dissemination. Who we "friend" and "follow" tell a lot about a person. Tonight, social media tells me that I didn't need to watch the Oscars (or the 500, if it weren't postponed), that's what I have friends and followers for. And they're telling me Christopher Plummer is having a good night:


 






Sunday, February 12, 2012

Reconciling skepticism and optimism in a post-paper life

Dan Gillmor would be proud. Or maybe he'd just roll his eyes and say that I used common sense, which should be expected and not applauded. Whichever the case (I'll find out Monday when he comes to our class to chat), when I first heard that Whitney Houston had died, it was on Twitter -- and I doubted it. I first saw it in a tweet from a former newspaper coworker -- someone who's certainly no dummy, much more social- and digital-media adept than I, and a person from whom I'd definitely trust a report.







I still went to three other news sources and checked the originating AP tweet myself before adding my own two cents to the celebrity death 'verse.







I suppose I was Mediactive, a term Gillmor coins in a book-blog-website hybrid release that aims to push the public to be more involved and educated about the news it digests and creates. I read a tweet this morning that Twitter broadcast the news of Houston's death 27 minutes before other media. Is that true? I didn't do much research it, but from what I know of Twitter, and its propensity to spread both real and fake news at lightening-fast speed, I at least give it credence.

This morning brought more news from another former newspaper coworker via Twitter:






As soon as I saw it, I knew what Michael meant and that he, too, was skeptical of this news. News I hadn't seen yet, but I quickly checked other news sites. No, Keanu Reeves is not dead, but he was for a while to a number of folks on Twitter. And I checked the #KeanuReeves hashtag itself, which was already rife with the word "hoax." Whew.

This just showed me that in my little circle of former newspaper folks, we were skeptical of some news and reported others quickly through a nearly instant medium. And when I think of  my little circle of what I call journo-friends or N-J peeps (from our time at the Daytona Beach News-Journal), I realize most of us are out of the business officially these days (but that's a whole other story about the newspaper industry...). Reading "Mediactive" brings up all sorts of issues with that idea. Are we ever really "out" of the business? It doesn't look like it.

While I read this book, I could hear the jeers, sighs, and fist-pounding that my newsroom colleagues would have had with his ideas: "Online media reports can't be trusted and they certainly have no place in print!"; "Give away our content for free?!"; "Let the readers contribute!?" Crazy ideas. Ideas that I now have to consider as integral parts of digital media's future.


Admittedly I'm still trained to bristle at the mention of "citizen journalism." But now I'm that citizen and I must force myself to look at the media from this new vantage point: my couch with all the other folks. I'm fairly certain that I'm slightly more intelligent than those "other folks," but isn't that the thought that got parts of the media into the quandary they're in now -- hubris? As a working-in-the-industry journalist, I had training that most people didn't, access that post people didn't, and power that most people didn't. I had the power to write a headline, decide what story ran from a practically bottomless pool of possibilities, and maybe more importantly, the power to decide what didn't run.  Now my power lies in my ability to research and filter good news from bad. I'd like to think I can do that better than the average, non-journalist-Joe, but I bet I'm wrong.

Gillmor puts forth an optimistic view of the media landscape; he's confident that, together, we can bring out the best of our knowledge and journalistic instincts. I'm working on being an optimist. Ten years in a newsroom is a lot to work through.

Sunday, February 5, 2012

Lots of Information

We read "The Information," by James Gleick for this week. The enormity of this book equals its content; from drum beats to Wikipedia, it covers the rise and transformation of information through the ages.

Like most readers, I like a good story. I wish Gleick would have sewn ideas together a bit more and told a story instead of presenting us with history lesson after history lesson. However, I bet the former was his intent. By writing and constructing it the way he did, he made his point: There's a lot to take in, to understand, and it's a complicated  journey from drum beat to byte.

An interesting question to ask after digesting this read is, "What are we going to do with all of this information?" I don't mean the book itself, though it is daunting and even dubbed "aspirational" by a New York Times reviewer. I mean with everything being in the cloud now, or at least on its way, what's next? Sure, we can catalog all that is for future generations, but then what? Will they continue to do the same? Will Wikipedia never end, having every page on every gas station archived for eternity? I envision a time when there's no cloud because everything is the cloud - TV, mail, all our personal records, banking transactions - all that stuff. I suppose all that information will still be stored then, too, but in a way I can't even fathom.

But that's the point of, "The Information" -- to show me that science and technology will answer my question for me. There will be a new theory, invention, or idea to come along and make me realize why all of the information is a good thing and show me what future generations can do with it.

I wish Gleick would have talked about the economic impact of all this information (perhaps it will be in the sequel?). It costs a lot of money (now) to store huge quantities of anything -- a reason cloud computing has become so popular. Granted, as technology progresses, so will the availability of it, but what does that mean for the moment? Or for countries who don't have the infrastructure to catalog their "right now" as tomorrow's history?

Maybe Wikipedia will catalog important things for those who can't type it right now. When it comes to that anyone-can-edit online encyclopedia, I'm an inclusionist (tho most of this post may make me sound like a deletionist). Let's write it all down and let technology catch up with us. Let's acknowledge we have too much information right now and see what the mathematicians, philosophers, and scientists of today and tomorrow can invent for the next big step.