Sunday, April 22, 2012

Not What, But How

I still believe that the Internet is a tool. Yes, it allows us to do things differently, more quickly, or more visually, but it still allows us to do what we would have done anway.

In reading for Monday's class discussion, two of the articles --  "News & the news media in the digital age: implications for democracy," by Herbert J. Gans and "Twitter: Microphone for the masses?" by Dhiraj Murthy -- they both made me think back to a very unrelated event: My  mother's only comment to me on Facebook. Ever.

While I'm sure the goal was to think of lofty ideas about technology helping foster democracy around the world, or giving a voice to millions who would have otherwise had no way to publish their ideas, I think about how technology hasn't really eaffected (edit: it's the end of the semester, give a blogger a break) how I communicate. And then I began to ponder if it every would change how anyone communicated.

What sparked this was my mom's only post ever onto my Facebook status. She's not at all too sure about Facebook, accidentally signing up through both her work and home email addresses (but we worked that out over Christmas break one year). So to my surprise one Veteran's Day where I posted a little shout out to all my veteran friends and spouse, out of nowhere mom chimes in and says not to forget "your dad's brother, Harry, who served in the Air Force." Wow. Well, one: I never met my "dad's brother, Harry." He was killed when my father was only 12. Secondly, how in the world did my mother decide that that was the day to finally login to Facebook and start reading her newsfeed?

(Side note: I was going to track down this post and put a screen shot of it here, but with the horrid new Facebook timeline, I can't find it. So instead, you get this picture of my cat Lou. More on him later.)


So while I'm now armed with a Twitter feed where, in 140 characters, I can spread news, thoughts, ideas, and my musings, it doesn't really provide me the opportunity to do something or say something I wouldn't have said before. And while I agree that it does give a platform for people who  need to make political statements or news events known, it doesn't give them the impetus - that's what we're born with and comes from a much  more human source. Twitter doesn't give a voice as much as it gives a microphone to the masses. The same goes for Facebook. I don't post anything I wouldn't say to most friends just because I'm armed with an Internet connection, it just allows me to more easily interact with a group who is  geographically diverse.

My mother's post proved to me that our intentions and inclinations as human communicators likely won't change with the medium. No matter what story I was telling, Ma would be sure to pipe in with an element she thought I was missing, or elaborate on a family connection I didn't explain fully. Even in her one foray into Facebook Newsfeed-land, she responded as she would have if we were in the same room.

Facebook has done one thing for me -- it's helped point out my failings as a pet owner. Lou had to have a tooth pulled on Friday, something I feel that I should have noticed a few weeks ago. By looking through Facebook pictures on Friday while he was at the vet, his always-present snaggle tooth was shorter than what I noticed Thursday night, which prompted the early morning vet run. In Facebook pictures, I had evidence that I was an unobservant pet owner. Lou's fine now. He's getting his medicine, kitty "advil," and a lot of soft treats. Thank you, Facebook. Thanks for allowing us to pass along guilt - be it in the form of motherly asides or pet-owner failures.

Sunday, April 1, 2012

A Lawrence by Any Other Name

Score one for Amanda.

After living with a computer-inclined person for more than 12 years now, the knowledge I had on the subject has always taken a back seat to the more capable hands of Lawrence, "Larry" the Computer Guy. It's not that I'm not technically inclined, I'm just not -as- technically inclined as some. So, as ducks go to water and I drift toward the creative, so Larry drifts toward bits of code and hardware.

But today, my friends, while armed with "Remix," by Lawrence Lessig, I found out something the "computer smart-one" didn't already know. Mind you, it was something that could be classified as trivia, but still: I knew how Apache got its name first in my household (see page 164 if I've piqued your interest). Sometimes it's the small victories that keep us going.

Let me change gears slightly from self-congratulating mode.

Never before had I thought about copyright laws for so long. It's been the big assumption of a lot of my life, professionally speaking: Make sure you have permission to run that photo, to run that story, to run that image. Granted, I was working at a newspaper, so a lot of Lessig's ideas don't necessarily apply since we were for-profit. The future I could imagine with Lessig's proposals to overhaul our current system actually inspires me. The lightbulb on the creative side of my brain lit up when reading this book -- yes, people should be able to share and create in the spirit of culture rather than the spirit of capitalism. For once in this semester, I'm actually optimistic about the effect ideas coming from this book could have.

Culture, technology, ideas -- these all grow from sharing and communicating. Innocentive knows that more minds and contributions are better than fewer, and they have proven results. Their results have actually brought good changes to the world, so why not apply that same mode of thinking to books, music, and art?


Mind you, there's still that tiny voice in the back of my brain telling me why greed and end result of profit will get in the way, but it's a smaller and softer voice than what my kindred spirit Evgeny Morosov stirred. I remember how angry and litigious Metallica became at the height of the Napster scandal in the mid-2000s. That's certainly going to happen again.

But I think to Lessig's example of Harry Potter. By letting kids (or anyone, really) take clips, artwork, etc from the books, the brand became more valuable.  It was shared as much as it was embraced by its audience. This could happen to any creative endeavor, if it's worth sharing. Yes, there will always be the issue of compensation, but I believe people will pay for the good stuff.  Misters Hammett and Hetfield, I never stole your music. And there was a good reason for that.


Sunday, March 25, 2012

Privacy in the Dappled Shade

When thinking about privacy this morning, my thoughts went back to my old house in Florida with my beloved Chickasaw Plum tree in the back yard. Despite being in the city, the back yard always felt like a private nook.


It's funny to start this day thinking about privacy. It doesn't seem like something I should type out, but rather words I should scribble into a journal I keep tucked securely beneath my pillow. I suppose I define privacy as my ability to control what I keep close, and what I choose to show to the sunshine. And its that dappled sunlight, as the sun shines through trees and windows, that makes up different shades of what is shared; the bright spots where the sun is unfiltered is more akin to a Tweet, while the filtered light  coming through a window is a dinner conversation, or possibly a Facebook post.

danah boyd and Alice Marwick talked to more than 100 teens in 20 states over a four-year period to get their thoughts on privacy. Apparently my view on the subject isn't that different than Jeromy and Meixing, ages 14 and 17, respectively, which boyd and Marwick sum up here (from page 4):
"Both of their approaches to privacy highlight how privacy isn’t simply binary – access or no access – but, rather, control over how information flows or, in other words, control over the social situation."
I had to go back a few years, but this article reminded me that the appeal of Facebook and Twitter to teens is nothing new. Teens themselves are worried about privacy, but not in the same way that an older adult might. I found it interesting that one teen, Hunter (page 18), became annoyed when people he wasn't "addressing" would comment on his status updates on Facebook. To him, the content of his status update and the tone in which he wrote it signaled the audience he was targeting -- and if you weren't in that audience, he thought it was rude of you to comment on it. That makes little sense to me, as the post was made publicly on the Internet. I think teens assume there's more privacy on Facebook because of the norms of their real-life social network, and forget that -anyone- (depending on the user settings) can read posts and flip through pictures regardless of if it's apppropriate behavior.

As boyd and Marwick point out, teens have a set of norms and mores within their social circles -- and it seems those norms are constant even if the way in which the teens interact changes over time.  When I was in high school, you wouldn't go up to a group of people who weren't in your "circle" and start a conversation or jump into one on which you were eaves dropping. The same applies now in a digital way. Teens have always sought out places where they can get together in groups, and now social networking sites are rivaling the mall for a place to hang out and socialize.

The trouble now is, you don't just have to worry about that group of catty girls hearing what you're saying, you have to worry about unknown groups reading or "digitally overhearing" everything you've posted or photographed without you ever knowing about it.

For now I will continue to Tweet as I normally would (which, it seems I'm very much like a lot of Twitter users in this article, also by boyd and Marwick, who have definite topics they don't touch) and post to Facebook in my usual fashion. Yes there are differences in how I use the two (and if you clicked on that article in the last sentence, you'll see most people do use them differently), but I'm comfortable with my name being beside what I put out there. And I think that's because I don't put everything in the sunshine. I prefer most of my communication with friends to be of the dappled variety.


Sunday, March 18, 2012

I'm Surrounded by Sweets -- and I Know it

From the Web comic xkcd.com
Nope, I am not that person in the comic from xkcd above. I am one of those Internet users who is more apt to be distracted by a good LOLcat than I am a troll's inflammatory statements on a blog somewhere.

I am one of those that Evgeny Morozov's book talks about -- one who is more likely to use the Internet for entertainment than for furthering some set of my social beliefs. I generally think of myself as a positive person, but this class has shown me that I've fallen more on the pessimistic side of the fence more often than not. Is this because I'm gaining all this wisdom they said I'd get in my 30s? Somehow, I don't think so.

This time, however, the reading for class is written from a skeptical or dystopian standpoint. And, big surprise, I agree with his assessments that the Internet is not the ultimate democratizing tool a lot of Western thinkers believe it is. Morozov's isn't an argument against closing the digital divide by any means, and his isn't an argument against globalized access to the Internet as some may view his pessimism. Instead his view is, in my words, summed up as "it is what it is."


What I found most interesting was an obvious tactic that I'd never thought about -- an authoritarian or repressive government not censoring its Internet use, but instead actively creating content for it (or not limiting access to it) to distract the masses. I didn't connect the prevalence of questionable or pornographic websites in otherwise pretty restricted societies as an aid to further those restrictions, I always just associated it as a result of the of that's society's fragmented foundation. It's a pretty ingenious use of pitting human nature against the humans - don't take away from the people, just give them more of the bad stuff; most won't seek out the healthy food if they're surrounded by sweets.

What I wish Morosov would have done more of was offer more than broad conclusions. I'd like his take on what the government should do (if anything), or who should be part of the Global Network Initiative and what he thinks its main goals should be. He addresses some of these answers in a talk he gave (along with Clay Shirky) at Brown University. It was a quick answer to a student question at 1:05:30 (a long talk, so scroll ahead), at the end of their talk, but it's what I'd like more of.

I know that one of my biggest is apathy -- I would not stay up and respond to that person who was "wrong" on the Internet as the subject of the comic would. I'd merely shut my laptop, mutter "idiot" under my breath, and move on. I'd like to think that's what my kindred spirit Morozov would do, too, but considering he already had responses to criticism ready to publish in the paperback version of The Net Delusion, I'm sure he'd sock it away to use as an anecdote in a future book.





Sunday, March 11, 2012

New Hope for the Digital Future

I may be changing my opinion. How open-minded and scholarly of me, no? I credit Chapter 4 of John Hartley's "Digital Futures for Cultural and Media Studies" entitled "The distribution of public thought." You'd have thought that Chapter 3, "Journalism and popular culture," would have spoken to this recovering newspaper person, but it was Chapter 4's discussion of public thought, what it means, and why having more of it isn't akin to a societal regression that made perk up and say "well... yeah."

Another thing that happened this week that immediately came to mind when reading this book was this whole Kony 2012 business. The video, which began circulating widely on social media outlets early last week, prompted a response in me. And when it was over, I began to think about that response. I reacted exactly how that video wanted me to react, which led me to be suspicious.

Hartley contends that more public thought will likely lead to "better" times ahead (p. 96.):
"In terms of history, more of anything worthwhile has never meant worse -- more education, healthcare, affluence, freedom, comfort, intellectual, or entrepreneurial activity, or whatever, has consistently resulted in, well, more. . . . Extending once priestly or royal privileges to everyone benefits . . . everyone."
So extending this idea to public thought, of course means that yes, we'll have more people thinking publicly (in Internet forums, blogs, online news content, etc...) to wade through, but with that comes more quality thought to stumble upon.

So, enter the viral Kony video. Twitter and Facebook exploded, as was the intent, with sympathetic shares and retweets. But what was most refreshing -- or even comforting -- to me was that just as quickly as the word spread about the video, so too did the fact-checking and other-side-of-the-story shares.

Hartley mentions that we may be at the next "Gutenberg" era in communication -- a change so big that it only happens once every 500 years or so -- and I'm inclined to believe him. And as Hartley says, it took a while for the full effect of the printing press to be seen; it will also take a while for the implications of social and digital media to be felt. Right now, the populace is learning how to best deal with it. With a lot of noise will come a smaller number of clear and focused voices that will rise above the rest. I think we caught a glimpse of this over the past week with the Kony story being wildly shared, then the "other side" being just as wildly broadcast.

So, as much as I still bristle at the phrase "citizen Journalism," I've softened to the idea of any idiot being able to publish anything on the Web. I now think it'll take a while for us to learn how to best navigate this new world and all of its opinions, but I'll hang up hope on the "practice makes perfect" hook and reserve my cynicism.

If more than a fleeting awareness of the myriad complicated issues in Africa come from the viral spread of #Kony2012 --  wonderful. But I hold on to enough cynicism to think that fleeting might be just the term that applies. I think at the very least Cultural Studies scholars will have a new term to discuss:



Wednesday, March 7, 2012

Here, by Popular Demand

Some of you requested screenshots of "Slick Willie," a game I remember during my college years during the second Clinton administration. Here you go, loyal readers: not just a screenshot, but a video courtesy of Classic Mac Gameplays:

The original "Slick Willie." Clinton's body-less head appears toward the top left, avoiding Ross Perot.

And here you can see the game in action:


Sunday, March 4, 2012

Puzzles and Ammunition

It's funny how each week brings a new book and a new perspective for me. You know that phenomenon when, say, your friend buys a new car and you suddenly begin noticing other folks driving that same type of car? Of course that model hasn't exploded in popularity, you've just been exposed to it and are now more aware. I think that's what's happening to me.

I came home from Knit Night (admittedly more socializing than knitting) last week to find the husband attached to our TV via headphones and other computer accessories. It was gaming night with "the boys," a relatively recent restart of weekly ritual. "The boys," as only I lovingly call them, used to work together in the same IT department at a university back home. They've mostly gone down different career paths now, but can still come together in the camaraderie of shooting each other over the Internet and requesting backup, no matter where they live.




 As I sat down and began reading "How to do Things with Videogames," by Ian Bogost that night, I couldn't help but laugh at the timing.

Bogost's book, which breaks down the discussion of games into aptly named chapters such as "Reverence," "Pranks," "Texture," and "Habituation," gave me a new way to look at games, but didn't act as a spotlight on a previously unknown world to me. Afterall, I've lived with someone who programs (and who is sometimes connected by wires to various machines) for close to 12 years now. I've seen how videogames bring friends together as a group, and how they can suck an otherwise social personality into a storyline of single-player-ness. I suppose I am the Jane Goodall of gaming, always observing, but never being a true "gamer." I'm what Bogost would call a casual gamer, one who doesn't want to read directions and be fairly successful from the start of the game. Tetris, that's my kind of game. I can shut it off and not return for years. It's addicting and forgettable and I don't have to spend time developing a character or learning a storyline.

As Bogost concludes his book, he ponders a world where the term "gamer" isn't used anymore; a world in which it won't be a special enough hobby or activity to warrant its own term. I think this is just like any other medium. Take books, for example. Some read trashy romances as an esccape while others reach for nonfiction to learn and gain historical perspective. I wouldn't classify readers of the "Twilight" series in the same camp as those who pick up the latest from David McCullough, but both are simply called readers. (Full disclosure: Both of these examples are on my bookshelf at home. I won't bother defending myself on the "Twilight" matter, you wouldn't believe me anyway.)

I think we're closer to that day than Bogost's book might make it seem. True, when you say "gamer" the vision of the kid with a backward baseball cap and unfortunately bleached or spiked hair comes to mind, but when you press a little further, each of us has been lost to a game of Solitaire or Nintendo's first Super Mario Bros. As we can look back on books we've loved with nostalgia, so too can we look back upon videogames. From my generation on, videogames are going to be a new medium by which we can mark periods of our lives (childhood of the '80s: NES' Mario Bros. and Duck Hunt. College in the '90s: "Slick Willie," which featured Bill Clinton's head flying around eating cheeseburgers while trying to avoid various Capitol Hill personalities. I'm not kidding.)

I don't think this is an especially surprising idea or controversial one. I think it simply is what it is. Videogames allow us to escape, become engaged, solve puzzles, and even interact with our friends while doing it. Just as we decide how we will enjoy books, or TV programs, or music, we'll decide what kinds of games we like to play. As technology progresses, we'll have more items to add to this list; videogames are just the most recent. As I look to the future, I see myself continuing to solve  puzzle games, while Larry the, to use that archaic term "gamer," will likely be talking to "the boys" about weapons and arrests well into his silver-haired years.












Sunday, February 26, 2012

Back on the Psychiatrist's Couch

Nights like tonight make me look back at my life as a newspaper person and let out a long, slow sigh of relief. The Daytona 500 was supposed to take place today, in addition to the Oscars, and an NBA All-Star game in Orlando. Last post I dazzled you with Twitter screenshots. Well, dear readers, today I will use my Facebook friends' posts to illustrate my points.
I used to work with Bruce and commenter Dave in Daytona. We all left the paper at different times for greener pastures, but have since reconnected on Facebook. At least some of us have sympathy for those left behind.


The social media scholar danah boyd points out in "Can social network sites enable political action?" that most people use social media to connect with people they already know. That's what happened tonight. I didn't have the race on, nor will I tune into the Oscars most likely, but I'm connected through Twitter or Facebook to friends who will be tuned into these things. How did I find out the race was postponed? Facebook (see?).

George is a good friend who still works in the Sports department in Daytona. It's nights like tonight that I look to the wonders of Social Media to reconnect me to my old stomping ground.

Our readings for this week ran the gamut of Social Media issues, but I've loosely placed each in one of two camps:
  1. Who uses it, how often, and why? 
  2. Can it be used for change or to change minds?
What I wish more of the articles addressed were implications of what they'd studied. Granted most were a few years old (a few were 2010-ish), but after reading them, I was often left with the feeling of, "Well, yeah. Of course." I wondered why most of the studies thought that human nature might change when technology is put into the mix.

For example,"The Network in the Garden: Designing Social Media for Rural Life," by Eric Gilbert, Karrie Karahalios, and Christian Sandvig looked at rural vs. urban social media users. Turns out that rural users matched up with the researchers expectations on most counts: they had fewer "friends" (as in Internet contacts, not actual in-person friends), those "friends" were geographically closer to the users, more users were women, and more profiles were kept private.

Another example is in "Dynamic Debates: An Analysis of Group Polarization Over Time on Twitter," by Sarita Yardi and Danah Boyd (you'll see her name a lot in social media research - sometimes traditionally capitalized and others all lowercase). They studied whether Twitter users of a particular slant or viewpoint would engage mostly with users of similar views or if they sought out alternate viewpoints. Not surprising to me they found that the old saying holds true, "birds of a feather flock together."

As I sit here tonight thinking about social media and its users, I just have to look at my own profile pages to see the results of these studies. And I can't help but think these articles studied the wrong things. Social science research has been done for decades now, and we're fairly well researched in the human condition. Why is it so surprising that we'd "follow" or "friend" people who share similar interests and beliefs? Or that rural users would be more cautious or private?

I don't think we should study social media as this new and strange thing, putting its users under microscopes or back on psychiatrists' couches; technology is an extension of humanity - it's not going to change how we behave. Rather, I think it will capture our behavior like a snapshot and keep it for posterity. Those are the implications that should get some attention.

The readings for this week essentially summed up that people use social media for various reasons including narcissism and news dissemination. Who we "friend" and "follow" tell a lot about a person. Tonight, social media tells me that I didn't need to watch the Oscars (or the 500, if it weren't postponed), that's what I have friends and followers for. And they're telling me Christopher Plummer is having a good night:


 






Sunday, February 12, 2012

Reconciling skepticism and optimism in a post-paper life

Dan Gillmor would be proud. Or maybe he'd just roll his eyes and say that I used common sense, which should be expected and not applauded. Whichever the case (I'll find out Monday when he comes to our class to chat), when I first heard that Whitney Houston had died, it was on Twitter -- and I doubted it. I first saw it in a tweet from a former newspaper coworker -- someone who's certainly no dummy, much more social- and digital-media adept than I, and a person from whom I'd definitely trust a report.







I still went to three other news sources and checked the originating AP tweet myself before adding my own two cents to the celebrity death 'verse.







I suppose I was Mediactive, a term Gillmor coins in a book-blog-website hybrid release that aims to push the public to be more involved and educated about the news it digests and creates. I read a tweet this morning that Twitter broadcast the news of Houston's death 27 minutes before other media. Is that true? I didn't do much research it, but from what I know of Twitter, and its propensity to spread both real and fake news at lightening-fast speed, I at least give it credence.

This morning brought more news from another former newspaper coworker via Twitter:






As soon as I saw it, I knew what Michael meant and that he, too, was skeptical of this news. News I hadn't seen yet, but I quickly checked other news sites. No, Keanu Reeves is not dead, but he was for a while to a number of folks on Twitter. And I checked the #KeanuReeves hashtag itself, which was already rife with the word "hoax." Whew.

This just showed me that in my little circle of former newspaper folks, we were skeptical of some news and reported others quickly through a nearly instant medium. And when I think of  my little circle of what I call journo-friends or N-J peeps (from our time at the Daytona Beach News-Journal), I realize most of us are out of the business officially these days (but that's a whole other story about the newspaper industry...). Reading "Mediactive" brings up all sorts of issues with that idea. Are we ever really "out" of the business? It doesn't look like it.

While I read this book, I could hear the jeers, sighs, and fist-pounding that my newsroom colleagues would have had with his ideas: "Online media reports can't be trusted and they certainly have no place in print!"; "Give away our content for free?!"; "Let the readers contribute!?" Crazy ideas. Ideas that I now have to consider as integral parts of digital media's future.


Admittedly I'm still trained to bristle at the mention of "citizen journalism." But now I'm that citizen and I must force myself to look at the media from this new vantage point: my couch with all the other folks. I'm fairly certain that I'm slightly more intelligent than those "other folks," but isn't that the thought that got parts of the media into the quandary they're in now -- hubris? As a working-in-the-industry journalist, I had training that most people didn't, access that post people didn't, and power that most people didn't. I had the power to write a headline, decide what story ran from a practically bottomless pool of possibilities, and maybe more importantly, the power to decide what didn't run.  Now my power lies in my ability to research and filter good news from bad. I'd like to think I can do that better than the average, non-journalist-Joe, but I bet I'm wrong.

Gillmor puts forth an optimistic view of the media landscape; he's confident that, together, we can bring out the best of our knowledge and journalistic instincts. I'm working on being an optimist. Ten years in a newsroom is a lot to work through.

Sunday, February 5, 2012

Lots of Information

We read "The Information," by James Gleick for this week. The enormity of this book equals its content; from drum beats to Wikipedia, it covers the rise and transformation of information through the ages.

Like most readers, I like a good story. I wish Gleick would have sewn ideas together a bit more and told a story instead of presenting us with history lesson after history lesson. However, I bet the former was his intent. By writing and constructing it the way he did, he made his point: There's a lot to take in, to understand, and it's a complicated  journey from drum beat to byte.

An interesting question to ask after digesting this read is, "What are we going to do with all of this information?" I don't mean the book itself, though it is daunting and even dubbed "aspirational" by a New York Times reviewer. I mean with everything being in the cloud now, or at least on its way, what's next? Sure, we can catalog all that is for future generations, but then what? Will they continue to do the same? Will Wikipedia never end, having every page on every gas station archived for eternity? I envision a time when there's no cloud because everything is the cloud - TV, mail, all our personal records, banking transactions - all that stuff. I suppose all that information will still be stored then, too, but in a way I can't even fathom.

But that's the point of, "The Information" -- to show me that science and technology will answer my question for me. There will be a new theory, invention, or idea to come along and make me realize why all of the information is a good thing and show me what future generations can do with it.

I wish Gleick would have talked about the economic impact of all this information (perhaps it will be in the sequel?). It costs a lot of money (now) to store huge quantities of anything -- a reason cloud computing has become so popular. Granted, as technology progresses, so will the availability of it, but what does that mean for the moment? Or for countries who don't have the infrastructure to catalog their "right now" as tomorrow's history?

Maybe Wikipedia will catalog important things for those who can't type it right now. When it comes to that anyone-can-edit online encyclopedia, I'm an inclusionist (tho most of this post may make me sound like a deletionist). Let's write it all down and let technology catch up with us. Let's acknowledge we have too much information right now and see what the mathematicians, philosophers, and scientists of today and tomorrow can invent for the next big step.


Sunday, January 29, 2012

The power of being plugged in

The readings for this week, particularly Amir Hatem Ali's "The Power of Social Media in Developing Nations...." (Harvard Human Rights Journal); Zizi Papacharissi's "The Virtual Sphere..." (New Media & Society); and Everett M. Rodgers' "The Digital Divide" (Convergence), raised an interesting debate both internally in my own brain, and externally as the husband (who will now be referred to as Larry throughout this blog) and I discussed points that I would blurt out while reading. What struck me was that the two of us were discussing the idea that having access to technology (the Internet, computers, etc) and its associated infrastructure could lift up a underdeveloped society.

I interpreted two arguments from some of these readings: infrastructure is almost critical to the success and prosperity of underdeveloped nations, and the contrary, introduced in Ali's work to offer the other side of the coin from famous technology experts Bill Gates and Steve Jobs. The academics say, essentially, "the power of information will save the world," while the technologists -- the ones who could conceivably make money off this idea if they were to don their capitalist hats for this venture --  say, "No, this is not what these people need at the moment."

What I find interesting from all this is the fact that on both sides of the argument sit fairly privileged people determining what is best, or what should be a first step, for these other societies.

Maria Sourbati's "Media Literacy and Universal Access in Europe" (Information Society) comes closest to my belief that we can't just plug everyone in and say "go." From most of these other articles, the overarching idea was that a lower socioeconomic status was connected to lower digital media skills and could lead, within certain societies, a disenfranchisement of those lower socioeconomic classes. This is somewhat of a "given" for me, and is not the same issue. Of course within an already plugged-in society we will have the plugged-in and the not-plugged-in. And of course we can't just drop off some technology and expect the solution to be solved; Alexander van Deursen and Jan van Dijk's "Internet skills and the digital divide" (New Media & Society) skills tests taught us that. In these cases, it is obvious to me that trying to level the playing field within that society by providing infrastructure and education is a completely different issue than deciding whether or not to plug in an completely unplugged society because of reason X.

To quote from Zizi Papacharissi's "The Virtual Sphere..." (New Media & Society):
"Those who would benefit the most from the democratizing 
potential of new technology do not have access to it." 
The problem I have here is the assumption of benefit. Of course, from my perspective, political situations and living conditions are not acceptable in many areas of the world. But it is not up to me to decide what is best for anyone else, rather, it is for the residents themselves. Yes, this is a naive view; of course there are myriad situations that prevent people from deciding things or making changes for themselves, but let us not storm in waving our flags of "information for all." Stating that a certain nation/society needs more access to technology and information so they can stop "X or Y" is the wrong pretense; let those who have access simply help the dissemination of infrastructure and training and let that society determine how they use it.

Sourbati's comes closest to saying this in her conclusion: "Provision of assistance by 'proxy' users, who can mediate access to media services for those  individuals or groups who cannot cultivate media literacy capacities under their current life circumstances, is likely to be an important dimension of such an approach."