Inventing New Media

This week’s readings for my New Media Seminar encompass how we understand time (according to McCloud’s Understanding Comics, which I am proud to say I read way back in 1993 when it came out in graphic novel form). 

Those who know me well know that I have a very complicated calendaring system using Google Calendar across all of my screens—laptop, iPad, and iPhone. I also have a paper calendar but pretty much the only time I look at it anymore is when I’m in the kitchen writing out a check for my babysitter. Anyway, the newest iOS has finally matched how I have always thought about calendar time in my mind. For as long as I can remember, I have visualized time, and myself in time, as existing on a limitless continuum of months—just as they are represented in Calendar on my iPhone 5s. Of course, there are some differences: I visualize frames around each date (similar to how McCloud describes the frames around comic panels); I have always visualized the months stretching out in an unbroken horizontal line, rather than vertically, as Calendar has them; but these differences are relatively unimportant. At any rate, I always have a sense of the months than have gone before as how much of the present year has been used up, and the months to come as relevant chunks of time. I’ve also always worked on an academic calendar, for example, so I think in chunks of time such as quarters, semesters, summers, winter breaks, etc. When I have the option, as on my laptop or iPad, or by turning my phone horizontally, I choose to view my particular place in time in “weekly” view.

I fear this is getting boring, so let me amuse you a bit more (or perhaps not) with a glace at a typical week in my calendar:

  
Gosh, I look busy, don’t I? And (to my eternal chagrin) the iPad picture doesn’t show the fancy color-coding I use while accessing Google Calendar via a web browser. I use color-coded chunks of time so that I can tell at a glance which proportion of my week is dedicated to research (Book, BPS, CP Paper, meetings with various RAs, etc.), service (attending a thesis defense) and personal (school events, rallies attended, etc.) 

I’d be lying if I said I didn’t check my calendar at least once per hour. But it keeps me on track, helps me get my things done, and as I set my to-dos in the “all day” portion (so they appear at the top), I also am able to remember things like calling my grandma on her 94th birthday (Hi, Grandma!).

So what does it mean? Has this digital format changed my conception of time? Probably. I think of time in chunks even more than I used to—down to the 15-minute chunk I allotted on Wednesday for picking my dog Appa up from the groomer. I can’t say that it’s better or worse, but it works for me. And I can even fit in a pickling workshop.

Advertisements
Inventing New Media

Illich and the Deschooled Society

As a sociologist of education, I often assign students to read Ivan Illich’s “Deschooling Society,” and it inevitably starts the same discussion: how would a large and complex society such as the United States function without formal schooling? Of course, my university students—benefactors of and true believers in schooling structures for the most part—may put it a little more bluntly: Is this guy crazy or what?

Illich makes some excellent points about the darker effects of schooling. Schooling is socialization; schooling routinizes education and teacher-child relationships; schooling often stifles children’s natural creativity; schooling becomes a stylized system that shapes children into students. When these conversations happen with my students, my job is to lead them away from a very understandable rejection of Illich’s hyperbole, and to guide them into thinking about what we can learn from Illich, and how we can apply it to the task of—if not demolishing the educational system—making schools more welcoming for children, particularly those whom society is more likely to deem “in need” of socialization. In different schools and different places, these groups may include racial/ethnic minorities, children from low-income families, immigrant children, or any other groups that have historically been denied full birthright into the “American dream.”

So what have I learned from Illich? Is there any school that Illich would deem worthy of keeping? From his description of what learning should entail, it seems like Montessori might fit the bill, though this particular reading does not seem to admit that any formalized structure would do. As a mother, would I pull my children out of school and “deschool” them? Much has been made of the tiny minority of parents who choose to “unschool” their children, but I would not choose this for myself—I have a job! Responsibilities! Bills to pay! In most cases, unschooling/deschooling remains an option for the privileged, as it all but requires a stay-at-home parent. When we cannot jump into Illich’s deschooled utopia with both feet, there are many opportunities to see how Illich’s vision is gaining a foothold in some areas of life.

To take one example that is both a “reference service to educational objects” and a “skills exchange,” I put forward the Oakland, CA library system’s Tool Lending Library. With a 5-star average across 28 reviews on Yelp, the Temescal Tool Lending Library (located in a rapidly gentrifying neighborhood of Oakland) is described as “everything you imagine it to be!” Says one rapturous user, “The staff is friendly and helpful – need them to hold something? Call! Not sure what tool for the job? Ask! The TLL is an amazing resource for when you don’t really want to hire a professional for lack of tools or buy tools yourself. They have high quality tools and are very helpful.” Raves another, “I’ve gotten everything from wrenches and paint scrapers to heat guns and sawzalls to a concrete breaker and chainsaw (chainsaw is a Friday-only thing–note).”

TLL
Chainsaws on Fridays only, people!

Of course, the new “sharing economy” boasts dozens of sites like Skillsbox as well, an online community where with a few clicks, you can trade your talents for credits with which to “buy” other kinds of knowledge and skills. Recent swaps include basketball lessons, bricklaying, tutoring, and web design.

I myself was an active swapper on the old Swaptree site, swapping videogames and CDs. We estimated we saved well over $200 swapping items we were no longer wanted or needed. But Swaptree became Swap.com (to swap clothing and the like) and then became the NEW Swap.com, an “online consignment” site where I can “make sales” and “get payouts,” much like eBay with a little swapping on the side. Innovations that “democratize” services like taxis (Lyft, Uber) and room-renting (Air BnB) often come under fire for capitalizing on the sharing economy. The gloomy conclusion is that the almighty dollar still creeps in where it’s least wanted, leaving those previously marginalized by the powers-that-be in little better position than where they started, as others have documented.

So what’s the upshot? Does it always have to be so gloomy? I argue that the answer is no. There is plenty we can take from Illich, even if we aren’t ready to give schools the old heave-ho. The explosion of new media and the world wide web leaves open endless possibilities, and though we in the higher education community are certainly suffering MOOC fatigue, my final, hopeful example, Khan Academy, promises to teach you a little of anything “For free. For everyone. Forever.” For someone like me who missed out on high school physics, that’s a pretty good deal.


Newton’s first law of motion: Basic primer on Newton’s First Law of Motion

https://www.khanacademy.org/embed_video?v=5-ZFOhHQS68

P.S. I had to choose either physics or AP Biology because of scheduling conflicts. Illich would laugh, or maybe cry.

Illich and the Deschooled Society

The Whole Action: Videogames vs. Movies

In case this is shocking to anyone, I’ll put this up front: my spouse and I are gamers. Granted, our enthusiasm has waned over the years, especially since having children. We also tend to enjoy different types of videogames now than in the past. While I might have delighted in solving complex and sometimes nonsensical puzzles in the past, my patience for them is now greatly reduced when I have maybe 90 minutes between the kids’ bedtime and my own of free time.

CannedJuice
I have to toss this canned juice down a garbage chute to dislodge the Old Man Coin in Silent Hill 2? Kay…

So, I’m more likely to enjoy, say, Legend of Zelda: Windwaker (a game with an immersive, cinematic world with multiple discrete tasks) than Legend of Zelda: Link to the Past (complex dungeons, difficult-to-beat bosses, and you can’t save any half-completed dungeons—if you quit, you have to start the dungeon all over).

34385-Legend_of_Zelda,_The_-_A_Link_to_the_Past_(USA)-10
Link to the Past: Get the chickens!!!
66445-The_Legend_Of_Zelda_The_Wind_Waker-3
Windwaker: Immersive world

Which brings me to my next point: rather than playing any game at all, Eric and I are MOST likely to just watch a movie or TV show, and the reason is what Aristotle (and Laurel, in my reading for this week) defined as the magnitude of the “whole action,” or plot. If one is forced by circumstance to devote only 90 minutes per day to a game that might take more than 10 hours to complete (20 hours and up for the kind of games Eric used to favor, role-playing games or RPGs), and there are constant interruptions like parent-teacher night or gymnastics lessons, than the narrative thread of a videogame is easily lost. Many times I’ve started up and thought, Now what was I doing again? Where am I supposed to go next? What is happening here? This doesn’t happen to me with movies or most TV shows (though Boyhood, at nearly 3 hours, did start to make me ask those kinds of questions; The Wire, with its complex multidimensional plot, is an exception to the TV rule).

So for me, the answer to this week’s query is the magnitude of the whole action is most critical for me, as far as enjoyment of a human-computer (or human-media) interaction is concerned. And whether or not the controls are simplistic enough to let me eat popcorn whilst slaying monsters.

The Whole Action: Videogames vs. Movies

The Message is Stronger than the Medium

Reading Marshall McLuhan this week has caused me to reflect on the media forms that have had the biggest impact on our world. Though it’s hard to beat the printing press for sheer revolutionary impact, the first thing that popped into mind was social media. Of course, the distractive properties of Twitter and Facebook are well known, and each probably eat up way too many hours in my day. But each also has revolutionized the way we know and interact with friends. It has been observed that today’s “digital natives” will not have the experience I did of “rediscovering” long-lost friends on Facebook–they will grow up never having lost contact with their childhood friends in the first place. Their entire lives, they will know what their friends are doing in real time (if they follow them on Instagram, SnapChat, Twitter or whatever) from the moment they meet them and exchange screen names.

Coloured-Social-Media-Icons-Round

The other difference I note in myself based on about five years of daily Facebook use and one year of intermittent Twitter participation is how these media have changed me. Each has decreased my tolerance for engaging with media that takes longer to engage with than a minute or two. If a friend links to a New York Times opinion piece, that’s probably the gold standard of getting me to actually read and comment on something. I’ll feel like I’ve learned something, and it takes little time to read. But if a friend tweets a link to a three-minute long video? I’m not watching that thing, that’s way too long! I have things to do, do you know how long my feed is? I’ll confess this too–the reason my use of Twitter is so intermittent is that I get frustrated with my inability to keep up. I can’t seem to shake my completist mentality, formed in my early months with Facebook, when it was possible to “finish” catching up on my feed within a few minutes’ time.

Now I’m rambling, so I’ll quit, but you get the gist. Twitter and even Facebook have well-known abilities to increase engagement, make us feel connected, and reduce the social distance between us and our idols, whether they be star academics (in my case) or pop stars. Twitter has been credited with assisting real-time organization amid revolution, such as during the Arab Spring and the Ferguson protests. In the more mundane day-to-day, however, social media also take up increasing amounts of time, while leaving us feeling like we’ve got little to show for it. I can finish a book, a newspaper, or a TV show, but my Twitter feed is a perpetual hamster wheel of news creation and opportunities for engagement. The potential for liberation is there, but on any given normal day, it can feel like a trap.

The Message is Stronger than the Medium

Computer/Lib

This week I stumbled through Ted Nelson’s 1970s imaginary “Computer/Lib,” and was struck by the ongoing tension between ease-of-use and complexification. And as I pondered, I considered: is the Apple Watch, or iWatch, a contender for each category? As a watch, it’s needlessly complex. It’s also needless machine (don’t we all have phones to attend to?), but possibly a needful machine (yet one more thing to interact with). No one needs one, but neither did we need iPads until the need was created.

iWatch

Isn’t it pretty, though?

Computer/Lib

Augmenting the Human

I just read some excerpts of Engelbart’s 1962 report, “Augmenting Human Intellect: A Conceptual Framework.” While the piece was interesting, I found it kind of a hard slog at the same time. Maybe it’s because I was distracted by trying to read while hanging out with my friend’s new baby, but I think part of it is that it’s just difficult for me to imagine this framework outside of the context of the present world. Engelbart describes concepts that became today’s cut-and-paste, today’s computer mouse, today’s networks, and today’s hyperlinks, but I find it difficult to parse descriptions of these notions in a pure theoretical form, without imagining their current digital or physical form.

I’m not a true digital native; in immmigration/migration parlance you might call me a member of the 1.5 generation. My mom bought a Commodore 64 when I was about 8 years old, joined a Commodore users group, and taught herself how to code. I played computer games from a young age, and wrote a report using a word processor on the Commodore (which by then had an early version of Windows-like software, which I navigated with a joystick) when I was in the seventh grade. I never composed any piece of writing by hand after that, apart from a brief try at poetry in high school. From the age of 12, then, I’ve relied on the kinds of writing practices Engelbart dreamed of:

I found rather quickly that the job of extracting rearranging, editing and copying new statements into the cards which were to represent the current set of product statements in each grouping was rather tedious. This brought me to appreciate the value of some sort of copying device with which I could transfer specified strings of words from one card to another, thus composing new statements from fragments of existing ones. This type of device should not be too hard to develop and produce for a price that a professional man could justify paying, and it would certainly facilitate some valuable symbol-structuring processes.

Although I maintain an affection for the printed word—I dislike reading books on electronic devices, for example—it boggles my mind to think of writing a journal article, let alone a book or a dissertation, without the benefit of cut-and-paste. This ability to piece thoughts together and rearrange them on the fly is integral to my writing process. So is my intelligence augmented? I guess Engelbart would say yes.

Augmenting the Human

Thinking Machines, Creative Machines

This week I read Turing’s “Computing Machinery and Intelligence,” featuring Turing’s arguments against resistance to the idea of thinking machines, as well as his predictions for what digital computers of the future would be able to do.

These days, many humans still have the same objections as those Turing argued against. We find thinking machines to be threatening, as in the anguish in some quarters when Deep Blue beat Kasparov in chess in 1997 (though this win has been contested as caused by a bug in the software). After the supercomputer nicknamed Watson beat out two human contestants to win at Jeopardy!, Conan O’Brien jokingly “hired” Watson to replace Andy Richter as his sidekick and announcer, much to Richter’s horror.

Determining whether a machine is truly “thinking” also invites us to explore the contours of humanity. Objections Turing dismisses, such as that machines cannot “enjoy strawberries and cream,” “have a sense of humor,” or “make someone fall in love with it” made me immediately think of the thousands of humans out there, with or without disabilities, who are unable to meet these requirements. Humans on the autism spectrum, for example, may demonstrate limited capacity to “use words properly,” and many of us—those of us writing grant proposals or academic research articles, for example—ponder whether we can actually “do something really new.” But does this mean that some humans are less than human, in the same way a machine might be? Or that some machines are more “fully” human than some humans?

Turing’s extended section on “informality of behavior” in particular reminded me of an episode of This American Life that reported on the experiences of one David Finch, diagnosed with Asperger’s Syndrome as an adult, and his struggles to relate to his wife appropriately. To cope with the difficult-for-him task of exhibiting empathic behavior, he created an elaborate “Journal of Best Practices” (now published as a memoir) that constituted essentially a “definite set of rules of conduct by which he regulated his life,” making him, in the argument Turing entertains (and rejects) “no better than a machine,” in the eyes of some. The blurb on Finch’s book describes his process this way:

His methods for improving his marriage involve excessive note-taking, performance reviews, and most of all, the Journal of Best Practices: a collection of hundreds of maxims and hard-won epiphanies, including “Don’t change the radio station when she’s singing along” and “Apologies do not count when you shout them.” Over the course of two years, David transforms himself from the world’s most trying husband to the husband who tries the hardest. He becomes the husband he’d always meant to be.

In becoming the “husband he’d always meant to be,” he transformed himself, essentially, into a learning machine. Training himself to be empathic required a ruthlessly logical and machine-like process, with the end result that Finch became more human. Fincher explained the revelation of his diagnosis this way to Ira Glass on This American Life:

I mean, it was as if somebody finally handed me a user manual for myself. You know, here’s how you operate, and if you read this manual, everything that was difficult in life before is going to be a lot easier now, because it makes sense and you can learn how to control certain things.

So where does the line between human and machine truly lie? I think we still don’t know.

Thinking Machines, Creative Machines