Get Your Head Out Of Your Asana & Look Around: An Introduction

“Asana is a simple way to stay on top of your classes, assignments, and student club activities. Before you get started, it’s important to understand how to set up Asana.” – Asana.

The Digital Communitas project, Student Voices, is actively soliciting blog posts, vlogs, and mp3 audio from post-secondary students at Canada’s universities to share their experiences in the classroom with digital media. Sara Humphreys, our fearless leader, has challenged both my colleague Shannon Haslett and I to come up with a blog post which meets the precipitous of the guidelines of the call for posts. I have come up with the idea for a series of posts under the title “Get Your Head Out Of Your Asana.”

One of the tools we use to communicate with each other and control workflow for Student Voices is through an app called Asana. Asana is project management software intended to replace the need for email by making work more “social.” In learning to use Asana, I visited its “Getting Started” page, and immediately noticed the language they used to sell you on their product especially its big-headed and “new-agey” marketing rhetoric. Looking at Asana critically, I thought it would be fun to pick on it to create a series of posts which comment to what level digital technology provides a source of assisting in our education and to what level they provide us with a distraction from learning. How much attention is the twenty-first student paying to Wikipedia and YouTube in the classroom as opposed to well-trained instructors? Or engaging with projected PowerPoint presentations instead of with their peers for in-class discussion? Or looking down at their various digital devices instead of observing the world around them? Where platforms such as Asana or Blackboard provide excellent tools for managing and synthesizing information, to what extent are they becoming a priority for institutional investment at the expense of learning resources whether digital or physical? Is ease of access and coded interactivity truly more important than the information we engage with? Could Marshal McLuhan have been right? Has the media, or the tool, really become the message?

Cool app or Kool-Aid?

Asana was started up by Facebook co-founder Dustin Moskovitz and ex-Facebook engineer Justin Rosenstein in 2008. Their goal is to revolutionize the way we communicate with each other by basically ridding the need to communicate needlessly via email and other, presumably, face-to face means (no more virtual water cooler chit chat), so more actual work can be produced. It professes to replace perceived idleness with efficiency yet it is a bit oxymoronic; replacing traditional digital communications with social communications so you can communicate less.

Ironically, Asana also refers to a yoga position which literally translates to the art or “mastery of sitting still.”  There is no real ‘sitting still’ embedded in the Asana software despite its name, unless they are referring to the ideas that are stagnating. It claims “[y]ou’ll spend less time reading and writing emails and more time getting work done.” Less time reading and writing… hmm. In actuality, it replaces not idleness with efficiency but creativity with efficiency, for out of boredom and dialogue, ideas are born. In the humanities, the work being done is about people and their interaction with (fill in your specialization here).

So, is Asana a good tool for education? What Asana proposes is digital Fordism or Taylorism, where we work with parts never being able to conceive of the whole. And isn’t trying to understand the big picture what a good education is really about?

In their own words,“Asana is the single source of information for your team.  Add projects, tasks, and comments as you go, and you’ll instantly build a team archive that’s easily accessible whenever you need context or information.”  Unfortunately, Asana is about Asana.  It’s mission to keep our heads in our Asana and out of other digital and physical spaces.  However, the truth behind the rhetoric is that Asana can only provide limited information and context, obvious to anyone who takes the time to logout and look around.

 

 

 

 

Call for Posts!

We are starting the next leg of our project where we will give undergrads more of a voice on this site (and hopefully, in how their classroom and courses are implemented). Please share this poster with your humanities undergrads, share with colleagues, or submit a post! We want to know how undergrads are using digital tech in the classroom or what they think of the tech in the classroom. Are smartboards useful? what about smartphones? Tablets? Laptops? The deadline is August 15th  and this post can be listed as a publication on a resume!

 

Deadline is July 31st - send to studentvoices@trentu.ca

Deadline is August 15th – send to studentvoices@trentu.ca

 

Blackboard Jungle -BW

What Using Google Docs in the Classroom Tells Me: We Need to Change Everything

I have been experimenting with Google Docs – I realize loads of teachers use this resource, and I am definitely not writing a how-to guide here. I am not even extolling the virtues of Googles Docs. Yes, it allows for online collaboration, and it’s relatively easy to use for those who have a fear of “digital anything.” What I am writing about here is the relief that students clearly felt at being able to collaborate without a central figure telling them how to learn. .

Students want to control their learning path – they need guidance but not a gatekeeper. While Don Tapscott and I don’t always meet eye to eye, I am on board with the concept of actual student-centred learning, which is very hard to achieve. The very structure of the university is not set up for student-centred learning. The current model of the classroom still privileges a factory, industrial model. The desks are aligned in exact rows; students sit as units of production, and the factory supervisor or professor supplies knowledge to the units. Just to prove my point further, it might make you more than a little queasy to know that students are regularly referred to as Basic Units of Funding (BIU). Each student reported to the Ministry of Training, Colleges, and Universities (MTCU) in Ontario (and I am sure this is true at ministries of education nationwide) is worth a certain amount of money. Clearly, post-secondary education requires funding, but from the way the MTCU configures its relationship to students to the actual architecture of the classroom, students are to be processed rather than actively engaged with. Ken Robinson, in his talks for the Royal Society and TED , has been telling us about this problem for years, but what have we done about it?  What I have discovered in using cloud computing tools (like Google Docs) in an industrial classroom is that these tools show us just how outmoded the university system really is. The good news is that these tools can transcend the industrial classroom.

I have been reading up about how others use Google Docs, and there are ideas bandied about like “stump the instructor” – in this particular assignment, students ask the teacher questions that they think might stump the teacher. This is an exercise in positioning the teacher as gatekeeper – the “holder-of-the-knowledge.” In other words, the digital tool is put into service of a certain type of learning. I mean, students could just write a question on a piece of paper and hand it to the teacher, so what’s the use of the digital tool, in this case? When Google Docs is used for actual collaboration, something incredible happens: students take control of the classroom and they love it. Unlike having students give a presentation where they mimic the professor (in essence), Google Docs allows them to take on an active role in constructing the class.

It can seem chaotic at first, because as the students log in and take on their Google ID (or if they do not have a Google Plus account – and most don’t – they are given animal identities like “anonymous liger”), they all seem to be randomly roaming the document as a bunch of cursors…..

All Together Now

but then something wonderful happens – they start to work together on one (digital) page that is a hybrid of the word processing application they use AND the social networking sites they interact on. Guess what? They collaborated brilliantly and productively. So much so that I could not keep up. Next time, I will use “time outs” where we stop and look at what everyone has done and then go back to the document. I should add that the Doc was up on the big screen so everyone could see on their screens and for those without the ability to connect (only 2 people out of 30), they could see as well and also share with classmates. The screen actually create a hub for us as we collaborated.

Students were answering the questions I asked on the Doc enthusiastically and passionately – and they also answered each other’s question on the doc!  The conversation on this digital page was electric, and when I asked students why they were so enthusiastic about this particular tool, they said that they felt at ease communicating in this format. They felt empowered by being able to add their thoughts without feeling pressure to answer (and, therefore, be judged) me directly. Instead of me being “the boss” they must answer to – I was just another roaming cursor adding to the document. Interesting – no? I think the impulse by some teachers will be to condemn “this generation,” who need to learn social skills and so forth. I have not seen a decline in social skills (there are as many rude middle aged folks as young folks, I’ll wager). No, the point here is that the need to communicate collaboratively is as strong as ever but the venue has changed, and post-secondary pedagogy is way behind. Native digital users communicate through online collaboration (e.g. social platforms) and teachers need to get on board or lose student interest and vitality. You can dig your heels in a wish for a better time or join in and have a blast.

Here is a snippet of what we produced in real time (note – students used bolding, color, and italics but all was erased in  the cutting and pasting of that document to this document):

Google Docs Transcript

Question One (I know, this is really a series of questions, but they are linked!)

In both Alice in Wonderland and The Wizard of Oz, the lead characters leave reality, enter fantasy, and, in the end, return to reality. Are Alice and Dorothy changed by their experience? Does the fantasy world challenge the norms of reality? Does the fantasy world support certain norms? Does the return from the fantasy world negate the social criticism found in the fantasy world? For that matter, what social criticism did you detect in either story?

So?  Any answers?

Alice and Dorothy return from the fantasy world with new perspectives of their own world/reality.  (SM, MB, GM)

In Dorothy’s world, the fantasy does challenge the norm of reality when she first sees the colourful and abundant flowers, which she wishes were in Kansas. (GM)  The colours and “aliveness” of Oz really influence Dorothy’s worldview in that she is so taken aback by the brightness

I don’t think Alice and Dorothy are changed by the experiences but they bring the changes with them when they return. (Ed)

That’s an interesting point. What did Alice “bring back” with her?

Answer to purple text: Alice brings back a different view for her sister and a brief lapse back into her childhood, (as read to us by Sara). (Ed)

Alice brings back the value of imagination and wonder which children were taught to grow out of in order to have a successful adulthood. Alice introduces the idea that fantasy cannot/should not be defined by age. This is a positive spin on the story, which is important – most often, it is seen as defining girls as“civilizers.”

The fantasy world definitely changes the norms of reality but only in the Western sense of norms. (Ed) That’s true

The criticism of reality dissipates at the end of the story, as Alice and Dorothy discard their adventures at the end of the story, but the reader may change his/her perception of reality based on the fantasy world. (Ed)

Good point-why would dorothy want to return to such a grey place (kansas) when she could remain in the colourful world of Oz?

Dorothy wants to return to Kansas in order to be with her family; her auntie and uncle. Maybe to her the beauty and fantasy of the Oz is worthless in the light of her loved ones. (GM) Interesting point – and Em certainly changes

In terms of social criticism-in The Wonderful Wizard of Oz, Dorothy remarks that her land is civilized-there are no witches or wizards there…perhaps alluding to the isolation that different people face? The witch then explains that while there are bad witches, there are also good witches maybe challenging people not to judge those who are strange to society? (BK) I think this is a really good point. There is a lot of diversity in Oz and Dorothy also frees a lot of oppressed peoples

Is there any significance in Alice growing and shrinking? Also, is there any significance with the rabbits white glove that he drops, Alice picks it up, and then puts it on? (BK)    Yes.

I believe that the size changes represent the power of adults in society. Adults hold the power in society and when Alice grows in size she is able to exert more power over the other characters.    YES!

the size changes can be seen as Alices perspective changing … in the end alice turns into a giant and she is able to speak up against the nonsensical trial. YES!
the size change can represent the changes the body goes through in adolescence?

Perhaps- as much as a female wants to grow(for lack of a better word) they are constantly brought back down? (BK) Any advancements that were made for women in society were shortly lived before they were brought back down to size?

Children were previously viewed as passive in their own intellectual and moral development.  Alice has to learn how to take an active role in her own development (whether she’s making herself smaller or bigger).  Perhaps this is the author’s way of endorsing the agency of children.

Since Alice always has problems when she grows bigger, maybe since children are smaller they are able to do and see things that adults cannot? (GM)

The growing and shrinking could represent her lack of experience, or in this case her feeling inferior to something she doesn’t understand, and she grows as she begins to understand and get some measure of control over a new experience.  For instance, she shrinks when she realizes she can’t get the key but stops when she calms herself down enough to think.  In regards to the Rabbits house she is completely thrown by the change of (attitudes? behaviours?) in this strange world but begins to grow when she believes she can exert a manner of control or understanding. (Ed)

(MH, MM)Their experiences do change them in the sense that the characters are no longer jaded, they have a newfound understanding of the world around them in the sense of understanding people and how not all people are who they appear. It does challenge the norms of reality in the sense that it makes a kind of makes a mockery of real culture, like when Alice is falling through the rabbit hole and consumerism follows her as she looks at the different items surrounding her. The fantasy world supports the norm of childhood curiosity and actually encourages it, by way of providing many things intended to encourage the child to question, study and judge the world around them (example being the curiosity of Dorothy in whilst in Oz of her surroundings and the strange people she meets, or Alice and the vivid and somewhat terrifying world that surrounds her).

I think Alice comes back more mature than before her journey because she’s experienced life on her own. But how do we know that? We only have her sister’s view of things…but an interesting idea. Because it’s HER story, does she have power?

Alice brings back to the world a willingness to fantasize that adapts to fit the world it is now a part of, which she passes on to others in the world, such as her sister.  It’s not the full, intense fantasy of Wonderland, as it does not seem to directly alter the physical world, but it is a mental fantasy that transforms the world through perception–a fantasy about Alice’s still somewhat mystical future and a romanticization of the “real world”. (Carly)

In Alice in Wonderland the criticisms are rampant.  At the end of her adventures there’s a massive ‘dis’ of Western courts.  The croquet game is a parody of how lucid rules and regulations are.  The tea party was discussed in class.  With the baby the duchess was caring for the critique was how much value we put on children, in this case a baby is represented as a pig. (Ed)

The Digital Classroom: Catering to Idealism

One more theorist inspired post from me, once more applying ideas towards technological learning environments. This time I will examine the issue from a perspective inspired by Jean Baudrillard. Baudrillard was a French theorist, writing in the latter half of the 20th century and into the 21st. He covered a broad range of ideas, but some of his core themes were that of simulations, consumerism, media, technology, and signs/symbolism. His bibliography is quite lengthy, and since I can only include so much, this article will use some ideas found in The System of Objects, Simulacra and Simulation, and The Gulf War Did Not Take Place.

Again, I must note in advance, this post will seem very non-supportive of technology in the classroom, but I want to again note that this is not my position, rather that this should provoke thought about the origins of said technology, and the political/economical/ideological motives behind equipping classrooms with technology. By understanding these underlying themes, we can have a fuller understanding of the true meanings and ramifications of technology in the classroom. It may be a harsh reality to face and some may rail against such claims, but technology in the classroom does in fact have many downsides to it, most of which are the fault of said ideological influences.

As one of the arguments for supporting digitized classrooms, technology is slated to combat boredom, to cater to new ways of learning, and create new methods of teaching. However, I want to point out the most obvious fact: teaching with technology is still teaching. It is not replacing teaching or teachers, and thus will become subject to the same miseries, or even worse distractions, than traditional methods eventually. Digital interfaces in a classroom hold the attention of younger generations and students because they are a novelty. A techno-generation of students will view digitally enhanced classrooms much in the same way rebellious children today view the traditional methods. It is a circular process. The idea of learning and the attitudes towards that is what needs to be changed to allow the full effects of technological enhancements in the classroom to be felt. We are deluding ourselves into believing that technology will save the classroom, or that the fundamentals will somehow be different and more effective. Not that this is naivety, on the contrary, that is the very goal of technology, which allows it to flourish and invade our body. But we must be suspicious.

In the mad rush to equip schools with technology, the reduction of technology to merely a sign is evident. I do not mean to deprive digital interfaces of their ability to fundamentally change the classroom, but the symbolism of a digital classroom is stronger, and arguably more important. This movement is a modernization, to pull into the 21st century one of the oldest social environments. Digital interfaces serve as a sign of modernity, of a technologically savvy and triumphant society. Traditional methodologies are not seen as rational, nor 100 percent efficient; there is always a piece of technological equipment that can be created to achieve this, though always within limits imposed by dominant social structure. In short, technologically advanced classrooms serves to further legitimize our technocratic society. It is more the idea of an advanced classroom that is appealing to many than the practical application of it. The actual implementation is a nothing more or less than a worthy goal of society, though successful integration would be a “token” of power.

Referencing back again to my first post about surveillance and data collection, we come to a concept Baudrillard detailed in Simulacra and Simulation. He believed that we bring the sacred objects in life into a scientific order, in an effort to control them. He gave the example of museums effectively “killing” Egyptian mummified corpses; that we write our own ethnographies. The world has become a museum; everything is an exhibit. Though this work was written in 1981, it has been amplified to the infinite degree by the internet today. Social media is a prime example; we build our own exhibits for ourselves, and prostitute our images. Our data is collected, and advertisers have their own museums about us, the consumer, and cater back to us so we can continue to build on our own exhibit. In a similar way, this works in the classroom as well, if the technology is set up in such a way. The statistics of how well we perform in certain areas lead to an exhibit of our academic profile and abilities. We have seen how advertisers have invaded digital academic tools such as Blackboard; theoretically they could advertise to the student particular devices or programs that best suit their academic abilities or deficiencies. Additionally, schools or programs that require applications can see a deeper/rational/synthesized evaluation of a student, or even a host of students, through simple things like data collection and analysis. An exhibit of a student or classroom can be quite revealing. Nothing is private, and everything is oriented towards efficiency and economical benefits. Schools have a vested interest in having top performing students, advertisers make money from the data, and other companies make money from people purchasing the programs. Although this idea of an exhibit of the student may seem like a logical step, do exhibits and statistics always demonstrate the potential or true abilities of a student? As Baudrillard suggests, there is more and more information, but less and less meaning.

This idea of the exhibit of the student somewhat leads into the idea that people believe technology adapts to them. Any little learning deficiency, interest, etc, is seen as an individualization, personalization, a focus on them. This is wrong: everything is coded, predicted, anticipated, and such “adaptations” are designed to dupe you into believing technology is working for you. Any “adaptation” is an orientation to a paying consumer, or creation of adaptive ideas for profit in patents. Problem with math? Try this new teaching program! I see you like science; try this new data collection program! Perhaps if a new development is made requiring non-existent technologies, it could be an adaptation, though it is not an adaptation for the sake of the person, it is for the sake of filling another gap for profit. Point is, technology is not working, or doing the work, for you. If we have digital interfaces in the classroom as an example, they cannot “work” unless we utilize them. It is what WE do that matters. We manipulate it, and it does what it is told. Ultimately it is the programmers and corporations YOU are working for, the opposite of what we believe technology is doing. Every quirk has a niche and a price; there’s an app for that.Governments and companies both are pushing for this equipment, as it benefits both of them. Given the recent revelations of widespread data being monitored by government agencies, it can be debated whether or not this is actually in the student’s best interest.

Relating to this, technology opposes progress just as much as it promotes advancements. Advancing too far, too fast will kill it; it is only logical that it organizes itself against too fast of progress. It is in its best interest to go slow, and “adapt” to situations that come up instead of preventing or pre-empting them. This is economic play. Baudrillard discussed at length in The Gulf War Did Not Take Place how speculators make money of things that never really happen, that do not catch on, take off, what have you. The same effect is being applied here. Economic powers are speculating the worth of technologically enhanced classrooms, and speculating off the students. Why else would companies be interested in equipping schools with tablets, e-readers, computers, and so on? Advertising, media, promotion, money.

Baudrillard suggested objects and technology grow like an organism; it evolves, it has waste, it has obsolescence. Technology, of course, is the exact same. Every year we see it, the new Apple products and the mad rush to purchase these expensive devices that undergo minor changes that are often aesthetic only in nature, a planned obsolescence designed to suck money out of the masses. In the classroom, this is the greatest danger of technology. It is expensive, and cannot be replaced at the consistent rate of growth it is experiencing. There will always be a school that is out of date, and then the point of digitally enhanced classrooms is lost, as methods become obsolete as fast as the technology. Again, economics are at play. We must also deal with the waste; where does all this go? The amount of waste technology produces is astounding, and is only growing.

There is another problem, regarding specificity of technology. Again, I noted in my first post that many instructors are incapable of fixing problems with the rudimentary technology in the classrooms. Baudrillard noted the increasing specialization in “gizmos” and technology in general. This is part of the plan; there are jobs out there for people who can fix these problems, so the people who really need to know (instructors) are not taught. Here is an example, though it is not exactly about learning with digital interfaces, but it paints the general picture. In my final year of high school, all of the clocks broke and everything fell out of time. All the buzzers were 8 minutes behind, and not a single clock in the school was correct. It remained this way for quite some time because apparently there was only ONE PERSON in the entire municipality that was licensed and trained (aka entitled) to fix the clocks. Great. In terms of actual classes, why do students have to wait for IT help to get the lesson going? It would be more much efficient if the instructors knew how to work these things, and then both students and instructors would not lose out on 10-15 minutes of class every week.

It boils down to this: money. Baudrillard asserted that the cultural mosaic does not exist, rather there is only one “culture,” and this one is the capitalist centralization of value. Everything can be given a price, and now that education and instructors are increasingly under the economic scope, economics are extraordinarily important, especially given the enormous focus on economics today. This centralized tendency to reduce everything to monetary value is extraordinarily demeaning, and is certainly a backwards progression. Technological advances in the classrooms are suffering from this, and as noted in this post and my last one, it is solely due to money, presumably greed.

There is a great technocracy at play here. It is evident in all of the above issues. It is also evident in the prevailing attitude about technology: if you do not have it, are not up to date, or non-proficient with it, you are socially ridiculed. Although it is not the social “castration” Baudrillard suggested happened to those who lost their driver’s licence, the principle is still applicable. It is an expectation of people today, an expectation that everybody has cell phones, social media accounts, mp3 players, and so on. Technology is not to be halted; those who have not come under technology are backwards, morally and socially. This idea allows it flourish, and continuously creates pretexts for development, pointless or not. This is one reason why we consider the traditional classroom model out of date, and legitimizes the push for digital enhancements.

The final idea I will explore about technologically advanced classrooms is that of a social paralysis. Baudrillard suggested technology does not actually create communication; I believe this to be a result of given historical time-frames (1968), and I will say technology DOES in fact create communication, but a non-standard type or form of communication. It cannot be denied that objects such as Smart phones or other media platforms are killing true human interaction and relationships. It may seem contradictory: does social media not count as human interaction or relationships? Not in my mind. Social media deludes people into believing this, but subtly we must be aware that we do not really have 900+ friends. It is merely an exhibit; social media does not give us the opportunity to learn how engage our peers in the flesh, and digital interfaces in the classroom pose the same threat. I feel that speaking with colleagues and engaging in open dialogue in seminar settings allows the best ideas to be birthed and grow. Technology cannot, and should not, be allowed to isolate students from not only themselves, but from the instructor. If indeed technology does not create or foster standard communication, this is what it should (at least in part) facilitate.

To wrap things up and keep it short, it is suffice to say that we need to really take a step back and look at digitally enhanced classrooms as something more than an educative tool. It was quoted in The System of Objects that “there is something morally wrong about an object whose exact purpose is not entirely known.” Technology in the classrooms is this. Very few people know every line of code that goes into these programs, the underlying motives, etc, for this equipment. It is a consumption of technology, and Baudrillard left us with an excellent definition of consumption: “an activity consisting of the systematic manipulation of signs.” Technology is the sign, and there is so much at stake for all parties that everybody wants their piece of the pie, economically, politically, and so on. By considering some of the more sinister aspects of technology, and figuring out how to mitigate them as best as possible while being as open as possible, we can make progress in this endeavour.

Technological Rationality and You

The development of a highly technological academic learning environment is at once both exciting and intriguing. This exploratory post will use ideas founded within Herbert Marcuse’s work, drawing ideas in particular from One-Dimensional Man. Although it was published in 1964 and written with particular economic and cultural frameworks in mind, many of the ideas presented are still relevant to today’s world, some of which are increasing in relevance, rather than declining, possibly indicating negative or backward trends in society as a whole. This post will also examine the use of technology in the classroom, as well as the relationship of technology and the typical assignments.

A main theme in One-Dimensional Man is an idea of one-dimensional thought. It is a self-imposed limitation on freethinking, where people follow prescribed patterns of thought, are indoctrinated into specific ways of thinking, and a strong trust in operationalized definitions. We follow the definitions too closely, and this prevents a “far reaching” change in thought. People are indoctrinated in a manner that consumes the entire individual, rather than just being a set of teachings for example. Technology and mass production have enabled this method of total indoctrination.

In a broader sense of the idea of one-dimensional thought, we can look at increased issues of accessibility of knowledge. Wikipedia, although vastly improved from its previous states, presents a point in fact of the dangers of open-access material; it shows us how a database can become something of a default knowledge base people go to, and can become regarded as some sort of dogma. Due to the ease of accessibility and ease of editing, I believe this is producing what Marcuse called a self-limitation in thought. Studies have linked changing brain patterns when it comes to deciding where to find data, with the more-open internet, particularly Google, becoming a default search base instead of asking others knowledgeable in the subject, doing some hard-copy research, or accessing databanks/search engines such as EBSCOhost. I believe this contributes to a self-limitation of thought; we are thinking less and less for ourselves, instead quickly turning to search engines to see what other people think, and rely on that rather than getting multiple opinions or data; our resourcefulness has decreased. The long-term impact of self-limitation of thought is quite negative, but in the learning environment there is a two-fold effect that has to be recognized.

First is a de-skilling of learning capabilities. Many students today are lacking in understanding how to use resources, and how to really pull information from academic work. Students tend to search first, and think later. There is also the old argument of the internet creating a text dialect, a horrid shorthand of English. Digital tools such as spell-check also allow students to be lazier, although this is debatable since many students still somehow manage hand in assignments riddled with errors. Second, is the erosion of academic freedom, meaning students are not truly learning, they are being dictated to, and being specifically formed to take their place in the system. Although this is a granted and obvious effect, it is often not recognized as such because, as Marcuse points out, the system does a wonderful job of hiding this purpose within it. The lack of resourcefulness and tendencies to take information at face value without giving it deeper thought or reading supporting documents creates a flawed “learning” environment. Although this is not as bad at the post-secondary level, anything with a “curriculum” or departments/instructors that are somewhat censored are doing a disservice to academic growth.

As I noted in my previous post, there have been major calls to integrate technology in the modern classrooms of the twenty-first-century. Classrooms have been slow to adopt computerized methods of teaching for a variety of reasons, including economic factors, methodology conflicts, and general access. However, there has recently been a push to equip the academic environment with technology; again, as I noted earlier, what is the point if the technology is not used properly, faulty, or simply not effective? Marcuse made note of the “self-validating hypotheses” being “repeated hypnotically” in society, and the push to integrate technology is exactly this. We constantly hear that digital interfaces will enhance the learning environment, and it is taken for granted that this is the case. However, the consequences are being overlooked in favour of accelerated mobilization. We have heard the adverse effects on health from wireless internet in some schools, yet they decide to press on with it. The extremely rapid obsolesce rate of technology does nothing to help either. What good is it to implement a series of technological marvels if they will be outdated in a relatively short span of time? This planned obsolesce is nothing more than greed, and leads to the current problem of recycling and waste that is being experienced by the digital world.

One of the larger themes in One-Dimensional Man is the idea of a “technological rationality.” Marcuse suggested that everything in society is being operationalized, rationalized, and brought into a scientific order. However, technological rationality works to protect dominate social structures at all times. It contributes to the creation of and sustains a monoculture, a “rational” culture, where everything is accounted for, specifically defined, and freethinking is limited and discouraged. This is where our needs are defined, and our drive towards those needs created, whether or not they are Marcuse’s “true or false” needs. Everything is confined within these figurative boundaries of the monoculture. One-dimensional thought is created by this structure as well.

A major issue with technological rationality is that it treats everything equally, that any idea or object is to be steamrolled and absorbed into the whole by totally disregarding any inequalities or differences. This is what is prevailing in the bid to equip schools with digital learning interfaces. It is often assumed that all kids these days are brought up with modern technology, or that it is “natural” to them. This is a sweeping generalization. There are often considerable gaps when it comes to access of technology, especially if we consider rural areas. This assumption also takes for granted that parents allow children to have extensive use of computers at home, or mobile technology, and more importantly assumes that the economic capital for constantly up-to-date technology is available. This is not always the case. For example, although many of my peers are quite proficient with technology and love to live connected to it, I was not brought up with it; I did not get internet until 2005 (dial-up on windows 98), and did not have a reasonably decent computer and internet until 2008, at which point I was finishing high school. Living in a suburban area with many working class families, I know many who were not brought up immersed in technology, dislike using it, and dislike seeing much of it used in classrooms. Many of my peers prefer auditory lecture styles, with slideshows being a reference point only.

How will the assignments change? We have seen how technology has created a cycle where people are bound more to work after their work hours are complete. It is possible for the same thing to happen with higher tech classrooms. For example, if the students are equipped with tablets or laptops, are they going to have to do more work outside of class than ever? Marcuse emphasized the new need for continuous, mentally stupefying labour, and wasting time on a laptop or tablet is the perfect fit. On the flip-side, what if people are not equipped with these things, but have no access to them? Way to go, technological rationality.

It is undeniable that there are issues with boredom in the classroom. Classic ways of teaching have been victimized, and it is argued that technology will make it “better.” On its face, this seems like it would be a good thing; a more interactive environment stimulates the student to pay more attention. However, there is an equally arguable point that if the student is simply not interested in the subject matter, the format of delivering the material will not matter. Statistics, taught by lecture, PowerPoint, Prezi, in a computer lab, or by workshop, is still statistics, and thus still likely bores the students. It is an issue of self-discipline; the burden cannot always be laid on the teacher or the methods of teaching. This smacks of the entitled generation, wherein the kids and students of today never seem to be at fault.

If the issue of learning is boredom, how does a technological environment solve the issue? Having access to multiple efficient resources is one thing, but what to do with them? People need to be taught how to use these things, and in turn the lesson once again becomes linear. To solve boredom, one must entertain, and Marcuse suggested “entertainment may be the most effective mode of learning,” given that we can distance ourselves from the entertainment. A digital classroom immerses us in the entertainment. This entertainment idea is dangerous, for it raises the chance of the lessons not being taken seriously, and also demeans the educative environment. Just look at how classes that bring in movies are looked at; easy, not taken seriously at all. Will digital classrooms be looked at in a similar way?

There are research projects currently being conducted that investigate the introduction of social media into the learning environment. How do we view this integration? To me this seems like a way of continuing the “happy consciousness,” that the student must be occupied at all times to make it seem like the system is working just fine. It is a problem maker that simply hides the problems. Is it really boredom, or is it an addiction, not necessarily an addiction to technology, but an addiction to being in contact with people? Perhaps we should be critiquing how traditional constructions of space and interaction have created an atmosphere of isolation, and looking at social media as a way of fulfilling the desire of interaction, since it transcends physical constructions of space. Such a desire is amplified by the fact that society isolates us from each other.

Here we look to Marcuse’s idea of a cultural diffusion. Education, historically, was a marker of class and wealth. It was only a gradual process that enabled the general population to be educated. Lower culture was readily absorbed into the process, and the higher culture aspects were brought down and blended with the lower cultural meanings to create a harmonious and smooth operation. Negative aspects of society are also absorbed into this harmonious whole. The key to the success of this idea is looking to the past, and convincing ourselves that it is better now than it was before.

As such, technology is only being integrated on the terms of dominant social structure. As we have seen here with Blackboard 9, corporate business has a strong interest in integrating technology with academia. Currently, learning with digital media can only be implemented and accomplished if someone stands to profit by it, and the results of learning with these digital tools produces a citizen that is within both societal and corporatist interests. We know corporations pay scientists or scholars to represent a certain angle of a given topic, and regular people to support it. What kind of impact can this have on a database? How can the academic community regulate itself to maintain academic freedom? The third party interests are what need addressing the most. It is essential to construct the most neutral as possible environment for students to learn, so they can think for themselves and choose their own path.

We are constantly looking for ways to do things differently from the past. I always come across self-congratulatory writing about how we have “come so far from our barbaric past,” particularly when I was doing research on mental health and asylums. Needless to say, usually a lot of the references of the barbaric past are changes that have happened within relatively small frameworks of time. It is a delusion of grandeur and egoism. Highly technical academic learning environments will be praised in much the same manner, but they will still suffer from lazy, distracted students, plagiarism, and boredom. They will still suffer from ineffective instructors and poor methodologies. Nevertheless, they will be considered much better than before, and a justification for these new methods will be always found and held onto.

Technological classrooms have the potential to entirely dominate the student. A continuous connection contributes to that absorption of the entire body I mentioned earlier. Marcuse wrote that technology shows us how unfree we actually are. We cannot disconnect from it, and it blinds us from understanding the true causes of our frustrations and oppression. Our data is collected, analyzed, and then we are sold.  The growth of technology is a constant reflection of not just advancements in capabilities in research and development, but also a growing understanding in how to cater to us better by providing a piece of technology for every purpose imaginable. We are not just connected to technology in the class, but we are connected to those who control the technology, the dominant social structure; we are surveilled, analyzed as a statistic, and understood. This is why cultural diffusion is necessary, to hide the domination of the upper strata by infusing our own meanings into technology. In the recesses of our minds, we understand how fully entrapped we are, and escape is extremely difficult, if not impossible. We find some way to justify this, and thus, in the end, we are “better off than before.” Technological classrooms are generated as a “false” need that people orient to as something rational, something worth desiring, even if it is not. This goes back to the whole idea of praising ourselves, and satisfying our “happy conscious,” and believe everything in the system is rational, working, and efficient.

As this has probably been an overwhelmingly cynical read, I must look for the positive. Through the cynicism of Marcuse’s work, he describes the “Great Refusal,” wherein he believed society could change for the better if people took an active part in resisting arbitrative change and thinking for themselves. The same thing applies to integrating digital media to the learning environment. Although it is a possibility, technology in the classroom is not necessarily a doomed idea. If it resists the operationalized, “rational,” standardized approach that is often undertaken by authority, there is a chance that digital interfaces can enhance the academic learning environment, particularly in the social sciences and the humanities. Strong students and teachers can make use of extended classroom dialogue through digital media, update information as it comes out, and developing stronger ideas by encouraging stronger peer-to-peer engagement. Highly interactive interfaces that are controlled and fully understood by both the instructor and the student are something worth striving for. Technology is only what people make of it, especially in academia.

People can claim that using technology has enhanced learning. There can be no doubt that it has had great effects so far. Nevertheless, the gears of progress seem to have stopped, it is stagnate. Tradition and the economics of today are holding technology back. Progress is stalled by greed, by those who wish to profit off new modes of learning. In traditionalist terms, technology has been making learning easier, and Marcuse suggested that we strongly “militate” ourselves against any fundamental changes that make life easier, but take away traditional roles. To move forward there needs to be a break in the old, constrained modes of thinking.

There needs to be a critical evaluation of these plans to integrate technology in a classroom. To substitute one form of classroom boredom to another is entirely pointless; technological teaching can be equally or more ineffective than bad teaching in the current system. Although I believe it can be used effectively if used in very particular manners, it is not the fix that some people hope it is. The student of today is equally a major factor of failing classroom experience as inefficient teaching methods. There needs to be a shift in the way people think, and how change how education is viewed and valued rather than simply patching the problem.

Credit for inspiration and ideas, One-Dimensional Man

Panopticism in the Classroom

by Allen Kempton 

Technology for Technologies Sake?

Calls have been made to integrate technology into the classroom by both academics and students. As a result, many classrooms have been hastily equipped with sometimes faulty equipment that nobody can use, or even want to learn how to use, such as SMARTBoards. Almost every room in my school is equipped with this board, yet it is not utilized to its fullest extent or potential, if used at all. Most professors and students use it as a projector screen. This kind of defeats the purpose of such technology in the class room.

However, assuming technology can and will be fully integrated into the academic learning environment, there are some things that need to be critiqued and explored. For example, how can an instructor facilitate a class that is technologically dependent? How is it accomplished? How can productive discipline with technology be created? Arguably, technology provides more distractions than one can possibly imagine in a classroom. Countless times I have seen students playing games, watching videos, talking with friends, etc, during class hours. How can we negotiate a discipline amongst students without being too intrusive? This is difficult subject to address, but it is well worth probing.

The Panopticon

Foucault’s model of the disciplinary society is the concept I will use for this investigation into the intersections between tech and teaching. The Panopticon, the architecture theorized by Jeremy Bentham, serves as a basis for the disciplinary society. In essence, the Panopticon is a structure that allows for maximum supervision, technically all-seeing, with minimal effort. The original idea was a central guard tower in a prison, where the guards could see all the prisoners at any given time, but the prisoners could not see the guards. It was always unverifiable whether or not guards were actually watching, but the prisoners had to act as if someone was watching them at all time, creating discipline. Foucault expanded the idea to work at the level of general society, termed “panopticism.” The description of how this is accomplished is deep and intricate, but for the sake of this discussion we can summarize by suggesting that society is deeply surveilled in ways we would not think possible at this moment. Our actions are monitored, our data is collected, signs of surveillance are everywhere, and somewhere all this data is analyzed and acted upon. Here, panopticism’s effect of discipline is realized. A coercive power that pushes people into doing things, to act in certain ways, and orient to the very idea that we are constantly being scrutinized and judged. With this, we can head back to the integration of technology in the classroom. Let’s stick with the classroom for now, because the effect of panopticism on publicly available digital databases is too deep for this first exploratory post.

Technology in the academic environment is also monitored to some degree. A panoptic effect, insofar as it is unverifiable by many. Indeed, most students I know were shocked to learn that instructors have the ability to monitor students’ actions to some degree on Blackboard. Recently I learned school e-mails are monitored as well. Some tech classes use classroom management software that allows the administrator or instructor to view what is happening on any given screen in the classroom, provided the computers are hooked up to the same system. In programs like Khan’s Academy, stats are collected to see where focus is needed most. Beyond the educative technologies, we are observed by video surveillance cameras in classrooms as well. Most of the time this is done without the knowledge, much less consent, of the students, although it is not like we are given a choice. Increasing awareness of the pervasiveness of marketeering in programs is causing some degree of concern among instructors (as seen with the advertisements in BlackBoard 9 – read our scathing critique of Blackboard here), and causing some people to automatically assume they are being observed while operating technology.

Although we are aware of some degree of observation, this does not mean it is approved. Often we are forced into submission: get watched or don’t get the program! In a conspiracy theorist sort of way, the possibilities of abuse of panopticism in school systems is terrifying. In America, there was a report of a student being penalized for an incident on a school supplied computer while at home; if I recall correctly, it was “inappropriate use,” but the point is that the principal could view what the student was viewing at any given moment. Schools like UOIT (University of Ontario Institute of Technology) supply school computers; do we know all the programs within them? Not likely. If current surveillance technologies in classrooms right now are an indication of anything, it is not unreasonable to predict that a more fully integrated academic environment will be even more observed. The very issue of observation must be addressed.

This point of view seems overwhelmingly negative; for the most part, this is true. However, as Foucault points out, observation can be utilized to increase efficiency. Technology provides the number one distraction because people are convinced they can get away with it. An integrated system that allows peers to see what the other is doing could possibly be an optimal solution. With an integrated system, people should be able to see who is slacking on the work at hand. In an academic space, people who are consciously neglecting their task should feel compelled to get back to work if all eyes of the classroom are on them. If all the members of the classroom have the ability to observe, this protects against a centralized, unverified power. People tend to be very self-aware of their actions on a computer if they are being watched, and so the possibility for increased focus on the task at hand is much greater. By increasing focus, more productive discourse and analysis can be made on whatever the given assignment is. Collective creativity can also produce better works, and can facilitate some interesting interaction.

A major factor in the distraction of students I believe is an unwillingness to engage in discourse with each other. A panoptic sort of idea where everybody communicates with each other can help solve boredom. Social media allows us to get to know each other better, as it acts as a disciplined partition in a sense, that we display who we are. If we have a built-in forum for students to learn about each other in the classroom, there is greater opportunity for people to get to know each other, and thus make it easier to talk to each other. Society today is very good at isolating people, but it does not have to be this way. An integrated social media forum in the classroom keeps the data relatively private, if handled appropriately, and is far more effective than the little ice-breakers some professors do, as it lasts longer and enables broader communication. Discussion where people are comfortable with each other tends to produce the best results, as people are not so afraid to state contradictory views. Technology with its panoptic attributes can help achieve this.

Implementation is the key. Panopticism in the academic environment can be either a negative or positive tool, and must be recognized as such. It must not be used as a mechanism of power; it must be able to be used and understood by all participants. One must look at surveillance technologies in a classroom beyond something as simple deterrence, and understand the true effect of the power of mind over mind. Foucault suggested that the effect of panopticism as an educative device can be remarkably effective, as with work ethics. Let us not forget the use of visibility beyond simple definitions of discipline.

Public Feeling and Iconic Images: Mass Cultural Exhibitionism?

Co-authored by Sara Humphreys

 

“Kent State Murder”

Type it into Google and a number of images, websites, and blogs will come up, along with the Neil Young song above.   The historical moment is repeated, consistently, through images and through the circulation of these images on the internet and through other mediums.  Digital culture has canonized this moment of grief and loss.  It has canonized the moment of turmoil and fear that shook America when the Kent State tragedy was publicized in newspapers and broadcast on the news days after former President Nixon announced that American troops would be invading Cambodia, as a side mission to help “success” in Vietnam.  When students protested Nixon’s decision to invade Cambodia, Nixon ordered the National Guard to restore order on Kent State campus and the result was that four student were shot and killed and many more were injured (Hariman and Lucaites in No Caption Needed:Iconic Photographs, Public Culture, and Liberal Democracy).  The iconic image of a young woman wailing over the dead body of her friend, who lies flat on the sidewalk, has become one of the most circulated images of the 1960s and is symbolic of this historical period.

Iconic images are felt individually, but often have a social function as well.  I was wondering how images circulated through digital media affect users differently, since digital media users are exposed to a plethora of images on the internet.  Whether advertisements (which often plague the internet), video, film, memes, GIFs, etc, images are everywhere.  I am using Kent State as an example.  In education, especially in history and historical research, photos are primary evidence that can be used to better understand a particular time.  Historical moments like Kent State resound through historical, literary, and sociological research.  Kent State is the epitome of student subjugation and the image is still circulated widely in universities.  What is more, the image also serves as an excellent example of cultural exhibitionism – something that the digital generation is very practiced at.  Since it is so circulated, does it affect us anymore?  Rather, should it affect us anymore?  ?  Sometimes the line between the real and the digital gets blurred.

The technology we use surrounds us like a fishbowl, surrounding us with a false sense of agency and knowledge.  The internet is another world, one of which, according to Geoffrey Nunberg in his article “Farewell to the Information Age,” “promises to disrupt this process [between sender and receiver]“.  It is a mediation ground that sharply increases the proportion of writers to readers.  There is far more content being produced than actually read, since it has become so easy to produce content on blogs, youtube, and elsewhere (518).  Images like “Kent State” become memes and, through this hyper-reproductive process, I wonder if they lose their meaning.  I weave Nunberg’s article in here because he talks about how the internet affects how readers use information:

“On the Web, that is, you can never have the kind of experience that you can have with the informational genres of print, the experience of interpreting a text simply as a newspaper or encyclopedia article without attending to its author, it publisher, or the reliability of its recommender.  We read Web documents [including images] not as information but as intelligence, which requires an explicit warrant of one form or another” (519-520).

So, we read the Kent State murder differently on the net than we read it in another medium (such as in the book I am reading now).  Since information on the net comes from a dynamic set of sources (some good, some bad) readers have to pry through information to find good sources.  The internet makes information casual, including something as iconic as Kent State.  I use Nunberg’s article in this post because Nunberg’s article disagrees that the internet will overtake print as the key source of information.  Although he acknowledges that we are in the digital age, he states that the information age still holds a strong influence on how we gather, share, and learn information through a variety of mediums.  In fact, without the groundwork that this age, the digital age would simply not be.  It is clear that humans cannot exist without sharing information using all sorts of language.  Digital technology simply presents a new means of sharing, one that interweaves a multitude of languages and mediums.  Digital technology shows the evolution of language technologies.   Kent State becomes a narrative that is accessible through a variety of these technologies.  We can access it through video, music, scholarly articles, and journalistic accounts.  The single event becomes accessible through the medium of choice and the reader can see how social this event actually was.

This brings me to a key point in this post.  I noticed that in No Caption Needed, Hariman and Lucaites write about iconic photographs from the early-to-middle twentieth century.  What makes a photo “iconic” seems to be linked to this age, when photos were circulated only through certain networks.  The general public was at the whim of journalists and news corporations, who presented certain images that would attract readers to a story.  In the introduction of their book, Hariman and Lucaites provide an extensive definition of what they mean by “iconic.”  I include part of this definition below:

Iconic photographs provide an accessible and centrally positioned set of images for exploring how political action (and inaction) can be constituted and controlled though visual media . . . . These images were obviously highly specific objects of memory and admiration, yet also somehow abstract representations whose value was far more symbolic than referential, and more a public art form than objects for connoisseurship (5-6).

Photojournalism played a major role in chronicling the twentieth century, the age of information.  However, we are now in the digital age.  Information is no longer as sacred, or rare, as it used to be.  We no longer share only through specific ways (i.e. phone; photograph) or through specific cultural practices (i.e. story-telling; conversation).  We now share through all these ways in a very abstract fashion.  Information is everywhere and the internet has made society a network (or networks) of information.  We are on a constant, continuous search for it.

How does this effect what it means to be “iconic”?  Is anything “iconic” anymore?  9/11 puts the concept of “iconic” into question, since 9/11 was a recent moment in history that transformed the world in countless ways.  While reading No Caption Needed, I kept thinking of the “Falling Man” image (below).  I have always had a difficult time looking at the image.  In fact, the immediate emotional response is that of nausea and vertigo.  The image captures a moment of complete loss.  An inescapable fall into nothingness.  No Hollywood ending.  The jumper could not escape the fire in the tower and was forced to make a terrible decision.  The “Falling Man”  invokes terror and pathos in those who view it or perhaps macabre glee, but in any case, affect is generated and will do so because this photo is digitally rendered – available for all to view at will.  Perhaps authenticity comes into question when something is seen so many times that the viewer is desensitized to its effect.

Yet, the “Falling Man” seems to still have an effect.   The image also functions as a symbol of national loss; therefore, it functions for nationalistic purposes.  This, according to Hariman and Lucaites, is crucial in the definition of “iconic”: its function for political gain (18).

So, is it “iconic”?  Somewhat.  The digital age, especially the advances in digital technology that were to follow 9/11, has allowed for mass circulation of images and video.  The “Falling Man” has become an iconic image, perhaps the first and last one, of the digital age.  Yet, it also remains among the company of many digitally circulated images and video which have been diluted of their meaning.  They are viewed by so many that it eventually becomes difficult to tell where and when this photo was taken.   Perhaps the image represents the subversion of political meaning.  The photo/video of the “Falling Man” transcends beyond a political purpose nowadays.  It cannot function to support a particular party, nation, or ideology.  The image transcends its iconic potential, since it deals with complete loss.  It deals with death.  In this way it is similar to the Kent State image.  Both situations show tragedy.  In the Kent State photograph, this tragedy functioned to ignite opposition to the Nixon government, the war in Vietnam, and the invasion of Cambodia.  The “Falling Man” was used to justify the invasion of Iraq in 2001; however, in 2012, the image is reluctant to show the support of any violence.  It is a still frame of potential devastation.  What happened to the “Falling Man” is not known.  The “Falling Man” does not bestow a nationalistic identity, even though it brings together a nation in trauma.  Due to its mass circulation – a symptom alone of how fascinating, yet ubiquitous, the image is – the “Falling Man” is an image of an uncertain fate, which America faced at the time and still faces.  Perhaps an uncertain identity is the new identity of the digital age.

Don Tapscott on Post-Secondary Possibilities and the Net Gen

Co-authored by Sara Humphreys

When we began the this site and the project, our group was uncertain where to really begin.  There were so many questions.  How do we define what we are doing?  What are we doing?  The research my colleagues and I have done has provided as many questions as answers.  Work by author, consultant, and technology expert, Don Tapscott’s has influenced my contributions to this site, in particular his focus on the Net Generation (NetGen) and digital (digitized?) education.  Therefore, the decision to ask Don to contribute to the site was as necessary as it was easy. Don very kindly provided insightful comments on how education is evolving to better instruct and teach the Net Gen and why this evolution must happen. Feel free to check out Don’s full answers in a prezi located on the main page or check out the highlights in this post (or do both!):

To start, here is an insightful and perhaps incendiary comment from Don:

“One of the biggest reasons student abandon classrooms in secondary and post-secondary education is that they’re bored”

Whoa. I can hear feathers ruffling…or perhaps sabers rattling? It’s an easy equation: students +classroom = boredom.  Students yawning, drooling, chatting, texting are all common images and the student is often cast as the bad guy in this scenario.  However, Tapscott makes a crucial point in the above quotation.  Students are bored – not because they dislike the material, but because the classroom environment does not suit the way they learn.  The Net Gen grows up surrounded by technology of a variety of mediums.  What is more, students are attracted to using new technology that offers knowledge at a rapid rate and in exciting forms. Students sit with their tablets, laptops, and superphones watching video, listening to music, reading text, and performing their networked selves. All of this multimodality opens up a whole new world of learning for these students. How can a classroom with a teacher lecturing at the front of the classroom with slides and maybe one or two videos measure up?

Tapscott’s main argument is that students grow up in a multi-networked environment, where background noise from technology is normal.  The Net Gen is cognitively different from previous generations.  They are used to distraction, but are not always distracted from the task at hand – they just process it differently.  Tapscott argues that it is the criteria of what is considered distraction that needs to change, not so much the student.  The student is a product of an environment enriched by digital media and once they enter into a classroom, which often functions using the traditional “Industrial Model,” they are at a loss (for a short video on the deficiencies of the Industrial model of education by Sir Ken Robinson, click here).

“[T]he evidence shows that giving students laptops, for example, can free the teacher to introduce a new way of learning that’s more natural for kids who have grown up digital at home”

Laptops in the class give the student freedom to explore and record (through typing and actual recording) the information provided..  Social media use is another thing: students will often flip from facebook to twitter and then look up what an instructor is discussing – all in a matter of seconds. Are students learning or are they distracted? Don says students are learning, but in a new way: students are taking the information they receive in class and expanding on it, which brings into question the efficiency and efficacy of traditional teaching methods. What if students can learn to teach themselves?  The role of the teacher/instructor/professor could change since they no longer would be the centre of all knowledge, but a distributor of knowledge that can be explored further using technology.  Educators have to become guides to a network of information rather than gatekeepers to one way of knowing.  Don states that the educator is similar to TV (a legacy medium – how about that folks?): both are forms of one-way communication.

“Youth today are abandoning one-way TV for the higher stimulus of interactive communication they find on the Internet. Sitting mutely in front of a TV set–or a professor –doesn’t appeal to or work for this generation. They learn best through non-sequential, interactive, asynchronous, multi-tasked and collaborative activities. Digital immersion at a formative stage of life has affected their brain development and consequently the way they think and learn.”

It is simple: we teach how we learn.  Therefore, the student and educator divide will be present until the educator is capable of learning the way their students do. This divide is also apparent in the private sector:

“If companies don’t respond appropriately, Net Geners will start their own corporations”

Don further states that this generation is a “Global Generation,” which have five main qualities: “norms for freedom, customization, collaboration, integrity and innovation.” Further, the “Global Generation” (or Net Gen) is a generation who wants to know why, not just how.  The knowledge of why is the power that they harness when they are creating the next world for both past and future generations.  Net Gen is a generation that will bring profound change, but it will be in bytes.  Gradually, the world will look back to the world these digital natives have created and realize that enormity of change that has occurred in several decades, not several centuries.

And digital technologies are the tools for this change. It’s clear that if academia wants to “train” youth for the future, then academia needs to respond appropriately or NetGeners may bypass university entirely or, more likely, educate themselves.

Thank you to Don Tapscott and Kejina Robinson of the Tapscott Group.  Both provided us with truly useful information.

Twitter Me Impressed or Why We Are Not the “Dumbest Generation”

One of the most intriguing faculties in the realm of the Digital Humanities I’ve discovered is that this loose and baggy discipline seems to span an innumerable number of disciplines. Today in Toronto’s Public Reference Library, I sat in a cubby with a stack of no less than ten books, spanning New Media Studies, Literature, Philosophy, and even pithy dentist-office-sensational-titled nonfiction studies like The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future (Or, Why you shouldn’t trust anyone under 30), by Mark Baurlain. But I’ll get to that piece of…”writing” later.

Coalescing between these multiple disciplines is the differentiation between the cultures of reading & critiquing (consuming) more situated in the 20th century, and the building & making (producing) of the 21st. It is not my attempt to set up a dichotomy, rather acknowledge a difference social participation congruent throughout many texts and theorists speaking to the Digital Age.

In his introduction to a digital humanities reader, Debates in the Digital Humanities, Matthew K. Gold references three-minute plenary speech by Stephen Ramsay at the Modern Languages Association Conference in 2011, that would go on to spur controversy.

Ramsay stated that those who are truly involved in the Digital Humanities (DH) needed to construct and therefore had to know some sort of coding or programming language. I am not sure how in-depth he meant (how about rudimentary HTML, Steve?), but the point was that digital literacy in the humanities means more than knowing how to use MS Office. But Ramsay’s speech (what is it with people named “Ramsay” and controversy?) brings up an interesting questions for the directions DH research needs to take.

Gold goes on to provide an introduction with plenty of probing questions for DH scholarship. Has what was formerly New Media Studies become the Digital Humanities? Does the Digital Humanities need theory? Does the Digital Humanities have politics? Is it accessible to all members of the profession? Do social media platforms like Twitter trivialize the Digital Humanities professional discourse?

I’m going to pause and address this one, because I have seen it surface again and again, as a quipping newspaper headline quivering in its ad revenue boots from the threat that ‘print is over’, to thinly veiled patronizing head pat-ernalism from a clumsy radio commentator, to all varieties of dress up dress down entry points to discussion ‘this young generation’.

As a literature major, and someone not quite 30, it irked me to hear a CBC program discussing the nomination of the graphic novel Essex County in Canada Reads 2011, wherein a participant in the broadcast chuckling reflected on the genre of the graphic novel as something of the ‘Tweet of literature.’ Turning two emergent genres and platforms for narrative and theorizing against each other in such a belittling, unintelligible fashion, is an increasingly common grating of gears meddling with the media presented to our younger generations.

But where exactly is all this coming from? In a body of worked entitled “Technoromanticism‘, (a term coined by Stephen Barron) Richard Coyne explores the pragmatics of cyberspace by discussing the ways in which narratives derived from the Enlightenment and Romantic periods speak situationally; how “every attempt is being made to consign the Enlightenment to the realms of the other, to render it strange and unfamiliar” (180).

In Bauerlein’s sensationalizing study, he balks at a hypothetical ‘Susie’ and her proud parents, beaming with her ability to navigate a multiplicity of media concurrently, but failure to exact the knowledge of what the Soviet Union was. He examines with a quick brush and silver hair ‘the minds of youth’ as negatively influenced by the fleet of digital media that clutter our bedrooms, with the presence of books themselves merely being bookends. Our intellects (assuming we have any) and our physical spaces are a set up of distraction. He attempts to explain in his interview with reason.tv, his distrust of the younger generation as something of a failure on our part, to utilize ‘technology,’ to return to these ideas of mastery, connected in former assemblies of knowledge. However, in his explanation, he demonstrates in himself instead a lack of understanding of the technological objects he (at times improperly) names. At 1:05 seconds he begins to explain that these tools are not used to ‘progress’ ourselves, to go to museums, to learn, to engage. He explains Facebook as something of a Lyceum, where we’d all rather smoke pot and compare hairstyles. But what about our education in school – are we directed to the Smithsonian Institution website? Or has he actually been to Buzzfeed? There’s smart stuff on there, Mark, mixed in with news about “catz.” In other words, maybe this is a failure of the classroom and not so much the user?

Through my apparently feeble-minded Facebooking fingertips, I  turn your attention to the Twitter account of Kim Kierkegaardashian, and reflect on what we’ve already introduced here on the work of Don Tapscott.

There is a fear that is permeating, not strictly located in older generations, but among common publics discouraged by technology and digital media, and that is a fissure, or “trauma” as Coyne posits, from this shift in the comfort of understanding time through a past and future. A ‘networked self’, Papacharissi explains, is always all the time negotiating sites of production and construction of self, and the suggestion of Facebook as a ‘distraction’ is a myth.

Digression: Dr. Strangeblog – Or, How I Learned to Stop Worrying and Love Social Media

Have no fear, Dave is here

I have been a fan of the films of Stanley Kubrick since I was a teenager.  I loved how Kubrick seemed to be a mystic of sorts.  He relied on the susceptibility of the audience to be entertained while also being convinced that what they were seeing was reality.  Kubrick constructed a fourth wall using meta-film.  To a critical viewer, his films showed the process of film making and showed how films were central to our understanding of what it means to be human in modern times.  2001: A Space Odyssey has always been a personal favourite of mine because I love to question the film’s symbolism and how it pertained to the Kubrick’s world and how it pertains to mine.

I think Kubrick and I thought similarly.  Would humans become so reliant upon technology that they would forfeit trusting their own instincts?  The black obelisk: it is terrifying, yet powerful.  It symbolism transcends its physical usage and prompts early humans to act on instinct.  Are we controlled by these instincts?  Or does something else – our technology – control these instincts?

The film brings up many questions, some of which I would like to apply to my first digression on this blog (there will not be many of these, since it is quite, ahem, unprofessional (;D).  Yet, it sheds light on an issue that is important: my initial fear of writing about technology.  To be honest, when I started out on my posts, I needed to update myself with the tech knowledge that comes natural to many of my peers.  My innocence, or perhaps it was ignorance, did come in handy.  My lack of knowledge on the subject made me seek it as much as I could this past summer.  Libraries, academic journals, scientific studies, and, most importantly, communicating with my peers, provided me with some excellent background information.  The posts have gradually become more creative and innovative.  I enjoy writing them now since I feel like I can finally swim through this knowledge; my grasp is getting stronger.

What did I have before?  Technophobia.  Perhaps.  Only a technophobe would take Kubrick, yet another tech skeptic, so seriously, right?  A motif has cropped up.  A very interesting one at that: how much the medium truly is the message.  Technology is the word; it is the language.  When we use it we are learning how to use that language.  We can look foolish.  We can use it incorrectly.  However, we also can shape it.  The blog allows for sculpting.  It becomes an art when you do not feel like you are working but creating.  I love to write creatively, so, my best pieces usually seem to come out of thin air.  They are seamless and, since I write poetry, the only source of their origin I attribute to a muse, or an inspiration.  Ideas for the posts have started to come from these inspirations; these muses.  One has been my concern for marginalized groups, another for the digital divide that I have personally witnessed in class.  Yet another post was developing on these ideas and proposing solution.  There will be more posts on these themes, however, I believe that the post will now explore the very essence of the medium: innovation.

Fear of anything is provoked my lack of knowledge of its essence [Aside: did that sound like Yoda?  Or perhaps Lao Tzu? Well read, I am ;)].

[Another Aside: Hal wasn't actually evil.  It was merely Dave's understanding of evil and, thus, his displacement of his irrational, human fears on Hal.  Dave was fearful of the unknown, so he made an irrational judgement - completely human].

The innovation will come from recent interviews with some very interesting, wonderful academics and digital media experts.  The research that I have done so far is very human focused.  The technology these people are using becomes a means for them to express their ideas; to express their humanity.  To follow their lead, I would also like to delve into my ideas for this technology.  Digital media has made me ponder how and why technology can help people communicate in ways they have never done before.  I tend to look at horizons: I like the future day that is coming.  The horizon begins to light up a world that will be more communicative.  It will share more – maybe even care more.  Not in a mushy way, but in an imperative way.  With digital technology in education students are able to progress much faster and easier than in previous generations.  The actual structure of education is changing.  Curriculum is changing.

Structure is changing.  Through language.  The semiotics that structure our online world are affecting our real one in ways never expected.  With this knowledge students can go into a world and build upon the logistics of communication.  A more communicative world is, hopefully, a less violent one.  With more communication, there is less misunderstandings and a desire to connect on deeper levels.

7 billion in this world.  Near 2 billion live in poverty.  Another several billion do not have fair access to technology and the internet.  It seems like we have a problem, Dave.

Really, we have a resource.  Infinite resources.  One medium: digital technology.  Before becoming afraid of this technology – and I am sure there are many of my peers who still will be – we need to see its benefits as they apply to our greatest needs and desires.

Ciao for now, Sara