Tag: technology

businessinnovationpsychologyscience & technologysocial systems

The logic of a $1000 iPhone

TimePhone.jpg

Today Apple is expected to release a new series of iPhone handsets with the base price for one set at more than $1000. While many commentators are focusing on the price, the bigger issue is less about what these new handsets cost, but what value they’ll hold. 

The idea that a handset — once called a phone — that is the size of a piece of bread could cost upward of $1000 seems mind-boggling to anyone who grew up with a conventional telephone. The new handsets coming to market have more computing power built into them than was required for the entire Apollo space missions and dwarf even the most powerful personal computers from just a few years ago. And to think that this computing power all fits into your pocket or purse.

The iPhone pictured above was ‘state of the art’ when it was purchased a few years ago and has now been retired to make way for the latest (until today) version required not because the handset broke, but because it could no longer handle the demands placed on it from the software that powered it and the storage space required to house it all. This was never an issue when people used a conventional telephone because it always worked and it did just one thing really well: allowed people to talk to each other at a distance.

Changing form, transforming functions

The iPhone is as much about technology as it is a vector of change in social life that is a product of and contributor to new ways of interacting. The iPhone (and its handset competitors) did not create the habits of text messaging, photo sharing, tagging, social chat, augmented reality, but it also wasn’t just responding to humans desire to communicate, either. Adam Alter’s recent book Irresistible outlines how technology has been a contributor to behaviours that we would now call addictive. This includes a persistent ‘need’ to look at one’s phone while doing other things, constant social media checking, and an inability to be fully present in many social situations without touching their handset.

Alter presents the evidence from a variety of studies and clinical reports that shows how tools like the iPhone and the many apps that run on it are engineered to encourage the kind of addictive behaviour we see permeating through society. Everything from the design of the interface, to the type of information an app offers a user (and when it provides it), to the architecture of social tools that encourage a type of reliance and engagement that draws people back to their phone, all create the conditions for a device that no longer sits as a mere tool, but has the potential to play a central role in many aspects of life.

These roles may be considered good or bad for social welfare, but in labelling such behaviours or outcomes in this way we risk losing the bigger picture of what is happening in our praise or condemnation. Dismissing something as ‘bad’ can mean we ignore social trends and the deeper meaning behind why people do things. By labelling things as ‘good’ we risk missing the harm that our tools and technology are doing and how they can be mitigated or prevented outright.

PhoneLookingCrowd.jpg

Changing functions, transforming forms

Since the iPhone was first launched, it’s moved from being a phone with a built in calendar and music player to something that now can power a business, serve as a home theatre system, and function as a tour guide. As apps and software evolve to accommodate mobile technology, the ‘clunkiness’ of doing many things on the go like accounting, take high-quality photos, or manage data files has been removed. Now, laptops seem bulky and even tablets, which have evolved in their power and performance to mimic desktops, are feeling big.

The handset is now serving as the tether to each other and creates a connected world. Who wants to lug cables and peripherals with them to and from the office when you can do much of the work in your hand? It is now possible to run a business without a computer. It’s still awkward, but it’s genuinely possible. Financial tools like Freshbooks or Quickbooks allow entrepreneurs to do their books from anywhere and tools like Shopify can transform a blog into a full-fledged e-commerce site.

Tools like Apple Pay have turned your phone into a wallet. Paying with your handset is now a viable option in an increasing number of places.

This wasn’t practical before and now it is. With today’s release from Apple, new tools like 3-D imaging, greatly-improved augmented reality support and enhanced image capture will all be added to the users’ toolkit.

Combine all of this with the social functions of text, chat, and media sharing and the handset has now transformed from a device to a social connector, business driver and entertainment device. There is little that can be done digitally that can’t be done on a handset.

Why does this matter?

It’s easy to get wrapped up in all of this as technological hype, but to do so is to miss some important trends. We may have concern over the addictive behaviours these tools engender, the changes in social decorum the phone instigates, and the fact that it becomes harder to escape the social world when the handset is also serving as your navigation tool, emergency response system, and as an e-reader. But these demands to have everything in your pocket and not strapped to your back, sitting on your desk (and your kitchen table) and scattered all over different tools and devices comes from a desire for simplicity and convenience.

In the midst of the discussion about whether these tools are good or bad, we often forget to ask what they are useful for and not useful for. Socially, they are useful for maintaining connections, but they have shown to be not so useful for building lasting, human connections at depth. They are useful for providing us with near-time and real-time data, but not as useful at allowing us to focus on the present moment. These handsets free us from our desk, but also keep us ‘tied’ to our work.

At the same time, losing your handset has enormous social, economic and (potentially) security consequences. It’s no longer about missing your music or not being able to text someone, when most of one’s communications, business, and social navigation functions are routed through a singular device the implications for losing that device becomes enormous.

Useful and not useful/good and bad

By asking how a technology is useful and not useful we can escape the dichotomy of good and bad, which gets us to miss the bigger picture of the trends we see. Our technologies are principally useful for connecting people to each other (even if it might be highly superficial), enabling quick action on simple tasks (e.g., shopping, making a reservation), finding simple information (e.g., Google search), and navigating unknown territory with known features (e.g., navigation systems). This is based on a desire for connection a need for data and information, and alleviating fear.

Those underlying qualities are what makes the iPhone and other devices worth paying attention to. What other means have we to enhance connection, provide information and help people to be secure? Asking these questions is one way in which we shape the future and provide either an alternative to technologies like the iPhone or better amplify these tools’ offerings. The choice is ours.

There may be other ways we can address these issues, but thus far haven’t found any that are as compelling. Until we do, a $1000 for a piece of technology that does this might be a bargain.

Seeing trends and developing a strategy to meet them is what foresight is all about. To learn more about how better data and strategy through foresight can help you contact Cense

Image credits: Author

 

design thinkingpsychologyscience & technologysocial systems

The Disconnected Human

8474361613_f8e5a6ff2f_k

In this mini-series we look at the phenomenon of paradox and some of the prominent ones in our social world. Today we look at human disconnection in the face of mass-urbanization, globalization, social media and information technology and why so many feel so isolated in a world pushing ever-more interconnection.

It is possible to have a wristwatch that is connected to a phone which also connects to your online social network platforms, a heart rate monitor, earphones and maybe even the appliances in your home. We do not lack for connections, but we do often lack connectedness.

As it turns out, connectedness matters. Laboratory and clinical research on addiction (summary), has suggested that it is as much a problem of social connection as biochemistry. Those who are socially isolated experience a higher degree of, and experience with, addiction. Loneliness, in psychological terms, refers to the absence of connection and communication with others and both negative affect and cognition resulting from this. Loneliness is not just a social issue, it gets under the skin  (pdf) with research pointing to pathways for harmful biochemical changes among those who are lonely.

Still, how is it that we are in an age of increasing global urbanization (pdf) whereby more people are living closer to each other, exposed to more people than ever before, and yet there is a parallel increase in loneliness? How can it be that we have more tools — ones that are mobile, instantaneous, and easy to use — that can share rich media with nearly everyone we know at nearly any time of day and still find ourselves isolated?

This paradox is all by design — whether intended or not.

Disconnecting the social graph: Facebook

Facebook is a classic example of how to design isolation into a tool aimed at promoting connection. Once a college tool for meeting and connecting friends, it now serves as a news media source, community organizing tool, general communication platform, organizational home page, text message system and photography album. It’s ever-changing, sophisticated algorithm ensures that every login to Facebook is different, aside from the current format of seeing top post followed immediately by an ad, the next top posts, more advertising, and so on. When you need to find something explicit, it’s very hard and that’s by design. The more connections you have, the harder it is to find material and maintain those connections without having to sift through material that, ironically, disconnects you from the purpose of your visit.

This all helps keep you on the site and coming back.

But this very aspect of having to come back frequently, to see different things each time, and to have to root through social and marketing ephemera to get to something that feels social is what isolates us. Yet, the intermittent reinforcement that comes when you log in and find just what you want the moment you open the screen is based on a crude, but powerful set of psycho-biological principles that anchor behaviour to the pleasurable feel of dopamine that rushes through the brain when you get that social media high. Add in everyday stress and the cortisol it releases and oxytocin hormone rush that come when we connect and you’ve got the perfect ghost-in-the-machine scenario to keep you locked on to this tool that offers you the hope of connection.

Now Facebook is aiming to integrate it into its other properties like Instagram and WhatsApp presumably to integrate this experience and your data along with it. This has the added benefit of Facebook of doing what other marketers already do and that is follow me everywhere I go on the Internet and reminding of me of more things to buy, consume and connect to, which will only add (paradoxically) to my sense of disconnection. Other social media platforms do this differently, but nearly all of them offer a variant of the same sort of stimuli aimed at keeping you posting pictures, exchanging messages, and sharing content.

Stimuli addiction

A powerful post by my friend, colleague and fellow designer, Medina Eve, wrote a deeply personal, provocative piece on living with ADD as an adult and the lost generation of souls who share her circumstances. Her brave, detailed story chronicles how she, like many young women in particular, have struggled with focus due to ADD and reaping the benefits that come with it, despite being an incredibly productive, intelligent, engaged person. Her story provides a first-person account of a social epidemic and paradox on how the ability to connect to so much means there is little ability to connect deeply to many of the things that matter and the incredible isolation that this engenders.

When the world offers too much to pay attention to (or filter through), we get too little in return.

ADD is at its core is an addiction to stimuli. It is the bodymind getting overwhelmed with the amount of stimulation we have around us which reduces our ability to filter, ignore and reject stimulation of various sorts coming at us. If you have any doubts about how much stimuli we are exposed to practice a mindful meditation where you aim to simply pay attention to what’s around you and what’s in your head. It can be remarkable that everyone doesn’t have it.

This is also a problem I’ve certainly battled and continue to battle with limited success and I am certainly not alone. This addition to the stimulation around us, particularly through socially-connected media and our explicit and ambient technologies that facilitate it all, is not only making us less connected, it’s also making us less human. And this is also by design.

Stimulation by Design

A look at the image below provides an illustration of how we design for stimulation. Imagine the holiday season and the Covent Garden Market in central London. All around there is music, food, bustling crowds doing holiday shopping and business, shopkeepers and buskers selling everything from entertainment to handbags to Lebanese street food, and the air filled with the scents of perfume, various cuisines from around the world, and an air of cedar from the holiday wreaths. All of this is lit up and decorated as the crowds jam through the stalls, eateries and cafes to take it all in. This is what Covent Garden wants and it is why people come from all over the world to take it all in. If there were no people, less ‘stuff’ and less activity it wouldn’t be attractive, which is why not all of London’s markets look like this.

But thankfully for us all, we can’t take Covent Garden with us. We have the option to disengage from it in a way we don’t with social technology.

coventgarden_snapseed

The Holiday Crowd at Covent Garden

What you will also see among this bustle are families walking together, friends gathering over a drink, and individuals roaming through the market, maybe even stopping to take a picture or two. For those who are enjoying this space, I suspect they are doing so because it’s special. While London is a very crowded, colourful city, it’s not this crowded or colourful all the time (although that is changing, too).

But what happens when the energy of the crowd and the space turn against us? Most of teh time, human beings adapt. I am sure if you were to bring someone from even 100 years ago they might break down at the experience of all this stimulation, because they aren’t used to it. Many of us are, or are we?

Social disconnection and its sequalae may be pointing to the paradox present in our question to create more stimulation and feedback opportunities by loosening our ability to connect to the very things that are at the heart of much of this stimulation: pleasure and the connection to our own humanity.

 

Giving up the Internet: A case study

Comedian Louis C.K. has a funny, poignant reflection on what we lose in this stimulated world during a guest spot on the Conan O’Brien show.

Kids don’t build empathy through interactions and building the ability to be yourself, with yourself; the kind of experiences you can only have without technology. What a powerful thought.

Louis C.K. was so concerned about what technology was doing not only to his kids, but himself that he ‘quit the Internet’ altogether as you can see in the segment below.

What Louis C.K did was design the conditions in which he used (or didn’t use) technology. His aim was to create, improve, and remedy the experiences he had with his children and found a way to do it. Aside from some tech support from his daughters he did this all alone. The reward was increased connection to his family, however what we don’t know was what cost there was in disconnecting. Maybe that cost was worth the doing.

 

Invisible problems, invisible solutions?

The point here is that design is often best when it’s invisible. It’s what makes the stimulation economy so insidious because it’s reach is everywhere, yet is often not noticed, thus making it a very successful design. The challenge, if we wish to channel the stimulation and influence what we have in our lives and to increase the connectedness in which this paradox of connecting tools present, is to design equally invisible solutions.

That is the focus of what is to come in this series along with a deeper exploration of connectedness and its shadow, loneliness.

Photo credits: Disconnect by Randy Heinitz used under Creative Commons License via Flickr. Thanks for sharing your work Randy.

Covent Garden at Christmas by the author

complexityjournalismscience & technologysocial systems

Our Paradoxical Age

pokemongodebut

If there was a word we could use to define the current times, paradox would certainly have to be a leading candidate. Can we learn to love this seemingly maddening force or are we doomed to accept this emergent complexity? This first in a series looks at some of the paradoxes of the day, what they might mean for our society and how we might live with them. 

paradox |ˈperəˌdäks|

noun

a statement or proposition that, despite sound (or apparently sound) reasoning from acceptable premises, leads to a conclusion that seems senseless, logically unacceptable, or self-contradictory: a potentially serious conflict between quantum mechanics and the general theory of relativity known as the information paradox.

• a seemingly absurd or self-contradictory statement or proposition that when investigated or explained may prove to be well founded or true: in a paradox, he has discovered that stepping back from his job has increased the rewards he gleans from it.

• a situation, person, or thing that combines contradictory features or qualities: the mingling of deciduous trees with elements of desert flora forms a fascinating ecological paradox.

If you think we’re living in strange times you’re not alone. Technology and its influence on our social world has produced things, ideas, encounters that only a few years ago would seem utterly preposterous if not impossible. Self-driving cars, drone delivery, digital social networks and video telephony, as remarkable as they are, have been somewhat pre-saged by science fiction and in a Jetsons-esque manner seem somewhat plausible to those who’s imaginations are sufficiently rich or their attenuation to popular culture sufficiently robust.

What I am talking about are the less dramatic or technologically sophisticated , but powerful shifts that have come from new products and services that are moving from the occasional ‘once-in-a-blue-moon’ kind of occurrence to something common and regular. This is producing paradoxes in droves, which is presenting conundrums for social scientists, policy makers and citizens alike.

Pokemon Go and the case of the walking dead-or-alive(?)

The above photo was taken July 11th in Toronto, Canada. I was walking upon a square when I noticed nearly everyone — dozens of people (with more coming every minute) congregating at the square to look at their phone. Unbeknownst to me at the time, five days earlier a game called Pokemon Go was released in Japan and the United States (a game that would not be released in Canada for another six days so these individuals were all using a version they’d obtained through some kind of digital work-around). These people were all chasing Pokemon characters who happened to “be” in that square.

The Pokemon Go craze had just ignited and the phenomena wouldn’t hit the mainstream news for another day or two so I was left wondering what was going on and thinking how sad it was that one of the nicest days of summer to date was being spent by so many standing looking at their phone. I took some pictures, made some inquiries and was left amazed, confused and slightly depressed all at the same time.

To call Pokemon Go a game seems misleading. So does calling it a community, a phenomenon, a technological marvel, a marketing coup, a social convener, a public health risk, a public health benefit or a waste of time. It’s something completely new and it brings with it a number of puzzling, odd and paradoxical qualities as noted in the piece below from The Pipe Dream Meme.

As one man reports: “I’m not the most physically fit person, obviously, but I have walked more since this game has come out than I have in my entire life” He goes on to talk about how he knows more about his city than ever did before thanks to Pokemon Go, a video game that involves a person being focused on their screen, not the actual city around them even though what is on the screen is based on the physical city (and country and cemeteries and…) that is the foil for the hundreds of Pokemon characters to live through a handset.

Whether you consider the thousands of people walking around your city staring at their handset the walking dead (as disconnected from the world) or the living (as engaging with the world, differently) is a matter of perspective.

Paradox thy name is Pokemon.

Extensions of humanity to what?

Marshall McLuhan wrote (PDF) that the medium is the message and that the tools and technologies embedded in that media initially extend our humanity then culturally envelopes humanity by making us an extension of it. A simple look at capitalism and the use of money as a means of negotiating our sense lives illustrates this as McLuhan points out.

“Money has reorganized the sense life of peoples just because it is an extension of our sense lives. This change does not depend upon approval or disapproval of those living in the society.”

McLuhan cites the work of Carl Jung to support his thesis by drawing on a quote that illustrate the insidiousness of system of paradoxes and what they can have on a society:

Every Roman was surrounded by slaves. The slave and his psychology flooded ancient Italy, and every Roman became inwardly, and of course unwittingly, a slave. Because living constantly in the atmosphere of slaves, he became infected through the unconscious with their psychology. No one can shield himself from such an influence (Contributions to Analytical Psychology, London, 1928).

Thus by enslaving others we ourselves become enslaved.

Perhaps no better example of this paradox is in the way we’ve created tools to learn, exchange information and automate activity — making our work much more efficient — and finding ourselves either overworked or out of a job entirely. We’ve created a capitalist consumption system that relishes in efficiency in order to provide us with more of what we want and need to survive, thrive and be happy and we seem to put ourselves out of work; create stressed over-work for many of those who have jobs; destroy the planet (which is the only place to live), disconnect us from society and ourselves; and in a manner that contributes to mental health disorders along the way.

Things ought to be amazing — and in many ways they are — but the horrors created along with this are as notable and significant not only for our life today, but the future of the planet. This is the paradox of plenty.

Creating stupidity through knowledge

The problems we’ve created from consumption would be manageable if it was simply an issue of lack of knowledge. Solving knowledge-based problems is pretty straightforward: you find the right information, package it appropriately to the right audience, and ensure you deliver that message at the right time and place. This is the basis behind the knowledge transfer model and second generation of knowledge-to-action theories. Ask any marketer and they’ll tell you that while there’s no one way to do this and it does take work and experimentation, the mechanics are pretty straightforward.

Yet, knowledge (and truth, which is linked to this knowledge), is losing its power to sway people in the information society, which is based largely on the production / consumption / use of knowledge. As we have more access to more knowledge about something we are often less informed and more likely to discount the very thing we are using to make decisions. Paradoxical, isn’t it?

The ascendency of Donald Trump from real estate developer/reality TV show host/beauty pageant promoter to Republican candidate for the President of the United States is as good of an example as you’ll ever find. Irrespective of whatever policy positions you might hold, it’s impossible to deny that his track record of outright lies is beyond the pale. Or maybe it’s not impossible and that’s the problem.

Clay Johnson wrote about this phenomenon and drew parallels between our obesogenic culture and that of information consumption. He was inspired by an encounter with a protester in the early days of what would become the ‘Obamacare’ movement who had a sign saying “Keep your government hands off my Medicare” and recalls the circumstance in his book The Information Diet:

I spoke to this protestor about his sign. He seemed rather well educated — sure, he was angry, but he was not dumb, just concerned about the amount of money being spent by the current administration…This man did not suffer from a lack of information. Yet he had failed to consider the irony of holding a sign above his head asking government to keep its hands off a government-run program. To him, it made perfect sense.

 

So what’s to be done? Anything? That’s what I’ll explore in the next post.

complexityjournalismscience & technologysocial media

The more we get together

8145864985_81d273fea5_k

As we forge ever-greater connections online to each other and the world of ideas the thinking was that we would be far better off, more tolerant, educated and wise and yet there is much evidence to suggest this isn’t the case. What does it mean to come together and how can we do this that brings us closer rather than driving us further apart? 

The more we get together, the happier we’ll be – lyric from popular song for children

Like many, I’ve grown up thinking this very thing and, for the most part, my experience has shown this to be true. However upon reflection, I’m realizing that most of this experience is related to two things that could reveal a potential flaw in my thinking: 1) I’m thinking of face-to-face encounters with others more than any other type and also 2) most of the relationships I’ve formed without aid of or post use-of the Internet.

Face-to-face interactions of any real quality are limited in nature. We only have so many hours in a day and, unless your job is extremely social or you live in a highly communal household complex, we’re unlikely to have much interaction with more than a few dozen people per day that extends beyond “hello” or something like that. This was explored in greater detail by anthropologist Robin Dunbar, who determined that our social networks are usually capped at 100 – 250 individuals. Dunbar’s number (the commonly held mean number of people in these networks) is commonly considered to be 150.

Why does this matter? When we engage others online, the type of interactions and the number of ideas we engage can be far larger, or at least is certainly different in how those relationships are managed. We see comments on discussion boards, social media posts, videos and pictures shared online, and are exposed to media messages of all types and through myriad news (official, professional and otherwise) sources. Ethan Zuckerman, who I’ve written about before, has written extensively about the paradox of having such incredible access to diversity in the world and yet we often find ourselves increasingly insular in our communication patterns, choosing like-minded opinions over alternative ones.

Looking ahead by looking back at Marshall McLuhan

Journalist Nicholas Carr, who’s written extensively on the social context of technology, recently posted an interview with Marshall McLuhan from 1977 speaking on his views about where media was going and his idea of “the global village”. His piece, the global village of violence, was enlightening to say the least. In it, Carr points to the violence we are committing in this global village and how it doesn’t square with what many thought were the logical outcomes of us connecting — and does so by pointing back to McLuhan’s own thoughts.

McLuhan’s work is often a complicated mess, partly because there is a large, diverse and scattered academic culture developed around his work and thus, often the original points he raised can get lost in what came afterwards. The cautions he had around hyper-connection through media are one of those things. McLuhan didn’t consider the global village to be an inherently good thing, indeed he spoke about how technology at first serves and then partly controls us as it becomes normalized part of everyday life — the extension becomes a part of us.

As is often the case with McLuhan, looking back on what he said, when he said it and what it might mean for the present day is instructive for helping us do, just as his seminal work sought to help us do, understand media and society. Citing McLuhan, Nicholas Carr remarked that:

Instantaneous, universal communication is at least as likely to breed nationalism, xenophobia, and cultism as it is to breed harmony and fellow-feeling, McLuhan argues. As media dissolve individual identity, people rush to join “little groups” as a way to reestablish a sense of themselves, and they’ll go to extremes to defend their group identity, sometimes twisting the medium to their ends

Electronic media, physical realities

These ‘little groups’ are not always so little and they certainly aren’t weak. As we are seeing with Donald Trump‘s ability to rally a small, but not insignificant population in the United States to join him despite his litany of abusive, sexist, inflammatory, racist, discriminatory and outwardly false statements has been constantly underestimated. Last week’s horrible mass shooting in Orlando brought a confluence of groups into the spotlight ranging from anti-Muslim, both anti-gay and gay rights, pro-gun, along with Republican and Democratic supporters of different issues within this matter, each arguing with intensity and too often speaking past each other. Later this week we saw British MP Jo Cox murdered by someone who saw her as a traitor to Britain, presumably on account of her position on the pending ‘Brexit’ vote (although we don’t yet know the motivation of the killer).

 

There are many reasons for these events and only some that we will truly know, but each matter points to an inability to live with, understand and tolerate others’ viewpoints and extreme reactions to them. The vitriol of debate on matters in the public sphere is being blamed for some of these reactions, galvanizing some to do horrible things. Could it be that our diversity, the abundance of interactions we have and the opportunities to engage or disengage selectively

If this hypothesis holds, what then? Should we start walling off ourselves? No. But nor should we expect to bring everyone together to share the tent and expect it to go well without very deliberate, persistent, cultivation and management of relationships, collectively. Much like a gardener does with her garden, there’s a need to keep certain things growing, certain things mixing, certain things out and others in and these elements might be different depending on the time of year, season, and plants being tended to. Just as there is no ‘one garden’ style that fits everywhere, there is no one way to do ‘culture’, but some key principles and a commitment to ongoing attention and care that feed healthy cultures (that include diversity).

As odd as this may sound, perhaps we need to consider doing the kind of civic development work that can yield healthy communities online as well as off. We certainly need better research to help us understand what it means to engage in different spaces, what types of diversity work well and under what conditions, and to help us determine what those ‘simple rules’ might be for bring us closer together so, like the childrens song above, we can be happier rather than what we’ve been becoming.

Complexity isn’t going away and is only increasing and unless we are actively involved in cultivating and nurturing those emergent properties that are positive and healthy and doing it by design, and viewing our overlapping cultures as complex adaptive systems (and creating the policies and programs that fit those systems), we put ourselves at greater risk for letting those things emerge that drive us further apart than bring us together.

 

Photo credit: Connections by deargdoom57 used under Creative Commons License via Flickr. Thanks deargdoom57 for sharing your work!

 

behaviour changepsychologysocial mediasocial systems

My troubled relationship with social media

6847365223_79666079f5_o

Do you care about donuts? I did, once. I’m not so sure anymore.

I used to love donuts, was passionate about donuts and spent the better part of my early career looking at the power of social media to transform our understanding of and engagement with donuts. Just this week, I had a paper published that I co-authored with colleagues looked at Twitter is being used to engage audiences on donuts, er vaping and it’s potential public health implications. I’m still into donuts, but the question is whether donuts are still serving the purpose they once did. It’s left me asking….

Is it still time to make the donuts?

Twitter turned 10 this past month. When it was founded the idea of communicating short 140 character chunks of content to the world by default (unlike Facebook, where you could restrict you posts to your ‘friends’ only by default), the idea seemed absurd, particularly to me. Why would anyone want to use something that was the equivalent of a Facebook status update without anything else? (Keep in mind that link shorteners were not yet in wide use, the embedded pictures and lists that we have now were either not invented or highly cumbersome).

However, social media is a ‘participation sport’ as I like to say and by engaging with it I soon realized Twitter’s enormous potential. For the first time I could find people who had the same quirky collection of interests as I did (e.g, systems science, design, innovation, Star Wars, coffee, evaluation, soccer, politics, stationary and fine writing instruments – and not necessarily in that order, but in that combination) and find answers to questions I didn’t think to ask from people I didn’t know existed.

It was a wonder and I learned more about the cutting edge of research there than I ever did using traditional databases, conferences or books much to the shock, horror and disbelief of my professional colleagues. I’ve often been considered an early adopter and this was no exception. I did research, consultation and training in this area and expanded my repertoire to Instagram, Pinterest, YouTube, LinkedIn and pretty much everything I could including some platforms that no longer exist.

I developed relationships with people I’d never (and still have never) met from around the world who’s camaraderie and collegiality I valued as much or more than those people I’d known for years in the flesh. It was heady times.

But like with donuts, it’s possible to have too much of a good thing. And also like donuts, where I once loved them and enjoyed them regularly consuming them now starts to not sit so well and that’s maybe for the better.

I’m left questioning whether it’s still time to make the donuts.

The river I stand in

This river I step in is not the river I stand in – Heraclitus

Like with donuts the experience of social media — the context of its use — has changed. As I age, eat better, exercise more wisely and am more mindful of how I feel and what I do, donuts lost appeal. They probably taste the same, but the experience has changed and not because the donuts are different, but my dietary and lifestyle context is.

The same is true for social media.

I have never been a techno advocate or pessimist, rather I’ve been a pragmatist. Social media does things that traditional media does not. It helps individuals and organizations communicate and, depending on how its used, engage an audience interactively in ways that ‘old media’ like billboards, radio, TV and pamphlets do not. But we still have the old media, we just recognize that it’s good at particular things and not others.

But the river, the moving and transforming media landscape, is much faster, bigger and bolder than it was before. Take the birthday girl or boy, Twitter, it’s grown to be a ubiquitous tool for journalists, celebrities and scholars, but saw a small decline in its overall use after a year of flatlined growth.

TwitterMonthlyActive 2016-04-01 13.50.14(Twitter monthly users via Tech Crunch)

As for Facebook, it’s faring OK. While it still has growth, I’ve struggled to find anyone who speaks in glowing terms about their experience with the service, particularly anyone who wishes to change their privacy settings or wishes to stem the flow of ads. Over at Instagram, my feed has seen the rise of ‘brands’ following me. No longer is it the names of real people (even if its a nickname) it’s usually some variant of ‘getmorefollowers’ or brands or something like that. This is all as I see more ads and less life.

Information overload and filter failure

Speaking to an audience in 2008, author and media scholar Clay Shirky spoke to the problem of ‘information overload’ which was a term being applied to the exponential rise in exposure people had to information thanks to the Internet and World Wide Web. At the time, his argument was that it was less about overload of information, than a failure of our filter systems to make sense of what was most useful. 

But that was 2008. That was before the mobile Internet really took off. That was when Twitter was 2 and Facebook just a couple years later. In the third quarter of 2008, Facebook had around 100,000 users and now its got a population of more 1.6B users. The river has got bigger and more full. That might be nice if you’re into white water rafting or building large hydro-electric dams, but it might be less enjoyable if you’re into fly fishing. I can’t imagine A River Runs Through It with a water feature that’s akin to Niagara Falls.

As journalist Douglas Rushkoff has pointed out in many different fora, the Internet is changing the way we think.  Indeed, ‘smarter technologies’ are changing the way we live.

This all brings up a dilemma: what to do? As one who has studied and advised organizations on how to develop and implement social media strategies I would be a hypocrite to suggest we abandon them. Engaging with an audience is better than not doing so. Humanizing communications – which is something social media can do far better than speaking ‘at’ people — is better than not. Being timely and relevant is also better than not. Yet, the degree to which social media can answer these problems is masked by the volume of content out there and the manner in which people interact with content.

Walking through any major urban area, take public transit, or watching people in line for pretty much anything will find a substantial portion of humans looking at their devices. Even couples or friends at restaurants are left to concoct games to get people paying attention to each other, not their devices. We are living in the attentional economy and what is increasingly valuable is focus, not necessarily more information and that requires filtration systems that are not overwhelmed by the volume of content.

Emotional pollution and the antisocial media

I recently wrote about how ‘the stream’ of social media has changed the way that social activism and organizing is done. While social media was once and invaluable tool for organizing and communicating ideas, its become a far more muddled set of resources in recent years. To be sure, movements like Black Lives Matter and others that promote more democratic, active social engagement on issues of justice and human dignity are fuelled and supported by social media. This is a fantastic thing for certain issues, but the question might be left: for how long?

Not so long ago, my Facebook feed was filled with the kind of flotsam, jetsam and substance of everyday life. This was about pictures of children or vacations, an update on someone’s new job or their health, or perhaps a witty observation on human life, but the substance of the content was the poster, the person. Now, it is increasingly about other people and ‘things’ . It’s about injustices to others and the prejudices that come with that, it’s about politics (regardless of how informed people are), it’s about solidarity with some groups (at the willful ignorance of others) and about rallying people to some cause or another.

While none of these are problematic — and actually quite healthy in some measure — they are almost all I see. On Twitter, people are sharing other things, but rarely their own thoughts. On Facebook, it’s about sharing what others have written and the posters emotional reaction to it.

Increasingly, it’s about social validation. Believe my idea. “Like” this post if you’re really my friend. Share if you’re with me and not with them. And so on.

What I am left with, increasingly, is a lost sense of who the ‘me’ and the ‘them’ are in my social media stream. What it feels is that I am increasingly wading into a stream of emotional pollution rather than human interaction. And when my filters are full, this gets harder to do and I’m not sure I want to be less sensitized to the world, but I also don’t want my interactions with others to be solely about reacting to their rage at the world or some referendum on their worldview. It seems that social media is becoming anti-social media.

In complex systems we might see this is as a series of weak, but growing stronger, signals of something else. Whether that’s collective outrage at the injustices of the world, the need for greater support, or the growing evidence that social media use can be correlated with a sense of loneliness, I’m not sure.

But something is going on and I’m now beginning to wonder about all those donuts we’ve created.

Photo credit: Chris Lott Social Media Explained (with Donuts) used under Creative Commons License via Flickr

About the author: Cameron Norman is the Principal of Cense Research + Design and works at assisting organizations and networks in creative learning through design, program evaluation, behavioural science and system thinking.

 

complexityemergenceevaluationinnovation

Do you value (social) innovation?

Do You Value the Box or What's In It?

Do You Value the Box or What’s In It?

The term evaluation has at its root the term value and to evaluate innovation means to assess the value that it brings in its product or process of development. It’s remarkable how much discourse there is on the topic of innovation that is devoid of discussion of evaluation, which begs the question: Do we value innovation in the first place?

The question posed above is not a cheeky one. The question about whether or not we value innovation gets at the heart of our insatiable quest for all things innovative.

Historical trends

A look at Google N-gram data for book citations provides a historical picture of how common a particular word shows up in books published since 1880. Running the terms innovation, social innovation and evaluation through the N-gram software finds some curious trends. A look at graphs below finds that the term innovation spiked after the Second World War. A closer look reveals a second major spike in the mid-1990s onward, which is likely due to the rise of the Internet.

In both cases, technology played a big role in shaping the interest in innovation and its discussion. The rise of the cold war in the 1950’s and the Internet both presented new problems to find and the need for such problems to be addressed.

Screenshot 2014-03-05 13.29.28 Screenshot 2014-03-05 13.29.54 Screenshot 2014-03-05 13.30.13

Below that is social innovation, a newer concept (although not as new as many think), which showed a peak in citations in the 1960’s and 70s, which corresponds with the U.S. civil rights movements, expansion of social service fields like social work and community mental health, anti-nuclear organizing, and the environmental movement.  This rise for two decades is followed by a sharp decline until the early 2000’s when things began to increase again.

Evaluation however, saw the most sustained increase over the 20th century of the three terms, yet has been in decline ever since 1982. Most notable is the even sharper decline when both innovation and social innovation spiked.

Keeping in mind that this is not causal or even linked data, it is still worth asking: What’s going on? 

The value of evaluation

Let’s look at what the heart of evaluation is all about: value. The Oxford English Dictionary defines value as:

value |ˈvalyo͞o|

noun

1 the regard that something is held to deserve; the importance, worth, or usefulness of something: your support is of great value.

• the material or monetary worth of something: prints seldom rise in value | equipment is included up to a total value of $500.

• the worth of something compared to the price paid or asked for it: at $12.50 the book is a good value.

2 (values) a person’s principles or standards of behavior; one’s judgment of what is important in life: they internalize their parents’ rules and values.

verb (values, valuing, valued) [ with obj. ]

1 estimate the monetary worth of (something): his estate was valued at $45,000.

2 consider (someone or something) to be important or beneficial; have a high opinion of: she had come to value her privacy and independence.

Innovation is a buzzword. It is hard to find many organizations who do not see themselves as innovative or use the term to describe themselves in some part of their mission, vision or strategic planning documents. A search on bookseller Amazon.com finds more than 63,000 titles organized under “innovation”.

So it seems we like to talk about innovation a great deal, we just don’t like to talk about what it actually does for us (at least in the same measure). Perhaps, if we did this we might have to confront what designer Charles Eames said:

Innovate as a last resort. More horrors are done in the name of innovation than any other.

At the same time I would like to draw inspiration from another of Eames’ quotes:

Most people aren’t trained to want to face the process of re-understanding a subject they already know. One must obtain not just literacy, but deep involvement and re-understanding.

Valuing innovation

Innovation is easier to say than to do and, as Eames suggested, is a last resort when the conventional doesn’t work. For those working in social innovation the “conventional” might not even exist as it deals with the new, the unexpected, the emergent and the complex. It is perhaps not surprising that the book Getting to Maybe: How the World is Changed is co-authored by an evaluator: Michael Quinn Patton.

While Patton has been prolific in advancing the concept of developmental evaluation, the term hasn’t caught on in widespread practice. A look through the social innovation literature finds little mention of developmental evaluation or even evaluation at all, lending support for the extrapolation made above. In my recent post on Zaid Hassan’s book on social laboratories one of my critique points was that there was much discussion about how these social labs “work” with relatively little mention of the evidence to support and clarify the statement.

One hypothesis is that evaluation can be seen a ‘buzzkill’ to the buzzword. It’s much easier, and certainly more fun, to claim you’re changing the world than to do the interrogation of one’s activities to find that the change isn’t as big or profound as one expected. Documentation of change isn’t perceived as fun as making change, although I would argue that one is fuel for the other.

Another hypothesis is that there is much mis-understanding about what evaluation is with (anecdotally) many social innovators thinking that its all about numbers and math and that it misses the essence of the human connections that support what social innovation is all about.

A third hypothesis is that there isn’t the evaluative thinking embedded in our discourse on change, innovation, and social movements that is aligned with the nature of systems and thus, people are stuck with models of evaluation that simply don’t fit the context of what they’re doing and therefore add little of the value that evaluation is meant to reveal.

If we value something, we need to articulate what that means if we want others to follow and value the same thing. That means going beyond lofty, motherhood statements that feel good — community building, relationships, social impact, “making a difference” — and articulating what they really mean. In doing so, we are better position to do more of what works well, change what doesn’t, and create the culture of inquiry and curiosity that links our aspirations to our outcomes.

It means valuing what we say we value.

(As a small plug: want to learn more about this? The Evaluation for Social Innovation workshop takes this idea further and gives you ways to value, evaluate and communicate value. March 20, 2014 in Toronto).

 

 

businesscomplexitydesign thinkingevaluationinnovation

Developmental Design and The Innovator’s Mindset

Blackberry Swarmed By Ignorance

Blackberry Swarmed By Ignorance

Blackberry, once the ‘must have’ device is no longer so and may no longer even exist. Looking back on how the mighty device maker stumbled the failure is attributed to what was done and not done, but I would argue it is more about what was unseen and not thought. Ignorance of the past, present and future is what swarmed them and a lack of developmental design in their culture.

Today’s Globe and Mail features the above-pictured story about how and why Blackberry lost out to Apple’s iOS iPhone and Google’s Android powered phones due in large part to their focus on their stellar enterprise security system and failing to consider what would happen when competitors yielded ‘good enough’ models.  It’s a tale years in telling and what may be the beginning of the end of the once globally dominant Canadian tech leader.

Getting out

Those I’ve known who’ve worked for Blackberry describe a culture devoted to engineering excellence above all, which emphasized technical superiority and attention to the technology over the users of that technology. Perhaps if more of those engineers got out a more beyond their own circles they might have noticed a few things:

  1. Facebook, Twitter and social media sites that all seemed fun at first were quickly becoming more than just pastimes, they were being used as communications tools for everything from family and friends to work;
  2. Cameras were being used to capture photos and videos, share them and edit them (like Instagram and now Vine) for purposes beyond social, but also to take photos of PowerPoint presentations at events, brainstorming whiteboards and prototypes;
  3. The rich media experience provided through other devices meant that the keyboards were less important — typing faster and easier was being weighed against screen dimensions for videos, photos and interactive content;
  4. Workers were passionate enough about these new tools that they would bear the cost of their own phone to use these tools and carry two devices than just rely on a Blackberry if they were required to have one.

I saw this phenomena all over the place. Embedded in this pattern were some assumptions:

  1. Email was the most important form of productivity. (This might also include learning);
  2. Email was fun;
  3. Email got people communicating

Few people I know like email anymore. We tolerate it. Almost no one who is in the work world gets too few emails. Email is a useful and highly embedded form of communication; so much so as to nearly be a form of dominant design in our business communications.

What a little anthropological research on RIM’s part would have produced is some insights into how people communicate. Yes, email is the most pronounced electronic method of communication for business, but it doesn’t excite people like a video does or engage conversation like Twitter can or enable re-connection to close peers or family like LinkedIn and Facebook do. These are all platforms that were lesser served by the Blackberry model. What that means is that email is vulnerable to those things that attract people.

In complexity terms rich media is an attractor; it organizes patterns of activity around it that stimulate creativity in the system. This meant that a lot of positive energy was being directed into these new means of engagement over others and that when given the opportunity to choose and use a device that supported this engagement better people (and eventually the firms they worked for) began to opt for them over Blackberry.

Ongoing innovation

Developmental design is a process of incorporating the tenets of design thinking with developmental evaluation, strategic foresightbusiness model innovation and contemplative inquiry. It means constantly evaluating, assessing, designing and re-designing your product offerings as things change and developing a constant attentive focus on where you are, where you came from and the weak and strong signals that indicate shifts in a culture.

This is a new way of doing innovation development, evaluation and strategy, but it is the necessary ingredient in a space where there is high levels of complexity, rapid churn in the system, and high demand for action. Increasingly, this is no longer just the domain of high tech, but banking, retail, healthcare, education and nearly every system that is operating in multi-jurisdictional environments. When we (the customer, patients, students…) were very much the same, we could treat our system simply. Now the ‘we’ is different and the systems are complex.

Developmental design is the praxis of innovation.

What would Steve Jobs do?

It is interesting to note that today is the day the bio-pic on Steve Jobs is released into theatres. Jobs knew developmental design even if he never named it as such. He famously ‘got out’ in his own, unique way. He went for walking meetings rather than sat in boardrooms. He watched what people did and channeled his own passion for creating things into a company culture that was designed to create things to help people create things. To that end, he was among the most outstanding innovators of the last 50 years.

Yet, Jobs and his team were good at paying attention to where things had gone (the computer), where they were (increasing bandwidth capability and demand with the Internet), and where they were going (decentralized production). Thus we had a number-crunching machine turned it into a suite for personal creativity (Mac), which spawned a music player (iPod) and online store (iTunes), which led to a multimedia communications handset (iPhone), which inspired a handheld tablet (iPad).

Apple is the most valued tech company in the world because of that vision, one that has been questioned in light of Jobs’ passing on and new leadership in place at the company.

Blackberry is not unique. The leaderboard in consumer mobile technology has changed from Motorola to Nokia to RIM (Blackberry) to Apple to Samsung (Android) in less than 15 years. That is enormous churn in a sector that touches over three quarters of the world’s population directly (more than toilets). While perhaps an extreme case, it is becoming a model to pay attention to for other industries on different scales.

Ask yourself: Are you Blackberry today or Apple yesterday?

If you apply developmental design to your work, you’ll have your answer.