Saturday, February 27, 2016

Where Do We Go from Here?

Over the course of the past few weeks, this blog has sought to discuss the various points at which communication and innovation collide. It is because that collision point is exactly where I believe most Americans find themselves. We find ourselves daily trying to keep up with the latest technological devices, the newest social networks, and whatever trending topics our friends, family and neighbors are talking about. We crave information, and each of us, regardless of age, goes through various channels to obtain it. This information morphs into conversations that happen around dinner tables and Web platforms worldwide.

When Johannes Gutenberg invented the printing press in the mid-15th century, the world and technology as it was known changed forever. Suddenly, information could be printed and distributed to the masses. No longer would learned knowledge be controlled by the few; now, it belonged to the people. Enlightenment thinking would soon follow along with so many other movements and paradigm shifts throughout history enabled by the easy distribution of information. Today, the printing press has been rendered obsolete by technologies that can distribute entire catalogues of data to the masses in mere seconds. What paradigm shift is next? Where do we go from here?

It’s hard to imagine that, at some point in time, whether any of us are here to witness it or not, people are going to look back on the technologies of 2016 as ancient. People will find iPhones not in electronics stores, but in museums. Facebook will be like the phone book: a symbol of a time gone by.

“Those were simpler times,” they’ll say. And I think the reason that statement will be uttered is because the future will be less about the technologies themselves and more about control of those technologies. As we enter a point in time when artificial intelligence is no longer science fiction, and we even begin exploring self-healing materials and devices that can understand human tendencies down to an individual level, there is also going to be the desire to create technology that has the capacity to control us. We already see this on a microscopic level with “phantom phone vibrations” stemming from a desire to remain connected to one’s devices (source: Imagine a time when we will have completely-wired homes, cars, and businesses. The ability to use the technologies we interact with as a means of controlling or even manipulating other people will be too appealing for some to pass up. We have to have good people working to use these emerging technologies and communication systems for good purposes. We have to work on the side of good.

One of the reasons why I began pursuing a master’s degree in strategic communication is because I see too many people in control of these avenues of communication who do not have the best interests of the public in mind. They work as freelancers, independent contractors, or companies that care only about revenue coming by way of viewership and readership. It has caused the quality of journalism, media, and social networking to do tremendous harm to many while simultaneously doing just enough good to keep people addicted. I believe this is wrong. I believe there is a better way. And, I believe that with the commitment of good people who think strategically about the ways in which we communicate and interact with one another, we can do great things with technology that ultimately serve the greater good.

Technology in the Classroom: Think of the Children!

A few weeks ago we discussed some striking trends in media consumption. Specifically, we noted that Millennials age 14 to 25 now turn to social media as their primary source of news at a rate that will soon eclipse television, perhaps by the time you read this blog. We’ve also seen that Americans are trending toward using mobile devices exclusively for online access, and that some homes don’t have desktop or laptop computers at all anymore. Naturally, these trends are being driven by young people who have time and again proven to be the innovators and opinion leaders of the technological revolution in American society.

In this country, young people have almost always been the drivers of technological innovation. Alexander Graham Bell was 29 years old when he and Thomas Watson made the first telephone call (source: Philo T. Farnsworth demonstrated his “electrical image” device, which we now know as television, when he was 21 (source: And, of course, there’s Bill Gates and Steve Jobs, two of America’s most prolific college dropouts. Gates was 20 years old when he founded Microsoft (source: and Jobs was 21 when he and Steve Wozniak started Apple (source: The list of young innovators grows with each passing year. Our young people are the big thinkers, the minds that have the capacity to grab hold of the future and pull it into the present.

This unbridled ability to think without limits has also created problems and issues worth considering. Now that the world has been blessed with the inventions of computers, mobile devices and the Internet, we have become the keepers of this technological power. We work to keep these technologies safe yet ensure that they can be used to the fullest ability. We struggle with conversations over how much influence any one person or group should have over its use and contents (source: We regulate it just enough to create some sense that our most vulnerable citizens — our children — will not be harmed in its use (source:

In many ways, we are still in the “Wild West” era when it comes to Internet safety and digital citizenship. This unfortunately means that children are at an elevated risk of exposure to cyber threats and obscene content when interacting with technology and a free and open Internet. The fine line between Americans’ First Amendment rights and the rights of all children to be protected from harm continues to create heated and fluid conversations within private organizations, all levels of government, and individual homes.

This is a conversation that is happening the most in our schools. As a communicator for a public school system, I hear discussions happening continually on the issue of technology, our deployment of it, and how we should use it responsibly. The interesting thing about this conversation point is that it affects everyone in a school. All of us are charged with the responsibility of being good stewards of the resources that have been given to us. This means that we must give the best knowledge and information we have to our children so that they can learn it, analyze it, and synthesize new information on their own. But, we also have a responsibility to create a digital fence around our children so that they can explore and learn without fear of anyone compromising their experience or their safety.

The benefits of technology in the classroom are becoming clearer. As students are encouraged to use devices at school or even bring their own from home, teachers and administrators must simultaneously curate the best educational tools available through the Web. These technologies create opportunities to engage students who might not otherwise benefit from more “traditional” teaching methods that, frankly, were designed with greater concern of uniform approaches than individual minds.

At the same time, however, there must be layers of vetting and researching the apps and websites that we use in the learning process, then there must be additional layers in place to ensure that these apps align with responsible safety and security policies. The Consortium for School Networking (CoSN) has stated that “no one is exempt from the threat of cyber attacks,” and parents deserve “assurances that student data are protected” (source: Education advocacy groups like the National Education Association thus emphasize digital citizenship and the need for educators to build “responsible use” policies for students as well as teachers (source:

For any of this to actually benefit children, however, there must be a desire to bridge the “digital divide.” Technology is a means of information sharing, and it is ultimately up to the student to decide what to do with that information once it has been given. But, it is unjust to limit the ability for any child to gain access to the information that comes by way of technology. There is a growing concern that an overreliance on technology at schools may breed a new social stigma among children where instead of “being teased for clothing choices, now, perhaps it is because the child cannot afford the next-gen iPad” (source:

I think that we will bridge that divide in time as technology costs become lower and lower, and we should encourage innovation projects that seek to bridge the digital divide. A great example of this is seen in the rural town of Piedmont, Alabama, where the school district in 2009 established a program that gave a laptop and free home internet access to every child in grades 4-12. Since the program’s implementation, the number of students taking college entrance tests has more than doubled. Their test scores are now above the state average. Enrollment in the school district has jumped 20 percent. Apple even recognized Piedmont’s high school as the only Apple Distinguished School in America (source:

This is the promise that comes from harnessing the power of technology in a meaningful way for our young people. When we remove the limits to access and simultaneously encourage responsible use of technology, the innovators of tomorrow are more prepared to create the “next big thing” that inspires us all.

Saturday, February 20, 2016

Branding and the Consumer Experience

I have had the good fortune over the course of my career to work at several organizations that underwent retooling of their brands. For most companies, developing a brand is a long, arduous process that involves multiple phases, a great deal of research, and even more time and effort spent getting people to buy into the new brand once it is revealed. I have seen branding campaigns managed in a very structured and thought-out way, and I have seen other campaigns thrown together that ultimately resulted in a mish-mash of the old and new brands.

My first experience with a rebranding project took place at my alma mater while I was an undergraduate college student. The university had a brand image that had been in place since the 1980s, and it had resulted in a stagnating image of the university among potential and current students. The university’s brand seemed to be an afterthought, and the students perceived it that way. So, after university trustees made the decision to grow enrollment, the university had to think differently about its brand and its public appearance.

As a communication student, I had the opportunity to sit in on a couple of focus group sessions held in our building by the rebranding agency. They surveyed students all over campus as well as alumni. The goal was to collect feedback and input to better understand the mission, vision, goals, and values of the university and its students. It involved a tremendous level of research, interviewing hundreds if not thousands of students, and then codifying that data in a meaningful way. But, without that research, how would they have been able to have a cohesive understanding of the “DNA” of the university? Any rebranding effort would have failed miserably without this critical layer of research.

Once that portion of the branding agency’s efforts were complete, we students heard nothing more until the university’s new logo and positioning statement were publicly announced. I remember distinctly that there was intense backlash against the new logo when it came out. The new logo was basically the university’s initials set in a red Helvetica Neue typeface. (On a side note: This reveal of the new university logo began my obsession with typography which continues to this day as a communication practitioner and graduate student.) There were a number of students and alumni who looked at all the new logo and the new slogan (“Where You’re Going”) and asked: Is that all there is?

The reality for this university was that rebranding had little to do with creating an identity for the students who were already there and more to do with creating an identity for those who had not yet arrived. The new brand was for the young people who have no prior attachments to the brand and serve as opinion leaders in their own high schools and communities. The brand had already been identified in the minds of those who were on campus, naturally creating a bias in their minds against a redesigned brand. But that bias really didn’t factor into the strategic planning of a university where almost all of those who were opposed to the new look would likely be gone from campus in four years or less.

Have you ever thought that some people, or even perhaps most people, struggle with accepting change? Just 3.4 percent of smartphone sales went to Apple’s iPhone following its debut in the summer of 2007 (source). By the fourth quarter of 2015, devices running iOS and Apple’s competitor, Android, combined for more than a 98 percent share of the world’s smartphone market (source). This slowness to adopt has much to do with the ideas put forth in Everett Rogers’ Diffusion of innovations theory (book). Rogers suggested in 1962 that as a new idea is disseminated, society divides itself into five groups that adopt the idea at varying rates of speed. Innovators, while making up a tiny fraction of the population, are also the first to accept this new idea, followed by a larger (but still a minority) segment known as early adopters who also tend to be opinion leaders. The early majority and late majority follow, concluding with a small but sizeable population known as laggards. To get every segment of adopters on board takes time, but it also takes a commitment to what I call the “long game” of strategic research, planning and communication. It is not enough for a company or organization to decide to simply create a brand image for itself. The brand must permeate everything the company does, particularly when in view of its publics.

A brand is much more than just a logo or a slogan, however. In the minds of consumers, a brand is also encapsulated in the experiences they have with a company or organization. On college campuses nationwide, experience has become a new buzzword because we now understand that it is an integral part of a brand’s reputation and image in the mind of the consumer. A customer may forget what a logo looks like or what catchy slogan was employed by a company, but rarely will consumers forget the experience they had and their emotional responses to that experience. This is one of the reasons why a relatively unknown online brand, Zappos, leads almost all online retailers in customer service (source). Their focus on total satisfaction of the customer has created for themselves a loyal and expansive customer base where 3 out of every 4 orders are placed by repeat customers (source). Their very first core value is to “deliver ‘wow’ through service” (source). They accomplish this through liberal policies like free shipping and free returns for up to a year on purchases. Customer service agents are trained to seek positive outcomes in all customer service situations, and they are given free rein to make any decision necessary to please a customer — a stark contrast from the endless list of companies that tie the hands of their CSRs and leave the customer to hear the dreaded words, “There’s nothing I can do.”

Perhaps we can all learn a lesson from the success of brands like Zappos and Apple. In the end, the brand is created not so much by a trendy logo or a smart slogan, but rather by the collective experiences of a company’s publics. Good branding, then, is created when a company creates a great culture first which flows from its employees into its customer experiences, followed by logos and positioning statements that capture the spirit of that culture.