CSUN Professor Examines Transparency, Trust and Protection on the Web
Every time you make a Google search, post on Facebook, play Candy Crush, send a tweet or take a photo on Snapchat, you are leaving a piece of yourself — information about who you are, your habits, your tastes, your family, your friends — behind, whether you want to or not.
It is that unintentional “surrender” of information that has California State University, Northridge marketing professor Kristen L. Walker concerned.
In an article in the most recent edition of the American Marketing Association’s Journal of Public Policy & Marketing, “Surrendering Information Through the Looking Glass: Transparency, Trust, and Protection,” Walker pointed out that consumers only have faith that online providers will respect their privacy in an age where technology is constantly evolving, information is a commodity and marketers and others are hungry for any tidbit they can get to help them build as detailed a profile as possible of those consumers.
“Very few people, when they sign up for social media or buy an app, actually read the policy agreements,” Walker said. “They just click ‘agree,’ and trust that the vendor will protect them and their data. But when you sign up, you are opening a door. To borrow from Lewis Carroll, it’s like going through the looking glass, and who knows what’s on the other side.”
Walker acknowledges that the rapid development of information technology has changed the world. Not only is the internet used to communicate, the online world has become an arena for transactions, analysis and an endless variety of multifaceted interactions.
“Information is a product and a byproduct of many of these innovative exchanges; a product that is gathered, stored, packaged and sold,” she said. “Firms purportedly use ‘big data’ to personalize services and products as a means of improving customer satisfaction and increasing customer lifetime value, sometimes without much forethought. Society’s increasing reliance on technology creates a data-rich environment that is inextricably intertwined with marketing and public policy issues of transparency, trust and consumer protection.
“To put it simply, technology is evolving so fast and it’s becoming so much a part of our lives that we — including many of the creators of the new technology — are making assumptions about our privacy, when we really should be taking a moment to think about the consequences,” Walker said.
She said she doesn’t have the answers to the privacy questions her paper is raises, but she hopes the paper will get people, particularly marketers and public policy makers, to think about the ethical and security issues surrounding society’s increasing reliance on technology.
“It’s not really about privacy,” she said. “It’s about what information we share, and what information we surrender because we really don’t have a choice.
“Free exchanges of information come with costs, resulting in socially transmitted data (technology’s STDs),” she continued. “The phenomenon of surrendering to technology challenges consumers’ ability to focus on details and actively protect themselves online. Consumers are not sharing information online, but rather surrendering information — providing information to an infinite amount of parties without clear understanding and with few conditions, or protections. This is an ethical problem for individuals and society.”
To illustrate her point, Walker developed the Sharing-Surrender Information Matrix (SSIM). It addresses the roles of “mutual benefits, mutual commitments, trust, and social and information linkages that are necessary to understand in the increasing information and digital age,” she said.
There are four “quadrants” in Walker’s SSIM matrix: conditional sharing of information, unconditional sharing, conditional surrendering of information and unconditional surrendering.
An example of conditional sharing is when a consumer updates her status on a social media website and places restrictions on access to that information via privacy settings on the site that limit who can see her posts. The consumer has the ability to verify the privacy settings and view any and all parties who access the post.
For conditional surrendering of information, Walker cites the example of a high school student who, disregarding warnings about sharing certain information, sends a suggestive photo of herself with her boyfriend. She sends it via a social media application on which the image “disappears” after a few seconds. Although the app informs her when an image has been copied, her boyfriend uses another app to save a screenshot of the image. Despite her faith in her app and her boyfriend, the image is later posted on a website. She was not able to view any of the parties who had the potential to receive her information from the initial app, or whether they shared, stored, bought or sold her information.
Unconditional surrender occurs, for instance, when a person visiting an office building shows his or her driver’s license to enter. The visitor is not informed and has no idea that the building’s security team runs the license number through complex software that provides them with a brief history of personal information, including social media posts. Unaware of the search, the visitor is not able to view any of the parties who may have access to this information.
Walker pointed out that Article 2 of the Code of Conduct of the United States Fighting Force clearly states that military personnel “will never surrender of [their] own free will.” If in command, they will “never surrender the members” of their command “while they still have the means to resist.” And if they become prisoners of war, military personnel will “keep faith with [their] fellow prisoners … will give no information or take part in any action which might be harmful to [their] comrades.”
“As a society, we clearly position surrendering as an undesirable scenario,” Walker said. “Why then are we allowing our citizens to surrender so much information?”