#CampusTech – Privacy and Security in the Age of Algorithmic Spies

Session Description: Security experts often claim that people are the most unreliable part of their systems and that privacy is dead. Jennifer Golbeck, an expert in cybersecurity and human-centered technology design, asserts that the problem is not people behaving insecurely, but security systems that are designed with no concern for their users. She’ll discuss the risks organizations face when humans are not the center of their security plans, present simple changes that can make systems more secure and easier to use, and describe how we can educate people about simple steps they can take to regain control over their digital lives.

Speaker: Jennifer Golbeck, Director of Social Intelligence Lab, University of Maryland


Better usability leads to better security!

  • 70% of security breaches result from weak passwords, but can we talk about the reason for this? All these rules…. Eeek, with cognitive science tells us 7 is what we remember best. How many passwords do you have? 
  • Should we use a password manager? Why do we have these? Because we created a system that requires you to use a password manager.
  • How can we make passwords easy to use! If we made microwaves like we have our passwords management processes, we would never use it.
  • If you make the system easy to use, it will be more secure.
  • Password changes every 6 months? Research shows that passwords become less secure.

Evidence shows that requiring password changes creates less security!

  • Enter > user experience design – can we make security easier for the user?
  • It’s important to undertstand who your users area, security is not a users’ task, it’s getting in the way to do their task. 
  • We need to get away from “because it’s easy it’s not secure” mentality.
  • If the security is easy it makes people use it.

People are Social

  • Once trust is earned, that can be a security flaw.
  • People have a nature to trust and to be nice.
  • It’s hard to say no.
  • We want to help.
  • People can do insecure things because of our nature and if we feel guilt.
  • If we understand that people are likely to do that, so then let’s look at how we can preven this.
  • E.g. How easy is it to create a guest login for visitors to our campuses? If it’s easy, then people will use it.
  • Once you give your password out, they have permanent access to your device, enter thumb print and biometric access tools.
  • Goal is to design technology around what we know about people and our systems.

Fear as a Motivator

  • If fear is created then people and motivate people – for good and for bad.
  • Take this Lollipop – Take This Lollipop is a 2011 interactive horror short film and Facebook app, written and directed by Jason Zada, which uses the Facebook Connect application to bring viewers themselves into the film, through use of pictures and messages from their own Facebook profiles. – This fear based app can motivate you to care about security adn privacy.
  • People are revealing so much information – more than they know.

New and Coming for Security and Privacy

  • Big data – take this data and find out what people have shared. Facebook likes are public. On the one hand it’s a narrow slice, on the other it’s public. 
  • Algorithms can predict intelligence: eg. likes for page for science, thunderstorms, Colbert report, and curly fries. These items correlate to high intelligence – and likes are social which can be modeled and artifacts can be found by computer algorithms. People are friends with people like them – sociologists call this “homophily”.
  •  Target predicts with big data – purchasing trends: “How Target Figured Out A Teen Girl Was Pregnant Before Her Father Did
  • Algorithms give us correlation and we can find deeply unique information about people.
  • Predicting future behavior for alcoholics – predicting if a person will be sober via data.
  • Predications are illogical and statistical, meaning you can’t look at a data point and be concerned with if that data is public or not.
  • We can change a lot of systems but we can’t change people. We are social creatures we don’t want to get rid of that social aspects of our lives, instea of forcing people to interrupt their work to do the security thing, wehat we should be doing is putting expel at the heart of our systems. Asking how do we build the systems around the people and what they do. This requires a shift in what we have developed – vs – an adversarial relationship web currently have with “regular” users and security people.
  • On security, we need to bmbrace the humans instead of burdoning them!

One thought on “#CampusTech – Privacy and Security in the Age of Algorithmic Spies

  1. Pingback: 3 Campus Technology Conference Highlights #CAMPUSTECH | eLearning and Emerging Technologies @GVSU

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s