Security fatigue: problems in password paradise

________

[5 min read]

A new survey from the Pew Research Center paints a bleak picture of how Internet users feel about their online security. The report starts with bad news about passwords, the high profile tool in the toolkit: “69% of online adults say they do not worry about how secure their online passwords are.”

How does not worrying look in real life?

Consider the findings from Keeper, a vendor of password management software. It recently tallied its annual list of the world’s favorite passwords. The top 10 list opposite, taken from an analysis of 10 million sample passwords, illustrates pretty well what end-users mean by not worrying. These passwords are so terrible that the estimated crack time for the “safest” choice on the list (#6) is about 9/1000 of a second – for the others, the effective crack time is zero seconds. This preference for easy – and insecure – passwords goes hand in hand with a set of attitudes to online security that’s not easy to fathom.

To begin with, Pew notes a tension between lack of trust in institutions and reluctance to take personal action on security:

“[While] they express skepticism about whether the businesses and institutions they interact with can adequately protect their personal information, a substantial share of the public admits that they do not always incorporate cybersecurity best practices into their own digital lives.”

Internet users are right to feel skeptical. Site operators as varied as Target, Ashley Madison and Yahoo! have shown they’re not only lousy at network security, but irresponsible in disclosure and damage control. In December, Yahoo! admitted that hackers had breached its systems and stole information from one billion accounts – and had done so three years before management got around to discussing the attack publicly.

A second and more counter-intuitive finding concerns what people do in response to suffering from an actual online attack:

“Americans who have personally experienced a major data breach are generally no more likely than average to take additional means to secure their passwords (such as using password management software).”

What explains such quick dismissal of self-interest?

Despite being a part of daily life, I think most people find passwords not just difficult but, well, weird. The better they are, the worse they are, since what makes them hard to crack also makes them hard to handle. Unlike, say, car locks and safe deposit boxes, passwords work invisibly on assets that are also invisible. Even as we type them, they dissolve into rows of inscrutable little dots. Plus they’re often stored on remote servers, i.e. in the “cloud” – the perfect metaphor for a tool you can’t see or understand.

Perhaps this abstract quality is what prompts people to manage their passwords in another kind of remote cloud: their brains. Two-thirds of onliners (65%) say memorizing their passwords is their most used strategy, while 86% use memorizing as at least one approach. The way distant second? Writing passwords on a piece of paper, the most used method for only 18% of respondents.

Software developers look at this behavior and think they can put us out of our misery by selling us password management software – 1Password, Dashlane, Keeper, etc – the tools security experts recommend most highly.

The bad news, however, is that almost nobody uses them. A mere 12% of onliners say they use these applications at least sometimes, while those who say they use a password manager most often amount to a tiny minority of 3%. Pew cautions this is not niche behavior, as password software “is used relatively rarely across a wide range of demographic groups.”

There’s a useful lesson here.

People at the selling end of the consumer tech business see code as the solution to everything. If you have trouble remembering your passwords and that makes you unsafe and you’re generally miserable about it all, then you’re gonna love our software. What’s wrong with this logic is not how good the software is or how cheap or how user-friendly. The problem is that it’s software.

This mental fatigue extends far past security. It’s only part of the fallout from how mainstream consumers are taught to behave in the digital world – to expect everything we touch to be effortless, easy and user-friendly, even when it clearly isn’t. Vendors know their customers won’t take lessons, respond to scares or read the manual so they just pretend there’s nothing to learn in the first place.

Same deal with hardware. As a tech at the Apple Genius Bar once explained to me, customers come in with broken, manhandled $1500 machines they’ve never maintained or even cleaned, and leave with their repair ready for more abuse. Imagine treating a $1500 Weber gas barbecue that way.

The only way mainstream consumers are ever going to make peace with their devices – and their passwords – is by getting to know them better. Mystification is a terrible motivator, as I can attest after a decade teaching 20-somethings how their digital world works.

Getting this particular demographic to put down their phones, their ingrained habits and their fear of exploring technology (yep, you heard that right), is hard work for all. Like most people, students have been persuaded there must be an app for that – one that will allow them to learn how a data packet crosses the Internet without any effort on their part. Or while texting. Well, there isn’t and there won’t be.

I see a wholesale change in our approach to understanding digital technology as one of the most important educational missions of the next decade. I’ll be writing more about this educational challenge in the coming weeks and months.

(The Pew survey on cybersecurity is available here.)

D.E.

Continue reading

An uncertain future for higher ed (Pew/Elon 2016)

broadway-tower-b

Last month I wrote about the Pew/Elon experts survey on the future of the Internet. I included comments on the ubiquitous use of algorithms and the costs that entails. That was one of five questions on the 2016 survey. I answered two others: one on the future of education (#2) and the other on the effects of ever-increasing connectedness (#5).

My views on the future of higher education – especially in the liberal arts – have grown more pessimistic over the last year and a half. They’ve been shaped by the research and interviews I’ve done while working on a book proposal aimed at the uses and misuses of technology in the classroom. The working title, Turned off Tech, reflects the long-ago inciting incident: confiscating student phones and all other digital devices, the better to make the classroom a place to learn again.

phones-lab-3

Students adjust nicely to the idea that paying attention is a good way to find out how digital technologies work – as opposed to staring into a screen and expecting some miracle of osmosis. These days they’re much more concerned about what happens after they leave class and graduate. Many tell me that their 4-year degree was a painful necessity that will bring nothing by itself. Continue reading

Why algorithms are bad for you (Pew/Elon 2016)

al-khwarizmi

Statue of al-Khwārizmī, the 9th-century mathematician whose name gave us “algorithm”

~~~

I’ve written a lot about the Pew Research Center. Pew does a great deal of invaluable survey research on the behaviors and attitudes we develop online (okay, “we” means American here). In a departure from the science of probability surveys, Pew teamed up with researchers at Elon University back in 2004 to launch their Imagining the Internet project.

future-pew-elon

About every two years, the team prepares a set of questions that’s sent to a list of stakeholders and experts around the world. The questions reflect current hot-button items – but ask the participants to imagine how online trends will look a decade from now. The topics have ranged from broad social concerns like privacy and hyperconnectivity, to more technology-oriented questions like cloud computing and Big Data.

The 7th version of the survey was fielded this summer; it’s my 4th shot at predicting what life will be like in 2025. (For a look at what the survey tackled in 2014, see my posts starting with one on security, liberty and privacy.) Continue reading

Broadband as a basic service: be careful what you wish for (4)

speedometer-3

[This post continues from the previous one, comparing the FCC and CRTC approaches to the principle of universality, and finding the CRTC’s approach to broadband puts this principle at risk.]

~~~

For my money, the key lesson we can take from Chairman Wheeler’s FCC lies in the willingness to admit when they’ve got a big problem on their hands. The FCC spends little time reflecting on its successes, compared to worrying about how they will correct market failures and right social injustices. In that spirit, Wheeler’s recent statement on the new Lifeline proceeding gets straight to the main issue: “…nearly 30% of Americans still don’t have broadband at home, and low-income consumers disproportionately lack access.”

Compare that blunt admission to the CRTC’s habit of seeing the world through rose-colored glasses. The rosy glow is not confined to decisions; it’s also been a feature of the CRTC’s research documents. Take last year’s Communications Monitoring Report on telecommunications (pdf uploaded here). Turning to the section on the Internet market sector and broadband availability (p.171), the reader is hard-pressed to see that anything is amiss in this parallel universe. Continue reading

Broadband as a basic service: be careful what you wish for (3)

speedometer-2

I’m taking a further shot in this post at the question of the decade: should Ottawa guarantee Internet access to all Canadians?

This question is now drawing a great deal of attention. In April, the CRTC launched a new proceeding to review “basic telecommunications services.” As I wrote previously:

“The most important single question to be addressed in this proceeding is whether the time has come to start treating a broadband connection to the Internet as an essential service to be provided to all our citizens, just as we have done for decades in the provision of basic telephone service.”

As luck would have it, that is exactly the issue the FCC voted to examine on June 18: FCC Takes Steps to Modernize and Reform Lifeline for Broadband.”

Nevertheless, the two agencies see what is at stake in very different terms. These differences are evident in a comparison of the relevant public notices and agency research documents. My reading indicates our American friends are way ahead of us in the assumptions they’ve made about the public interest, as well as in the tools at their disposal to make a success of this epic broadband venture. Continue reading

Rebooting basic telecom services: hope for policy reform?

reboot_your_blog

~~~

The recent Rebooting conference in Ottawa was a terrific experience. Lots of people with lots of good ideas and the opportunity to debate them at length.

cbc_logo_1940_1958Oversimplifying a little, I would divide the conference participants into two general groups. The first and larger of the two was reform-minded, with many calling for serious changes, especially to the CRTC. The second group, while smaller, was just as eloquent in defending what I’d call the status quo. By that I mean maintaining or expanding subsidies for program production; a bigger role for the CBC; and measures explicitly designed to protect broadcasters with a view, among other things, to protecting jobs in the broadcast sector. This perspective tended to cast the socio-cultural objectives of the Broadcasting Act in a favorable light.

My six minutes of fame featured a half dozen reasons as to why there’s an urgent need to reboot the Broadcasting Act, and in particular to redraw the policy goals in section 3 from the ground up.

Why we need reform

1 – The 1991 Act is older than the Web. One simple argument for reform is chronological. The 1991 Act predates the Web by six months: the first publicly available Web page was posted on the Internet in August 1991. Worse still, most of section 3 is based on what became law in 1968 – 47 years ago! The main difference is that the current version is over three times longer and now refers to “programs” and “programming” 31 times. Continue reading

Broadband speeding up, broadcast TV slowing down?

sam-knows-map

~~~

This morning brought news that the CRTC has launched a national broadband measurement initiative using the SamKnows platform (“The global leaders in broadband measurement“). The announcement comes hard on the heels of Michael Geist’s Tuesday post entitled Missing the Target: Why Does Canada Still Lack a Coherent Broadband Goal? Ironically, after his well taken lament, the Commission suddenly seems ready to answer Michael’s question – though not in the way some of us might like.

“The CRTC is recruiting up to 6,200 Canadians to help measure the Internet services provided by the participating ISPs. Volunteers will receive a device, called a “Whitebox”, that they will connect to their modem or router. The Whitebox will periodically measure broadband performance, testing a number of parameters associated with the broadband Internet connection, including download and upload speeds.”

On this Commission page, the visitor is offered some details, including how to sign up. In a discussion with some other folks today, there was agreement that the Commission is going to have to work hard to attract mainstreamers who have no technical background. To do so, the project team is going to have to take a more didactic approach, and give up self-congratulatory marketing lingo like a “world-class communication system.” Continue reading

Now playing at the CRTC: your precarious future on the Internet (2)

toronto-skyline

~~~

In this post, I follow up on my comments about the first day of the CRTC’s hearing to review its framework for wholesale services in the telecom industry. Since the most significant sector to be affected is Canada’s residential broadband service, I’m summarizing evidence here that was compiled recently by the Open Technology Institute (OTI) that compares broadband in 24 cities in Europe, East Asia and the US, along with Toronto. This evidence is consistent with findings from other international studies. It shows Toronto lags far behind the broadband leaders in available speeds; in the penetration of fiberoptic platforms; in symmetric connectivity (uplink bandwidth matches downlink bandwidth); and, most seriously from a social policy perspective, in the high prices Torontonians are forced to pay. I take this evidence as a strong argument in favor of maintaining and extending the regulatory regime that ensures open access to networks for smaller, competitive ISPs – including not just legacy platforms like DSL, but also emerging fiber platforms. Unless the CRTC includes these next-generation platforms, Canada will fall even further behind in its long slide into slow and expensive broadband connectivity.

~~~

“We are now ready to take our place as the most technologically advanced nation on the planet.” –Stephen Harper, Digital Canada 150, April 2014

Last month the Open Technology Institute released the third in a series of annual studies of broadband speeds and prices in 24 cities in the US, East Asia and Europe, plus Toronto (originally 22 cities). I wrote about OTI’s first report back in November 2012 (CRTC’s 2nd pro-consumer decree: 4 reasons not to celebrate); and I had comments a year later about the second report (Broadband data for Toronto: more bad news and getting worse). Continue reading