A pig in a poke no more: students take on the ISPs (4)

(2500 words)

Back in the fall semester of 2015, I wrote a series of posts on an experiment I’d been conducting with my students. It was a novel kind of written assignment — what we were calling a “field report,” as opposed to the usual essay or research paper they were used to doing in most of their courses. 

The point of the exercise was to provide a clear, accurate profile of the technical and financial arrangements made with their ISP. As we discovered, it’s frustratingly difficult trying to get information like the cost of a data cap overage or what uplink speed someone was paying for.

The next challenge was everyone had to take their current information and contact two other ISPs to try to get a better deal. The common finding here is that comparison shopping for broadband is an apples-and-oranges nightmare. Despite these hardships, everyone was happy to get a break from trips to the library, the cheeseburger essay format and the inescapable “thesis” statement.

We’ve been doing this assignment in my courses ever since. I’m pleased to say it still manages to be a popular and efficient way to learn both technical concepts like latency and a few things about the cruel realities of the broadband marketplace. For a little historical perspective, I’ve taken the excerpt below from my post of November 1, 2015, which covers the apples and oranges problem in ISP pricing (the 2nd and 3rd in the series are from November 14, 2015, and December 19, 2015).

While the main point of the assignment is to identify “comparable” replacement or upgrade plans from two other ISPs, most students had difficulty finding the information matches they needed to make comparisons. The sources of these comparative difficulties can be grouped thusly:

  • finding matching speed tiers;
  • accounting for the size and cost of caps;
  • wading through the pricing morass.

First, ISPs offer a jumble of ever-changing speed tiers, which adds confusion to price comparisons while serving no good purpose but upselling. Some students who had a discontinued plan were told there was no way of finding out what their then-advertised speeds were supposed to be. And even when someone did discover a competing plan with the same speeds, most found it impossible to match caps as well – especially after taking into account both the size of caps and cost of overages.

But these annoyances pale beside the absurd way the incumbents set prices. Advertised prices are never the real prices, and the incumbents make it plenty difficult to figure out what those prices might be. Ironically, most students found fewer issues with add-ons like modem rental than they did with the incumbents’ relentless attempts to present any and all monthly prices as discounted. The two favorite practices here are bundling with other services; and discounting initial fees for several months, then jacking them up by as much as double. As one student wrote:

“I’m on a student plan offered by Rogers so I’m paying $55 monthly. I was surprised to discover this discounted rate only lasted for 9 months (which I was not told when I activated this plan), so my bill will be increasing to $90.99 at the end of this month.”

The sales gimmicks shifted tone for students who spoke with a CSR about switching. One thing about the CSR mandate: their employers badly want them to upsell existing customers and snag new prospects on the first call. That brings two undesirable effects. One, when there’s no reward for answering questions helpfully, questions don’t get answered helpfully. And two, once CSRs get a newbie on the phone, standard procedure is to invoke the day-of deal, a low, low price framed as a once-in-a-lifetime offer that expires as soon as you hang up. It’s not the pressure tactics that raised student hackles, so much as the further confusion this piles on while trying to find the best value for money.

And for good measure, here’s a chart summarizing how the 39 participating students rated the three ISPs they had to negotiate with (these ratings were part of the conclusions of each report). As you can see, Bell was by far the worst rated of all (see post #3 for some context, and some remarks on Bell’s use of poutine as a marketing tool on the York campus).

~~~

How the CRTC makes life even more difficult for broadband consumers

Telecom firms, especially incumbent ISPs, are not famous for their transparent pricing, gracious customer relations or truth in advertising. That’s why they invented regulators. We live in a highly concentrated, vertically integrated telecom marketplace. More than most developed countries, we need a regulator that protects mainstream consumers from the predations of broadband providers. This social need is all the greater given the critical importance of access to the global public Internet — along with the information asymmetry that puts consumers at such a huge disadvantage in their dealings with ISPs.

Too bad this is news to the CRTC.

I’ve been writing for years about two of the CRTC’s most worrisome shortcomings: its failure to take consumer protection seriously and its bizarre refusal to conduct actual empirical research, unlike its major counterparts, e.g. the FCC and Ofcom (more on the latter in a moment). The two are closely related. The regulator needs to do good research about the market, use that to develop insights and advice for those who need it most, then communicate that information in topical, accessible publications.

In sizing up the CRTC’s role in consumer protection, the obvious starting point is its annual publication, the Communications Monitoring Report (CMR). I say obvious because there are no other contenders. The Commission publishes thousands of decisions and regulatory policies every year — but they don’t count.

The main report covering the whole waterfront — TV, telephony, Internet, wireless, etc — weighs in at 270 pages (pdf here), although that’s over 100 pages short of the 2017 edition. It bravely tries to position its opus as being of general interest. Section 2 (starting p.41) is titled “An Overview for Canadians.” It devotes the opening sentence to this pious generality (p.42):

“The CRTC continues to strengthen its efforts to place Canadians at the centre of the communication system, whether as consumers of communications products and services; creators and distributors of content; or citizens who need access to information, products, and services to fully engage in a democratic society.” 

It would be nice if the average Canadian citizen was so concerned about fully engaging in a democratic society that they’d set aside a month to read the CMR and hire a consultant to interpret it for them. But wait. In its firm commitment to marketing communications, the Commission also produced a bite-sized version of the broadband material only that’s about one-tenth the length (33 pages). You’d want to pin your hopes on the junior edition (pdf here) — except there’s another problem.

Just like their predecessors, neither the short nor the long version of the 2018 CMR can decide whether it’s chalk or cheese. That predicament is especially unusual in the broadband-only report. It offers helpful explanations on some of the mysteries of digital life, like which streaming applications use how much data (Broadband applications, starting p.27). But then the reader gets admonished in comments that make you wonder who they’re trying to help, such as:

“When reading the chart below, it is important to note that customers can avoid triggering bitrate reductions by not exceeding the usage limits specified in their [mobile] plans” (p.31). 

Or maybe the problem is that the usage limits Canadians have to put up with are far too low (helping the Big Three enjoy the highest mobile ARPU in the developed world). Then there’s the editorial problem: no one who’s worked their way through this much data needs to be told what happens when you exceed your data limit or how to avoid it.

This struggle to identify the CMR audience keeps popping up. Take the opening passage, which — instead of even a vague gesture to consumer interests — brags about how well the industry is doing (p.3):

“In 2017, the retail fixed Internet sector (hereafter, Internet sector) was the fastest-growing sector of all telecommunications sectors. Revenues grew by 7.7%…”

Isn’t that great, the five incumbent ISPs are making even more revenue from their cozy oligopoly. Or take the first two footnotes:

1. Examples of incumbent TSPs include Bell, SaskTel and TELUS. They also include small incumbent TSPs such as Sogetel and Execulink.
2. Examples of cable-based carriers include Rogers, Shaw, and Videotron.

Is that what motivated, curious telecom consumers, or citizens, are wondering? On the other hand, if this tome is actually intended for the industry, why is it full of dumbed-down explanations of this kind that no telecom manager, lawyer or consultant could possibly need?

One good reason for this ambivalent approach to consumer interests is the Commission never does real consumer research. In particular, it has abdicated responsibility for conducting probability (i.e. “scientific”) survey research. What it does instead is “consultations.” Over the years, the Commission has asked Canadians to chime in with their views on various policy and technical matters, then presented the self-selected results as though they meant something. This approach makes for warm and fuzzy public relations but lousy evidence-based policymaking — as you can sense from the call to arms on their website: Have your say!

The second problem I referred to above is another sin of omission. The Commission never criticizes any of its licensees in print (aside from announcements buried in arcane publications like tariff pages). In fact, it can barely bring itself to suggest that anything might be amiss in our telecom industry — let alone call out individual firms for letting us down. Why worry when revenues are up!

This combination of faux research and see-no-evil nonsense continues to have a toxic effect on consumers, researchers and students being researchers, to name a few constituencies. I started looking in detail at the Commission’s publication problems back in 2013, in a series of posts titled “CRTC’s annual report is here: the good, the bad, the weird” (see here, here and here). I had six recommendations at the time, leading with Stop wasting money on online consultations. (Actually my interest in these issues goes back to a post I wrote in August 2010 about the Commission’s badly mismanaged attempt to get Canadians’ “thoughts” on essential telecommunications services and the obligation to serve.)

Apparently they weren’t impressed by my constructive criticisms.

So, you might be thinking, what would a regulator acting in the interests of consumers actually look like?… How about Ofcom?

The UK telecom regulator Ofcom also publishes an annual report that surveys the state of the market. Ofcom has just published its 2019 report — a highly readable document of 37 pages, clearly intended for consumers and titled Choosing the best broadband, mobile and landline provider: Comparing Service Quality (pdf here). The introduction sets the scene (p.1):

“As part of our work to ensure fairness for customers, we want to help people make more informed decisions about which provider is best for them…. By shining a light on the performance of the UK’s main mobile, broadband and home phone providers, this report allows people to look beyond the price of a service” (my emphasis).

Ofcom doesn’t use industry data to talk down to consumers. They do use information from third parties, including the providers themselves. But their report is based mainly on a comprehensive survey of consumers, and thus offers a representative compendium of opinions expressed by the British public. Plus an analysis of a huge set of social media posts about telecom providers, and an interactive version of the report on their website.

This report doesn’t deal in aggregates: it names names and apportions blame for lousy service, as you can see from the graphic above (p.31). Ofcom isn’t merely telling consumers some percentage of them are satisfied with their broadband service, or here’s where to complain. They’re pointing the regulatory finger at misbehaving licensees so consumers can go ISP shopping — and switching — armed with real information. And, as Ofcom adds, so the licensees themselves will have incentives to improve.

Broadband measurement: two perspectives

The CRTC is once again recruiting participants for its Broadband Measurement Project, having just launched phase 2 earlier this month. I have several issues with the way the Commission has conducted this research. First, three years after the windup of phase 1, it has still not made all the test data public. Second, the prior pool of participants totalled only 5,000, and as they were self-selected, the results aren’t necessarily representative (the results were reported by the measurement platform vendor SamKnows in this document). Third, and of greater importance to me, the project includes the incumbent ISPs only. As I’m a TekSavvy customer (one of 300,000 now across Canada), I can’t participate.

There’s good news. Just in time for the upcoming ISP reports in our summer course (4520 syllabus over the page here), my friends Reza Rajabiun and Fenwick McKelvey have published a new paper about broadband measurement titled “Complementary realities: Public domain Internet measurements in the development of Canada’s universal access policies,” (25 Mar 2019, The Information Society: pdf here).

The goal of their paper is to evaluate “public domain Internet performance measurements available for assessing the state of connectivity and developing universal access service quality standards in Canada.” As the authors argue, and my students discover for themselves, how you evaluate broadband performance is a fraught exercise. Yet it is rapidly becoming a vital determinant of how we experience the public Internet in every aspect of our daily lives. I’m grateful to Reza and Fen for providing us with tools to enrich our ISP assessment project. (Their paper includes comments about the CRTC’s SamKnows project on pages 3, 4 and 12.)

Epilog

As I noted earlier in this post, one of the aims of our ISP project is to equip students with the kinds of consumer insights we might have expected from a more socially committed regulator. I make no bones about the practical side of this exercise. In fact, one of the promised course outcomes is You can expect to get tough on your ISP. I’ve seen plenty of evidence of how a deeper knowledge of the nuts and bolts of the ISP business can lead to that very behavior — along with a more more confident attitude about how to face down the many challenges of the digital life.

And I’ve seen the changes carry over from well past the end of a course. But last Saturday evening — 8:37 pm to be exact — I got an excited text message from a student about a big change in his family’s communications profile:

“Finally convinced my mom to switch to TekSavvy today. Saving her $80/month from those crooks at Rogers! We’re officially a cable-cutter family. Say bye bye to cable television!”

The really exciting thing about this surprise Saturday evening message was it came from a student whom I hadn’t talked to in ages. And who picked up his pointers in this same summer  course… two years ago. His message says a lot about what happens when determined graduates get out there and get to work on closing the provider-subscriber information gap.

D.E.