Last month I wrote about the Pew/Elon experts survey on the future of the Internet. I included comments on the ubiquitous use of algorithms and the costs that entails. That was one of five questions on the 2016 survey. I answered two others: one on the future of education (#2) and the other on the effects of ever-increasing connectedness (#5).
My views on the future of higher education – especially in the liberal arts – have grown more pessimistic over the last year and a half. They’ve been shaped by the research and interviews I’ve done while working on a book proposal aimed at the uses and misuses of technology in the classroom. The working title, Turned off Tech, reflects the long-ago inciting incident: confiscating student phones and all other digital devices, the better to make the classroom a place to learn again.
Students adjust nicely to the idea that paying attention is a good way to find out how digital technologies work – as opposed to staring into a screen and expecting some miracle of osmosis. These days they’re much more concerned about what happens after they leave class and graduate. Many tell me that their 4-year degree was a painful necessity that will bring nothing by itself. Continue reading →
Statue of al-Khwārizmī, the 9th-century mathematician whose name gave us “algorithm”
I’ve written a lot about the Pew Research Center. Pew does a great deal of invaluable survey research on the behaviors and attitudes we develop online (okay, “we” means American here). In a departure from the science of probability surveys, Pew teamed up with researchers at Elon University back in 2004 to launch their Imagining the Internet project.
About every two years, the team prepares a set of questions that’s sent to a list of stakeholders and experts around the world. The questions reflect current hot-button items – but ask the participants to imagine how online trends will look a decade from now. The topics have ranged from broad social concerns like privacy and hyperconnectivity, to more technology-oriented questions like cloud computing and Big Data.
The 7th version of the survey was fielded this summer; it’s my 4th shot at predicting what life will be like in 2025. (For a look at what the survey tackled in 2014, see my posts starting with one on security, liberty and privacy.) Continue reading →
[This post continues from the previous one, comparing the FCC and CRTC approaches to the principle of universality, and finding the CRTC’s approach to broadband puts this principle at risk.]
For my money, the key lesson we can take from Chairman Wheeler’s FCC lies in the willingness to admit when they’ve got a big problem on their hands. The FCC spends little time reflecting on its successes, compared to worrying about how they will correct market failures and right social injustices. In that spirit, Wheeler’s recent statement on the new Lifeline proceeding gets straight to the main issue: “…nearly 30% of Americans still don’t have broadband at home, and low-income consumers disproportionately lack access.”
Compare that blunt admission to the CRTC’s habit of seeing the world through rose-colored glasses. The rosy glow is not confined to decisions; it’s also been a feature of the CRTC’s research documents. Take last year’s Communications Monitoring Report on telecommunications (pdf uploaded here). Turning to the section on the Internet market sector and broadband availability (p.171), the reader is hard-pressed to see that anything is amiss in this parallel universe. Continue reading →
I’m taking a further shot in this post at the question of the decade: should Ottawa guarantee Internet access to all Canadians?
This question is now drawing a great deal of attention. In April, the CRTC launched a new proceeding to review “basic telecommunications services.” As I wrote previously:
“The most important single question to be addressed in this proceeding is whether the time has come to start treating a broadband connection to the Internet as an essential service to be provided to all our citizens, just as we have done for decades in the provision of basic telephone service.”
Nevertheless, the two agencies see what is at stake in very different terms. These differences are evident in a comparison of the relevant public notices and agency research documents. My reading indicates our American friends are way ahead of us in the assumptions they’ve made about the public interest, as well as in the tools at their disposal to make a success of this epic broadband venture.Continue reading →
The recent Rebooting conference in Ottawa was a terrific experience. Lots of people with lots of good ideas and the opportunity to debate them at length.
Oversimplifying a little, I would divide the conference participants into two general groups. The first and larger of the two was reform-minded, with many calling for serious changes, especially to the CRTC. The second group, while smaller, was just as eloquent in defending what I’d call the status quo. By that I mean maintaining or expanding subsidies for program production; a bigger role for the CBC; and measures explicitly designed to protect broadcasters with a view, among other things, to protecting jobs in the broadcast sector. This perspective tended to cast the socio-cultural objectives of the Broadcasting Act in a favorable light.
My six minutes of fame featured a half dozen reasons as to why there’s an urgent need to reboot the Broadcasting Act, and in particular to redraw the policy goals in section 3 from the ground up.
Why we need reform
1 – The 1991 Act is older than the Web. One simple argument for reform is chronological. The 1991 Act predates the Web by six months: the first publicly available Web page was posted on the Internet in August 1991. Worse still, most of section 3 is based on what became law in 1968 – 47 years ago! The main difference is that the current version is over three times longer and now refers to “programs” and “programming” 31 times.Continue reading →
“The CRTC is recruiting up to 6,200 Canadians to help measure the Internet services provided by the participating ISPs. Volunteers will receive a device, called a “Whitebox”, that they will connect to their modem or router. The Whitebox will periodically measure broadband performance, testing a number of parameters associated with the broadband Internet connection, including download and upload speeds.”
On this Commission page, the visitor is offered some details, including how to sign up. In a discussion with some other folks today, there was agreement that the Commission is going to have to work hard to attract mainstreamers who have no technical background. To do so, the project team is going to have to take a more didactic approach, and give up self-congratulatory marketing lingo like a “world-class communication system.” Continue reading →
In this post, I follow up on my comments about the first day of the CRTC’s hearing to review its framework for wholesale services in the telecom industry. Since the most significant sector to be affected is Canada’s residential broadband service, I’m summarizing evidence here that was compiled recently by the Open Technology Institute (OTI) that compares broadband in 24 cities in Europe, East Asia and the US, along with Toronto. This evidence is consistent with findings from other international studies. It shows Toronto lags far behind the broadband leaders in available speeds; in the penetration of fiberoptic platforms; in symmetric connectivity (uplink bandwidth matches downlink bandwidth); and, most seriously from a social policy perspective, in the high prices Torontonians are forced to pay. I take this evidence as a strong argument in favor of maintaining and extending the regulatory regime that ensures open access to networks for smaller, competitive ISPs – including not just legacy platforms like DSL, but also emerging fiber platforms. Unless the CRTC includes these next-generation platforms, Canada will fall even further behind in its long slide into slow and expensive broadband connectivity.
“We are now ready to take our place as the most technologically advanced nation on the planet.” –Stephen Harper, Digital Canada 150, April 2014
Fresh evidence from Akamai about Canada’s lousy broadband speeds
Time now for some empirical evidence, featuring Akamai’s recently published State of the Internet report for Q2 of 2014.
Akamai’s Intelligent Platform is a cloud computing technology that operates in some 90 countries around the world. Because of the scale and sophistication of its operations, it collects and analyzes huge amounts of real-time (not advertised) data about broadband speeds and related variables (based on roughly two trillion requests for Web content every day). Akamai includes in its analysis every country from which it receives requests for content from more than 25,000 unique IP addresses. Currently that’s 139 countries.Continue reading →