Read The Internet of Us Online

Authors: Michael P. Lynch

The Internet of Us (19 page)

So, even if knowledge is more “democratized” now—its production and distribution is more inclusive and available—that means little in conditions of increasing epistemic inequality. If you are too poor and oppressed to access anything online, the digital wonders of the world mean nothing to you. The value of epistemic equality is the value of open and fair access to epistemic resources. But “access” here means more than just the ability to go to school or look things up on the Internet. It also means something more abstract but just as important:
having the status of a full participant in the economy of knowledge.

To be a full participant in a monetary economy, you need to be more than just a laborer. Slaves are laborers, but their labor is not shared or exchanged by them; it is stolen from them. To be a true economic participant, you need to be someone who has the resources and willingness to participate in buying and selling. But more than that:
you
have to be recognized as such by others
. Otherwise, you end up just trading with yourself. Likewise with the economy of knowledge. To participate in that economy, you need to be more than just a receptive knower and reasonable believer.
You need to be seen or understood as such.
Otherwise your epistemic labor will be ignored or exploited. You won't be counted as a reasonable believer, as someone who can be trusted; you'll suffer what the philosopher Miranda Fricker labels “epistemic injustice.”
10

The history of racism in this country and many others is replete with examples of people being excluded from not only the monetary economy but the epistemic economy. In 1854, for example, the California Supreme Court infamously ruled that it was perfectly legal that “no Black or mulatto person, or Indian, shall be allowed to give evidence in favor of, or against a white man.” In writing the opinion, Chief Justice Charles J. Murray pointed to what he thought was a slippery slope:

The same rule which would admit them to testify, would admit them to all the equal rights of citizenship, and we might soon see them at the polls, in the jury box, upon the bench, and in our legislative halls. This is not a speculation . . . but an actual and present danger.”
11

Murray, for all his terrifying racism, sees the very point at issue. To recognize a class of people as possible testifiers in a court of law
is
a slippery slope—because it grants them the status of a reasonable believer. It treats them as credible participants in the economy, and as such, as persons who have autonomy over their thoughts and actions. That's one point that Fricker's work has brought to the fore in recent discussion: epistemic injustice of this sort has crippling effects. Once you are no longer recognized as a possible credible source of information—even about yourself—then the dominating class will excuse itself for ignoring your basic rights.

Epistemic injustice of this sort has been much discussed by writers in postmodern critical theory. But the general drift there has been to abandon the category of “reasonable” or “justified”
belief—to see these as inherently dominating categories. What is interesting and important in Fricker's work is that she doesn't see it this way. For her, abandoning standards of reasonableness would be giving up on the goal of epistemic equality. In short, what we need is not to abandon reasonableness but instead, in philosopher Lewis Gordon's words, to “shift the geography of reason.”
12
And that is the question we would be wise to ask with regard to our digital life as well: how is it contributing to, or inhibiting, that shift? A central cause for worry is the increasing fragmentation of reasons themselves. In the context of our present discussion, we might worry that this fragmentation doesn't just have bad political effects. It has bad epistemic effects. It promotes epistemic inequality and a loss of intellectual autonomy. And that in turn can affect people's ability to filter out bullshit—simply because their filtering is so one-sided.

Web 2.0 and the Internet of Things
can
be forces for democratic values. But we must not let our enthusiasm blind us to the existence of epistemic inequality, and the fact that its causes—racism, income inequality—pollute the infosphere just as much as they pollute the minds that make it up.

Walmarting the University

Standard procedure for university exams these days involves prohibiting the use of smartphones. As I was reminding my students of this recently, one of them joked that the university better come up with policies on wearable tech like smart watches ASAP. We all laughed, but nervously, because he was
right. And as another student noted, whatever policy that is, it is going to be outmoded by the time it is enacted—not just because universities are slow to adapt to change but because technology is moving so fast. While Google's initial experiment with Glass may not have been successful, the idea isn't going away; and one day that too may seem quaint, should something like neuromedia emerge.

That raises a question: if the Internet is available to you at the blink of an eye—and available in a way that seems like memory—then what
are
we testing for when giving exams? What, in general, is the point of higher education in the age of big data?

These questions come at a time when the idea of the university itself is often said to be in crisis—especially in the United States. In one sense, the American university system continues to flourish. American institutions of higher learning dominate world rankings, making up more than half of the top 100 and a large majority of the top ten. Go to any top research conference in the world and you'll find many of the keynote speakers and top researchers there are from American universities. American institutions continue to lead in the production of scientific research in the best journals, and produce the most Nobel laureates. And students from across the world continue to come to the United States to study. In economic terms, university education continues to be one of America's leading industries.

But at the same time, there is the increasing worry that we are in something of an education bubble, and that the model is no longer sustainable. The cost per university student for an education has risen almost five times the rate of inflation since 1983. Thus it is not surprising that the amount of debt per student has
so dramatically increased; two-thirds or more of students now take out loans.
13
Private institutions routinely charge around $60,000 a year, and an “affordable” public institution, like my own, can cost more than $25,000. The explanations for these depressing facts vary, although it is clear that part of the matter is that state funding, on whom both public and, to a lesser extent, private institutions have long depended, has dramatically decreased in the last three decades.
14
Taxpayers, for good or for ill, no longer clearly favor paying for the epistemic equality brought about by public institutions—and public education, at all levels, is obviously a primary victim of this change in mentality. But whatever the explanation, it is hard to avoid the conclusion that something needs to change.

Starting around 2012, many pundits, and more than a few academic administrators, started forecasting that information technology was going to lead this change. In particular, the advent of MOOCs (Massive Open Online Courses) was thought to signal a shift to a different model of education. MOOCs are free (or mostly free) online courses, composed generally of video lectures, various forms of computer-enabled discussion forums and computerized grading. In the wake of several high-profile successes attracting thousands of students, startups and nonprofits promoting and hosting MOOCs, such as Coursera and edX, sprang up almost overnight. Universities began creating their own MOOCs. The anticipation, and the hype, ran high, with the president of edX, Anant Agarwal, declaring that MOOCs would reinvent education and transform and “democratize education on a global scale.”
15

MOOCs do indeed have much to offer. Many of the courses
allow people who would never have a chance to take a course by a world-renowned expert on a subject the ability to do so, and for free too. In many cases, students can even receive college credit if they finish the course successfully. Already millions of people around the globe have taken advantage of this opportunity. As a result, it is hard not to see it as crashing the gates of the university and helping to promote epistemic equality. It is also simply edifying, as a friend of mine (a superstar teacher who designed and created a MOOC while at the University of Virginia) said to me. Few things are more inspiring than to find yourself talking philosophy to 80,000 people worldwide, from all levels of income and backgrounds. Who can argue against free philosophy?

Not me. Yet only two years later, it is becoming clearer that, for all their many virtues, MOOCs are not exactly the revolutionary product they have been hyped to be. To see why, let's go back to Rifkin. According to Rifkin, the old model of higher education maintained that “the teacher was akin to the factory foreman, handing out standardized assignments that required set answers in a given time frame.”
16
The old model was “authoritarian” and “top-down.” It emphasized lectures, was hierarchical in its power structure and privileged memorization over discussion. The new model emerging in the Collaborative Commons is more lateral, egalitarian and interdisciplinary.

Rifkin is certainly right that the old, old,
old
model of education has many of the features he describes. But the
Mad Men
era has been gone for some time now, and the shift to more discussion-orientated, problem-solving models of education began as far back as Dewey. And this was the result not of a
technological shift but a pedagogical one. This helps explain why many educators have been skeptical about using MOOCs as a replacement for, as opposed to an addition to, brick-and-mortar classroom teaching.
Most MOOCs, after all, just are paradigm examples of the old model in action
. They consist of lectures. Their methods of assessment are standardized. They privilege memorization over discussion. While those are not essential features of MOOCs, of course, the technology is only as innovative as we want it to be, and, right now, it seems as if we don't want it to be that innovative. The fact that MOOCs are more like big lectures is why faculties at Amherst and Duke have rebelled against involving their institutions in MOOCs. Their point was not that there is something inherently wrong with making education free online—far from it. Their point was that the present models of MOOCs are simply extensions of what is already happening at universities worldwide: large classroom lecture-style courses. Pedagogically, many (although not all) MOOCs are not innovative; they are old school.

The other reason educators have been wary of MOOCs is that some see them as hastening what we might call the Walmarting of the university. As I noted above, a hallmark of the global economy is cheaper goods, produced and sold by poorly compensated workers, made possible by amazing models of distribution. This trend has been dominating education as well. According to a leading study of the American professoriate, in 1969 over three-quarters of faculty at American colleges and universities were in reasonably well-paid and stable tenure-track positions.
17
By 2009, that number had almost flipped, with only about one-third of faculty now being tenure-track. In short, most students
are now taught by temporary workers who are largely not unionized and paid well below the minimum wage. The worry that many have had about MOOCs is that it will only exacerbate this process, should universities (as some initially proposed to do) replace their own course offerings with MOOCs purchased for their students from other entities.

Whether that will come to pass is hard to say. MOOCs are in their infancy, and their path is hard to predict. But it's doubtful that MOOCs are the biggest changes looming in education due to technology. Instead, those changes will likely come more directly via the Internet of Things. As I stated at the outset of this section, the big questions concern the more obvious fact: what do you make of education when people have all the “facts” at hand? If you had neuromedia, you'd be able to access tons of information about history, philosophy, mathematics, art, etc. You'd have dates and names at your disposal, just as you do now on your phone. You'd Google-know all sorts of stuff—that is, you'd have potential receptive knowledge, as I've put it. And the more Google-knowledge we have, the greater the “room” in students' minds, you might think, for more important stuff.

This isn't only a modern problem. The use of technology to outsource mental activities is hardly new. At one time, calculators were verboten in math classrooms; not any more. Similarly, students today routinely access the Internet during instruction, and often do so in an interactive way designed and monitored by the instructor. (I've done this in my own courses.) Let's also remember that libraries have long provided huge riches of knowledge for those who want them. Thus the question, “Why go to college if you have neuromedia?” is not much different than the
question (one I took seriously myself as a know-it-all youth), “Why go to college when you have a library?”

You already know the answer to that one. In the ideal world, if not always the reality, we go to college to find pilots who can guide us across the vast seas of knowledge. We need them to tell us what is already charted and what is left to chart still. Such guides shouldn't make us more receptive knowers; they should aim to make us more reflective, reasonable ones and, what's more, they should help us to understand.

8

Other books

Vacuum by Bill James
Speed Freak by Fleur Beale
The Lion's Love Child by Jade White
Quest for a Killer by Alanna Knight
Midnight Sacrifice by Melinda Leigh
Voices from the Air by Tony Hill
The Love Shack by Christie Ridgway
Bad Blood by Anthony Bruno