- Richard Rodriguez at the Sesquicentennial symposium on "Migration: Past, Present, and Future" (pg. 26)
- "Fellow citizen," one freshman's journey to a naturalization ceremony (pg. 32)
- Scenes from the naturalization ceremony (pg. 32)
- "The Future of Catholic Periodicals"—a panel of editors discusses (pg. 40)
- Bishop Robert McElroy's talk on "The Challenge of Catholic Teaching on War and Peace in the Present Moment" (pg. 42)
- Peter Fallon at the Greater Boston Intercollegiate Undergraduate Poetry Festival (pg. 48)
- "Mile 21: The day after," scenes from the April 16 Mass for Healing and Hope (pg. 10)
- "Anniversary moments," capturing the range of Sesquicentennial events (pg. 32)
- Close-ups of early diplomas (Holy Cross's and Boston College's) and the University's current one (pg. 13)
Updates, special features, and a day-by-day history of Boston College
View upcoming events at Boston College
Books by alumni, faculty, and staff
Alumni in the news
Order books noted in Boston College Magazine
Join the online community of alumni
View the current BCM in original format
Need to know
The dangers posed by Big Data are real. So is the defense inherent in liberal arts study
Here are a few numbers to think about. They may alarm you. They probably should alarm you: Thirty-nine percent of Americans believe in evolution. Thirty-six percent of Americans believe in global warming. Fifty-five percent of Americans believe in angels. Sixteen percent of Americans believe they have actually seen an angel. I could use these numbers as a launching point for a jeremiad about our collective failure as educators, and as citizens, to get people thinking scientifically. But that would be too easy.
Instead, I will point out questions that many of you are probably already asking yourselves: Are all those numbers from the same poll? Were all those polls taken in the same year? Who conducted those surveys? Were they part of peer-reviewed studies, or were they wacky Internet polls? I purposely didn’t give the sources. I didn’t give the contexts. I didn’t note that, in fact, every one of those numbers comes from a different poll, with a different sample size and different methodology. In sum, those numbers are not comparable.
But if I were on a television or radio talk show, I could run through them unchallenged. I could cite them in a peer-reviewed article in my own field (American studies), and no one would question them, I guarantee you. In this country, we only pretend to be crazy about empiricism.
In The Education of Henry Adams (first printed in 1906), which all American studies students read at some point, the historian Adams relates his experience of attending the 1893 Chicago World’s Fair. Adams describes entering an exhibition about electricity and confronting the dynamo. The moment he sees this big, buzzing, whirring machine, which is destined to light up the world, he recognizes it’s over for him. His mind is trained in the classics. He knows the Old Testament. He knows Greek and Latin. He went to Harvard. But he is totally unprepared for the 20th century, which will be the century of the engineer and the scientist.
It’s hard to think about technology today without being haunted by Henry Adams’s ghost. When I visited the Googleplex—the Google corporation’s headquarters in Mountain View, California—for the first time, in 2006, I expected a Henry Adams moment. I expected to be awed by this powerful, brilliant, dynamic institution. And it didn’t happen. Nor did it happen over subsequent days and later visits. What you see at the Googleplex is a certain eccentricity—people in shorts skateboarding around, occasionally riding unicycles. There’s been much written about this scene, but frankly, as a workplace, it’s not that different from my own (a university, where people wear shorts and occasionally ride unicycles). Indeed, what struck me about Google was how familiar it seemed.
And isn’t this what we’re experiencing with technology today in America? The dynamo that will flood the world with data is somehow opaque to us, its action invisible.
Over a short period of time, we’ve gone from an almost spatial relationship with digital technology to a magical one. It used to be that when you decided to check your e-mail you would go to the room in your house where the computer sat, and you would dial up a distant server on a modem, and you would hear the warbles and bleeps and eventually connect. And then you would download all the waiting e-mails, upload your responses to the server, and out they would go. You did this at most three times a day. The whole process was expressed spatially: I’m going online. I’m logging on. I’m downloading. I’m uploading. I’m entering a chat room, even. This is how we would talk about cyberspace, and it’s how we still talk about it, even though it represented our experience for only a very short time.
Now these same processes are deeply embedded in our lives. They’re instantaneous. Many devices no longer require the mediation of a keyboard. They have been designed to be, as Marshall McLuhan would say, “extension[s] of ourselves” in a way that is just barely metaphorical. Our skin activates the signal, or our voice does. If it ever was true that there was a separate place called the Internet, or a place called cyberspace, that’s no longer the case, and it probably never will be again.
The science fiction writer Arthur C. Clarke famously said, “Any sufficiently advanced technology is indistinguishable from magic.” And indeed, when the iPad first debuted, its lead designer, Jony Ive, compared operating it to performing magic. But this notion of magic is not healthy. It causes us to lose sight of much that is important. It’s our job as cultural critics, as scientists, as technology analysts, and as citizens, to demystify technologies. Once you understand or attempt to understand how a technology works, you develop a genuine respect for the people involved in it. And maybe you get to unpack the errors they made along the way, the failed experiments, the ideas that didn’t quite make it into the product (and again respect grows). That’s what we lose in these sleek, sealed devices, and in this aura of magic. We lose sight of human collaboration.
If you understand the materiality of the technology, you’re in a better position to raise questions about the products and their companies. You will respect, too, the people who assembled the machines—the ones living in China and working under horrible conditions. We should fully understand the commercial as well as the scientific and technological ramifications of the devices we now take for granted.
The economist and sociologist Thorstein Veblen wrote a book in 1921 that hardly anyone read at the time, and few people read today, The Engineers and the Price System. I don’t recommend it, but in its pages he poses an interesting idea. In 1918, during World War I, Veblen served in President Wilson’s Food Administration, quitting before the war ended. He had gone to Washington nearly a Deweyan Democrat, with great hopes for the power of democratic politics. By 1920, he was deeply disillusioned.
A notoriously grumpy man, Veblen decided that the idea of a democratic polis being fit to make important decisions about the direction of society (and about such critical questions as resource distribution) was naïve. World War I had shown the pernicious power of demagogues and taught that political beings can’t necessarily be trusted to take a disinterested view, or even a long one. Veblen cast a look around society to figure out who could be trusted, and he came upon the engineer. What we need, he said, are councils of engineers. (The unfortunate phrase he used was a “soviet of engineers.”)
It is a fascinating idea that, as the world grows increasingly complex, we should outsource important decisions. How do we learn to trust these engineers? How do we make sure they aren’t captured by interests other than the public good? These are political questions beyond Veblen’s consideration.
Yet we already outsource a great deal of judgment to the soviet of engineers. Take, for example, the people who work at Google. It’s a fact that there can be no such thing as a neutral algorithm, and, as it happens, Google’s algorithms end up favoring the outrageous statement. The currency of Google is the link, and in our culture the outrageous statement (on any subject) draws attention out of proportion to its truth, attention that is manifest in a high volume of links. The result is that if you do a search on Google for, say, “autism and vaccination,” you will get a lot of misinformation that has caught the attention of people who search in this area. Google doesn’t have a way of filtering for truth. It can only filter for what its developers would call relevance—which is to say, popularity.
The notion that complex empiricism can be above the fray is one that we seem to worship. Indeed, of all that Veblen wrote—about the natural divide between industry and business, and about the leisure class and “conspicuous consumption” (a phrase he coined)—it is his vision of the engineer that seems to have been realized most certainly. You find it in education when standardized test scores serve as the baseline for measuring achievement. You see it in baseball with sabermetrics, the idea that it is possible to derive predictions for a season from distillations of the complex interactions among 18 athletes (plus umpires) engaged in unscripted play. You see it when a television celebrity warns parents not to vaccinate their children because she’s done research on autism at the “University of Google.”
We are in the era of Big Data, where the possibility exists for making sense of vast sets of numbers and for drawing connections that were once unattainable. In my classroom, I tell the English majors, media studies majors, and sociology majors that if they get out of college without taking courses in statistics and computer science they will not be among the people who pull the levers in society. They will be the ones who get played.
We find ourselves in the midst of an intellectual revolution, a scientific revolution of sorts. We will see things we didn’t see before, and we will also face new questions about methods and ethics, about modes of presentation, about law, policy, politics, and commercialization. We in the liberal arts have an opportunity for—in fact a duty of—real collaboration, communication, public outreach, and engaged criticism. We can demystify science. We can define a context for technology by examining its systems historically, politically, economically, and culturally, to glean how its power works in society.
Siva Vaidhyanathan is the Robertson Professor of Media Studies and Law and chair of the Media Studies Department at the University of Virginia. His most recent book is The Googlization of Everything—and Why We Should Worry (2011). This essay is drawn and adapted from a talk he gave at Boston College on October 29 at a symposium sponsored by the University’s Institute for Liberal Arts on the topic “Science in the Liberal Arts University.”