It started with a humorous paean to Saran Wrap and ended on the question of whether super-intelligent computers will destroy humanity— which is to say that this year’s David and Lyn Silfen University Forum covered a fair amount of intellectual ground.
“From Idea to Innovation: The Impactful University”—in which Penn President Amy Gutmann peppered biographer Walter Isaacson with questions arising from his 2014 book The Innovators: How a Group of Inventors, Hackers, Geniuses and Geeks Created the Digital Revolution—was the centerpiece. But the day-long celebration of innovation at Penn also included talks by several leading faculty researchers, and the rechristening of the University’s 23-acre South Bank site, acquired in 2010, as Pennovation Works, a hub for new business ventures and the future home of the Penn Center for Innovation. (For more information, see this issue’s “From College Hall.”)
Saran Wrap came up by way of an animated video-clip in which Mel Brooks and Carl Reiner’s “2,000 Year Old Man” proclaimed it the “greatest thing mankind ever devised,” while allowing that “man’s discovery of space” was “good, too.” More seriously, it was offered as a prime example of the unexpected paths innovation takes. Discovered by accident in an effort to synthesize a new dry-cleaning solvent, and first marketed by Dow as a seawater- and chemical-resistant coating used on World War II fighter planes being shipped overseas, Saran Wrap took quite a circuitous route before finding its true calling preserving the world’s school lunches and leftovers.
Gutmann pointed out the moral: “We can bring the best people together with the best resources, as Dow did in the 1930s. We can confidently expect that innovations will occur. But we can’t predict what the outcome will be.”
This is why it’s so important for universities to act as “stimulators [and] incubators of great research and discoveries,” she added. “We have the minds, we have the resources and the vision to expand innovation at Penn even further.” The Pennovation Works facility will “translate new Penn ideas and research into products, into ventures, into services that will change the world,” Gutmann said. “Innovation will be the key.”
At Penn the idea that “the academy should be connected to enterprise” goes all the way back to Benjamin Franklin, noted Isaacson, whose acclaimed biography of the University’s founder came out in 2003. But later leaders haven’t always seen things that way.
Take ENIAC, for example. Isaacson devotes a big chunk early in his latest book to the Electronic Numerical Integrator And Computer and why it deserves to be considered the first “electronic general-purpose programmable computer.”
Developed during World War II with government funding at Penn’s Moore School by John Mauchly Hon’60 (“one of the ultimate collaborative scientists,” in Isaacson’s view) and J. Presper Eckert EE’41 GEE’43 Hon’64 (“somebody who really knew how to engineer”), along with “six great women PhDs in math” who programmed the machine and a cadre of mechanics “with grease under their fingernails” who built it, ENIAC was publicly unveiled with much fanfare on Valentine’s Day 1946. While things were touch-and-go up until the last hours, it performed flawlessly in its first public test.
At that point, Penn looked to be in on the ground floor of the Information Age. But it didn’t work out that way. Instead, Mauchly and Eckert ended up leaving the University to set up their own computer company, which they later sold. Disputes over intellectual-property ownership and patent rights played a role, but a big factor was that Penn just didn’t know what to do with ENIAC, Isaacson said.
“Penn made a mistake, I think, in thinking of computers as just a practical tool” that didn’t gel with the academy’s theoretical bent, he explained. The same assumptions were being made at places like Harvard and Princeton’s Institute of Advanced Studies, where other early computer projects were under way, he added. “It’s a problem back then. They all think this is too practical and commercial.”
Another “bad moment” for Penn: ENIAC’s debut was a spectacle—covered on the front page of The New York Times—capped by a formal dinner on campus . “All your generals come from Washington. And these two women [programmers], Frances Bilas and Jean Jennings, stay up all night because there’s still one last glitch in the program. And they conquer it at about five in the morning. And the program works beautifully with all the lights blinking, huge applause,” Isaacson recounted. “And then everybody goes to Houston Hall for the candlelit black-tie dinner, and the women aren’t invited. And they take the bus back to their apartments on Valentine’s Day, a very cold February night.”
That women were hired to program ENIAC in the first place was largely because it was seen as a low-status role. “The boys with their toys thought making the hardware was the important stuff, so they assigned the women just to do the reprogramming of the cables and the software,” Isaacson explained. “Turns out they were wrong. The hardware becomes almost commoditized. It’s the software that becomes important.”
Gutmann and Isaacson—who both have daughters trained in the sciences—lamented the continuing under-representation of women in STEM fields. Ironically, “more women went into math in the 1930s than a generation later,” Isaacson noted. Gutmann cited a more recent gloomy statistic: “In 1985, 37 percent of undergraduate degrees in computer science were earned by women. Fast-forward to 2010, and that number has been cut in half, to 18 percent.”
On the other hand, Isaacson quoted his daughter, Betsy, to the effect that “‘girls who code get jobs.’ So you can thrive in the economy.” Tech companies like Google are now actively recruiting women, he added. “So if you put out more engineers, they will get jobs.”
But prejudices about women’s aptitude for science—such as those expressed by a certain former Harvard University president, Isaacson suggested—continue to have a negative impact, as does a dearth of well-known role models. “And that’s particularly bad because there are role models, but they’ve been written out of history,” said Isaacson, “which is why I really was so happy to discover and then celebrate them in this book.”
Isaacson starts the book with Ada Lovelace, the daughter of the Romantic poet Lord Byron, who in the 1800s published the first description of a general-purpose computer and the first computer program. But Walter wasn’t the first in his family to discover her; he learned her remarkable story from Betsy, who had learned and written about Lovelace for her college-application essay.
It was Lovelace’s great insight that computers could do much more than, well, compute—that they would be “general-purpose”—but there was one thing she was certain they would never do, Isaacson said, and that was think for themselves. She thus staked out one side of the continuing debate between the “augmentation” and “artificial intelligence” strands of computer history.
To date, Lovelace’s side has had the winning argument. From the earliest concept of the personal computer and culminating “to some extent with Steve Jobs, who makes all of our devices much more personal,” he said, “that strand has so far surpassed expectations of computers becoming more personal and more integrated in our lives.”
By contrast, the vision of artificial intelligence, “where machines will think without us,” seems to be perpetually on the horizon. “In 1950, I can’t tell you the number of times [it was claimed that] in 20 years we’ll have machines that will be able to replace humans. Every decade it’s 20 years away. It’s a mirage,” Isaacson said.
“Someday [it] may happen. But aiming for it in the Pennovation Center will probably be spinning your wheels. What you should do is aim for that augmentation of human and computer creativity.”
While advancing information technology has been the principal innovation-narrative for the last 50 years, Isaacson added, that is about to change. “In a garage, you can sort of build Apple or Microsoft. The harder innovation is what’s coming next, which is in the biomedical sciences, the wetware labs,” he said. “That is why Penn is so well placed compared to Stanford or [Silicon] Valley. You need to have great medical centers, great medical research, and you need to break down the silos between chemistry, biology, and information technology.”
Unfortunately, given the current political climate, those advances will likely have to happen without the robust partnership among government, universities, and corporations that fueled progress in what Isaacson called the “golden age of basic research in America,” during the 1950s and 1960s, which produced breakthroughs like the space program, the Internet (then called ARPANET), the transistor, and the microchip.
The impact of declining research funding—exacerbated by additional cuts imposed in recent years under the sequester agreement—was thrown into high relief by reports of Ebola-vaccine studies halted or slowed for lack of funding, Gutmann and Isaacson agreed. And while there’s no telling whether a vaccine would otherwise have been discovered by now, said Isaacson, “if you’re not doing the basic research in the genetic engineering of viruses, then you’re not going to have a vaccine five, six, seven years later, or 50 years later.”
When Gutmann opened the floor to questions, the final one returned to artificial intelligence and its darker implications: Isaacson was asked for his reaction to remarks made at MIT by tech entrepreneur and visionary Elon Musk W’99 [“The Next, Next Thing,” Nov|Dec 2008] in which he suggested that artificial intelligence could be “our biggest existential threat” and cautioned against “summoning the demon.”
“I think he’s wrong—but I think he’s smarter than me, so he may be right,” Isaacson replied.
Throughout history, he added, humans have generally been able to keep our technological advances and moral and humane sensibilities in balance. “That doesn’t mean that will always be the case. We may invent machines that do really bad things. But what I will tell you is that everything that’s going to be done in [Pennovation Works], and everything that’s going to be done with artificial intelligence—it ain’t up to the technology. It’s up to us. We control the technology.
“So to the extent that Elon Musk is fearful, he ain’t fearful of technology, he’s fearful of human beings. And that’s what you’ve got to keep your eye on—in the computer-science department and everywhere else.”