Share Button
GRASP Lab alumni and KMel Robotics founders Daniel Mellinger and Alex Kushleyev take one of their quadrotors outdoors for a spin.

An invasion of unmanned aerial vehicles—drones to you—is on its way, but these flying robots are here to help, not enslave the human race. (At least, that’s what they say at Penn’s pioneering GRASP Lab, where some of the most sophisticated ones are being created.)

By David Wolman
Photograph by Addison Geary


Download a PDF of this article

They are everywhere. In the news headlines, anyway. 

“Here Come the Drones,” proclaims Scientific American. “Pentagon to soon deploy pint-sized but lethal Switchblade drones,” reports The Los Angeles Times. “Here’s Looking at You,” intones The New Yorker.

Not alarmed yet? “Congress Should Ban Armed Drones Before Cops in Texas Deploy One” (TheAtlantic.com). “Drone Flying Over Washington, D.C., Neighborhood Goes Missing” (Slate). 

It remains to be seen whether a diabolical swarm of robots may ever block out the Sun, but a mountain of articles predicting as much may beat them to it.

Inside the Penn lab that is home to some of the most cutting-edge drone innovation research in the world, however, a perennial concern isn’t so much robot-enabled Armageddon—it’s battery power. A well-worn quip among researchers in the General Robotics, Automation, Sensing & Perception (GRASP) Laboratory goes something like this: if a robot army threatens global domination and subjugation of the human race, just close your doors and wait 20 minutes. All the batteries will die and everything will be peachy again.

“The limitations of this technology are just huge,” says Daniel Mellinger GEng’10 Gr’12, a recent doctoral graduate in mechanical engineering and applied mechanics and a member of the GRASP Lab. Mellinger and other engineers aren’t blasé about the military’s use of drones, or the civil-liberties implications of flying machines armed with cameras—or worse. But when those concerns sound more like the script of The Terminator, Mellinger can do little but laugh. 

“It’s really just a joke to us, this fear that these things will take over the world,” he says.

But make no mistake: the limitations of his circuit-board babies belie some startling capabilities. One specialty in the GRASP Lab is a machine called a quadrotor. With a body no bigger than your hand, these robots have four rotors extending out and up from each corner. Choreographed adjustment of these helicopter-like blades is what enables the machine to move through three-dimensional space—and likewise what enables a group of them to move in concert with one another. And as three million and counting YouTube viewers can attest, that can mean literally in concert, as when a squadron of these things descended on a collection of drums, maracas, an adapted guitar, and a keyboard to play the “James Bond Theme.” (Video)

Drones have proven to be an essential—and highly controversial—military tool, particularly when it comes to targeting individuals or groups in faraway places (read: Pakistan, Yemen, and God knows where else). But they are also being used by filmmakers to capture the perfectly angled shot, wildlife managers tracking elusive species, and farmers measuring precisely where and how much fertilizer and water to deliver to their crops. 

Meanwhile, as the cost of off-the-shelf components continues to drop, everyday tinkerers, roboticists, and high-school students are falling in love with drones. Why go to the park to throw a ball with your kid when you can fly a custom-built quadrotor?

Yet at the same time that this industry is booming, drone technology ignites all kinds of fears, particularly of the police-state variety and, yes, even nightmares inspired by science fiction. The use of drones in the war on terror, not to mention the horror of civilian casualties resulting from drone strikes, doesn’t exactly improve the public image of this technology. In the public consciousness, drone and evildoer are close cousins.

Vijay Kumar, deputy dean of education and UPS Foundation Professor in the School of Engineering and Applied Sciences, would like to undo that impression. 

While walking to work a few months ago, Kumar, a svelte man who wears bright shirts and sleek rectangular glasses, was surprised to find a group of about a dozen students standing outside Levine Hall, where the GRASP Lab is housed, holding placards. 

“I think the protesters were confusing UAVs”—unmanned aerial vehicles—“with military drones in use overseas,” Kumar says. “What we do in the lab isn’t remotely related to that.”

The demonstration was calm and modest, so much so that the protesters either didn’t want to direct their frustration at individuals, or they simply didn’t know that Kumar is Penn’s high priest of drones. He walked right past them without notice.

That would hardly have been the case at a gathering of drone-technology enthusiasts.

“He stands on the heights of Mount Olympus, providing the model we could only aspire to someday,” says Chris Anderson, the editor-in-chief of Wired magazine and co-founder and chairman of 3D Robotics, a drone manufacturing company. (Disclosure: I write for Wired.) Kumar gave a talk about drone technology last spring at the TED conference, a high-profile semi-annual gathering of leaders in technology, entertainment, and design (founded by Richard Saul Wurman Ar’48 GAr’59 [“The Commissioner of Curiosity,” December 1997] and now owned by the Sapling Foundation). The video of Kumar’s TED presentation has gotten more than 1.8 million views so far, and people who have seen it will occasionally greet him out of the blue.

Penn’s “high priest of drones” Vijay Kumar’s presentation at the TED conference has garnered nearly 2 million YouTube views.

Anderson says the advanced mathematics that Kumar and his students are deploying in the lab to enable their drones to navigate the world around us is nothing short of jaw-dropping. “You’re listening to a monkey describing the Mona Lisa!” he says when asked to talk about the technological underpinnings of these machines. But one takeaway, he says, is that Kumar and other Penn engineers have shown “that it’s possible to create cheap, small, and sufficiently smart drones, and that you could deploy multiple ones simultaneously and coordinate them.” 

Protegés of Kumar’s are aggressively recruited by aeronautical engineering firms across the country, while graduates like Mellinger and fellow GRASP alumnus Alex Kushleyev EE’07 GEE’07 have already founded their own robotics firm, KMel Robotics.

Online videos showcasing the achievements of GRASP Lab innovators have garnered something of a cult following among technophiles and critics alike. One YouTube clip, titled “A Swarm of Nano Quadrotors,” features 16 robots flying in formation and entering a mock building window. In other clips, robots fly in figure-8 patterns, ferry blocks across the room to carry out rudimentary construction projects, and hover like hummingbirds before darting through a hula-hoop tossed into the air by a researcher. That maneuver happens so fast that you have to watch it in slow motion—which you will. Oh, and they “can sense the end of battery life, fly to a charging station, and summon others to replace them while they recharge,” says Kumar. 

There goes the idea that battery life is an obstacle to the robot army’s plan to take over the world.


It’s no wonder, really, that people are dazzled by these videos, Kumar’s TED talk, and other recent “performances” in which drones are incorporated into a music-and-lights show on stage. As cool gizmos go, these machines are hard to beat. Your smartphone may open a universe of possible apps to run, but don’t expect the iPhone 6 to fly in formation. For the critics, however, a future full of flying robots doesn’t sound so sunny. All of which raises the question: Where is the line between fear-mongering and legitimate concern?

Before diving further into the drone wars, let’s clear up some terminology. To the experts, UAV (again, unmanned aerial vehicle) is the preferred descriptor, although that word can include a wide range of machines, from those controlled remotely by a pilot wielding a joystick, to robots controlled by on-board computers governed by algorithms. Algorithms are mathematical rules, essentially flight plans composed by humans but executed by motors and rotors that respond to commands—delivered in fractions of a second—determined by cameras, onboard GPS, and sensors that collect input about surrounding landmarks or obstacles, including fellow drones.

Drone is actually a term that bugs a lot of the people who work on them. For people in the Air Force, Kumar explains, the word incorrectly suggests that these machines are taking independent action, when in fact a person is always telling them what to do. Kumar dislikes it for the opposite reason. 

“For me, drone is annoying because it’s associated with dumb creatures,” he says—like drones in a beehive, for instance. “We’re trying to build smart machines! If you have a drone, you could hit the wrong target. But if you have a smart UAV, you don’t miss. We like the term flying robots because it’s all about the algorithms behind them.” 

Unfortunately, language has a habit of following its own trajectory, and for the time being, drone seems to be the public’s go-to word to describe all kinds of flying machines.

Remote-controlled aircraft aren’t new. In many ways, a commercial airliner is an unmanned aerial vehicle, to the extent that onboard computers do much of the flying, and, although the notion may be unpalatable to paying customers and politicians, could soon do the taking off and landing, too. But the unmanned aircraft carrying out reconnaissance and military strikes overseas are an order of magnitude more sophisticated than autopilot programs on commercial jets, to say nothing of the quaint model airplanes your uncle used to fly. Some look like tiny helicopters or planes, while others are made to look like flying insects and weigh less than an ounce. 

If you can’t resist personifying these machines, then you might say that they “think” for themselves, although they are impossibly dumb—like an eggplant is impossibly dumb. Onboard sensors and chips help flying robots respond to changing inputs. But they are quite vulnerable to forces that would hardly trouble a two-year-old, such as wind. If the machine’s thrusters don’t perfectly compensate for a sudden gust from the west, the drone will get tossed into a tree, wall, or power line. (Indeed, one of the more amusing—and anxiety-relieving—videos Mellinger has posted to YouTube is a collection of “quadrotor fails,” a sort of anti-highlight reel of midair collisions and spontaneous nosedives to the floor of their safety-net-draped testing space.)

One of the major challenges Kumar and his team are currently trying to grapple with is weather. In the lab, scientists can minimize “noise,” incoming information that might get in the way of the machine’s attempt to accurately execute commands. In the real world? Not so much. And that includes not just wind but rain, dust, crows, power lines, unusually tall maple trees, and inferior GPS signaling. 

“In the lab, we have indoor GPS, which is better in that it’s more precise and you get much better update rates,” Kumar says. That is key because a machine flying on its own needs a rapid and steady delivery of inputs, just like you do when weaving through downtown traffic, even though this massive amount of data-processing happens seamlessly in your mind. 

“But with regular GPS outside,” he adds, “you only update a couple of times a second.” Even the computers in your car send information back and forth between sensors and controls a hundred times a second. “We want the equivalent to be done on the aircraft,” he says.


Another project currently in the works at GRASP is what Kumar calls a rapid-response team for emergency scenarios. Imagine that the police get a call from a building where terrorists have taken hostages. Within seconds you want to have robots surrounding it and gathering information so that the police are more informed about what to expect when they arrive. In one sense, it’s odd that this kind of tool strikes so many people as wildly futuristic. After all, the police already use robots for bomb disposal, and surveillance cameras are practically everywhere nowadays.

Amid the continuous stream of news about military drones, positive applications get short shrift. Yet the benefits drones could deliver may turn out to be profound.

Take forest fires, for instance. Yes, we can use huge planes to dump water and fire retardant, a strategy that is of questionable utility. But what if the use of aircraft to fight fires was more calculated? Firefighters in France have shown how using drones can help smokejumpers predict where fires are moving and therefore more strategically target their countermeasures. There are other promising applications in areas like monitoring weather conditions, atmospheric and environmental research, and keeping an eye on Coast Guard ships and fishing boats at sea.

Last winter, with funding from the National Science Foundation, Kumar took a couple of students to Sendai, Japan. The plan was to see if they could use drones to assist with data collection at the site of the Fukushima nuclear disaster. Scientists who design and build robots—not just flying machines but also crawling roach-bots, spider-bots, snake-bots, and more—have a peculiar relationship with disasters. Like the rest of us, they don’t really want to see bad things happen. Yet they do happen, and few settings more persuasively highlight the benefits of robot technology than the scene of a catastrophe.

Small robots can enter into tiny, dark spaces that people or rescue dogs might not be able to access, like buildings that have collapsed in an earthquake. When hazardous materials or otherwise dangerous spaces are involved, using machines means you can keep people out of harm’s way. This was the idea in Fukushima. 

Traveling with Kumar were doctoral student Shaojie (Frank) Shen GEng’11 Gr’16, master’s degree student Kartik Mohta GEng’11, and research assistant professor Nathan Michael Gr’08. One application they hoped to look into was flying UAVs in coordination with robots on the ground. GRASP scientists have developed systems in which robots fly into a space such as a damaged building, where, rapidly collecting and compiling data, they can map the surroundings. The results emerge on the researchers’ computer screens, as if materializing out of thin air. You can imagine a group of rescuers, nuclear technicians, or Navy Seals staring at those screens, assessing whatever damage or new hazards the site presents, and plotting their next moves.

Unfortunately for Kumar and his team, access to the Fukushima site wasn’t granted, due to a combination of red tape and—although Kumar is too diplomatic to put it in these terms—rigidity on the part of Tokyo Electric Power Company, the utility that runs the plant and is dealing with the disaster aftermath. Nevertheless, the researchers were able to conduct useful exercises in a building in Sendai that had collapsed in the earthquake.  

Think back for a moment to that eerie notion of robots that think for themselves. Well, now we’re talking about drones flying into smoke-filled apartments or nuclear-reactor buildings to gather data about temperature, radiation levels, the location of an unconscious child—and transmitting it all to rescuers en route to the scene. Maybe machines making their own decisions aren’t so bad after all.

But there it is again: the autonomous entity trap. 

“It’s a mistake to think they’re taking action independently,” says Kumar. “A person is always telling them what to do.” 

Drones react based on a code that follows and implements rules, much in the same way that the cruise-control setting on your car is a rule, and computers react to changing conditions in order to follow a command from a person (you): maintain my speed. For better and for worse—and yes, it’s both—drones are controlled by humans, in all our imaginative yet also occasionally violent glory.

The fact is, says Mellinger, any technology can be used for good or bad. This is the ultimate platitude when it comes to new and at times worrisome innovations, but it also happens to be accurate. Again, look at computers. Or microbiology. Or nuclear fission. Or the printing press. 

“It takes a generation or two to domesticate these new technologies,” says Anderson. “People freaked out about the computer,” he adds, recalling how, in the 1970s, computers were seen as Big Brother’s latest weapon in the quest for total control, and prompted widespread fears about so-called intelligent machines. Today, most people can’t imagine leaving the house without one of those intelligent machines in their pocket. Anderson believes the trajectory will be the same for drones: “Really cool civilian non-scary applications will emerge and reclaim the meaning of the word drone.”

Maybe so, but that will mean addressing some major civil-liberties concerns in the process.

To legal scholars, the issue of government or private-sector eyes in the sky—or in your yard, or even in your kitchen—isn’t new. Ever since the earliest aerial surveillance technology emerged in the 1960s, there has been tension separating legitimate search and privacy. And even for some drone applications that the courts determine to be legal, there is also the matter of what Google executive chairman Eric Schmidt, of all people, once called “the creepy line.”

Anita Allen is the Henry R. Silverman Professor of Law and Philosophy at Penn and an expert on privacy and data-protection law [“Reviving Privacy,” Sept|Oct 2012]. Under the law today, “there is a great deal of protection [of privacy] in the home,” she says. But drones raise a host of new questions because of their enhanced capabilities, the open question of which entities and government agencies might be able use them, and on whom and where they can be used (answer: possibly everyone, probably everywhere). Public safety and personal privacy are often at odds, and drones add a new dimension to that already puzzling balancing act.

“We need to think about drones in the same breath as we think [about] license-plate recognition, facial recognition, or gait recognition—all these ways government can collect information about us,” Allen says. But what is particularly worrisome about drones, she adds, is that “we don’t [do that] yet.” 

The fact that drones, unlike facial-recognition technology, have already been implicated in targeted killing only makes this concern more acute. 

“Clearly the idea of a new technology that can provide detailed knowledge of our comings and goings, or that might even lead to our assassination—that’s scary,” Allen says. “Even the word: drone. Doesn’t it make you think of Darth Vader or something?”

Scientists themselves have a knack for provoking this kind of anxiety, with language like, They can sense the end of battery life, fly to a charging station, and summon others to replace them while they recharge. At other times, the phrasing may be more playful, but the takeaway is still somewhat ominous. “Have you seen the quadrotor-like things jumping out of the box in the Harry Potter movies—for the Quiddich game?” Kumar asks. “It’s like that, but it’s a rapid-response system,” he says about his emergency-response project.

Here again, language betrays the engineers. “Look at the semantic trail from Kumar’s TED talk,” says Anderson. “He says swarming drone and right away, people have this image of locusts. And then you go to intelligence? Where did that come from? There’s nothing that’s self-aware about these things.” 

The truth of the matter, says Anderson, is that going from a swarm to an intelligent swarm is to make the leap from fact to fiction. And anyway, the more important thing Kumar and his GRASP colleagues are doing is paving the way toward inexpensive UAVs with flexible applications. 

“We have an obligation to demonstrate uses and apps that don’t scare people,” Anderson says.  “It’ll take time and we’re in the early days … Our job in our community, and my company, is to put cheap civilian drones in the hands of regular people who’ll find apps in their own lives, whether it’s the best windsurfing camera robot ever, or wildlife monitoring, or what have you.”

Mellinger, whose online videos have been watched by millions of viewers and who has read his fair share of criticism about his research, echoes that sentiment. 

“We’re also not actively pursuing what people are afraid of,” he says. Indeed, right now his company, KMEL Robotics, is mostly doing music videos and light shows, like the one at this year’s Saatchi & Saatchi New Directors Showcase in Cannes, where 16 quadrotors equipped with spotlights “danced” in time to electronic music. But even as the scope of those applications expands, Mellinger, like Kumar, is convinced that he is on the right side of innovation history. 

“The way I approach it,” he says, “is that the work has so much potential to do good and amazing things, that that overrides negative commentary and fears about what we might do.” 

Let’s hope he’s right.


David Wolman is a contributing editor at Wired and has written for many other publications. His latest book is The End of Money (Da Capo Press, 2012).


Share Button

    Related Posts

    Penn Engineering Launches AI Major
    X-RHex, Movie Star
    Engineering Lands $25 Million for Data Science Building

    Leave a Reply