As the National Science Foundation’s new director of Computer and Information Science and Engineering, Dr. Ruzena Bacsjy—a Penn computer-science professor noted for her work on robotic perception—must get Congress to see its way to creating greater support for basic research in information technology.

By Sonia Ellis | Illustration by Tony Klassen


Dr. Ruzena Bajcsy was probably one of the few television viewers whose mind was not on Monica Lewinsky during President Clinton’s State of the Union Address last January. For her, the dramatic high point of the speech was contained in the following sentence: “I propose a 28 percent increase in long-term computing research.” Bajcsy’s interest derived from the fact that she was then a little more than a month into a new job that will have her guiding federally sponsored computing research into the year 2000. In December 1998, she was named to a two-year appointment as director of Computer and Information Science and Engineering (CISE) at the National Science Foundation.
    Psychologically, the leap from Philadelphia to Washington was wrenching for Bajcsy, a member of Penn’s Department of Computer and Information Science since 1972 and chair for nearly five years; when first informed of the appointment in September, her reaction was mixed. “I was flattered at the selection because it’s very prestigious. So I was excited,” she says. On the other hand, “I was worried because it’s a different type of work. All my life I did research. I worked in the lab and made measurements and built algorithms. And this new job is really marketing research–a very different thing.” And then there was the matter of becoming well versed–quickly–in all the broader issues of computer science. “I am a specialist in robotics and machine perception,” she says, “and now I have to learn about the problems in other areas in which I’m not an expert, like software, reliability, security, scalability, the network.”
    Bajcsy is the sixth CISE director, and the first woman to hold the position–the latest in a string of firsts for her. When she received her doctorate in electrical engineering from Slovak Technical University in Czechoslovakia in 1967, she became the first woman in Slovakia to earn a Ph.D. That same year she moved on to Stanford University, studying artificial intelligence (AI) under John McCarthy, a pioneer in the field (in fact, he coined the term). When the Soviets invaded Czechoslovakia, Bajcsy chose not to return and stayed on at Stanford to earn a second doctorate. Along the way, she wrote one of the first computer programs that enabled machines to recognize textured patterns.
    In 1979, seven years after coming to Penn, Bajcsy drew on her experiences at Stanford when she established the General Robotic Active Sensory Perception (GRASP) laboratory, which has since achieved international stature in the robotics community. “I wanted to create an experimental environment in which people can cooperate with different disciplines, and students can learn from each other,” she remembers. “I was very much motivated by my time in the AI lab at Stanford. I thought that was an exciting environment, with so many different kinds of people around. I wanted to copy that.”

What Robots Can Do

   Much of Bajcsy’s own research at GRASP has focused on connecting machine perception with action. To understand what that means for robots, consider the problem of computer vision: How do robots look at things? Sixteen years ago, Bajcsy had the idea that a robot’s visual perception, its knowledge of the environment, would be improved if it could actively adapt to the scene around it. Bajcsy offers this analogy with human vision: “We do not just see; we look. Our pupils adjust to the level of illumination; our eyes bring the world into sharp focus; we move our heads or change our position to get a better view of something; and sometimes we even put on spectacles.” Percepts–those impressions of objects that we collect with our senses–don’t just fall onto our eyes like rain falls onto the ground, Bajcsy points out. The same should hold true for a sensor on a robot.

    How would that work in practice? Think of a camera mounted on a robot. The robot moves its head or its body–there’s the action–to help the camera collect information–that’s the perception. In turn the camera’s visual information is fed to a computer that controls the robot’s motion: a tidy feedback loop. The overriding idea is that actively changing a sensor’s parameters (like the camera’s position or focus) helps the robot adapt to an uncertain environment.
    That link between perception and action was a new way of looking at computer vision in the early 1980s. In 1983 Bajcsy worked her theory into a new research paradigm that she called “active perception.” The concept has become Bajcsy’s signature in the robotics community and just two years ago was a factor in her election to the National Academy of Engineering.
    In recent years, Bajcsy and her co-workers in the GRASP lab have taken the concept of computer vision a step further with another question: How can you use computer vision to gather information about the objects in a three-dimensional world? The answer to this question is making reality out of “tele-immersion,” a new technology that seems the fabric of fantasy. Using tele-immersion, you could have a meeting with people scattered across the country or across the globe, and each of you would feel as if you were in the same physical space. You’d be in a shared, simulated environment, but the person you “see” across the room would look real, not like an animated image. Real enough to make out the textures and vagaries of hair and skin and clothes. Real enough to touch.
    The principle that will make this technology work is called “stereo-reconstruction.” In each of the remote locations, a set of video cameras–at least two–takes pictures of the people and the objects in the room, capturing movements and measuring distances from the camera. The next step is putting together a three-dimensional reconstruction of the scene, incorporating changes as quickly as they happen. For that the GRASP tele-immersion group is refining a special algorithm that recovers all of the image information from the cameras. Then all the reconstructions can be projected into a virtual world. Or the 3-D reconstructions can be integrated into a real environment for a mind-bending taste of mixed reality. (Visit the lab’s Web site at www.cis.upenn.edu/ ~grasp/research.htm for a sample.)
    Dr. Kostas Daniilidis, Bajcsy’s colleague in this project, says engineers could use tele-immersion in the collaborative design of mechanical parts. He sees other applications as well–for example, in the entertainment industry: A dancer in Chicago and a dancer in New York could train together in a virtual ballroom. But those are in the future. For now, tele-immersion is seen as a test. The GRASP lab is part of the National Tele-Immersion Initiative, an academic consortium with plans to use tele-immersion as the toughest available technical challenge for Internet2, the university-led research and education network meant to develop pre-commercial technologies and enhance the federal government’s Next Generation Internet. One reason tele-immersion will be such a good test of Internet2 is that it draws on a very wide bandwidth (data-transfer rate) that current Internet technology can’t support.
    The reconstruction issue comes up again in another project of Bajcsy’s, this one on cooperative systems, which looks at how robots interact with each other and with people. Consider, say, a team of robots marching through an unknown environment. “The question is how much autonomy you want to give a system when the different members are supposed to cooperate,” Bajcsy says. “It’s like in the Boy Scouts: Meet me at place X. How much do you need to communicate to get there? And what if one of your members gets stuck? What do you do then?”
    Bajcsy has developed some mathematical models of this cooperation and autonomy and implemented them in the robots. But smoothing the robots’ paths to point X calls for a very accurate reconstruction of the environment by their sensors. And the familiar duet of action and perception comes into play here too. It turns out that different “perception strategies” will apply in different dynamic situations; in other words, the robot has to figure out the best way to avoid running into a wall.
    Using motion parallax–the apparent change in an object’s position when you view it from two different spots–works pretty well in open spaces. For a robot making a turn, stereo–the use of two cameras that work like a pair of eyes–is a better way to keep from tripping over obstacles. Our own biological systems work in much the same way. The challenge with a robot, says Bajcsy, is designing an automatic switch that, depending on the changing environment, will select the best perception strategy to help the robot reach its goal. “Once we have good perception,” she adds, “the control of the robots is easy.”
    Here’s one example of what a cooperative system has actually done in the GRASP lab: two mobile robots work together to carry a large object–a box or a pipe, say–and move across a room. As they move, first one robot leads the way; then they change their configuration and move side by side; then one robot takes the lead again. And all the while they hold the box together and traverse obstacles in their path.
    Understanding active cooperation like this is important, says Bajcsy, because it will help us understand what makes intelligent behavior. When computer scientists talk about making intelligent machines, they use the term “artificial intelligence.” Bajcsy views artificial intelligence as a discipline that tries to understand what intelligence really is and includes “a whole gamut of activities,” like perception, action, reasoning and learning. AI does not mean copying humans, she believes: “Machines can do some activities better than humans. They can multiply faster, do mathematical operations, remember more. But humans are more flexible, and that’s the puzzle. How do we understand this flexibility?
   Bajcsy’s mentor at Stanford, John McCarthy, agrees that you can’t point to one thing that will answer the question of whether or not a machine is intelligent. Though we define intelligence by relating it to human capabilities, “You have to ask which aspects of intelligence have been understood well enough to put into computer programs,” he says. “AI researchers are free to use methods that are not observed in people or that involve much more computing than people can do.”
    Most of us would probably agree that a machine can be called intelligent at some level. But there’s another element of human intelligence called consciousness. Can machines have that? That’s a murky area, says Bajcsy, and one that the philosopher Daniel Dennett, author of Consciousness Explained, among other books, would call a frontier of science.
    AI was built on the premise that we can model and implement whatever is rational behavior, she says. Now researchers are making progress toward trying to understand the emotional aspect. And that’s where consciousness comes in: “We all know that your emotional state can influence your rational behavior. But how do you put it into mathematical or formal terms; that’s really the question. When you cry or smile, we can measure your heartbeat, your perspiration, your temperature. All the lie detectors, for example, are based on this. So with emotions we have all these indirect observables. But are they the right observables? That is really the issue–that we cannot crawl inside you and measure what is going on.”
    McCarthy views consciousness from a different angle. Conscious machines, he says, will need the power of self-observation: “Computer programs will have to be able to look at themselves and evaluate their own intentions. They’ll have to decide whether their line of thinking is successful or unsuccessful, the way people do.”

What Congress Should Do

    However you view it, the science of robotics seems to demand an understanding of human behavior, and that’s a resource that Bajcsy will also need on Capitol Hill. John Hennessy, dean of engineering at Stanford University and chair of the search committee that selected Bajcsy, knew that the job would draw on all her “people” skills. “We were looking for someone who would work well with the other assistant directors at NSF, in engineering as well as science. And I think that’s a thing Ruzena can do well,” Hennessy says. “We were looking, also, for someone with her stature and recognition among the computer-science and engineering-research community, so that people would have faith in her doing a good job.”
    Bajcsy handles a budget this year of $330 million, which she hopes to increase substantially over her two-year term. For that, she seems to be in the right place at the right time. NSF will be the lead agency in a Clinton Administration initiative called Information Technology for the 21st Century (IT2). In February, NSF requested a record $4 billion for fiscal year 2000, with plans to increase Bajcsy’s CISE budget by $146 million.
    The impetus for this new surge of commitment to IT comes largely from an August 1998 report from the President’s Information Technology Advisory Committee (PITAC), which was set up two years ago to help the administration identify which technologies would keep the United States a frontrunner in IT. Its interim report to the president warned that the government’s research agenda was, in essence, myopic and dangerously under-funded. PITAC concluded that federal investments in IT research and development must increase, and that the focus must shift away from short-term projects toward long-term fundamental research. The difference, says Hennessy, is like “thinking about how we’re going to run the Internet when there are a billion people on it, not thinking about what the next service will be in the next few months.” That’s an apt analogy; it was federal investment in basic research during the Eisenhower and Kennedy administrations that eventually brought the Internet to life. Lately, according to PITAC, only about 5 percent of the government’s IT budget has been spent on projects that extend beyond five years.
    The committee recommended an additional $1 billion over the next five years to fuel long-term IT research. That sort of investment is not likely to come from industry, which is directing its support toward the development of commercial products with short-term profits. Yet information technology has emerged as a potent player in the nation’s economy; NSF cites estimates (from IT analysts) that the IT industry has spurred about a third of all U.S. economic growth over the past 10 years.
    Bajcsy says her first challenge as the new CISE director will be “convincing the Congress to give us the money that the White House is proposing. We have to go to the Hill and convince them that it’s a good idea to spend money this way. And I’ll be very busy with that until October.” If all goes well, “the next challenge is how to energize the scientific community to do the best work they can. In other words, how to spend the money once we get it–if we get it–so that there is indeed some significant progress made,” she says.
    In deciding how to distribute the funds available, Bajcsy can turn to the PITAC for some guidelines. The committee has named four recommended research priorities, which Bajcsy translates smoothly–she’s had practice–for the less computer literate:
    Software: “Software is what runs on every computer and what makes the computer do things. We need to improve our understanding of software, especially large-scale systems, and look at issues like software design, production and reliability.”
    Scalable information infrastructures: “Scalability comes up again and again in computer systems, with respect to the speed of the machine, the size of the data and the programs, and the different sizes of machines and how they can work together.”
    High-end computing: “This means research in advanced supercomputing. It’s really for these big problems like simulations of A-bombs, global changes in the environment and weather forecasting.”
    Socio-economic and workforce issues: “This category is looking at the effect of computer technology on people. How will that affect their daily lives?”
    As the person holding the purse strings, Bajcsy knows that she’ll be accountable for the direction CISE takes in choosing projects for funding. “I’ll be charting where this field should go. Money can influence these things, so I feel very responsible,” she says.
    With her own history of looking at research problems in a new way, Bajcsy thinks that the NSF peer-review system, in which established researchers in an applicant’s field review and grade the research proposal, tends to be “a little conservative. Fundamentally it’s a good process, because it gives you checks and balances. But there are some bad effects. If you have a very new idea, then people get skeptical–an attitude, unfortunately, of ‘it wasn’t invented here.’ But I’m trying to influence that now, especially with the new IT initiative. I will try to push more of the risky and imaginative projects.”
    If Congress okays the budget for the IT initiative, the additional funding will mean that CISE can award more grants. Right now roughly 20 percent to 30 percent of research proposals are approved, and Bajcsy thinks that’s too little. She’d also like to see more of that money channeled toward the support of young researchers.
    Some 20 years ago, the same concern inspired her to start the GRASP lab. “Part of my motivation in creating this lab was to make an environment where young people can really flourish. I take a tremendous pleasure out of that,” she says. “I like to see these young people have a place where they can really use their imaginations and test them against reality.”
    Use your imagination and then deal with reality. That could be a good recipe for Bajcsy’s next two years in Washington, too.


Sonia Ellis EAS’86 is a freelance writer on science and technology. She last wrote for the Gazette in June 1998 on the El Niño weather phenomenon.

Share Button

    Related Posts

    Demographic Winter Is Coming
    Sanctions Imposed on Law Professor Amy Wax
    Tyshawn Sorey Wins Pulitzer Prize