Great success for two AI researchers from the Cyber Valley ecosystem: Michael Niemeyer, PhD student at the Max Planck Institute for Intelligent Systems, and Prof. Dr. Andreas Geiger from the University of Tübingen were honored with the ‘Best Paper Award’ at this year’s Conference on Computer Vision and Pattern Recognition (CVPR) for their paper ‘GIRAFFE: Representing Scenes as Compositional Generative Neural Feature Fields’.
GIRAFFE2021CVPR.pdf
The paper of the two scientists from Tübingen is basic research in the field of computer vision. In it, Niemeyer and Geiger describe a method they developed that for the first time now makes it possible for computers to independently identify various three-dimensional objects on two-dimensional images (e.g., photographs).
“Our goal is for computers to learn for themselves how images – we’re talking about scenes here – are constructed,” says Niemeyer, lead author of the paper and a scholar at the International Max Planck Research School for Intelligent Systems (IMPRS-IS), the Cyber Valley graduate school. “So, the computer should recognize which three-dimensional objects are present in a photo, that is, how a scene is structured.” While most current approaches rely on two-dimensional datasets and do not capture the imaged scenes in three dimensions, Niemeyer and Geiger’s new model, GIRAFFE, recognizes the imaged objects in three dimensions, allowing for all sorts of different perspectives on the same scene.
In concrete terms, this means that a simple photo showing a car parked in front of a house, for example, can serve as a basis for viewing the car not only from the front, but also from other angles in front of the house. In addition, the position of the car in the image can be shifted as desired, thereby obscuring or revealing different parts of the house in each case. “This is what we mean by ‘compositional’ – our computer model captures the structure of the scene being depicted. It recognizes the car in the foreground as one object and the house in the background as another,” Niemeyer says.
GIRAFFE also offers more control over image generation. For example, the shape of an imaged object can be changed at will. In the future, this could provide a much more realistic representation when generating virtual environments, among other things. “Our paper is a basic work,” Niemeyer emphasizes. “Other researchers could use this as a basis for constructing safer AI applications, since they can now make better decisions thanks to a better understanding of their environment.” One conceivable field of application here, for example, would be autonomous driving, where the computer systems of cars could in the future better detect which objects are present in their environment – and thus draw more precise conclusions about the properties and behaviors of these objects. In addition, Niemeyer and Geiger’s model enables improved artificial generation of data because it requires only two-dimensional photos as a starting point. Accordingly, AI systems could be trained in an artificially generated environment in the future.
CVPR is one of the most prestigious international research conferences and is considered the most important conference in the field of computer vision. Only about a quarter of the approximately 7,000 papers submitted are accepted to the conference. With a total of 27 accepted papers and four best paper nominations, Cyber Valley researchers are particularly well represented among the world leaders in basic AI research at CVPR this year.
“My research group has already published several papers that address learning appropriate 3D representations with neural networks, which have received widespread attention in the computer vision community,” Geiger says. “The paper that has now won the award also built on that.” Still, Geiger and Niemeyer didn’t expect this particular award at CVPR 2021. Niemeyer: “The fact that our paper was accepted to the conference was an achievement in itself. We are really very surprised that our scientific paper has now been recognized as the best among all the submissions. We are all the more pleased about this outstanding recognition!”
More information