Image Analogies

Aaron Hertzmann New York University / Microsoft Research
Charles E. Jacobs Microsoft Research
Nuria Oliver Microsoft Research
Brian Curless Univerity of Washington
David H. Salesin Univerity of Washington / Microsoft Research

Abstract

This paper describes a new framework for processing images by example, called ``image analogies.'' The framework involves two stages: a design phase, in which a pair of images, with one image purported to be a ``filtered'' version of the other, is presented as ``training data''; and an application phase, in which the learned filter is applied to some new target image in order to create an ``analogous'' filtered result. Image analogies are based on a simple multiscale autoregression, inspired primarily by recent results in texture synthesis. By choosing different types of source image pairs as input, the framework supports a wide variety of ``image filter'' effects, including traditional image filters, such as blurring or embossing; super-resolution, in which a higher-resolution image is inferred from a low-resolution source; improved texture synthesis, in which some textures are synthesized with better coherence than previous approaches; texture transfer, in which images are ``texturized'' with some arbitrary source texture; artistic filters, in which various drawing and painting styles are synthesized based on scanned real-world examples; and texture-by-numbers, in which realistic scenes, composed of a variety of textures, are created using a simple painting interface.

SIGGRAPH 2001 paper:
PDF, Full resolution (35 MB)
PDF, 300 dpi (8 MB)
PDF, 72 dpi (1.25 MB)

Project Page: Image analogies (many more results!)

Project Page: Non-photorealistic rendering


Copyright © 2001 Aaron Hertzmann, Charles E. Jacobs, Nuria Oliver, Brian Curless, David H. Salesin