Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kristin J. Dana is active.

Publication


Featured researches published by Kristin J. Dana.


ACM Transactions on Graphics | 1999

Reflectance and texture of real-world surfaces

Kristin J. Dana; Bram van Ginneken; Shree K. Nayar; Jan J. Koenderink

In this work, we investigate the visual appearance of real-world surfaces and the dependence of appearance on the geometry of imaging conditions. We discuss a new texture representation called the BTF (bidirectional texture function) which captures the variation in texture with illumination and viewing direction. We present a BTF database with image textures from over 60 different samples, each observed with over 200 different combinations of viewing and illumination directions. We describe the methods involved in collecting the database as well as the importqance and uniqueness of this database for computer graphics. A related quantity to the BTF is the familiar BRDF (bidirectional reflectance distribution function). The measurement methods involved in the BTF database are conducive to simultaneous measurement of the BRDF. Accordingly, we also present a BRDF database with reflectance measurements for over 60 different samples, each observed with over 200 different combinations of viewing and illumination directions. Both of these unique databases are publicly available and have important implications for computer graphics.


computer vision and pattern recognition | 2001

Compact representation of bidirectional texture functions

Oana G. Cula; Kristin J. Dana

A bidirectional texture function (BTF) describes image texture as it varies with viewing and illumination direction. Many real world surfaces such as skin, fur, gravel, etc. exhibit fine-scale geometric surface detail. Accordingly, variations in appearance with viewing and illumination direction may be quite complex due to local foreshortening, masking and shadowing. Representations of surface texture that support robust recognition must account for these effects. We construct a representation which captures the underlying statistical distribution of features in the image texture as well as the variations in this distribution with viewing and illumination direction. The representation combines clustering to learn characteristic image features and principle components analysis to reduce the space of feature histograms. This representation is based on a core image set as determined by a quantitative evaluation of importance of individual images in the overall representation. The result is a compact representation and a recognition method where a single novel image of unknown viewing and illumination direction can be classified efficiently. The CUReT (Columbia-Utrecht reflectance and texture) database is used as a test set for evaluation of these methods.


International Journal of Computer Vision | 2004

3D Texture Recognition Using Bidirectional Feature Histograms

Oana G. Cula; Kristin J. Dana

Textured surfaces are an inherent constituent of the natural surroundings, therefore efficient real-world applications of computer vision algorithms require precise surface descriptors. Often textured surfaces present not only variations of color or reflectance, but also local height variations. This type of surface is referred to as a 3D texture. As the lighting and viewing conditions are varied, effects such as shadowing, foreshortening and occlusions, give rise to significant changes in texture appearance. Accounting for the variation of texture appearance due to changes in imaging parameters is a key issue in developing accurate 3D texture models. The bidirectional texture function (BTF) is observed image texture as a function of viewing and illumination directions. In this work, we construct a BTF-based surface model which captures the variation of the underlying statistical distribution of local structural image features, as the viewing and illumination conditions are changed. This 3D texture representation is called the bidirectional feature histogram (BFH). Based on the BFH, we design a 3D texture recognition method which employs the BFH as the surface model, and classifies surfaces based on a single novel texture image of unknown imaging parameters. Also, we develop a computational method for quantitatively evaluating the relative significance of texture images within the BTF. The performance of our methods is evaluated by employing over 6200 texture images corresponding to 40 real-world surface samples from the CUReT (Columbia-Utrecht reflectance and texture) database. Our experiments produce excellent classification results, which validate the strong descriptive properties of the BFH as a 3D texture representation.


International Journal of Computer Vision | 1999

Bidirectional Reflection Distribution Function of Thoroughly Pitted Surfaces

Jan J. Koenderink; Andrea J. van Doorn; Kristin J. Dana; Shree K. Nayar

We derive the BRDF (Bidirectional Reflection Distribution Function) at the mega scale of opaque surfaces that are rough on the macro and micro scale. The roughness at the micro scale is modeled as a uniform, isotropically scattering, Lambertian surface. At the macro scale the roughness is modeled by way of a distribution of spherical concavities. These pits influence the BRDF via vignetting, cast shadow, interreflection and interposition, causing it to differ markedly from Lambertian. Pitted surfaces show strong backward scattering (so called “opposition effect”). When we assume that the macro scale can be resolved, the radiance histogram and the spatial structure of the textons of the textured surface (at the mega scale) can be calculated. This is the main advantage of the model over previous ones: One can do exact (numerical) calculations for a surface geometry that is physically realizable.


International Journal of Computer Vision | 2005

Skin Texture Modeling

Oana G. Cula; Kristin J. Dana; Frank P. Murphy; Babar K. Rao

Quantitative characterization of skin appearance is an important but difficult task. The skin surface is a detailed landscape, with complex geometry and local optical properties. In addition, skin features depend on many variables such as body location (e.g. forehead, cheek), subject parameters (age, gender) and imaging parameters (lighting, camera). As with many real world surfaces, skin appearance is strongly affected by the direction from which it is viewed and illuminated. Computational modeling of skin texture has potential uses in many applications including realistic rendering for computer graphics, robust face models for computer vision, computer-assisted diagnosis for dermatology, topical drug efficacy testing for the pharmaceutical industry and quantitative comparison for consumer products. In this work we present models and measurements of skin texture with an emphasis on faces. We develop two models for use in skin texture recognition. Both models are image-based representations of skin appearance that are suitably descriptive without the need for prohibitively complex physics-based skin models. Our models take into account the varied appearance of the skin with changes in illumination and viewing direction. We also present a new face texture database comprised of more than 2400 images corresponding to 20 human faces, 4 locations on each face (forehead, cheek, chin and nose) and 32 combinations of imaging angles. The complete database is made publicly available for further research.


acm/ieee international conference on mobile computing and networking | 2010

Challenge: mobile optical networks through visual MIMO

Ashwin Ashok; Marco Gruteser; Narayan B. Mandayam; Jayant Silva; Michael Varga; Kristin J. Dana

Mobile optical communications has so far largely been limited to short ranges of about ten meters, since the highly directional nature of optical transmissions would require costly mechanical steering mechanisms. Advances in CCD and CMOS imaging technology along with the advent of visible and infrared (IR) light sources such as (light emitting diode) LED arrays presents an exciting and challenging concept which we call as visual-MIMO (multiple-input multiple-output) where optical transmissions by multiple transmitter elements are received by an array of photodiode elements (e.g. pixels in a camera). Visual-MIMO opens a new vista of research challenges in PHY, MAC and Network layer research and this paper brings together the networking, communications and computer vision fields to discuss the feasibility of this as well as the underlying opportunities and challenges. Example applications range from household/factory robotic to tactical to vehicular networks as well pervasive computing, where RF communications can be interference-limited and prone to eavesdropping and security lapses while the less observable nature of highly directional optical transmissions can be beneficial. The impact of the characteristics of such technologies on the medium access and network layers has so far received little consideration. Example characteristics are a strong reliance on computer vision algorithms for tracking, a form of interference cancellation that allows successfully receiving packets from multiple transmitters simultaneously, and the absence of fast fading but a high susceptibility to outages due to line-of-sight interruptions. These characteristics lead to significant challenges and opportunities for mobile networking research


international conference on computer vision | 2001

BRDF/BTF measurement device

Kristin J. Dana

Capturing surface appearance is important for a large number of applications. Appearance of real world surfaces is difficult to model as it varies with the direction of illumination as well as the direction from which it is viewed. Consequently, measurements of the BRDF (bidirectional reflectance distribution function) have been important. In addition, many applications require measuring how the entire surface reflects light, i.e. spatially varying BRDF measurements are important as well. For compactness we refer to a spatially varying BRDF as a BTF (bidirectional texture function). Measurements of BRDF and/or BTF typically require significant resources in time and equipment. In this work, a device for BRDF/BTF measurement is presented that is compact, economical and convenient. The device uses the approach of curved mirrors to remove the need for hemispherical positioning of the camera and illumination source. Instead, simple planar translations of optical components are used to vary the illumination direction and to scan the surface. Furthermore, the measurement process is fast because the device enables simultaneous measurements of multiple viewing directions.


IEEE Transactions on Biomedical Engineering | 2004

Bidirectional imaging and modeling of skin texture

Oana G. Cula; Kristin J. Dana; Frank P. Murphy; Babar K. Rao

In this paper, we present a method of skin imaging called bidirectional imaging that captures significantly more properties of appearance than standard imaging. The observed structure of the skins surface is greatly dependent on the angle of incident illumination and the angle of observation. Specific protocols to achieve bidirectional imaging are presented and used to create the Rutgers Skin Texture Database (clinical component). This image database is the first of its kind in the dermatology community. Skin images of several disorders under multiple controlled illumination and viewing directions are provided publicly for research and educational use. Using this skin texture database, we employ computational surface modeling to perform automated skin texture classification. The classification experiments demonstrate the usefulness of the modeling and measurement methods.


computer vision and pattern recognition | 1998

Histogram model for 3D textures

Kristin J. Dana; Shree K. Nayar

Image texture can arise not only from surface albedo variations (2D texture) but also from surface height variations (3D texture). Since the appearance of 3D texture depends on the illumination and viewing direction in a complicated manner, such image texture can be called a bidirectional texture function. A fundamental representation of image texture is the histogram of pixel intensities. Since the histogram of 3D texture also depends on the illumination and viewing directions in a complex fashion, we refer to it as a bidirectional histogram. In this work, we present a concise analytical model for the bidirectional histogram of Lambertian, isotropic, randomly rough surfaces, which are common in real-world scenes. We demonstrate the accuracy of the histogram model by fitting to several samples from the Columbia-Utrecht texture database. The parameters obtained from the model fits are roughness measures which can be used in texture recognition schemes. In addition, the model has potential application in estimating illumination direction in scenes where surfaces of known tilt and roughness are visible. We demonstrate the usefulness of our model by employing it in a novel 3D texture synthesis procedure.


international conference on computer vision | 1999

Correlation model for 3D texture

Kristin J. Dana; Shree K. Nayar

While an exact definition of texture is somewhat elusive, texture can be qualitatively described as a distribution of color, albedo or local normal on a surface. In the literature, the word texture is often used to describe a color or albedo variation on a smooth surface. We refer to such texture as 2D texture. In real world scenes, texture is often due to surface height variations and can be termed 3D texture. Because of local foreshortening and masking, oblique views of 3D texture are not simple transformations of the frontal view. Consequently, texture representations such as the correlation function or power spectrum are also affected by local foreshortening and masking. This work presents a correlation model for a particular class of 3D textures. The model characterizes the spatial relationship among neighboring pixels in an image of 3D texture and the change of this spatial relationship with viewing direction.

Collaboration


Dive into the Kristin J. Dana's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge