|Author||Grace, A. E. ♦ Pycock, D.|
|Subject Domain (in DDC)||Computer science, information & general works ♦ Data processing & computer science|
|Subject Keyword||Multiresolution Active Contour Model ♦ Surface Texture ♦ Edge Strength ♦ Stereoscopic Road Surface View ♦ Conventional Technique ♦ Smoothness Functional ♦ Traditional Approach ♦ External Constraint ♦ Sparse Range Data ♦ Stereoscopic Image ♦ Place Initial Edge Point ♦ Textured Stereo Image ♦ Internal Constraint ♦ Maximum Likelihood Estimate ♦ Open Active Contour Model ♦ Low Resolution Version ♦ Inherent Texture ♦ 3-d Road Defect ♦ Light Stripe ♦ Intensity Image ♦ Textured Surface|
|Description||In British Machine Vision Conference
This paper presents a method for generating sparse range data from textured surfaces which have structured light projected onto them. The work is motivated by the need to measure 3-D road defects rapidly and reliably. Traditional approaches to computing range from stereoscopic images have replied on either smooth or finely textured surfaces when using structured light. Conventional techniques that take advantage of the inherent texture in the images are not applicable. This is because corresponding stereoscopic road surface views are dissimilar due to the geometry of the cameras and the surface texture. The method described places initial edge points in a low resolution version of the intensity image. These points are used to initialise open active contour models or snakes which are propagated via a pyramid to a higher resolution. At this higher resolution, internal and external constraints are applied to the snake; the internal constraint being a smoothness functional and the external one being based on a maximum likelihood estimate of the edge strength across each light stripe. Computation is spatially localised at each stage and thus this algorithm could easily be parallelised. 1.
|Educational Role||Student ♦ Teacher|
|Age Range||above 22 year|
|Education Level||UG and PG ♦ Career/Technical Study|
|Learning Resource Type||Article|
National Digital Library of India (NDLI) is a virtual repository of learning resources which is not just a repository with search/browse facilities but provides a host of services for the learner community. It is sponsored and mentored by Ministry of Education, Government of India, through its National Mission on Education through Information and Communication Technology (NMEICT). Filtered and federated searching is employed to facilitate focused searching so that learners can find the right resource with least effort and in minimum time. NDLI provides user group-specific services such as Examination Preparatory for School and College students and job aspirants. Services for Researchers and general learners are also provided. NDLI is designed to hold content of any language and provides interface support for 10 most widely used Indian languages. It is built to provide support for all academic levels including researchers and life-long learners, all disciplines, all popular forms of access devices and differently-abled learners. It is designed to enable people to learn and prepare from best practices from all over the world and to facilitate researchers to perform inter-linked exploration from multiple sources. It is developed, operated and maintained from Indian Institute of Technology Kharagpur.
NDLI is a conglomeration of freely available or institutionally contributed or donated or publisher managed contents. Almost all these contents are hosted and accessed from respective sources. The responsibility for authenticity, relevance, completeness, accuracy, reliability and suitability of these contents rests with the respective organization and NDLI has no responsibility or liability for these. Every effort is made to keep the NDLI portal up and running smoothly unless there are some unavoidable technical issues.
Ministry of Education, through its National Mission on Education through Information and Communication Technology (NMEICT), has sponsored and funded the National Digital Library of India (NDLI) project.
For any issue or feedback, please write to email@example.com