LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
M. A. Lebedev; D. G. Stepaniants; D. V. Komarov; O. V. Vygolov; Yu. V. Vizilter; S. Yu. Zheltov (2014)
Publisher: Copernicus Publications
Journal: The International Archives of the Photogrammetry
Languages: English
Types: Article
Subjects: TA1-2040, T, TA1501-1820, Applied optics. Photonics, Engineering (General). Civil engineering (General), Technology

Classified by OpenAIRE into

arxiv: Computer Science::Computer Vision and Pattern Recognition
ACM Ref: ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION
The paper addresses a promising visualization concept related to combination of sensor and synthetic images in order to enhance situation awareness of a pilot during an aircraft landing. A real-time algorithm for a fusion of a sensor image, acquired by an onboard camera, and a synthetic 3D image of the external view, generated in an onboard computer, is proposed. The pixel correspondence between the sensor and the synthetic images is obtained by an exterior orientation of a "virtual" camera using runway points as a geospatial reference. The runway points are detected by the Projective Hough Transform, which idea is to project the edge map onto a horizontal plane in the object space (the runway plane) and then to calculate intensity projections of edge pixels on different directions of intensity gradient. The performed experiments on simulated images show that on a base glide path the algorithm provides image fusion with pixel accuracy, even in the case of significant navigation errors.
  • No references.
  • No related research data.
  • No similar publications.