Automatic evaluation of the internal parameters of the onboard camera of a spacecraft from video data of dockings with the iss

Cover Page

Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription Access

Abstract

The KURS radio engineering system for measuring motion parameters during rendezvous and docking has some disadvantages: the accuracy of measurement with multiple reflections of the wave can drop, the technical equipment is available on both docking vehicles (active and passive parts), it is expensive both in terms of energy resources and in terms of cost. An analysis of existing visual systems has shown that such systems successfully solve the problems of visual odometry on UAVs, robots, and similar devices. However, to use such systems, it is necessary to know the internal parameters of the camera (calibration). Classical calibration using a checkerboard pattern is difficult to perform in outer space. In connection with all of the above, this paper proposes methods for estimating the focal length of the camera, based on the analysis of the available video sequence with the footage of the process of rendezvous of spacecraft. The proposed approaches are based on the maximum likelihood method (MLE) and maximum a posteriori estimation (MAP) of the functional depending on the Euler angles and focal length. The results of these methods are compared, showing the advantages of MAP over MLE and the possibility of their practical application.

About the authors

V. A. Zinov

Institute for Information Transmission Problem RAS; Moscow Institute of Physics and Technology (National Research University)

Email: konovalenko@iitp.ru
Russia, 127051, Moscow, Bolshoy Karetnyy Pereulok, 19; Russia, 141700, Dolgoprudny, Institutskiy Pereulok, 9

I. A. Konovalenko

Moscow Institute of Physics and Technology (National Research University)

Author for correspondence.
Email: konovalenko@iitp.ru
Russia, 141700, Dolgoprudny, Institutskiy Pereulok, 9

References

  1. Bakhshiev A.V., Korban P.A., Kirpan N.A. Programmnyi kompleks opredeleniya prostranstvennoi orientatsii ob"ektov po televizionnomu izobrazheniyu v zadache kosmicheskoi stykovki [Software package for determining the spatial orientation of object by TV picture in the problem space docking.] Ekstremalnaya robototekhnika [Extreme Robotics]. 2013. S. 288–293 (in Russian).
  2. Boguslavskii A.A., Sokolov S.M. Sistema informatsionnogo obespecheniya zadach sblizheniya, stykovki, posadki kosmicheskogo apparata na osnove komp’yuternogo videniya [Dataware system for problem of approaching and docking, planting space vehicle on the basic of computer vision]. Mekhanika, upravlenie i informatika. 2011. № 6. S. 140–156 (in Russian).
  3. Bokhoeva L.A., Kurokhtin V.Yu. Opredelenie parametrov vnutrennei kalibrovki kamery sistemy tekhnicheskogo zreniya [Determination of internal calibration parameters of the vision system camera]. MEKHANIKI XXI VEKA. 2016. № 15. S. 133–138 (in Russian).
  4. Goshin Y.V., Fursov V.A. Solving a camera autocalibration problem with a conformed identification method. Computer optics. 2012. T. 36. №4. S. 605–610.
  5. Konovalenko I.A., Faradjev I.A., Shemyakina. Yu.A. Otsenka tochki skhoda otrezkov metodom maksimal’nogo pravdopodobiya [Calculation of a vanishing point by the maximum likelihood estimation method]. Vestnik YuUrGU MMP. 2020. V. 13. № 1. S. 107–117 (in Russian).
  6. Medvedev S.B., Saigiraev Kh.U., Sazonov V.V. Modelirovanie zon neustoichivoi raboty radiotekhnicheskoi izmeritel’noi sistemy s aktivnym otvetom vo vremya sblizheniya i stykovki kosmicheskikh korablei s mezhdunarodnoi kosmicheskoi stantsiei [ Modeling of zones of unstable work of radiosystem with active response during reapproaching and joining spaceships with international space station]. Matematicheskoe modelirovanie [Mathematical Models and Computer Simulations]. 2012. T. 24. № 2. S. 151–160 (in Russian).
  7. Miller B.M., Stepanyan K.V., Popov A.K., Miller A.B. Navigatsiya BPLA na osnove posledovatel’nostei izobrazhenii, registriruemykh bortovoi videokameroi [UAV navigation based on videosequences captured by the onboard video camera]. Avtomatika i telemekhanika [Automation and Remote Control]. 2017. № 12. S. 141–153 (in Russian).
  8. Mueller K., Atman J., Trommer G.F. Sopostavlenie izobrazhenii s shirokoi bazovoi liniei i otslezhivanie traektorii BPLA pri ego priblizhenii k oknu zdaniya [Combination of wide baseline image matching and tracking for autonomous UAV approaches to a window]. Giroskopiya i navigatsiya [Gyroscopy and Navigation]. 2019. T. 27. № 4. S. 52–68 (in Russian).
  9. Popov A.K., Miller A.B., Stepanyan K.V., Miller B.M. Modelirovanie protsessa navigatsii bespilotnogo letatel’nogo apparata s ispol’zovaniem dvukh bortovykh kamer, smeshchennykh po vysote [Modeling of the unmanned aerial vehicle navigation on the basis of two height-shifted onboard cameras]. Sensornye sistemy [[Sensory systems]. 2018. T. 2. № 1. S. 19–25 (in Russian).
  10. Saigiraev Kh.U., Smirnov A.I., Sokolov S.M., Boguslavsky A.A., Sazonov V.V. Avtomaticheskii monitoring stykovki kosmicheskogo korablya s orbitalnoi stantsiei po videoinformatsii [Automatic vision-based monitoring of a spacecraft docking to the ordital station]. Preprinty IPM im. M. V. Keldysha. 2004. № 74. S. 23. (in Russian).
  11. Fischler M.A., Bolles R.C. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM. 1981. V. 24. № 6. P. 381–395. https://doi.org/10.1145/358669.358692
  12. Grush R. Spacex’s crew dragon capsule successfully docks to the ISS for the first time. 2019. URL: https://www.theverge.com/2019/3/3/18244501/spacex-crew-dragon-automatic-docking-international-space-station-nasa. (accessed: 2021-09-24.)
  13. Grush R. Spacex’s crew dragon successfully docks with the space station. 2020. URL: https://www.theverge.com/2020/5/31/21271269/spacex-docking-iss-crew-dragon-nasa-success. (accessed: 2021-09-25.)
  14. Hartley R. Self-calibration of stationary cameras. International Journal of Computer Vision. 1997. V. 1. № 22. P. 5–23. https://doi.org/10.1023/A:1007957826135
  15. Hartley R. In defence of the 8-point algorithm. Proc. of 5th International Conference on Computer Vision. 1995. P. 1064–1070. https://doi.org/10.1109/34.601246
  16. Heikkila J., Silven O. A four-step camera calibration procedure with implicit image correction. Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition. 1997. V. 36. № 4. P. 1106–1112. https://doi.org/10.1109/CVPR.1997.609468
  17. Karpenko S., Konovalenko I., Miller A., Miller B., Nikolaev D. Uav control on the basis of 3d landmark bearingonly observations. Sensors. 2015. № 15. P. 29802–29820. https://doi.org/10.3390/s151229768
  18. Konovalenko I., Miller A., Miller B., Nikolaev D. Uav navigation on the basis of the feature points detection on underlying surface. In Proceedings of the 29th European Conference on Modeling and Simulation (ECMS 2015). 2015. № 15 P. 499–505. https://doi.org/10.7148/2015-0499
  19. Lowe D.G. Object recognition from local scale-invariant features. Proceedings of the Seventh IEEE International Conference on Computer Vision. 1999. V. 2. P. 1150–1157. https://doi.org/10.1109/ICCV.1999.790410
  20. Medioni G., Kang S.B. Emerging topics in computer vision. 2004. P. 654.
  21. Nelder J.A. and Mead. A simplex method for function minimization. Computer journal. 1965. № 7. P. 308–313. https://doi.org/10.1093/comjnl/7.4.308
  22. Stein G. Accurate internal camera calibration using rotation, with analysis of sources of error. Computer Vision, Proceedings, Fifth International Conference on. 1995. https://doi.org/10.1109/ICCV.1995.466781
  23. Zhang Z. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2000. V. 22 № 11 P. 1330–1334. https://doi.org/10.1109/34.888718

Supplementary files

Supplementary Files
Action
1. JATS XML
2.

Download (1MB)
3.

Download (891KB)
4.

Download (31KB)
5.

Download (29KB)
6.

Download (41KB)
7.

Download (32KB)
8.

Download (31KB)

Copyright (c) 2023 Russian Academy of Sciences

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies