Neuromorphic decoding of sample image representations by the boundary-consistent interpolation method
- Авторлар: Kershner V.A.1
-
Мекемелер:
- Kotel’nikov Institute of Radio Engineering and Electronics, Russian Academy of Sciences
- Шығарылым: Том 69, № 12 (2024)
- Беттер: 1183-1190
- Бөлім: ТЕОРИЯ И МЕТОДЫ ОБРАБОТКИ СИГНАЛОВ
- URL: https://rjonco.com/0033-8494/article/view/682389
- DOI: https://doi.org/10.31857/S0033849424120064
- EDN: https://elibrary.ru/HNBTUV
- ID: 682389
Дәйексөз келтіру
Аннотация
The paper discusses methods for encoding and decoding large amounts of data using a neuromorphic model based on known neuromechanisms for the perception of visual information. Known mechanisms of the visual system, such as aggregation of counts by receptive fields, central-lateral inhibition, etc., have been studied. A decoding model has been developed that implements the function of simple cells of the primary visual cortex responsible for spatial perception of stimulus contrasts. The proposed decoding model makes it possible to restore local boundaries of objects in an image, while improving the visual quality of images in comparison with the quality of restoration with classical bilinear interpolation.
Толық мәтін

Авторлар туралы
V. Kershner
Kotel’nikov Institute of Radio Engineering and Electronics, Russian Academy of Sciences
Хат алмасуға жауапты Автор.
Email: vladkershner@mail.ru
Ресей, Mokhovaya Str., 11, Build. 7, Moscow, 125009
Әдебиет тізімі
- Lu Z., Huang D., Bai L. et al. // arXiv preprint arXiv:2304.13023. 2023. https://doi.org/10.48550/arXiv.2304.13023
- Pinkston J. T. // IEEE Trans. 1969. V. IT-15. № 1 P. 66. https://doi.org/10.1109/TIT.1969.1054274
- Milner D., Goodale M. The Visual Brain in Action. Oxford: Univ. Press, 2006. https://doi.org/10.1093/acprof: oso/9780198524724.001.0001
- Antsiperov V., Kershner V. // Pattern Recognition Applications and Methods, ICPRAM 2021–2022. Lecture Notes in Computer Sci. P. 13822. Cham: Springer, 2023. https://doi.org/10.1007/978-3-031-24538-1_3
- Yang M., Sun X., Jia F. et al. // Polymers. 2022. V. 14. № 10. Р. 2019. https://doi.org/10.3390/polym14102019
- Keeler H. P. Notes on the Poisson Point Process. Technical Report. Berlin: Weierstrass Inst. 2016. 36 p. https://hpaulkeeler.com/wp-content/uploads/2018/08/PoissonPointProcess.pdf
- Antsiperov V. // Proc. 11th Int. Conf. on Pattern Recognition Applications and Methods – ICPRAM. 3–5 Feb. 2022. Setúbal: SciTePress – Science and Technology Publ., 2022. P. 354. https://doi.org/10.5220/0010836800003122
- Latecki L. J., Lakamper R., Eckhardt T. // Proc. IEEE Conf. Computer Vision and Pattern Recognition, CVPR-2000. Hilton Head Island. 15 Jun. N.Y.: IEEE, 2000. P. 424. https://doi.org/10.1109/CVPR.2000.855850
- Hubel D. H., Wiesel T. N. Brain and Visual Perception: The Story of a 25-year Collaboration. Oxford: Univ. Press, 2004. https://doi.org/10.1016/0001-6918(64)90136-2
- Keller A. J., Roth M. M., Scanziani M. // Nature. 2020. V. 582. № 7813. Р. 545. https://doi.org/10.1038/s41586-020-2319-4
- Hoon M., Okawa H., Santina L. D., Wong R. O. // Progress in Retinal and Eye Research. 2014. V. 42. Р. 44. https://doi.org/10.1016/j.preteyeres.2014.06.003
- Antsiperov V. // Proc. 12th Int. Conf. on Pattern Recognition Applications and Methods (ICPRAM 2023). Lisbon. 22–24 Feb. 2023. Setúbal: SciTePress – Science and Technology Publ., 2023. P. 517. https://doi.org/10.5220/0011792800003411
- Fish J., Wagner G. J., Keten S. // Nature Mater. 2021. V. 20. № 6. Р. 774. https://doi.org/10.1038/s41563-020-00913-0
- Ranstam J., Cook J. A. // J. British Surgery. 2018. V. 105. № 10. Р. 1348. https://doi.org/10.1002/bjs.10895
- Tam W. S., Kok C. W., Siu W. C. // J. Electron. Imaging. 2010. V. 19. № 1. Р. 013011. https://doi.org/10.1117/1.3358372
- Marr D., Hildreth E. // Proc. Royal Society of London. Ser. B. Biol Sci. 1980. V. 207. № 1167. P. 187. https://doi.org/10.1098/rspb.1980.0020. PMID: 6102765.
- Yu S., Zhang R., Wu Sh. et al. // Biomedical Engineering Online. 2013. V. 12. Р. 1. https://doi.org/10.1186/1475-925X-12-102
Қосымша файлдар
