International Transaction Journal of Engineering, Management, & Applied Sciences & Technologies

Archives

TuEngr+Logo
:: International Transaction Journal of Engineering, Management, & Applied Sciences & Technologies

http://TuEngr.com



ISSN 2228-9860
eISSN 1906-9642
CODEN: ITJEA8


FEATURE PEER-REVIEWED ARTICLE

Vol.14(1)(2023)

  • Towards Autonomous Micropipette Positioning in Eye Surgery by Employing Deep Learning Algorithm in Micro-Cannulation

    Mukesh Madanan, Nurul Akhmal Mohd Zulkefli (Department of Computer Science, Dhofar University, Salalah, OMAN.).

    Discipline: Artificial Intelligence, Healthcare, Applied Information Technology

    ➤ FullText

    doi: 10.14456/ITJEMAST.2023.1

    Keywords:Artificial Intelligence;Machine Learning;Deep Learning;Robotic Surgery;Eye Surgery;Micro-cannulation;Enhanced Guassian Filtering;Bee Colony Optimization;CNN;Image Processing

    Abstract
    Eye surgery, more precisely the retinal micro-surgery involves both sensory as well as motor skills. This is confined within human boundaries along with physiological limits for maintaining consistent steadiness, the ability to feel small forces and accuracy. Despite these assumptions to leverage robots in all types of surgery, multitudes of challenges have to be confronted to reach complete development. The deployment of robotic assistance in ophthalmologic surgery also faces the same challenge. This work focuses on the autonomous positioning of a micropipette that is to be mounted on a surgical robot for performing eye surgery. Initially, multiple microscopic images of the given micropipette along with its shadow are collected. These images are treated or filtered by using the Enhanced Gaussian Filtering (EGF) method. The so-obtained filtered image is partitioned or segmented by Bee Colony Optimization (BCO) into three segments: micropipette, eye ground and shadow of the micropipette. A new Modified Convolutional Neural Network (MCNN) is leveraged by the robot to perform eye surgery that learns the microscopic images with their ground truth. This MCNN uses automatic feature extraction and estimates micropipette regions with their shadow by examining a microscopic image and its tip. This is tapped for developing autonomous position control in robots. The selected micropipette is found to be positioned at a 99.56% success rate with a mean distance of 1.37 mm from the eye ground that is simulated.

    Paper ID: 14A1A

    Cite this article:

    Madanan, M., Zulkefli, N. A. (2023). Towards Autonomous Micropipette Positioning in Eye Surgery by Employing Deep Learning Algorithm in Micro-Cannulation. International Transaction Journal of Engineering, Management, & Applied Sciences & Technologies, 14(1), 14A1A, 1-20. http://TUENGR.COM/V14/14A1A.pdf DOI: 10.14456/ITJEMAST.2023.1

References

  1. Amodeo A, Linares Quevedo A, Joseph JV, Belgrano E, Patel HRH. Robotic laparoscopic surgery: cost and training. Minerva Urol Nefrol. 2009;61:121-8.
  2. Leal Ghezzi T, Campos Corleta O. 30 Years of Robotic Surgery. World J Surg. 2016;40(10):2550-7.
  3. Ma, R., Vanstrum, E. B., Lee, R., Chen, J., & Hung, A. J. (2020). Machine learning in the optimization of robotics in the operative field. Current opinion in urology, 30 (6), 808.
  4. De Smet MD, Naus GJL, Faridpooya K, Mura M. Robotic-assisted surgery in ophthalmology. Curr Opin Ophthalmol. 2018;29:248-53.
  5. Urias, M. G., Patel, N., He, C., Ebrahimi, A., Kim, J. W., Iordachita, I., & Gehlbach, P. L. (2019). Artificial intelligence, robotics and eye surgery: are we overfitted?. International Journal of Retina and Vitreous, 5 (1), 1-4.
  6. Sunshine S, Balicki M, He X, Olds K, Kang J, Gehlbach P, et al. A force-sensing microsurgical instrument that detects forces below human tactile sensation. Retina. 2013.
  7. Marban A, Srinivasan V, Samek W, Fernandez J, Casals A. A recurrent convolutional neural network approach for sensorless force estimation in robotic surgery. Biomed Signal Process Control. 2019;50:134-50.
  8. Levine S, Pastor P, Krizhevsky A, Ibarz J, Quillen D. Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection. Int J Rob Res. 2018;37(4-5):421-36.
  9. Meenink, H. C. M. (2011). Vitreo-retinal eye surgery robot: sustainable precision. Technische Universiteit Eindhoven. https://doi.org/10.6100/IR717725
  10. Tayama, T., Kurose, Y., Marinho, M. M., Koyama, Y., Harada, K., Omata, S., ... & Mitsuishi, M. (2018, July). Autonomous positioning of eye surgical robot using the tool shadow and kalman filtering. In 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (pp. 1723-1726). IEEE.
  11. Li, P., Hou, X., Wei, L., Song, G., & Duan, X. (2018, August). Efficient and low-cost Deep-Learning based gaze estimator for surgical robot control. In 2018 IEEE International Conference on Real-time Computing and Robotics (RCAR) (pp. 58-63). IEEE.
  12. Li, Z., Shahbazi, M., Patel, N., O'Sullivan, E., Zhang, H., Vyas, K., ... & Taylor, R. H. (2020). Hybrid robot-assisted frameworks for endomicroscopy scanning in retinal surgeries. IEEE transactions on medical robotics and bionics, 2 (2), 176-187.
  13. Qiu, H., Li, Z., Yang, Y., Xin, C., & Bian, G. B. (2020). Real-time iris tracking using deep regression networks for robotic ophthalmic surgery. IEEE Access, 8, 50648-50658.
  14. W. Zhang, M. L. Smith, L. N. Smith and A. Farooq, "Gender and gaze gesture recognition for human-computer interaction", Comput. Vis. Image Understand. , vol. 149, pp. 32-50, Aug. 2016.
  15. H. Heo, W. O. Lee, J. W. Lee, K. R. Park, E. C. Lee and M. Whang, "Object recognition and selection method by gaze tracking and SURF algorithm", Proc. Int. Conf. Multimedia Signal Process., vol. 1, pp. 261-265, May 2011.
  16. K. Tamura, K. Hashimoto and Y. Aoki, "Head pose-invariant eyelid and iris tracking method", Electron. Commun. Jpn., vol. 99, no. 2, pp. 19-27, 2016.
  17. B. Li and H. Fu, "Real time eye detector with cascaded convolutional neural networks", Appl. Comput. Intell. Soft Comput., vol. 2018, pp. 1-8, Apr. 2018.
  18. H. Kannan, "Eye tracking for the iPhone using deep learning", 2017.
  19. F. Wolfgang, S. Thiago, K. Gjergji and K. Enkelejda, "PupilNet: Convolutional neural networks for robust pupil detection", Revista De Odontologia Da Unesp, vol. 19, no. 1, pp. 806-821, 2016.
  20. S. Hoffman, R. Sharma and A. Ross, "Convolutional neural networks for iris presentation attack detection: Toward cross-dataset and cross-sensor generalization", Proc. IEEE/CVF Conf. Comput. Vis. Pattern Recognit. Workshops (CVPRW) , pp. 1620-1628, Jun. 2018.
  21. He, C., Patel, N., Shahbazi, M., Yang, Y., Gehlbach, P., Kobilarov, M., & Iordachita, I. (2019). Toward safe retinal microsurgery: Development and evaluation of an rnn-based active interventional control framework. IEEE Transactions on Biomedical Engineering, 67 (4), 966-977.
  22. Keller, B., Draelos, M., Zhou, K., Qian, R., Kuo, A. N., Konidaris, G., & Izatt, J. A. (2020). Optical coherence tomography-guided robotic ophthalmic microsurgery via reinforcement learning from demonstration. IEEE Transactions on Robotics, 36 (4), 1207-1218.
  23. Zhou, M., Wang, X., Weiss, J., Eslami, A., Huang, K., Maier, M., ... & Nasseri, M. A. (2019, May). Needle localization for robot-assisted subretinal injection based on deep learning. In 2019 International Conference on Robotics and Automation (ICRA) (pp. 8727-8732). IEEE.
  24. Hutchinson S, Hager GD, Corke P. A Tutorial on Visual Servo Control.IEEE Trans Rob Autom; 1996: vol. 12, no. 5, p. 651-670.


Other issues:
Vol.13(12)(2022)
Vol.13(11)(2022)
Vol.13(10)(2022)
Archives




Call-for-Papers

Call-for-Scientific Papers
Call-for-Research Papers:
ITJEMAST invites you to submit high quality papers for full peer-review and possible publication in areas pertaining engineering, science, management and technology, especially interdisciplinary/cross-disciplinary/multidisciplinary subjects.

To publish your work in the next available issue, your manuscripts together with copyright transfer document signed by all authors can be submitted via email to Editor @ TuEngr.com (please see all detail from Instructions for Authors)



Publication and peer-reviewed process:
After the peer-review process, articles will be on-line published in the available next issue. However, the International Transaction Journal of Engineering, Management, & Applied Sciences & Technologies cannot guarantee the exact publication time as the process may take longer time, subject to peer-review approval and adjustment of the submitted articles.