Multimode fiber-based greyscale image projector enabled by neural networks with high generalization ability

  • Jian Wang
  • , Guangchao Zhong
  • , Daixuan Wu
  • , Sitong Huang
  • , Zhi Chao Luo
  • , Yuecheng Shen*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

Multimode fibers (MMFs) are emerging as promising transmission media for delivering images. However, strong mode coupling inherent in MMFs induces difficulties in directly projecting two-dimensional images through MMFs. By training two subnetworks named Actor-net and Model-net synergetically, [Nature Machine Intelligence 2, 403 (2020)] alleviated this issue and demonstrated projecting images through MMFs with high fidelity. In this work, we make a step further by improving the generalization ability to greyscale images. The modified projector network contains three subnetworks, namely forward-net, backward-net, and holography-net, accounting for forward propagation, backward propagation, and the phaseretrieval process. As a proof of concept, we experimentally trained the projector network using randomly generated phase maps and their corresponding resultant speckle images output from a 1-meter-long MMF. With the network being trained, we successfully demonstrated projecting binary images from MNIST and EMNIST and greyscale images from Fashion-MNIST, exhibiting averaged Pearson's correlation coefficients of 0.91, 0.92, and 0.87, respectively. Since all these projected images have never been seen by the projector network before, a strong generalization ability in projecting greyscale images is confirmed.

Original languageEnglish
Pages (from-to)4839-4850
Number of pages12
JournalOptics Express
Volume31
Issue number3
DOIs
StatePublished - 30 Jan 2023
Externally publishedYes

Fingerprint

Dive into the research topics of 'Multimode fiber-based greyscale image projector enabled by neural networks with high generalization ability'. Together they form a unique fingerprint.

Cite this