Optics letters | 2021

Optical random phase dropout in a diffractive deep neural network.

 
 
 
 

Abstract


Unitary learning is a backpropagation (BP) method that serves to update unitary weights in fully connected deep complex-valued neural networks, meeting a prior unitary in an active modulation diffractive deep neural network. However, the square matrix characteristic of unitary weights in each layer results in its learning belonging to a small-sample training, which produces an almost useless network that has a fairly poor generalization capability. To alleviate such a serious over-fitting problem, in this Letter, optical random phase dropout is formulated and designed. The equivalence between unitary forward and diffractive networks deduces a synthetic mask that is seamlessly compounded with a computational modulation and a random sampling comb called dropout. The dropout is filled with random phases in its zero positions that satisfy the Bernoulli distribution, which could slightly deflect parts of transmitted optical rays in each output end to generate statistical inference networks. The enhancement of generalization benefits from the fact that massively parallel full connection with different optical links is involved in the training. The random phase comb introduced into unitary BP is in the form of conjugation, which indicates the significance of optical BP.

Volume 46 20
Pages \n 5260-5263\n
DOI 10.1364/OL.428761
Language English
Journal Optics letters

Full Text