%0 Journal Article %T 集成光电突触的衍射神经网络非线性仿真研究
Nonlinear Simulation of Diffraction Neural Networks Integrated with Optoelectronic Synapses %A 卢嘉英 %A 陈希 %J Modeling and Simulation %P 848-857 %@ 2324-870X %D 2025 %I Hans Publishing %R 10.12677/mos.2025.144335 %X 衍射深度神经网络(diffraction deep neural network, D2NN)因其低能耗、高速度及优异的抗干扰性能,在无透镜成像和图像分类等任务中表现出显著优势。然而,由于缺乏非线性激活层,D2NN的泛化能力和拟合能力在复杂任务中受限。为此,本文提出一种基于石墨烯纳米壁(graphene nanowalls, GNWs)光电突触的非线性衍射深度神经网络。通过在D2NN输出层引入GNWs光电突触,利用其灵敏的光电响应生成与光强相关的非线性光电流,从而实现新型激活函数super-softmax。数值模拟结果表明,无论输出层是否进行归一化,super-softmax激活函数的性能均优于传统的softmax激活函数。在单层非线性D2NN (64 × 64神经元)用于MNIST手写数字识别任务时,分类精度最高达95%。本研究为实现衍射深度神经网络的闭环在线学习提供了重要的理论支持。
Diffraction deep neural network (D2NN) has shown significant advantages in tasks such as lensless imaging and image classification due to their low energy consumption, high speed, and excellent interference resistance. However, the lack of nonlinear activation layers limits the generalization and fitting capabilities of D2NN in complex tasks. To address this issue, this paper proposes a nonlinear diffraction deep neural network based on graphene nanowalls (GNWs) optoelectronic synapses. By introducing GNWs optoelectronic synapses into the output layer of the D2NN, their sensitive optoelectronic response is utilized to generate a nonlinear photocurrent that is dependent on light intensity, thereby realizing a novel activation function, super-softmax. Numerical simulation results demonstrate that the performance of the super-softmax activation function is superior to that of the traditional softmax activation function, regardless of whether the output layer is normalized. When applied to a single-layer nonlinear D2NN (64 × 64 neurons) for the MNIST handwritten digit recognition task, the classification accuracy reaches up to 95%. This study provides important theoretical support for achieving closed-loop online learning in diffraction deep neural networks. %K 衍射深度神经网络, %K 石墨烯纳米壁光电突触, %K 非线性激活函数
Diffractive Deep Neural Network %K Graphene Nanowalls Optoelectronic Synapses %K Nonlinear Activation Function %U http://www.hanspub.org/journal/PaperInformation.aspx?PaperID=112825