Linear Anti-interference Algorithm for Digital Signal Transmission in Fiber Optic Communication Networks based on Link Analysis

Main Article Content

Jing Wu
Cheng Jin
Ziwu Wang


In order to achieve accurate transmission of protection signals in fiber optic communication networks, it is necessary to perform channel balancing configuration of fiber optic communication networks and adaptive forwarding control processing of relay protection signals, the author proposes an accurate transmission method for relay protection signals in fiber optic communication networks based on time-varying multipath fading suppression and adaptive beamforming. The system analyzes the sources of wireless long-distance pain signal interference signals, introduces anti-interference technologies such as two-dimensional joint processing (STAP), provides anti-interference algorithms and related gain analysis, and conducts signal processing gain simulation using MATLAB. Based on the analysis of comprehensive simulation results, at a given symbol length, the signal bandwidth increases, and the processing gain infinitely approaches the given theoretical limit value, rather than increasing nonlinearly. The reason is that the channel is affected by noise, and the channel estimation value and signal conjugate multiplication produce a noise quadratic term. At this point, the estimated value of the coherent region channel is reduced by the influence of noise, and the signal-to-noise ratio loss caused by the noise quadratic term is reduced, so the processing gain increases. During the process of infinite increase in signal bandwidth, the input signal-to-noise power ratio of the receiver tends to decrease towards an infinite value, limited by the size of the coherent region. The channel estimation value increases under the influence of noise, and the noise quadratic term is the main factor affecting the output noise power. When the symbol length is greater than the coherent time, the smaller the maximum Doppler frequency shift and the larger the coherent detection area, the greater the processing gain.

Article Details

Special Issue - Deep Learning-Based Advanced Research Trends in Scalable Computing