I turned the simulation to compare the turbo coding in the SISO AWGN environment with the FER performance of the LDPC in the current LTEA compliant file.
Simulation results show that LDPC’s FER performance was better in CQI 1, 2, 3, 14, and 15 CQI indexes, but the rest of CQI indexes confirmed graphs with better turbo coding performance.
I was expecting LDPC to perform better in all CQI indexes, but I’m very confused.
To compare the corresponding coding, I used the Linear-Log-MAP algorithm as a decoder in Turbo coding and the PWL-Min-Sum algorithm for LDPC.
In the simulation results, I would appreciate it if you could tell me if LDPC is inferior to Turbo coding or if there is anything else to consider except channel code and decoder in LDPC.
Did you adjust the decoding iterations? for turbo, 8 iterations are adequate; however, for LDPC, and based on the implemented decoder (serial-scheulding), you should use at least 16 decoding iterations.
In general, and at moderate code-rates and block lengths, 5G-LDPC and LTE-turbo codes have similar performance.
Viewing 2 posts - 1 through 2 (of 2 total)
The forum ‘Vienna 5G Link Level Simulator’ is closed to new topics and replies.