You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
a student of mine and me found a possibly quite severe bug during routine test evaluations of the PSC toolbox for NS3.
The evaluation in question is the provided wns3-2017-pssch scenario, which was introduced to evaluate the throughput for a simple sidelink communication for varying configurations.
One of the main analyzed parameters was the sub-band size, which was iterated from 2 to 50 RBs in steps of twos. The expected behavior would indicate an increasing throughput with larger sub-band sizes. (Plotted in Figure 5 of the complementary publication)
However, the data-rate calculations showed anomalies:
As one can see from the output, the data-rate follows changes in a "saw-tooth"-pattern w.r.t. to sub-band size, which is clearly a rather weird behavior and also different from the expected one.
A rather extensive search identified the tbSize as the main inconsistency, which was correctly reported on the MAC layer, but not on the PHY layer stats.
After some more digging, we found that the problem stemmed from a bad data type cast in the de-serialize function of the LTE SL Tag.
More specifically - Line 104/105 of the lte-sl-tag.cc file.
The tbSize is here cast into a uint_8 which will wrap around at 255, explaining the fall in data-rate as well as the saw-tooth like behavior.
Changing the casts to their correct data type resulted in the following output of the wns3-2017-pssch script:
The new results fit the data from the publication and are also more in line with the expected behavior of the LTE Sidelink communication.
Maybe someone from the official team would be available to comment on this possibily quite severe bug in the LTE PSC Sidelink toolbox?
This bug was found in the most current version of this toolbox.
Thanks and much thanks for your work.
Leonard Fisser
The text was updated successfully, but these errors were encountered:
fisserL
changed the title
Problematic SL-Tag tbsSize data cast
Problematic SL-Tag tbSize data cast
Jan 25, 2021
Hi,
a student of mine and me found a possibly quite severe bug during routine test evaluations of the PSC toolbox for NS3.
The evaluation in question is the provided
wns3-2017-pssch
scenario, which was introduced to evaluate the throughput for a simple sidelink communication for varying configurations.One of the main analyzed parameters was the sub-band size, which was iterated from 2 to 50 RBs in steps of twos. The expected behavior would indicate an increasing throughput with larger sub-band sizes. (Plotted in Figure 5 of the complementary publication)
However, the data-rate calculations showed anomalies:
As one can see from the output, the data-rate follows changes in a "saw-tooth"-pattern w.r.t. to sub-band size, which is clearly a rather weird behavior and also different from the expected one.
A rather extensive search identified the tbSize as the main inconsistency, which was correctly reported on the MAC layer, but not on the PHY layer stats.
After some more digging, we found that the problem stemmed from a bad data type cast in the de-serialize function of the LTE SL Tag.
More specifically - Line 104/105 of the
lte-sl-tag.cc
file.The tbSize is here cast into a uint_8 which will wrap around at 255, explaining the fall in data-rate as well as the saw-tooth like behavior.
Changing the casts to their correct data type resulted in the following output of the
wns3-2017-pssch
script:The new results fit the data from the publication and are also more in line with the expected behavior of the LTE Sidelink communication.
Maybe someone from the official team would be available to comment on this possibily quite severe bug in the LTE PSC Sidelink toolbox?
This bug was found in the most current version of this toolbox.
Thanks and much thanks for your work.
Leonard Fisser
The text was updated successfully, but these errors were encountered: