SIMO Cooperative Spectrum Sensing Scenario: A Dataset for Testing DNN-Based Models

Citation Author(s):
Salvatore
Serrano
University of Messina
Omar
Serghini
Cadi Ayyad University
Submitted by:
Salvatore Serrano
Last updated:
Tue, 06/13/2023 - 11:40
DOI:
10.21227/g55p-3h60
Data Format:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

This dataset focuses on cooperative spectrum sensing in a cognitive radio network, where multiple secondary users collaborate to detect the presence of a primary user. We introduce multiple cooperative spectrum sensing schemes based on a tree deep neural network architecture, incorporating a one-dimensional convolutional neural network and a long short-term memory network. The primary objective of these schemes is to effectively learn the activity pattern of the primary user. The dataset provides instructions for utilizing it, including how executing the code in Google Colab, how importing IPython notebooks, how generating training and test data, how training the models, and how testing the models to obtain ROC curves and calculate probability of detection (Pd) versus SNR curves. The dataset also includes pre-trained models, which can be easily loaded and evaluated using the provided Python notebook.

Instructions: 
  1. Utilize Google Colab for executing the code by accessing the following link: https://colab.research.google.com/
  2. Import the IPython notebooks into the appropriate environment.
  3. Generate the training data for the proposed deep neural network (DNN) schemes by executing the "DataTrain.ipynb"  IPython notebook and for the benchmark model by executing "DataTrain_paper.ipynb"  IPython notebook.
  4. Create the test data for the proposed DNN schemes by utilizing the "DataTest.ipynb"  IPython notebook and for the benchmark model by executing "DataTest_paper.ipynb"  IPython notebook.
  5. Train the models by executing the code within the "Training.ipynb"  IPython notebook.
  6. Test the models, obtain the ROC_curve, and calculate Pd versus SNR by executing the code in the "Testing.ipynb"  IPython notebook.
  7. In order to evaluate the pre-trained models, stored in the "trained_models" folder, you can simply load the ".h5" files by running the  IPython notebook named "Testing.ipynb" and utilizing the available test data.

We divide the data into two categories for different purposes. The first category is used to train and test the proposed DNN schemes. The second category consists of data for the benchmark model.
The reason for this separation is that these datasets require different data preprocessing methods and have distinct input shapes.

 

TRAIN

For the proposed deep neural network schemes, we have generated training and validation data for a scenario involving four secondary users (SUs), denoted by the letters u, x, y, and z. Filename syntax is "Xc[u,x,y,z]_train64x.npy" and Xc[u,x,y,z]_test64x.npy".
Each sample in this dataset has a length of 64 samples.

For the benchmark model, we have generated data with a sample length of 15 and for three SUs (x, y, and z) instead of four. Filename syntax is "Xc[x,y,z]_train15x.npy" and Xc[x,y,z]_test15x.npy".

TEST

The stored arrays for the proposed deep neural network schemes are labeled with X, while the stored arrays for the benchmark model are labeled with P.
The filnemane syntax is "Xc[SNR]_test[X,P].npy, where SNR digits represent the negative signal-to-noise ratio (SNR) values. For instance, an array named "Xc_test10X.npy" indicates an SNR value of -10dB for the proposed deep neural network schemes while "Xc14_testP.npy" indicates an SNR value of -14dB for the benchmark model.

Dataset Files

LOGIN TO ACCESS DATASET FILES