Show simple item record

AuthorBen Said A.
AuthorMohamed A.
AuthorElfouly T.
AuthorHarras K.
AuthorWang Z.J.
Available date2022-04-21T08:58:28Z
Publication Date2017
Publication NameIEEE Wireless Communications and Networking Conference, WCNC
ResourceScopus
Identifierhttp://dx.doi.org/10.1109/WCNC.2017.7925709
URIhttp://hdl.handle.net/10576/30113
AbstractIn this paper, we present a joint compression and classification approach of EEG and EMG signals using a deep learning approach. Specifically, we build our system based on the deep autoencoder architecture which is designed not only to extract discriminant features in the multimodal data representation but also to reconstruct the data from the latent representation using encoder-decoder layers. Since autoencoder can be seen as a compression approach, we extend it to handle multimodal data at the encoder layer, reconstructed and retrieved at the decoder layer. We show through experimental results, that exploiting both multimodal data intercorellation and intracorellation 1) Significantly reduces signal distortion particularly for high compression levels 2) Achieves better accuracy in classifying EEG and EMG signals recorded and labeled according to the sentiments of the volunteer. 2017 IEEE.
Languageen
PublisherInstitute of Electrical and Electronics Engineers Inc.
SubjectClassification (of information)
Compaction
Data compression
Decoding
Deep learning
Learning systems
mHealth
Signal encoding
Wireless telecommunication systems
Auto encoders
Compression approach
Encoder-decoder
High compressions
Joint compression and classification
Learning approach
Multi-modal data
Show through
Biomedical signal processing
TitleMultimodal deep learning approach for Joint EEG-EMG Data compression and classification
TypeConference Paper
dc.accessType Abstract Only


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record