Latent Space Transformers for Generalizing Deep Networks

Hamed Farkhari, Joseanne Viana, Nidhi, Luis Miguel Campos, Pedro Sebastiao, Albena Mihovska, Purnima Lala Mehta, Luis Bernardo

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


Sharing information between deep networks is not a simple task nowadays. In a traditional approach, researchers change and train layers at the end of a pretrained deep network while the other layers remain the same to adapt it to their purposes or develop a new deep network. In this paper, we propose a novel concept for interoperability in deep networks. Generalizing such networks' usability will facilitate the creation of new hybrid models promoting innovation and disruptive use cases for deep networks in the fifth generation of wireless communications (5G) networks and increasing the accessibility, usability, and affordability for these products. The main idea is to use standard latent space transformation to share information between such networks. First, each deep network should be split into two parts by creators. After that, they should provide access to standard latent space. As each deep network should do that, we suggest the standard for the procedure. By adding the latent space, we can combine two deep networks using the latent transformer block, the only block that needs to train while connecting different pretrained deep networks. The results from the combination create a new network with a unique ability. This paper contributes to a concept related to the generalization of deep networks using latent transformers, optimizing the utilization of the edge and cloud in 5G telecommunication, controlling load balancing, saving bandwidth, and decreasing the latency caused by cumbersome computations. We provide a review of the current standardization associated with deep networks and Artificial Intelligence in general. Lastly, we present some use cases in 5G supporting the proposed concept.
Original languageEnglish
Title of host publication2021 IEEE Conference on Standards for Communications and Networking, CSCN 2021
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Number of pages6
ISBN (Electronic)9781665423496
Publication statusPublished - 2021
Event2021 IEEE Conference on Standards for Communications and Networking, CSCN 2021 - Virtual, Online, Greece
Duration: 15 Dec 202117 Dec 2021


Conference2021 IEEE Conference on Standards for Communications and Networking, CSCN 2021
CityVirtual, Online


  • Deep learning
  • latent space
  • sharing information
  • standardization


Dive into the research topics of 'Latent Space Transformers for Generalizing Deep Networks'. Together they form a unique fingerprint.

Cite this