• Conference
  • Engineering and Numerical Tools

Towards Green AI : Assessing the Robustness of Conformer and Transformer Models under Compression

Conférence : Communications avec actes dans un congrès international

Today, transformer and conformer models are commonly used in end-to-end speech recognition. Generally, conformer models are more efficient than transformers, but both suffer from large sizes, and expensive computing cost making their use environmentally unfriendly.
In this paper, we propose compressing these models using quantization and pruning, evaluating size and computing time improvements while monitoring performance. Our experiments on LibriSpeech data confirm that without compression, conformer models achieve lower error rates than transformers. However, after compression, transformer models maintain more stable performance compared to conformers. Consequently, we conclude that transformer models are more robust to compression and better suited for resource-constrained use cases.