Abstract
We propose a variational Bayesian (VB) implementation of block-sparse Bayesian learning (BSBL) to compute proxy probability density functions (PDFs) that approximate the posterior PDFs of the weights and associated hyperparameters in a block-sparse linear model, resulting in an iterative algorithm coined variational BSBL (VA-BSBL). The priors of the hyperparameters are selected to belong to the family of generalized inverse Gaussian distributions. This family contains as special cases commonly used hyperpriors such as the Gamma and inverse Gamma distributions, as well as Jeffrey’s improper distribution.
Inspired by previous work on classical sparse Bayesian learning (SBL), we investigate the update stage in which the proxy PDFs of a single block of weights and of its associated hyperparameter are successively updated, while keeping the proxy PDFs of the other parameters fixed. This stage defines a nonlinear first-order recurrence relation for the mean of the proxy PDF of the hyperparameter. By iterating this relation “ad infinitum” we obtain a criterion that determines whether the so-generated sequence of hyperparameter means converges or diverges. Incorporating this criterion into the VA-BSBL algorithm yields a fast implementation, coined fast-BSBL (F-BSBL), which achieves a two-order-of-magnitude runtime improvement.
We further identify the range of the parameters of the generalized inverse Gaussian distribution which result in an inherent pruning procedure that switches off “weak” components in the model, which is necessary to obtain sparse results. Lastly, we show that expectation-maximization (EM)-based and VB-based implementations of BSBL are identical methods. Thus, we extend a well-known result from classical SBL to BSBL. Consequently, F-BSBL and BSBL using coordinate ascent to maximize the marginal likelihood coincide. These results provide a unified framework for interpreting existing BSBL methods.
Inspired by previous work on classical sparse Bayesian learning (SBL), we investigate the update stage in which the proxy PDFs of a single block of weights and of its associated hyperparameter are successively updated, while keeping the proxy PDFs of the other parameters fixed. This stage defines a nonlinear first-order recurrence relation for the mean of the proxy PDF of the hyperparameter. By iterating this relation “ad infinitum” we obtain a criterion that determines whether the so-generated sequence of hyperparameter means converges or diverges. Incorporating this criterion into the VA-BSBL algorithm yields a fast implementation, coined fast-BSBL (F-BSBL), which achieves a two-order-of-magnitude runtime improvement.
We further identify the range of the parameters of the generalized inverse Gaussian distribution which result in an inherent pruning procedure that switches off “weak” components in the model, which is necessary to obtain sparse results. Lastly, we show that expectation-maximization (EM)-based and VB-based implementations of BSBL are identical methods. Thus, we extend a well-known result from classical SBL to BSBL. Consequently, F-BSBL and BSBL using coordinate ascent to maximize the marginal likelihood coincide. These results provide a unified framework for interpreting existing BSBL methods.
| Original language | English |
|---|---|
| Pages (from-to) | 4856-4872 |
| Number of pages | 17 |
| Journal | IEEE Transactions on Signal Processing |
| Volume | 73 |
| Early online date | 29 Sept 2025 |
| DOIs | |
| Publication status | Published - 2025 |
Keywords
- multiple measurement vectors (MMV).
- Sparse Bayesian learning (SBL)
- sparse signal recovery
- variational Bayesian inference
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering
Fields of Expertise
- Information, Communication & Computing
Fingerprint
Dive into the research topics of 'Fast Variational Block-Sparse Bayesian Learning'. Together they form a unique fingerprint.Projects
- 2 Finished
-
SEAMAL Front - Securely Applied Machine Learning
Schreiber, H. (Project manager on research unit), Bischof, H. (Project manager on research unit), Witrisal, K. (Project manager on research unit), Freiberger, G. (Attendee / Assistant) & Schreiber, H. (Consortium manager resp. coordinator of internal research units)
1/10/20 → 30/09/23
Project: Research project
-
CD-Laboratory for Location-aware Electronic Systems
Wielandner, L. (Attendee / Assistant), Fuchs, A. (Attendee / Assistant), Venus, A. (Attendee / Assistant), Wilding, T. (Attendee / Assistant), Witrisal, K. (Consortium manager resp. coordinator with external organisations) & Grebien, S. J. (Attendee / Assistant)
1/01/18 → 31/12/25
Project: Research project
Research output
- 1 Doctoral Thesis
-
Detection and Estimation of Dispersive Target Signals
Möderl, J., 5 Sept 2024, 181 p.Research output: Thesis › Doctoral Thesis
Open Access
Cite this
- APA
- Standard
- Harvard
- Vancouver
- Author
- BIBTEX
- RIS