Follow
Alex Labach
Alex Labach
Verified email at mail.utoronto.ca
Title
Cited by
Cited by
Year
Survey of dropout methods for deep neural networks
A Labach, H Salehinejad, S Valaee
arXiv preprint arXiv:1904.13310, 2019
2162019
Survey of dropout methods for deep neural networks. arXiv 2019
A Labach, H Salehinejad, S Valaee
arXiv preprint arXiv:1904.13310, 1904
171904
A framework for neural network pruning using Gibbs distributions
A Labach, S Valaee
GLOBECOM 2020-2020 IEEE Global Communications Conference, 1-6, 2020
72020
Survey of dropout methods for deep neural networks (2019)
A Labach, H Salehinejad, S Valaee
arXiv preprint arXiv:1904.13310, 1904
61904
Duett: Dual event time transformer for electronic health records
A Labach, A Pokhrel, XS Huang, S Zuberi, SE Yi, M Volkovs, T Poutanen, ...
Machine Learning for Healthcare Conference, 403-422, 2023
32023
MultiResFormer: Transformer with Adaptive Multi-Resolution Modeling for General Time Series Forecasting
L Du, J Xin, A Labach, S Zuberi, M Volkovs, RG Krishnan
arXiv preprint arXiv:2311.18780, 2023
2023
Effective Self-Supervised Transformers For Sparse Time Series Data
A Labach, A Pokhrel, SE Yi, S Zuberi, M Volkovs, RG Krishnan
2022
Regularizing Neural Networks by Stochastically Training Layer Ensembles
A Labach, S Valaee
2020 IEEE 30th International Workshop on Machine Learning for Signal …, 2020
2020
13 Neural network sparsification using Gibbs measures
A Labach, S Valaee
Edge Intelligence Workshop 711 (23), 10, 2020
2020
Gibbs Pruning: A Framework for Structured and Unstructured Neural Network Pruning
AJ Labach
University of Toronto (Canada), 2020
2020
The system can't perform the operation now. Try again later.
Articles 1–10