Follow
Llion Jones
Llion Jones
Verified email at google.com
Title
Cited by
Cited by
Year
Attention is all you need
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Advances in neural information processing systems 30, 2017
535652017
Natural questions: a benchmark for question answering research
T Kwiatkowski, J Palomaki, O Redfield, M Collins, A Parikh, C Alberti, ...
Transactions of the Association for Computational Linguistics 7, 453-466, 2019
9942019
Tensor2tensor for neural machine translation
A Vaswani, S Bengio, E Brevdo, F Chollet, AN Gomez, S Gouws, L Jones, ...
arXiv preprint arXiv:1803.07416, 2018
5052018
The best of both worlds: Combining recent advances in neural machine translation
MX Chen, O Firat, A Bapna, M Johnson, W Macherey, G Foster, L Jones, ...
arXiv preprint arXiv:1804.09849, 2018
3982018
Attention is all you need. arXiv 2017
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762, 2017
3832017
ProtTrans: towards cracking the language of lifes code through self-supervised deep learning and high performance computing
A Elnaggar, M Heinzinger, C Dallago, G Rehawi, Y Wang, L Jones, ...
IEEE transactions on pattern analysis and machine intelligence, 2021
3172021
One model to learn them all
L Kaiser, AN Gomez, N Shazeer, A Vaswani, N Parmar, L Jones, ...
arXiv preprint arXiv:1706.05137, 2017
3092017
Advances in neural information processing systems
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
Red Hook: Curran Associates, Inc, 5998-6008, 2017
2912017
Character-level language modeling with deeper self-attention
R Al-Rfou, D Choe, N Constant, M Guo, L Jones
Proceedings of the AAAI conference on artificial intelligence 33 (01), 3159-3166, 2019
2792019
Attention is all you need. 2017
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762, 2017
2102017
Lingvo: a modular and scalable framework for sequence-to-sequence modeling
J Shen, P Nguyen, Y Wu, Z Chen, MX Chen, Y Jia, A Kannan, T Sainath, ...
arXiv preprint arXiv:1902.08295, 2019
1482019
Wikireading: A novel large-scale language understanding task over wikipedia
D Hewlett, A Lacoste, L Jones, I Polosukhin, A Fandrianto, J Han, ...
arXiv preprint arXiv:1608.03542, 2016
1462016
Attention is all you need. CoRR abs/1706.03762 (2017)
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
1362017
Attention is all you need. CoRR
A Vaswani, N Shazeer, N Parmar, J Uszkoreit, L Jones, AN Gomez, ...
arXiv preprint arXiv:1706.03762, 2017
352017
Accurate supervised and semi-supervised machine reading for long documents
D Hewlett, L Jones, A Lacoste, I Gür
Proceedings of the 2017 Conference on Empirical Methods in Natural Language …, 2017
212017
CodeTrans: Towards Cracking the Language of Silicon's Code Through Self-Supervised Deep Learning and High Performance Computing
A Elnaggar, W Ding, L Jones, T Gibbs, T Feher, C Angerer, S Severini, ...
arXiv preprint arXiv:2104.02443, 2021
202021
DF-Conformer: Integrated architecture of Conv-TasNet and Conformer using linear complexity self-attention for speech enhancement
Y Koizumi, S Karita, S Wisdom, H Erdogan, JR Hershey, L Jones, ...
2021 IEEE Workshop on Applications of Signal Processing to Audio and …, 2021
152021
Byte-level machine reading across morphologically varied languages
T Kenter, L Jones, D Hewlett
Proceedings of the AAAI conference on artificial intelligence 32 (1), 2018
152018
Attention-based sequence transduction neural networks
NM Shazeer, AN Gomez, LM Kaiser, JD Uszkoreit, LO Jones, NJ Parmar, ...
US Patent 10,452,978, 2019
102019
Attention-based sequence transduction neural networks
NM Shazeer, AN Gomez, LM Kaiser, JD Uszkoreit, LO Jones, NJ Parmar, ...
US Patent 10,719,764, 2020
22020
The system can't perform the operation now. Try again later.
Articles 1–20