Package: attention Title: Self-Attention Algorithm Version: 0.4.0 Authors@R: person("Bastiaan", "Quast", , "bquast@gmail.com", role = c("aut", "cre"), comment = c(ORCID = "0000-0002-2951-3577")) Description: Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) , Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) "Attention and Memory in Deep Learning". License: GPL (>= 3) Encoding: UTF-8 RoxygenNote: 7.2.3 Suggests: covr, knitr, rmarkdown, testthat (>= 3.0.0) VignetteBuilder: knitr Config/testthat/edition: 3 NeedsCompilation: no Packaged: 2023-11-10 00:29:14 UTC; bquast Author: Bastiaan Quast [aut, cre] () Maintainer: Bastiaan Quast Repository: CRAN Date/Publication: 2023-11-10 03:10:02 UTC