Automatic differentiation library written in pure Vim script.
vim-autograd provides a foundation for automatic differentiation through the Define-by-Run style algorithm such as Chainer or PyTorch. Since it is written completely in pure Vim script, there are no dependencies.
This library allows us to create next-generation plugins with numerical computation of multidimensional arrays or deep learning using the gradient descent method.
If you are using vim-plug, can install as follows.
Plug 'pit-ray/vim-autograd'
If you want to use the more efficient Vim9 script, please install the experimental vim9 branch implementation.
Plug 'pit-ray/vim-autograd', {'branch': 'vim9'}
A computational graph is constructed by applying the provided differentiable functions to a Tensor object, and the gradient is calculated by backpropagating from the output.
function! s:f(x) abort
" y = x^5 - 2x^3
let y = autograd#sub(a:x.p(5), a:x.p(3).m(2))
return y
endfunction
function! s:example() abort
let x = autograd#tensor(2.0)
let y = s:f(x)
call y.backward()
echo x.grad.data
endfunction
call s:example()
Output
[56.0]
The computational graph is automatically generated like the below.
- Basic differentiation and computational graph visualization
- Higher-order differentiation using double-backprop
- Classification using deep learning
- oreilly-japan/deep-learning-from-scratch-3
- chainer/chainer
- pytorch/pytorch
- numpy/numpy
- mattn/vim-brain
This library is provided by MIT License.
- pit-ray