]> git.ipfire.org Git - thirdparty/rspamd.git/commit
[Feature] Add GELU activation and expose dropout in KANN bindings
authorVsevolod Stakhov <vsevolod@rspamd.com>
Tue, 20 Jan 2026 14:20:51 +0000 (14:20 +0000)
committerVsevolod Stakhov <vsevolod@rspamd.com>
Tue, 20 Jan 2026 14:20:51 +0000 (14:20 +0000)
commit24fcf1f82beb35638e302aac080a1db3e43b14cf
tree9bc94b9640b77a505caf1e80498aac38eca25a40
parentf4cfde49ec22fa4ce4bc9724b9391700e5ec80aa
[Feature] Add GELU activation and expose dropout in KANN bindings

- Implement GELU (Gaussian Error Linear Unit) activation function
  using erf: GELU(x) = 0.5 * x * (1 + erf(x / sqrt(2)))
- Add proper forward and backward passes for GELU
- Register GELU as operation #37 in kad_op_list
- Expose dropout layer to Lua (function existed but wasn't registered)
- Add Lua bindings for rspamd_kann.transform.gelu

GELU is often better than ReLU for transformer-like architectures
and high-dimensional embedding inputs.
contrib/kann/kautodiff.c
contrib/kann/kautodiff.h
src/lua/lua_kann.c