]>
git.ipfire.org Git - thirdparty/rspamd.git/commit
[Feature] Add GELU activation and expose dropout in KANN bindings
- Implement GELU (Gaussian Error Linear Unit) activation function
using erf: GELU(x) = 0.5 * x * (1 + erf(x / sqrt(2)))
- Add proper forward and backward passes for GELU
- Register GELU as operation #37 in kad_op_list
- Expose dropout layer to Lua (function existed but wasn't registered)
- Add Lua bindings for rspamd_kann.transform.gelu
GELU is often better than ReLU for transformer-like architectures
and high-dimensional embedding inputs.