Skip to content

Add recurrent SNN docs, tests, and benchmark scaffold#3140

Open
Ikaikaalika wants to merge 1 commit intoml-explore:mainfrom
Ikaikaalika:snn-contrib-bootstrap
Open

Add recurrent SNN docs, tests, and benchmark scaffold#3140
Ikaikaalika wants to merge 1 commit intoml-explore:mainfrom
Ikaikaalika:snn-contrib-bootstrap

Conversation

@Ikaikaalika
Copy link

Summary

  • add a recurrent-temporal usage section to the NN docs with a chunked streaming pattern
  • add recurrent regression tests for dtype propagation, gradient parity, and long-sequence numerical finiteness
  • add a benchmark scaffold comparing unrolled tanh RNN and LIF hard-reset workloads across configurable shapes/dtypes

Validation

  • python -m pytest python/tests/test_nn.py -k "recurrent_dtype_propagation or recurrent_gradient_parity or recurrent_long_sequence_stability" -q
  • python benchmarks/python/recurrent_snn_bench.py --batch-sizes 1 --sequence-lengths 8 --hidden-sizes 16 --input-size 8 --dtypes float32 --warmup 1 --iters 2

Notes

  • benchmark sample output artifacts under benchmarks/python/results/ were intentionally left uncommitted

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant