Facaded RNN
Network Configuration
Input Size:
Hidden Sizes (comma sep):
Output Size:
Learning Rate:
Cell Type:
Simple RNN
LSTM
GRU
Activation:
Sigmoid
Tanh
ReLU
Output Activation:
Sigmoid
Tanh
Softmax
Linear
Loss Function:
MSE
Cross-Entropy
Gradient Clip:
BPTT Steps (0=full):
Create Network
Regularization
Dropout Rate (0-1):
Save/Load Network
Save Network (.json)
Load Network (.json)
Load Training Data
Load CSV File
Or paste CSV data (input cols, then target cols per row):
Load Pasted CSV
Generate Test Data
Samples per class:
Train Network
Epochs:
Batch Size:
Validation Split:
Log Every N Epochs:
Train RNN
Train with Progress
Stop
🔧 RNN Facade API Explorer
Use the facade to inspect and modify the RNN internals:
Timestep:
Layer Index:
Neuron Index:
Weight Index:
Gate Type:
Forget (f)
Input (i)
Output (o)
Cell Candidate (c̃)
Update (z)
Reset (r)
Hidden Candidate (h̃)
Time-Step & Sequence Access
Get Hidden State
Get Output
Get Input Vector
Get PreActivation
Cell State & Gate Access (LSTM/GRU)
Get Cell State
Get Gate Value
Gradients & Optimizer
Get Weight Gradient
Get Bias Gradient
Get Cell Gradient
Get Optimizer State
Sequence APIs
Get Sequence Outputs
Get Sequence Hidden
Get Sequence Cells
Get Sequence Gates
Dropout & Regularization
Dropout Rate:
Set Dropout
Get Dropout Mask
Get LayerNorm Stats
State Manipulation
Value:
Reset Hidden
Reset Cell
Reset All
Inject Hidden
Set Hidden State
Set Output
Time-Series Diagnostics
Hidden State Histogram
Activation Over Time
Gate Saturation
Gradient Scales
Detect Vanishing
Detect Exploding
Network Info
Get Layer Count
Get Hidden Size
Get Cell Type
Get Sequence Length
Output
Facade output will appear here...
Predict
Input Sequence (rows separated by ; or newline, values by comma):
0.1,0.2,0.3,0.4;0.2,0.3,0.4,0.5;0.3,0.4,0.5,0.6
Predict Sequence