Complete LSTM Neural Network Library
A complete, production-ready LSTM (Long Short-Term Memory) neural network library for Rust. Features multiple optimizers, peephole connections, gradient clipping, and a clean API for training and inference.
Add rust-lstm to your Cargo.toml:
[dependencies]
rust-lstm = "0.1.0"
SGD, Adam, and RMSprop optimizers with configurable learning rates and momentum.
Support for peephole LSTM variants for enhanced learning capabilities.
Built-in gradient clipping to prevent exploding gradients during training.
Clean, intuitive API designed for both beginners and advanced users.
Here's a simple example to get you started with rust-lstm:
use rust_lstm::{LSTM, Optimizer};
fn main() {
// Create an LSTM with 10 input features, 20 hidden units
let mut lstm = LSTM::new(10, 20);
// Prepare your training data
let inputs = vec![
vec![0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0],
vec![0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.1],
];
let targets = vec![
vec![1.0, 0.0],
vec![0.0, 1.0],
];
// Train the LSTM
let optimizer = Optimizer::Adam { learning_rate: 0.001 };
for epoch in 0..100 {
let loss = lstm.train(&inputs, &targets, &optimizer);
if epoch % 10 == 0 {
println!("Epoch {}: Loss = {:.4}", epoch, loss);
}
}
// Make predictions
let predictions = lstm.forward(&inputs[0]);
println!("Predictions: {:?}", predictions);
}
Using peephole connections and custom training configuration:
use rust_lstm::{LSTM, LSTMConfig, Optimizer};
fn main() {
// Configure LSTM with peephole connections
let config = LSTMConfig {
input_size: 50,
hidden_size: 100,
output_size: 10,
use_peepholes: true,
gradient_clip: Some(5.0),
dropout_rate: Some(0.2),
};
let mut lstm = LSTM::with_config(config);
// Use Adam optimizer with custom parameters
let optimizer = Optimizer::Adam {
learning_rate: 0.001,
beta1: 0.9,
beta2: 0.999,
epsilon: 1e-8,
};
// Training loop with batch processing
for epoch in 0..1000 {
let mut total_loss = 0.0;
for batch in training_data.batches(32) {
let loss = lstm.train_batch(&batch.inputs, &batch.targets, &optimizer);
total_loss += loss;
}
let avg_loss = total_loss / training_data.num_batches(32) as f64;
if epoch % 50 == 0 {
println!("Epoch {}: Average Loss = {:.6}", epoch, avg_loss);
// Evaluate on validation set
let val_accuracy = evaluate(&lstm, &validation_data);
println!("Validation Accuracy: {:.2}%", val_accuracy * 100.0);
}
}
}
Example: Predicting the next value in a sequence:
use rust_lstm::{LSTM, Optimizer};
fn predict_next_values(sequence: &[f64], n_predictions: usize) -> Vec {
// Create LSTM for sequence prediction
let mut lstm = LSTM::new(1, 32);
// Prepare training data (sliding window)
let window_size = 10;
let mut inputs = Vec::new();
let mut targets = Vec::new();
for i in 0..sequence.len() - window_size {
let input: Vec = sequence[i..i + window_size].to_vec();
let target = sequence[i + window_size];
inputs.push(input);
targets.push(vec![target]);
}
// Train the model
let optimizer = Optimizer::RMSprop {
learning_rate: 0.01,
decay_rate: 0.9,
};
for epoch in 0..500 {
lstm.train(&inputs, &targets, &optimizer);
}
// Generate predictions
let mut predictions = Vec::new();
let mut current_sequence = sequence[sequence.len() - window_size..].to_vec();
for _ in 0..n_predictions {
let prediction = lstm.forward(¤t_sequence)[0];
predictions.push(prediction);
// Update sequence for next prediction
current_sequence.remove(0);
current_sequence.push(prediction);
}
predictions
}
fn main() {
let time_series = vec![1.0, 2.0, 3.0, 5.0, 8.0, 13.0, 21.0, 34.0, 55.0];
let future = predict_next_values(&time_series, 5);
println!("Predicted next 5 values: {:?}", future);
}
Predict stock prices, weather patterns, or any sequential data.
Build text generation models, sentiment analysis, or language translation systems.
Create AI-powered music composition and melody generation systems.
Identify unusual patterns in sensor data, network traffic, or system logs.