This is an audio engine, inspired by the Web Audio API.
More examples can be found here
-
Create an engine
use rawdio::{connect_nodes, create_engine, Level, Oscillator, Node}; let sample_rate = 44_100; let (mut context, mut process) = create_engine(sample_rate);
-
Create an oscillator
let frequency = 440.0; let output_channel_count = 2; let mut oscillator = Oscillator::sine(context.as_ref(), frequency, output_channel_count);
-
Set the gain on the oscillator
let level = Level::from_db(-3.0); oscillator .gain() .set_value_at_time(level.as_gain(), Timestamp::zero());
-
Connect to output
connect_nodes!(oscillator => "output");
-
Start the context
context.start();
-
Run the process to get samples. This will vary depending on whether you wish to run the engine in realtime (e.g. using CPAL) or offline (e.g. to a file). The engine doesn't make any assumptions, and will simply wait to be asked to process.
All audio buffers are assumed to be non-interleaved. So if the audio that comes from your soundcard is interleaved, it will need to be de-interleaved first.
let input_buffer = /*create an input buffer*/ let output_buffer = /*create an audio buffer*/ process.process (&mut output_buffer);
cd examples && cargo run --bin [example_name] [example_args]
cargo test
cargo bench
The engine won't make any assumptions about how it is going to be run. This means that it can be run in real-time, for example using CPAL. Or, it could be run offline, for example processing audio from files using hound. There are examples of both of these in the /examples directory.
Bear in mind that audio is expected to be de-interleaved. Most soundcards and audio files will be interleaved, so it will need to be converted first.