HISE Docs

Audiofile


The Audiofile object is a data slot for loading sample files - not to be confused with the File object which represents an actual file on your disk.

Creating an Audiofile object

It is one of the 5 complex data types and can be created using

You can also attach a broadcaster to a data slot to get notified about event changes. Broadcaster.attachToComplexData()

All the examples in this documentation require a audio file slot that is loaded with an actual audio file to operate on. The easiest way of getting there is to just load the example assets of the snippet browser (once you have downloaded them):

// Load the example assets
FileSystem.loadExampleAssets();

// Grab whatever asset is first
const var firstAsset = Engine.loadAudioFilesIntoPool()[0];
Console.print(firstAsset); // {PROJECT_FOLDER}breakbeat_44k.wav

// Create a audio file slot
const var audioFile = Engine.createAndRegisterAudioFile(0);

// load the first asset
audioFile.loadFile(firstAsset);

// paste in all other code examples now...


Class methods

getContent

Returns the current audio data as array of channels.

AudioFile.getContent()


The getContent method retrieves the current audio data from the audio file and returns it as an array of channels. Each channel is represented as a Buffer object, which is a special float array type in HISE. This method is useful for accessing and manipulating the raw audio data for processing or analysis.

// Assume 'audioFile' is a valid AudioFile object

// Retrieve the current audio content
var audioContent = audioFile.getContent();

// Print the number of channels in the audio file
Console.print("Number of channels: " + audioContent.length);

// Iterate over each channel
for (var i = 0; i < audioContent.length; i++)
{
    var channel = audioContent[i];
    
    // Print the first 10 samples of each channel for inspection
    Console.print("Channel " + (i + 1) + " first 10 samples:");
    for (var j = 0; j < 10; j++)
    {
        Console.print(channel[j]);
    }
}

// Example of processing: Normalize the first channel
var firstChannel = audioContent[0];
var numSamples = firstChannel.length;
var maxSample = 0.0;

// Find the maximum sample value
for (var i = 0; i < numSamples; i++)
{
    if (Math.abs(firstChannel[i]) > maxSample)
    {
        maxSample = Math.abs(firstChannel[i]);
    }
}

// Normalize the first channel
if (maxSample > 0)
{
    for (var i = 0; i < numSamples; i++)
    {
        firstChannel[i] /= maxSample;
    }
    Console.print("First channel has been normalized.");
}
else
{
    Console.print("No need to normalize, max sample is zero.");
}


getCurrentlyDisplayedIndex

Returns the current sample position (from 0 to numSamples).

AudioFile.getCurrentlyDisplayedIndex()


The getCurrentlyDisplayedIndex method retrieves the current sample position within the audio file. This position is represented as an integer ranging from 0 to the total number of samples in the file (numSamples ). It is useful for tracking the playback or editing position within the audio file.

Note that for a live update of the playback position it might be more efficient to register a broadcaster to the complex data event using the AudioFile.DisplayIndex mode

Related Methods:

Example Usage:

// Assume 'audioFile' is a valid AudioFile object

// Retrieve the current sample position
var currentPosition = audioFile.getCurrentlyDisplayedIndex();

// Retrieve the total number of samples in the audio file
var totalSamples = audioFile.getNumSamples();

// Calculate the normalized position (between 0.0 and 1.0)
var normalizedPosition = currentPosition / totalSamples;

const var slider = Content.getComponent("Knob1");

// Set the slider value based on the normalized position
slider.setValue(normalizedPosition);

// Print the current sample position and normalized slider value to the console
Console.print("Current sample position: " + currentPosition);
Console.print("Normalized slider value: " + normalizedPosition);


getCurrentlyLoadedFile

Returns the reference string for the currently loaded file.

AudioFile.getCurrentlyLoadedFile()


The getCurrentlyLoadedFile method returns the reference string for the currently loaded audio file. This reference string indicates the file path or name of the audio file that is currently in use. If the file is located in the AudioFiles folder of your project, the path up to the AudioFiles folder will be represented by the {$PROJECT_FOLDER} placeholder. Otherwise this will return the system-specific absolute path to the audio file that was loaded into the audio file slot.

This is particularly useful for ensuring that file references remain consistent regardless of the actual location of the project on your file system.

If you want to get the actual file object from the return value of this method, it's recommended to use the FileSystem.fromReferenceString() method which takes in either an absolute path or a reference string and hence is the perfect match for the return value.

// Retrieve the reference string for the currently loaded file
const var refString = audioFile.getCurrentlyLoadedFile();

const var actualFile = FileSystem.fromReferenceString(refString, FileSystem.AudioFiles);


getNumSamples

returns the amount of samples.

AudioFile.getNumSamples()


The getNumSamples method returns the total number of samples in the audio file. This is an integer value representing the length of the audio file in terms of the number of discrete audio samples it contains. It is useful for understanding the size of the audio data and for performing operations that depend on the total sample count.

Related Methods:

// Assume 'audioFile' is a valid AudioFile object

// Retrieve the total number of samples in the audio file
const var numSamples = audioFile.getNumSamples();

// Retrieve the sample rate of the audio file
const var sampleRate = audioFile.getSampleRate();

// Calculate the duration of the audio file in seconds
const var durationInSeconds = numSamples / sampleRate;

// Print the total number of samples and the duration to the console
Console.print("Total number of samples: " + numSamples);
Console.print("Sample rate: " + sampleRate + " Hz");
Console.print("Duration of the audio file: " + durationInSeconds + " seconds");

// Further usage: Check if the audio file is longer than a specific duration
const var thresholdDuration = 60; // 60 seconds
if (durationInSeconds > thresholdDuration)
{
    Console.print("The audio file is longer than " + thresholdDuration + " seconds.");
}
else
{
    Console.print("The audio file is not longer than " + thresholdDuration + " seconds.");
}


getSampleRate

Returns the samplerate of the audio file.

AudioFile.getSampleRate()


The getSampleRate method returns the sample rate of the audio file. The sample rate is the number of samples per second and is typically measured in Hertz (Hz). This value is crucial for accurately interpreting the timing and pitch of the audio data.

Difference between AudioFile.getSampleRate and Engine.getSampleRate :

Related Methods:

// Assume 'audioFile' is a valid AudioFile object

// Retrieve the sample rate of the audio file
var fileSampleRate = audioFile.getSampleRate();
Console.print("Audio file sample rate: " + fileSampleRate + " Hz");

// Retrieve the sample rate of the audio engine
var engineSampleRate = Engine.getSampleRate();
Console.print("Audio engine sample rate: " + engineSampleRate + " Hz");

// This would be the playback speed that is required to play back
// the file in the original speed (most things in HISE that play 
// back samples will do this for you).
var playbackRatio = fileSampleRate / engineSampleRate;


linkTo

Links this audio file to the other Edit on GitHub

AudioFile.linkTo(var other)



loadFile

Loads an audio file from the given reference. Edit on GitHub

AudioFile.loadFile( String filePath)



setContentCallback

Sets a callback that is being executed when a new file is loaded (or the sample range changed).

AudioFile.setContentCallback(var contentFunction)


The setContentCallback method sets a callback function that is executed whenever a new file is loaded or the sample range changes within the audio file. This callback can be used to perform custom actions or updates in response to these events. The method takes one parameter, contentFunction , which is the function to be called. The function should not take any parameters but will have its this object pointed to the audio file that has changed.

// Assume 'audioFile' is a valid AudioFile object
// Clear the file slot (so that the content callback fires
// if you recompile after loading it the first time)
audioFile.loadFile("");

// Set the callback
audioFile.setContentCallback(function()
{
	// print out some properties (the this object of the
	// function will point to the audio file)...
	Console.print(this.getCurrentlyLoadedFile()); // {PROJECT_FOLDER}breakbeat_44k.wav
	Console.print(this.getSampleRate()); // 44100.0
	Console.print(this.getNumSamples()); // 830117
});

// load the first asset
audioFile.loadFile(firstAsset);


setDisplayCallback

Sets a callback that is being executed when the playback position changes.

AudioFile.setDisplayCallback(var displayFunction)


The setDisplayCallback method sets a callback function that is executed whenever the playback position changes within the audio file. This can be useful for updating UI elements or performing other actions in response to changes in the playback position. The method takes one parameter, displayFunction , which is the function to be called. The function will have its this object assigned to the audio file and needs to have a single parameter that will provide the playback position as sample index.

Related Methods:

Example Usage:

// Grab a reference to a looper that is loaded with a sound 
// (we need it to actually play the sample for this example)
const var AudioLoopPlayer1 = Synth.getAudioSampleProcessor("Audio Loop Player1");

// Grab a reference to the loop audio file slot
const var audioFile = AudioLoopPlayer1.getAudioFile(0);

// register a function to be called whenever the playback position changes
audioFile.setDisplayCallback(function(displayValue)
{
	// Print the normalised position using the sample length
	Console.print(displayValue / this.getNumSamples());
});


setRange

Sets a new sample range.

AudioFile.setRange(int min, int max)


The setRange method sets a new sample range for the audio file. This range is defined by the min and max sample positions, effectively selecting a subset of the audio data. This method provides a scripting option for changing the sample range, which can also be done interactively by dragging the sample area in the Audio Waveform .

Example Usage:

// Paste in the code to load the asset from above...

// clear the audio file (so that it loads the full file again
// when you recompile)
audioFile.loadFile("");

// load the first asset (again)
audioFile.loadFile(firstAsset);

// Define the new sample range
var minSample = 1000;
var maxSample = 5000;

// Grab the full sample data
var fullSample = audioFile.getContent();

Console.print(fullSample[0].length); // 830117

// Set the new sample range using the setRange method
audioFile.setRange(minSample, maxSample);

// Now the getContent() method will only return the slice
// from 1000 to 5000
var slice = audioFile.getContent();

Console.print(slice[0].length); // 4000


update

Sends an update message to all registered listeners.

AudioFile.update()


There are a number of occasions where you want to manually cause a listener callback to be fired again.

This function will send the content change message to all listeners (so if you have assigned a callback using Audiofile , this will call it again even if the content hasn't changed).

audioFile.loadFile("");

var counter = 0;

// Set the callback
audioFile.setContentCallback(function()
{
	Console.print(++counter);
});

// load the first asset
audioFile.loadFile(firstAsset);

// The content callback is asynchronous
// so the counter is still 0
Console.print(counter); // 0

// Send the listener again
audioFile.update();

// still zero
Console.print(counter); // 0

// The full console output will show that
// the content callback has been executed twice:
// Interface: 0
// Interface: 0
// Interface: 1
// Interface: 2