Skip to content

RuohoRecords/music-transformer-generation

 
 

Repository files navigation

Music-Transformer-Generation

About

Transformer-based symbolic music generation based on Music Transformer using REMI midi encoding.

Uses MidiTok for the encoding and model implementation from here.

Dataset used: Lakh MIDI Dataset.

Framework: Pytorch.

This is my first project in transformer-based music generation. Received lots of help from the above research and especially code from MusicTransformer-Pytorch. Also inspired by PopMAG, this may be my next music generation attempt!

Generated Examples

See examples directory for midi files of varying length

Requirements

Anaconda, Pytorch >= 1.2.0, Python >= 3.6 Install dependencies with: conda env create --file environment.yaml

How to use:

1. Get the dataset

Download and unzip LMD-full from Lakh MIDI Dataset. Then:
./preprocess.py <midi_files_directory> <processed_dataset_directory>

2. Train the model

./train.py <processed_dataset_directory> <checkpoints_directory>

3. Generate

./generate.py <processed_dataset_directory> <checkpoints_directory> --l <max_sequence_length>

Parameters

To change these, just edit utils/constants.py

batch_size = 16
validation_split = .9
shuffle_dataset = True
random_seed= 42
n_layers = 6
num_heads = 8
d_model = 512
dim_feedforward = 512
dropout = 0.1
max_sequence = 2048
rpr = True
ADAM_BETA_1 = 0.9
ADAM_BETA_2 = 0.98
ADAM_EPSILON= 10e-9

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%