Open the WekinatorProject_waverider.wekproj. You should now see these values within the WaveRider_input app in Max/MSP.Ĭ. Now you should see in yellow that only the have been selected. Return to the IP address page by selecting the first icon from the left (bottom menu). Turn the Gyroscope on and switch off everything else. From the bottom menu, select the second icon from the left. Type your IP address and make sure the port is set to 9999. Th.polar.wave~ and th.wave.table objects by Timo Hoogland are needed:ī.Open the GyrOSC app on your end set.
Instructions for how to run and use your project.Įnsure that the following dependencies are installed.Max/MSP (developed and tested on version 8.1)
PC or Mac (developed and tested on Windows 10 and OSX 10.14.5) IOS handset (developed and tested on iPhone12-iOS 14.8)
Machine Learning is used as a sound exploration method, controlling 13 parameters of the synthesizer using my iPhone gyroscope (pitch, roll, yaw) via the Wekinator regression-based model. The synth is based on th.polar.wave~ and th.wave.table objects by Timo Hoogland The outputcode is Wavetable-VisualSynth a wave-table audio-visual synthesiser developed in Max/MSP using real-time DSP and Jitter (visual-library). It extracts the gyroscope data from the iPhone (pitch, roll, yaw) and sends them to Wekinator fro Machine Learning traning. The inputcode is a regression-based feature extractor called WaveRider. The input and output devices have been developed in Max/MSP, implementing OSC, wavetable synthesis and Jitter for visuals.
WaveRiding, consisting of a feature extractor, a wave-table audio-visual synthesiser and a Wekinator machine learning model.