Tyler McIntosh
Curriculum Vitae
Education
PhD in Artificial Intelligence and Music
Queen Mary University of London, UK
2022 - Present
- Supervisor: Prof. Simon Dixon
- Research Topic: Expressive Performance Synthesis for Music Generation Systems
MSc in Media and Arts Technology
Queen Mary University of London, UK
2021 - 2022
- Grade: Distinction
- Supervisor: Dr. Mathieu Barthet
- Dissertation: Affective Conditional Modifiers in Adaptive Video Game Music
BA in Sound Design
University of Greenwich, London, UK
2018 - 2021
- Grade: First Class with Honours
- Supervisor: Dr. Angela McArthur
- Dissertation: Exploring the Relationship Between Music and Emotions with Machine Learning
Level 3 Diploma in Music
Kendal College, Kendal, UK
2015 - 2017
- Triple Grade: Distinction, Distinction, Merit
Professional Experience
Demonstrator
Queen Mary University of London, London, UK
October 2022 - Present
- Delivering course content
- Marking assessments
- Conducting lab demonstrations
Website Developer
Coyote Arts, Cumbria, UK
November 2020 - Present
- Full-stack web development
Website Developer
Solfest Music Festival, Cumbria, UK
February 2017 - August 2019
- Front-end web development
Publications
Conference Proceedings
-
McIntosh, T, Woscholsk, O., and Barthet, M. (2023). “Affective Conditional Modifiers in Adaptive Video Game Music.” In AudioMostly 2023 (AM ‘23), August 30 – September 1, 2023, Edinburgh, Scotland. ACM, New York, NY, USA. Read here
-
McIntosh, T, Weinel, J., and Cunningham, S. (2022). “Lundheim: Exploring Affective Audio Techniques in an Action-Adventure Video Game.” In AudioMostly 2022 (AM ‘22), September 6–9, 2022, St. Pölten, Austria. ACM, New York, NY, USA. Read here
-
McIntosh, T (2021). “Exploring the Relationship Between Music and Emotions with Machine Learning.” In EVA London 2021 (Electronic Visualisation and the Arts), July 5-9, 2021, London, United Kingdom. Science Open, Berlin. Read here
Portfolio
Neural Scores
Undergraduate Dissertation project
This project used innovative technologies, such as electroencephalography and machine learning, to explore the relationship between music and emotions. By recording and analyzing EEG data from listeners, the project created a series of emotion graph images that visually depict the complete emotional journey evoked by a song as the participant listened to it.