Dr Mathieu BarthetSenior Lecturer in Digital MediaEmail: m.barthet@qmul.ac.ukTelephone: +44 20 7882 7518Room Number: Engineering, Eng 111ResearchKey PublicationsTeachingResearchmusic information retrieval, human computer interaction, music perception, augmented/virtual reality, internet of musical thingsKey Publicationstest Cui W, Sarmento P, Barthet M (2024). MoodLoopGP: Generating Emotion-Conditioned Loop Tablature Music with Multi-granular Features. nameOfConference DOI: 10.1007/978-3-031-56992-0_7 QMRO: qmroHref Yang S, Barthet M, Reed C et al. (2023). Do You Hear What I Hear?. nameOfConference DOI: 10.1109/mc.2023.3315470 QMRO: qmroHref Graf M, Barthet M (2023). Combining Vision and EMG-Based Hand Tracking for Extended Reality Musical Instruments. International Symposium on Computer Music Multidisciplinary Research DOI: doi QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/91305 Martelloni A, Mcpherson AP, Barthet M (2023). Real-time Percussive Technique Recognition and Embedding Learning for the Acoustic Guitar. 24th International Society for Music Information Retrieval Conference DOI: doi QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/89568 Deacon T, Barthet M (2023). Invoke: A Collaborative Virtual Reality Tool for Spatial Audio Production Using Voice-Based Trajectory Sketching. Proceedings of the 18th International Audio Mostly Conference DOI: 10.1145/3616195.3616211 QMRO: qmroHref Frachi Y, Chanel G, Barthet M (2023). Affective gaming using adaptive speed controlled by biofeedback. International Cconference on Multimodal Interaction DOI: 10.1145/3610661.3616124 QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/94955 Graf M, Barthet M (2023). Reducing Sensing Errors in a Mixed Reality Musical Instrument. VRST 2023 DOI: doi QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/91304 Williams A, Lattner S, Barthet M (2023). Sound-and-Image-informed Music Artwork Generation Using Text-to-Image Models. Music Recommender Systems Workshop at the 17th ACM Conference on Recommender Systems DOI: doi QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/91539 Mcintosh T, Woscholski O, Barthet M (2023). Affective Conditional Modifiers in Adaptive Video Game Music. Audio Mostly 2023 DOI: 10.1145/3616195.3616222 QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/90808 Loth J, Mamou-Mani A, Barthet M (2023). Playing Style Affects Steel-String Acoustic Guitar Timbre. 3rd International Conference on Timbre DOI: doi QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/89318 Ceriani M, Viola F, Rudan S et al. (2023). Semantic integration of audio content providers through the Audio Commons Ontology. nameOfConference DOI: 10.1016/j.websem.2023.100787 QMRO: qmroHref Montague LE, Barthet M (2023). Collaboration on the Tracks: Ethnographically-Informed Design for Computer-Assisted Music Collaboration between Producers and Performers. Creativity and Cognition DOI: 10.1145/3591196.3596619 QMRO: qmroHref Sarmento P, Kumar A, Chen Y-H et al. (2023). GTR-CTRL: Instrument and Genre Conditioning for Guitar-Focused Music Generation with Transformers. nameOfConference DOI: 10.1007/978-3-031-29956-8_17 QMRO: qmroHref Adkins S, Sarmento P, Barthet M (2023). LooperGP: A Loopable Sequence Model for Live Coding Performance Using GuitarPro Tablature. nameOfConference DOI: 10.1007/978-3-031-29956-8_1 QMRO: qmroHref Deacon T, Healey P, Barthet M (2023). “It’s cleaner, definitely”: Collaborative Process in Audio Production. nameOfConference DOI: 10.1007/s10606-022-09448-1 QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/85226 Deacon T, Barthet M (2023). Spatial Design Considerations for Interactive Audio in Virtual Reality. nameOfConference DOI: 10.1007/978-3-031-04021-4_6 QMRO: qmroHref Sarmento P, Holmqvist O, Barthet M (2022). Ubiquitous Music in Smart City: Musification of Air Pollution and User Context. Ubiquitous Music Symposium 2022 DOI: 10.5281/zenodo.6842186 QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/80062 Graf M, Barthet M (publicationYear). Mixed Reality Musical Interface: Exploring Ergonomics and Adaptive Hand Pose Recognition for Gestural Control. NIME 2022 DOI: 10.21428/92fbeb44.56ba9b93 QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/80213 Frachi Y, Takahashi T, Wang F et al. (2022). Design of Emotion-Driven Game Interaction Using Biosignals. nameOfConference DOI: 10.1007/978-3-031-05637-6_10 QMRO: qmroHref Wei C, Kronland-Martinet T, Frachi Y et al. (2022). Influence of Music on Perceived Emotions in Film. nameOfConference DOI: doi QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/88842 Sarmento P, Kumar A, Carr CJ et al. (2021). DadaGP: A Dataset of Tokenized GuitarPro Songs for Sequence Models. International Society for Music Information Retrieval Conference DOI: doi QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/75792 Löbbers S, Barthet M, Fazekas G (2021). Sketching sounds: an exploratory study on sound-shape associations. International Computer Music Conference DOI: doi QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/73470 Yang S, Reed CN, Chew E et al. (2021). Examining Emotion Perception Agreement in Live Music Performance. nameOfConference DOI: 10.1109/TAFFC.2021.3093787 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/76949 Graf M, Opara HC, Barthet M (2021). An Audio-Driven System for Real-Time Music Visualisation. Audio Engineering Society Convention 150 DOI: doi QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/72837 Martelloni A, McPherson A, Barthet M (publicationYear). Guitar augmentation for Percussive Fingerstyle: Combining self-reflexive practice and user-centred design. NIME 2021 DOI: 10.21428/92fbeb44.2f6db6e6 QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/88835 Irvine L, Barthet M (2021). Immersive Spatial Sound Design Using Creative Commons Audio. nameOfConference DOI: doi QMRO: qmroHref Dannemann T, Barthet M (2021). SonicDraw: a web-based tool for sketching sounds and drawings. nameOfConference DOI: doi QMRO: qmroHref Sarmento P, Holmqvist O, Barthet M (2020). Musical Smart City: Perspectives on Ubiquitous Sonification. Ubiquitous Music Workshop DOI: doi QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/65398 Martelloni A, Mcpherson A, Barthet M (2020). Percussive Fingerstyle Guitar through the Lens of NIME: an Interview Study. New Interfaces for Musical Expression (NIME) DOI: doi QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/66940 Yang S, Barthet M, Chew E (2019). Identifying Listener-informed Features for Modeling Time-varying Emotion Perception. Inter- national Symposium on Computer Music Multidisciplinary Research DOI: doi QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/83464 Turchet L, Barthet M (2019). Haptification of performer's control gestures in live electronic music performance. Proceedings of the 14th International Audio Mostly Conference: A Journey in Sound DOI: 10.1145/3356590.3356629 QMRO: qmroHref Bruford F, Barthet M, McDonald S et al. (2019). Modelling Musical Similarity for Drum Patterns. Proceedings of the 14th International Audio Mostly Conference: A Journey in Sound DOI: 10.1145/3356590.3356611 QMRO: qmroHref Stolfi AS, Milo A, Barthet M (2019). Playsound.space: Improvising in the browser with semantic sound objects. nameOfConference DOI: 10.1080/09298215.2019.1649433 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/61778 BROMHAM G, Moffat D, Barthet M et al. (2019). The Impact of Audio Effects Processing on the Perception of Brightness and Warmth. Audio Mostly DOI: 10.1145/3356590.3356618 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/59958 Turchet L, Barthet M (2019). An ubiquitous smart guitar system for collaborative musical practice. nameOfConference DOI: 10.1080/09298215.2019.1637439 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/61779 Turchet L, Barthet M (publicationYear). Smart Musical Instruments. nameOfConference DOI: 10.4324/9781315106359-11 QMRO: qmroHref Xambó A, Font F, Fazekas G et al. (publicationYear). Leveraging Online Audio Commons Content for Media Production. nameOfConference DOI: 10.4324/9781315106335-10 QMRO: qmroHref Deacon T, Bryan-Kinns N, Healey PGT et al. (2019). Shaping Sounds. Proceedings of the 2019 on Creativity and Cognition DOI: 10.1145/3325480.3325493 QMRO: qmroHref Weaver J, Barthet M, Chew E (2019). Filling the space: The impact of convolution reverberation time on note duration and velocity in duet performance. nameOfConference DOI: doi QMRO: qmroHref Bruford F, McDonald ST, Barthet M et al. (2019). Groove explorer: An intelligent visual interface for drum loop library navigation. nameOfConference DOI: doi QMRO: qmroHref Turchet L, Barthet M (2019). Smart Musical Instruments: Key Concepts and Do-It-Yourself Tutorial. nameOfConference DOI: 10.4324/9781315106359-11 QMRO: qmroHref TURCHET L, BARTHET M (2018). Co-design of Musical Haptic Wearables for Electronic Music Performer's Communication. nameOfConference DOI: 10.1109/THMS.2018.2885408 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/54063 Turchet L, Barthet M (2018). Jamming with a Smart Mandolin and Freesound-based Accompaniment. 2018 23rd Conference of Open Innovations Association (FRUCT) DOI: 10.23919/fruct.2018.8588110 QMRO: qmroHref Turchet L, Viola F, Fazekas G et al. (2018). Towards a Semantic Architecture for the Internet of Musical Things. 2018 23rd Conference of Open Innovations Association (FRUCT) DOI: 10.23919/fruct.2018.8587917 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/68123 Stolfi A, Sokolovskis J, Gorodscy F et al. (2018). Audio semantics: Online chat communication in open band participatory music performances. nameOfConference DOI: 10.17743/jaes.2018.0048 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/54060 Viola F, Stolfi A, Milo A et al. (2018). Playsound.space. Proceedings of the 1st International Workshop on Semantic Applications for Audio and Music DOI: 10.1145/3243907.3243908 QMRO: qmroHref Turchet L, Fischione C, Essl G et al. (2018). Internet of Musical Things: Vision and Challenges. nameOfConference DOI: 10.1109/ACCESS.2018.2872625 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/54062 Pauwels J, Xambo Sedo A, Roma G et al. (2018). Exploring Real-time Visualisations to Support Chord Learning with a Large Music Collection. 4th Web Audio Conference (WAC) DOI: doi QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/60833 Xambó A, Pauwels J, Roma G et al. (2018). Jam with Jamendo. Proceedings of the Audio Mostly 2018 on Sound in Immersion and Emotion DOI: 10.1145/3243274.3243291 QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/98977 Turchet L, McPherson A, Barthet M (publicationYear). Real-Time Hit Classification in a Smart Cajón. nameOfConference DOI: 10.3389/fict.2018.00016 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/54064 GODDARD C, BARTHET M, WIGGINS G (2018). Assessing Musical Similarity for Computational Music Creativity. nameOfConference DOI: 10.17743/jaes.2018.0012 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/38044 Turchet L, McPherson A, Barthet M (2018). Co-design of a smart Cajón. nameOfConference DOI: 10.17743/jaes.2018.0007 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/39005 SKACH S, XAMBO A, TURCHET L et al. (2018). Embodied Interactions with E-Textiles and the Internet of Sounds for Performing Arts. ACM Conference on Tangible, Embedded, and Embodied Interactions DOI: 10.1145/3173225.3173272 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/40883 Weaver J, Barthet M, Chew E (2018). Analysis of piano duo tempo changes in varying convolution reverberation conditions. nameOfConference DOI: doi QMRO: qmroHref Turchet L, Barthet M (2018). Demo of interactions between a performer playing a smart mandolin and audience members using musical haptic wearables. nameOfConference DOI: doi QMRO: qmroHref Xambo A, Roma G, Lerch A et al. (2018). Live Repurposing of Sounds: MIR Explorations with Personal and Crowd-sourced Databases. nameOfConference DOI: doi QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/98985 de Souza Stolfi A, Turchet L, Ceriani M et al. (2018). Playsound.space: Inclusive free music improvisations using audio commons. nameOfConference DOI: doi QMRO: qmroHref Weisling A, Xambó A, Olowe I et al. (2018). Surveying the compositional and performance practices of audiovisual practitioners. nameOfConference DOI: doi QMRO: https://qmro.qmul.ac.uk/xmlui/handle/123456789/98984 Bromham G, Moffat D, Barthet M et al. (2018). The impact of compressor ballistics on the perceived style of music. nameOfConference DOI: doi QMRO: qmroHref Deacon T, Stockman T, Barthet M (2017). User experience in an interactive music virtual reality system: An exploratory study. nameOfConference DOI: 10.1007/978-3-319-67738-5_12 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/31978 Pigrem J, Barthet M (2017). Datascaping. Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences DOI: 10.1145/3123514.3123537 QMRO: qmroHref Goddard C, BARTHET M, Wiggins G (2017). Designing Computationally Creative Musical Performance Systems. Audio Mostly 2017 DOI: 10.1145/3123514.3123541 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/27804 Olowe I, Barthet M, Grierson M (2017). FEATUR.UX.AV. Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences DOI: 10.1145/3123514.3123561 QMRO: qmroHref Herremans D, Yang S, Chuan C-H et al. (2017). IMMA-Emo. Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences DOI: 10.1145/3123514.3123545 QMRO: qmroHref Subramaniam A, Barthet M (2017). Mood Visualiser. Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences DOI: 10.1145/3123514.3123517 QMRO: qmroHref Stolfi A, Barthet M, Goródscy F et al. (2017). Open Band. Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences DOI: 10.1145/3123514.3123526 QMRO: qmroHref Olowe I, Grierson M, Barthet M (2017). User Requirements for Live Sound Visualization System Using Multitrack Audio. Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music Experiences DOI: 10.1145/3123514.3123527 QMRO: qmroHref Fazekas G, Barthet M, Stockman T (2017). Welcome to Audio Mostly 2017!. nameOfConference DOI: doi QMRO: qmroHref Wu Y, Zhang L, Bryan-Kinns N et al. (2017). Open Symphony: Creative Participation for Audiences of Live Music Performances. nameOfConference DOI: 10.1109/MMUL.2017.19 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/22542 Turchet L, Fischione C, Barthet M (2017). Towards the Internet of Musical Things. nameOfConference DOI: doi QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/31979 Zhang L, Wu Y, BARTHET M (2016). A Web Application for Audience Participation in Live Music Performance: The Open Symphony Use Case. International Conference on New Interfaces for Musical Expression DOI: doi QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/12500 Hayes K, BARTHET M, Wu Y et al. (2016). A Participatory Live Music Performance with the Open Symphony System. ACM Conference on Human Factors in Computing Systems (CHI): Interactivity DOI: 10.1145/2851581.2889471 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/12501 Barthet M, Fazekas G, Thalmann F et al. (2016). Crossroads: Interactive Music Systems Transforming Performance, Production and Listening. nameOfConference DOI: doi QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/12502 Barthet M, Fazekas G, Allik A et al. (2016). From interactive to adaptive mood-based music listening experiences in social or personal context. nameOfConference DOI: 10.17743/jaes.2016.0042 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/22527 Saari P, Fazekas G, Eerola T et al. (2016). Genre-Adaptive Semantic Computing and Audio-Based Modelling for Music Mood Annotation. nameOfConference DOI: 10.1109/TAFFC.2015.2462841 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/31904 Olowe I, Moro G, Barthet M (2016). ResidUUm: User mapping and performance strategies for multilayered live audiovisual generation. nameOfConference DOI: doi QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/12551 Allik A, Fazekas G, Barthet M et al. (2016). myMoodplay: An interactive mood-based music discovery app. nameOfConference DOI: doi QMRO: qmroHref Barthet M, Fazekas G, Allik A et al. (2015). Moodplay: an interactive mood-based musical experience. nameOfConference DOI: 10.1145/2814895.2814922 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/12173 Fazekas G, BARTHET M, Sandler MB (2014). Novel Methods in Facilitating Audience and Performer Interaction Using the Mood Conductor Framework. nameOfConference DOI: 10.1007/978-3-319-12976-1_8 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/36478 Kachkaev A, Wolff D, BARTHET M et al. (2014). Visualising Chord Progressions in Music Collections: A Big Data Approach. 9th Conference on Interdisciplinary Musicology (CIM) DOI: doi QMRO: qmroHref Weyde T, Cottrell S, Dykes J et al. (2014). Big Data for Musicology. 1st International Digital Libraries for Musicology workshop DOI: 10.1145/2660168.2660187 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/9417 Kolozali S, Fazekas G, Barthet M et al. (2014). A framework for automatic ontology generation based on semantic audio analysis. nameOfConference DOI: doi QMRO: qmroHref BARTHET MHE, Plumbley MD, Kachkaev, A et al. (2014). Big Chord Data Extraction and Mining. 9th Conference on Interdisciplinary Musicology (CIM) DOI: doi QMRO: qmroHref Weyde T, Cottrell S, Benetos E et al. (2014). Digital Music Lab - A Framework for Analysing Big Music Data. European Conference on Data Analysis (ECDA) DOI: 10.1109/EUSIPCO.2016.7760422 QMRO: qmroHref Lou T, Barthet M, Fazekas G et al. (2014). Evaluation and Improvement of the Mood Conductor Interactive System. nameOfConference DOI: doi QMRO: qmroHref Fazekas G, Barthet M, Sandler M (2014). Novel Methods in Facilitating Audience and Performer Interaction using the Mood Conductor Framework. nameOfConference DOI: 10.1007/978-3-319-12976-1_8 QMRO: https://uat2-qmro.qmul.ac.uk/xmlui/handle/123456789/31906 Baume C, Fazekas G, Barthet M et al. (2014). Selection of audio features for music emotion recognition using production music. nameOfConference DOI: doi QMRO: qmroHref BARTHET M, Benetos E, Cottrell S et al. (2014). The DML Research Project: Digital Music Lab - Analysing Big Music Data. Workshop on "Big Data: Challenges and Applications", Imperial College, London DOI: doi QMRO: qmroHref Aramaki M, Barthet M, Kronland-Martinet R et al. (2013). Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics): Preface. nameOfConference DOI: doi QMRO: qmroHref Kolozali S, Barthet M, Fazekas G et al. (2013). Automatic Ontology Generation for Musical Instruments Based on Audio Analysis. nameOfConference DOI: 10.1109/TASL.2013.2263801 QMRO: qmroHref Barthet M, Kronland-Martinet R, Ystad S (2013). Consistency of timbre patterns in expressive music performance. nameOfConference DOI: doi QMRO: qmroHref Barthet M, Marston D, Baume C et al. (2013). Design and evaluation of semantic mood models for music recommendation. nameOfConference DOI: doi QMRO: qmroHref Saari P, Barthet M, Fazekas G et al. (2013). Semantic models of musical mood: Comparison between crowd-sourced and curated editorial tags. nameOfConference DOI: 10.1109/ICMEW.2013.6618436 QMRO: qmroHref Fazekas G, Barthet M, Sandler M (2013). The BBC Desktop Jukebox music recommendation system: A large-scale trial with professional users. nameOfConference DOI: 10.1109/ICMEW.2013.6618235 QMRO: qmroHref Saari P, Eerola T, Fazekas G et al. (2013). The Role of Audio and Tags in Music Mood Prediction: a Study Using Semantic Layer Projection. nameOfConference DOI: doi QMRO: qmroHref Barthet M, Dixon S (2011). Ethnographic observations of musicologists at the british library: Implications for music information retrieval. nameOfConference DOI: doi QMRO: qmroHref Stowell D, Barthet M, Dixon S et al. (2011). Musicology for the masses: Situating new audio technologies for musicology and music education. Digital Engagement 2011 DOI: doi QMRO: qmroHref Barthet M, Hargreaves S, Sandler M (2011). Speech/Music Discrimination in Audio Podcast Using Structural Segmentation and Timbre Recognition. nameOfConference DOI: 10.1007/978-3-642-23126-1_10 QMRO: qmroHref Barthet M, Guillemain P, Kronland‐Martinet R et al. (2008). Exploration of timbre variations in music performance. nameOfConference DOI: 10.1121/1.2934986 QMRO: qmroHref Barthet M, Depalle P, Kronland-Martinet R et al. (2007). The effect of timbre in clarinet interpretation. nameOfConference DOI: doi QMRO: qmroHref Barthet M, Guillemain P, Kronland-Martinet R et al. (2005). On the relative influence of even and odd harmonics in clarinet timbre. nameOfConference DOI: doi QMRO: qmroHref TeachingDigital Media and Social Networks (Postgraduate) Content description: ------- How does the way we feel and express emotional behaviour affect our interaction with technology? What if we could use a ''head nod'' for ''liking'' things on Facebook? Can we create assistive technology to help people suffering from social disorders (e.g., autism)? Affective and Behavioural Computing is a multidisciplinary field of research and practice concerned with these questions, and understanding, recognizing and utilizing human emotions and communicative behaviour in the design of computational systems. ----- The following list aims to clarify the content and provides a representative list of topics: ¿ Overview: affective and behavioural computing; ¿ Theories in psychology, cognitive science and neuroscience: affect, emotion and social signal processing; ¿ Computational models; ¿ Emotion, affect and social signals in Human-Computer Interaction (HCI); ¿ Sensing: vision, audio, bio signals, text; data acquisition and annotation, databases and tools; ¿ Processing: extracting meaningful information and features; ¿ Recognition: applying machine learning techniques; ¿ Programming refresher: Hands-on lecture on programming for affective and behavioural computing using relevant libraries; ¿ Evaluation: automatic analysers, and emotionally and socially intelligent systems; ¿ Affect and social signal expression and generation (virtual characters, robots, etc.); ¿ Affect and social signals for Mobile HCI; ¿ Applications (entertainment technology/gaming/arts; clinical and biomedical studies, e.g., autism, depression, pain; etc.; implicit (multimedia) tagging; affective wearables); ¿ Ethical issues. Digital Media and Social Networks (Undergraduate) Introduction to Online Social Networks (OSN) Characteristics of OSNs Basic Graph Theory Small World Phenomenon Information propagation on OSNs Influence and Content Recommendation Sentiment Analysis in Social Media Privacy and ethics Sound Recording and Production Techniques (Postgraduate/Undergraduate) The module develops the students' skills and understanding of contemporary audio production techniques. It will give the students a good grounding in the theoretical aspects of audio production, from the functionality of audio interfaces to the signal processing within audio effects, as well as providing practical experience in the use of all audio equipment to which the theory applies. The students will learn the implications of audio digitisation, through which they will gain an understanding of the various means by which digital media is disseminated in the modern age. Back to top