Differences
This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
pub:research [2020/11/04 13:53] – papers update kkutt | pub:research [2025/02/01 12:14] (current) – kkutt | ||
---|---|---|---|
Line 2: | Line 2: | ||
===== Papers ===== | ===== Papers ===== | ||
+ | |||
+ | === KES2024 === | ||
+ | * J. Ignatowicz, K. Kutt, and G. J. Nalepa, “**Evaluation and Comparison of Emotionally Evocative Image Augmentation Methods**, | ||
+ | * DOI: [[https:// | ||
+ | * [[https:// | ||
+ | * ++Abstract | Experiments in affective computing are based on stimulus datasets that, in the process of standardization, | ||
+ | |||
+ | === IWINAC2024a === | ||
+ | * K. Kutt and G. J. Nalepa, “**Emotion Prediction in Real-Life Scenarios: On the Way to the BIRAFFE3 Dataset**, | ||
+ | * DOI: [[https:// | ||
+ | * [[https:// | ||
+ | * ++Abstract | Despite over 20 years of research in affective computing, emotion prediction models that would be useful in real-life out-of-the-lab scenarios such as health care or intelligent assistants have still not been developed. The identification of the fundamental problems behind this concern led to the initiation of the BIRAFFE series of experiments, | ||
+ | |||
+ | === IWINAC2024b === | ||
+ | * K. Kutt, M. Kutt, B. Kawa, and G. J. Nalepa, “**Human-in-the-Loop for Personality Dynamics: Proposal of a New Research Approach**, | ||
+ | * DOI: [[https:// | ||
+ | * [[https:// | ||
+ | * ++Abstract | In recent years, one can observe an increasing interest in dynamic models in the personality psychology research. Opposed to the traditional paradigm—in which personality is recognized as a set of several permanent dispositions called traits—dynamic approaches treat it as a complex system based on feedback loops between individual and the environment. The growing attention to dynamic models entails the need for appropriate modelling tools. In this conceptual paper we address this demand by proposing a new approach called personality-in-the-loop, | ||
+ | |||
+ | === DSAA2023 === | ||
+ | * K. Kutt, Ł. Ściga, and G. J. Nalepa, " | ||
+ | * DOI: [[https:// | ||
+ | * ++Abstract | Current review papers in the area of Affective Computing and Affective Gaming point to a number of issues with using their methods in out-of-the-lab scenarios, making them virtually impossible to be deployed. On the contrary, we present a game that serves as a proof-of-concept designed to demonstrate that—being aware of all the limitations and addressing them accordingly—it is possible to create a product that works in-the-wild. A key contribution is the development of a dynamic game adaptation algorithm based on the real-time analysis of emotions from facial expressions. The obtained results are promising, indicating the success in delivering a good game experience.++ | ||
+ | |||
+ | === InfFusion2023 === | ||
+ | * J. M. Górriz //et al.//, " | ||
+ | * DOI: [[https:// | ||
+ | * [[https:// | ||
+ | * ++Abstract | Deep Learning (DL), a groundbreaking branch of Machine Learning (ML), has emerged as a driving force in both theoretical and applied Artificial Intelligence (AI). DL algorithms, rooted in complex and non-linear artificial neural systems, excel at extracting high-level features from data. DL has demonstrated human-level performance in real-world tasks, including clinical diagnostics, | ||
+ | |||
+ | === SciData2022 === | ||
+ | * K. Kutt, D. Drążyk, L. Żuchowska, M. Szelążek, S. Bobek, and G. J. Nalepa, " | ||
+ | * DOI: [[https:// | ||
+ | * [[https:// | ||
+ | * ++Abstract | Generic emotion prediction models based on physiological data developed in the field of affective computing apparently are not robust enough. To improve their effectiveness, | ||
+ | |||
+ | === AfCAI2022 === | ||
+ | * K. Kutt, P. Sobczyk, and G. J. Nalepa, " | ||
+ | * {{ : | ||
+ | * ++Abstract | Facial expressions convey the vast majority of the emotional information contained in social utterances. From the point of view of affective intelligent systems, it is therefore important to develop appropriate emotion recognition models based on facial images. As a result of the high interest of the research and industrial community in this problem, many ready-to-use tools are being developed, which can be used via suitable web APIs. In this paper, two of the most popular APIs were tested: Microsoft Face API and Kairos Emotion Analysis API. The evaluation was performed on images representing 8 emotions—anger, | ||
+ | |||
+ | === MRC2021b === | ||
+ | * L. Żuchowska, K. Kutt, and G. J. Nalepa, " | ||
+ | * {{http:// | ||
+ | * ++Abstract | The paper presents the design of a game that will serve as a research environment in the BIRAFFE series experiment planned for autumn 2021, which uses affective and personality computing methods to develop methods for interacting with intelligent assistants. A key aspect is grounding the game design on the taxonomy of player types designed by Bartle. This will allow for an investigation of hypotheses concerning the characteristics of particular types of players or their stability in response to emotionally-charged stimuli occurring during the game.++ | ||
+ | |||
+ | === MRC2021a === | ||
+ | * K. Kutt, L. Żuchowska, S. Bobek, and G. J. Nalepa, " | ||
+ | * {{http:// | ||
+ | * ++Abstract | The paper provides insights into two main threads of analysis of the BIRAFFE2 dataset concerning the associations between personality and physiological signals and concerning the game logs' generation and processing. Alongside the presentation of results, we propose the generation of event-marked maps as an important step in the exploratory analysis of game data. The paper concludes with a set of guidelines for using games as a context-rich experimental environment.++ | ||
+ | |||
+ | === Sensors2021 === | ||
+ | * K. Kutt, D. Drążyk, S. Bobek, and G. J. Nalepa, " | ||
+ | * DOI: [[https:// | ||
+ | * [[https:// | ||
+ | * ++Abstract | In this article, we propose using personality assessment as a way to adapt affective intelligent systems. This psychologically-grounded mechanism will divide users into groups that differ in their reactions to affective stimuli for which the behaviour of the system can be adjusted. In order to verify the hypotheses, we conducted an experiment on 206 people, which consisted of two proof-of-concept demonstrations: | ||
+ | |||
+ | === ICAISC2020 === | ||
+ | * S. Bobek, M. M. Tragarz, M. Szelążek, and G. J. Nalepa, " | ||
+ | * DOI: [[https:// | ||
+ | * [[https:// | ||
+ | * ++Abstract | Development of models for emotion detection is often based on the use of machine learning. However, it poses practical challenges, due to the limited understanding of modeling of emotions, as well as the problems regarding measurements of bodily signals. In this paper we report on our recent work on improving such models, by the use of explainable AI methods. We are using the BIRAFFE data set we created previously during our own experiment in affective computing.++ | ||
=== HAIIW2020 === | === HAIIW2020 === | ||
* K. Kutt, D. Drążyk, M. Szelążek, S. Bobek, and G. J. Nalepa, "**The BIRAFFE2 Experiment – Study in Bio-Reactions and Faces for Emotion-based Personalization for AI Systems**." | * K. Kutt, D. Drążyk, M. Szelążek, S. Bobek, and G. J. Nalepa, "**The BIRAFFE2 Experiment – Study in Bio-Reactions and Faces for Emotion-based Personalization for AI Systems**." | ||
* [[https:// | * [[https:// | ||
+ | * ++Abstract | The paper describes BIRAFFE2 data set, which is a result of an affective computing experiment conducted between 2019 and 2020, that aimed to develop computer models for classification and recognition of emotion. Such work is important to develop new methods of natural Human-AI interaction. As we believe that models of emotion should be personalized by design, we present an unified paradigm allowing to capture emotional responses of different persons, taking individual personality differences into account. We combine classical psychological paradigms of emotional response collection with the newer approach, based on the observation of the computer game player. By capturing ones psycho-physiological reactions (ECG, EDA signal recording), mimic expressions (facial emotion recognition), | ||
=== MRC2020 === | === MRC2020 === | ||
- | * L. Żuchowska, K. Kutt, K. Geleta, S. Bobek, and G. J. Nalepa, " | + | * L. Żuchowska, K. Kutt, K. Geleta, S. Bobek, and G. J. Nalepa, " |
- | * {{http://mrc.kriwi.de/2020/ | + | * {{http://ceur-ws.org/Vol-2787/paper7.pdf|Full text available online}} |
+ | * ++Abstract | We propose an experimental framework for Affective Computing based of video games. We developed a set of specially designed mini-games, based of carefully selected game mechanics, to evoke emotions of participants of a larger experiment. We believe, that games provide a controllable yet overall ecological environment for studying emotions. We discuss how we used our mini-games as an important counterpart of classical visual and auditory stimuli. Furthermore, | ||
=== AfCAI2019 === | === AfCAI2019 === | ||
* K. Kutt, D. Drążyk, P. Jemioło, S. Bobek, B. Giżycka, V. Rodriguez-Fernandez, | * K. Kutt, D. Drążyk, P. Jemioło, S. Bobek, B. Giżycka, V. Rodriguez-Fernandez, | ||
* {{http:// | * {{http:// | ||
+ | * ++Abstract | In this paper we introduce the BIRAFFE data set which is the result of the experiment in affective computing we conducted in early 2019. The experiment is part of the work aimed at the development of computer models for emotion classification and recognition. We strongly believe that such models should be personalized by design as emotional responses of different persons are subject to individual differences due to their personality. In the experiment we assumed data fusion from both visual and audio stimuli both taken from standard public data bases (IADS and IAPS respectively). Moreover, we combined two paradigms. In the first one, subjects were exposed to stimuli, and later their bodily reactions (ECG, GSR, and face expression) were recorded. In the second one the subjects played basic computer games, with the same reactions constantly recorded. We decided to make the data set publicly available to the research community using the Zenodo platform. As such, the data set contributes to the development and replication of experiments in AfC.++ | ||
=== SEMANTiCS2019 === | === SEMANTiCS2019 === | ||
* B. Giżycka, K. Kutt, and G. J. Nalepa, " | * B. Giżycka, K. Kutt, and G. J. Nalepa, " | ||
* {{http:// | * {{http:// | ||
+ | * ++Abstract | Tools for automatization of knowledge on game mechanics and their interrelationships are still lacking. Game design patterns, as proposed by Björk and Holopainen, seem promising in this area, as they can be represented formally as an ontology. This paper presents our proposal of such a representation, | ||
=== Sensors2019 === | === Sensors2019 === | ||
Line 23: | Line 89: | ||
* DOI: 10.3390/ | * DOI: 10.3390/ | ||
* [[https:// | * [[https:// | ||
+ | * ++Abstract | In this paper, we consider the use of wearable sensors for providing affect-based adaptation in Ambient Intelligence (AmI) systems. We begin with discussion of selected issues regarding the applications of affective computing techniques. We describe our experiments for affect change detection with a range of wearable devices, such as wristbands and the BITalino platform, and discuss an original software solution, which we developed for this purpose. Furthermore, | ||
=== ICAISC2019b === | === ICAISC2019b === | ||
* M. Z. Łępicki and S. Bobek, " | * M. Z. Łępicki and S. Bobek, " | ||
* [[https:// | * [[https:// | ||
+ | * ++Abstract | Affective computing gained a lot of attention from researchers and business over the last decade. However, most of the attempts for building systems that try to predict, or provoke affective state of users were done for specific and narrow domains. This complicates reusing such systems in other, even similar domains. In this paper we present such a solution, that aims at solving such problem by providing a general framework architecture for building affective-aware systems. It supports designing and development of affective-aware solutions, in a holistic and domain independent way.++ | ||
=== ICAISC2019a === | === ICAISC2019a === | ||
Line 33: | Line 101: | ||
* DOI: 10.1007/ | * DOI: 10.1007/ | ||
* {{ : | * {{ : | ||
+ | * ++Abstract | The use of emotions in the process of creating video games is still a challenge for the developers from the fields of Human-Computer Interaction and Affective Computing. In our work, we aim at demonstrating architectures of two operating game prototypes, implemented with the use of affective design patterns. We ground our account in biological signals, i.e. heart rate, galvanic skin response and muscle electrical activity. Using these modalities and the game context, we reason about emotional states of the player. For this purpose, we focus on defining rules with linguistic terms. What is more, we address the need for explainablity of biological mechanics and individual differences in terms of reactions to different stimuli. We provide a benchmark, in the form of a survey, to verify our approach.++ | ||
=== CoSECiVi2018 === | === CoSECiVi2018 === | ||
* G. J. Nalepa and B. Giżycka, "**How a mobile platform for emotion identification supports designing affective games**" | * G. J. Nalepa and B. Giżycka, "**How a mobile platform for emotion identification supports designing affective games**" | ||
* Presented at the [[https:// | * Presented at the [[https:// | ||
- | * [[https:// | + | * {{https:// |
+ | * ++Abstract | Affective | ||
=== GEM2018 === | === GEM2018 === | ||
Line 44: | Line 114: | ||
* DOI: 10.1109/ | * DOI: 10.1109/ | ||
* {{ : | * {{ : | ||
+ | * ++Abstract | A relatively new field of research on affective gaming suggests applying affective computing solutions to develop games that can interact with the player on the emotional level. To bring together selected models of affect and affect-driven frameworks developed to date, we propose an approach based on affective design patterns. We build on the assumption that player’s emotional reactions to in-game events can be evoked by patterns used early in the design phase. We provide description of experiments conducted to test our hypothesis so far, along with some tentative observations, | ||
=== HAI2018 === | === HAI2018 === | ||
* B. Giżycka, G. J. Nalepa, and P. Jemioło, " | * B. Giżycka, G. J. Nalepa, and P. Jemioło, " | ||
* Presented at the [[https:// | * Presented at the [[https:// | ||
- | * [[https:// | + | * [[https:// |
+ | * ++Abstract | As technologies become more and more pervasive, there is a need for considering the affective dimension of interaction with computer systems to make them more human-like. Current demands for this matter include accurate emotion recognition, | ||
=== CCSC2018 === | === CCSC2018 === | ||
* B. Giżycka, " | * B. Giżycka, " | ||
* Presented at the: 10th Cracow Cognitive Science Conference | * Presented at the: 10th Cracow Cognitive Science Conference | ||
- | * [[http:// | + | * {{http:// |
+ | * ++Abstract | As modern technologies become more apparent and persistent, human-computer interaction becomes an important research topic. With birth of affective computing, which aims at developing systems capable of detecting and processing emotionally significant data from the environment, | ||
=== HSI2018 === | === HSI2018 === | ||
Line 60: | Line 133: | ||
* DOI: 10.1109/ | * DOI: 10.1109/ | ||
* {{ : | * {{ : | ||
+ | * ++Abstract | In the paper we describe a new software solution for mobile devices that allows for data acquisition from wristbands. The application reads physiological data from wristbands and supports multiple recent hardware. In our work we focus on the Heart Rate (HR) and Galvanic Skin Response (GSR) readings. This data is used in the affective computing experiments for human emotion recognition.++ | ||
=== FGCS2018b === | === FGCS2018b === | ||
Line 66: | Line 140: | ||
* DOI: 10.1016/ | * DOI: 10.1016/ | ||
* [[https:// | * [[https:// | ||
+ | * ++Abstract | We discuss affective serious games that combine learning, gaming and emotions. We describe a novel framework for the creation and evaluation of serious affective games. Our approach is based on merging pertinent design patterns in order to recognize educational claims, educational assessment, best game design practices, as well as models and solutions of affective computing. Björk’s and Holopainen’s game design patterns have been enhanced by Evidence Centered Design components and affective components. A serious game has been designed and created to demonstrate how to outline a complex game system in a communicative way, and show methods to trace how theoretically-driven design decisions influence learning outcomes and impacts. We emphasize the importance of patterns in game design. Design patterns are an advantageous and convenient way of outlining complex game systems. Design patterns also provide favorable language of communication between multidisciplinary teams working on serious games.++ | ||
=== ICAISC2018 === | === ICAISC2018 === | ||
Line 71: | Line 146: | ||
* Presented at [[http:// | * Presented at [[http:// | ||
* {{ : | * {{ : | ||
+ | * ++Abstract | The paper outlines a mobile sensor platform aimed at processing physiological data from wearable sensors. We discuss the requirements related to the use of low-cost portable devices in this scenario. Experimental analysis of four such devices, namely Microsoft Band 2, Empatica E4, eHealth Sensor Platform and BITalino (r)evolution is provided. Critical comparison of quality of HR and GSR signals leads to the conclusion that future works should focus on the BITalino, possibly combined with the MS Band 2 in some cases. This work is a foundation for possible applications in affective computing and telemedicine.++ | ||
=== AfCAI2018 === | === AfCAI2018 === | ||
Line 76: | Line 152: | ||
* Presented at [[https:// | * Presented at [[https:// | ||
* {{http:// | * {{http:// | ||
+ | * ++Abstract | In this overview paper we focus on our recent progress in the work on the mobile platform for AfC. We provide the main assumptions about the platform, as well as describe affective data acquisition and interpretation. We discuss our most recent experiments and provide an outlook of our future works.++ | ||
=== FGCS2018a === | === FGCS2018a === | ||
Line 81: | Line 158: | ||
* Published in [[https:// | * Published in [[https:// | ||
* [[https:// | * [[https:// | ||
+ | * ++Abstract | In our work, we focus on detection of affective states, their proper identification and interpretation with use of wearable and mobile devices. We propose a data acquisition layer based on wearable devices able to gather physiological data, and we integrate it with mobile context-aware framework. Furthermore, | ||
=== FedCSIS2017 === | === FedCSIS2017 === | ||
Line 87: | Line 165: | ||
* DOI: 10.15439/ | * DOI: 10.15439/ | ||
* [[https:// | * [[https:// | ||
+ | * ++Abstract | The emotional state of the user is a new dimension in human-computer interaction, | ||
=== AfCAI2016b === | === AfCAI2016b === | ||
* G. J. Nalepa, J. K. Argasiński, | * G. J. Nalepa, J. K. Argasiński, | ||
* Presented at [[pub: | * Presented at [[pub: | ||
- | * [[http:// | + | * {{http:// |
+ | * ++Abstract | In this paper we discuss selected important challenges in designing experiments that lead to data and information collection on affective states of participants. We aim at acquiring data that would be basis to formulate and evaluate computer methods for detection, identification and interpretation of such affective states, and ultimately human emotions.++ | ||
=== AfCAI2016a === | === AfCAI2016a === | ||
* G. J. Nalepa, K. Kutt, S. Bobek, and M. Z. Łępicki, " | * G. J. Nalepa, K. Kutt, S. Bobek, and M. Z. Łępicki, " | ||
* Presented at [[pub: | * Presented at [[pub: | ||
- | * [[http:// | + | * {{http:// |
+ | * ++Abstract | We are aiming at developing a technology to detect, identify and interpret human emotional states. We believe, that it can be provided based on the integration of context-aware systems and affective computing paradigms. We are planning to identify and characterize affective context data, and provide knowledge-based models to identify and interpret affects based on this data. A working name for this technology is simply AfCAI: Affective Computing with Context Awareness for Ambient Intelligence.++ | ||
+ | |||
+ | ===== Projects ===== | ||
+ | |||
+ | * **Personality, | ||
===== Tools and Datasets ===== | ===== Tools and Datasets ===== | ||
Line 110: | Line 195: | ||
==== Prototypes of Affective Games ==== | ==== Prototypes of Affective Games ==== | ||
+ | * [[pub: | ||
* [[pub: | * [[pub: | ||
* [[pub: | * [[pub: | ||
- | * [[pub:londonbridge|London Bridge]] (scrollrunner game) | + | * [[pub:prototypes# |
==== Datasets ==== | ==== Datasets ==== |