The Creative Technologies Research Theme integrates creative arts and technology, creating a unique and dynamic combination in areas of multimedia, gaming, content and production.
Creative Technologies researchers in Trinity look at technologies which are creative in themselves, including computer animation, computer graphics, and signal processing, while also carrying out research into how these technologies can be utilised in other wider research areas such as Education and Training, Health, Active Ageing, and Art.
Trinity has an international reputation for research, education and knowledge transfer activities in the technologies that underpin the Creative and Entertainment Industries, such as Film, Video Games, Visualisation and Design, Digital Arts and Networks and Telecommunications. This research has extended to collaborations between engineers, scientists, and artists, which is an important strategy in leading research centres around the world along with direct engagement with the creative and enabling industries.
Trinity College Dublin:
The University of Southern California Institute for Creative Technologies applies this winning Hollywood formula to benefit service members, students and society at large.
An academic research institute, ICT brings film and game industry artists together with computer and social scientists to study and develop immersive media for military training, health therapies, science education and more.
he University of Southern California Institute for Creative Technologies’ (ICT), pioneering efforts within DARPA’s Detection and Computational Analysis of Psychological Signals (DCAPS) project encompass advances in the artificial intelligence fields of machine learning, natural language processing and computer vision. These technologies identify indicators of psychological distress such as depression, anxiety and PTSD, and are being integrated into ICT’s virtual human application to provide healthcare support.
This effort seeks to enable a new generation of clinical decision support tools and interactive virtual agent-based healthcare dissemination/delivery systems that are able to recognize and identify psychological distress from multimodal signals. These tools aim to provide military personnel and their families’ better awareness and access to care while reducing the stigma of seeking help. For example, the system’s early identification of a patient’s high or low distress state could generate the appropriate information to help a clinician diagnose a potential stress disorder. User-state sensing can also be used to create long-term patient profiles that help assess change over time.
ICT is expanding its expertise in automatic human behavior analysis to identify indicators of psychological distress in people. Two technological systems are central to the effort. Multisense automatically tracks and analyzes in real-time facial expressions, body posture, acoustic features, linguistic patterns and higher-level behavior descriptors (e.g. attention and fidgeting). Multisense infers from these signals and behaviors, indicators of psychological distress that directly inform SimSensei, the virtual human. SimSensei is a virtual human platform able to sense real-time audio-visual signals captured by Multisense. It is specifically designed for healthcare support and is based on the 10+ years of expertise in virtual human research and development at ICT. The platform enables an engaging face-to-face interaction where the virtual human automatically reacts to the perceived user state and intent, through its own speech and gestures. DCAPS is not aimed at providing an exact diagnosis, but at providing a general metric of psychological health.