Paper: Technology Demonstration: Augmented Design Ideation
Proceedings of the 2017 ACM SIGCHI Conference on Creativity and
Cognition | C&C 2017, June 27–30, 2017, Singapore


This technology demonstration showcases augmented design ideation tools that have been developed to explore hybrid design environments [1] using novel prototypes that use ambient informatics and ubiquitous computing approaches. The tool aims to enhance the creative scope of multi-participant design ideation by providing contextually relevant visual design prompts to enhance design ideation processes and conceptualisation [1]. This demonstration explores the feasibility and effects of tools that can be built now using ad-Hoc arrangements/meshes of the following: Application Programing Interfaces (API's) from popular web platforms: that offer up various types of culturally/contextually relevant content -- i.e. historical, contemporary or live -- images, text and sound. Internet search technologies: This includes the "mashing" together of Internet search technologies such as speech-to-text (IBM's Watson) to both provide live queries to culturally/contextually relevant content, and importantly, live feedback through word clouds and image text relationships fed into the ideation environment. Multimodal data: gathered through the use of gesture mapping (using stereoscopic 3D cameras such as the LEAP Motion [2]) and video analytics. This data is then used to create ideation metadata that can be used in comparative analysis of ideation "events". This metadata is also used to provide live informatics in the ideation environment in order to explore performative aspects [3] of such environments and design ideation.

Direct download here.