Fluid Transformers and Creative Analogies: Exploring Large Language Models’ Capacity for Augmenting Cross-Domain Analogical Creativity

dc.contributor.authorDing, Zijian
dc.contributor.authorSrinivasan, Arvind
dc.contributor.authorMacNeil, Stephen
dc.contributor.authorChan, Joel
dc.date.accessioned2023-09-14T19:48:22Z
dc.date.available2023-09-14T19:48:22Z
dc.date.issued2023-06-19
dc.description.abstractCross-domain analogical reasoning is a core creative ability that can be challenging for humans. Recent work has shown some proofsof-concept of Large language Models’ (LLMs) ability to generate cross-domain analogies. However, the reliability and potential usefulness of this capacity for augmenting human creative work has received little systematic exploration. In this paper, we systematically explore LLMs capacity to augment cross-domain analogical reasoning. Across three studies, we found: 1) LLM-generated crossdomain analogies were frequently judged as helpful in the context of a problem reformulation task (median 4 out of 5 helpfulness rating), and frequently (∼80% of cases) led to observable changes in problem formulations, and 2) there was an upper bound of ∼25% of outputs being rated as potentially harmful, with a majority due to potentially upsetting content, rather than biased or toxic content. These results demonstrate the potential utility — and risks — of LLMs for augmenting cross-domain analogical creativity.
dc.description.urihttps://doi.org/10.1145/3591196.3593516
dc.identifierhttps://doi.org/10.13016/dspace/nef3-euxg
dc.identifier.citationZijian Ding, Arvind Srinivasan, Stephen MacNeil, and Joel Chan. 2023. Fluid Transformers and Creative Analogies: Exploring Large Language Models’ Capacity for Augmenting Cross-Domain Analogical Creativity. In Creativity and Cognition (C&C ’23), June 19–21, 2023, Virtual Event, USA. ACM, New York, NY, USA, 17 pages.
dc.identifier.urihttp://hdl.handle.net/1903/30503
dc.language.isoen_US
dc.publisherAssociation for Computer Machinery (ACM)
dc.relation.isAvailableAtCollege of Information Studiesen_us
dc.relation.isAvailableAtInformation Studiesen_us
dc.relation.isAvailableAtDigital Repository at the University of Marylanden_us
dc.relation.isAvailableAtUniversity of Maryland (College Park, MD)en_us
dc.subjectlarge language models
dc.subjectanalogy
dc.subjectcreativity support tools
dc.titleFluid Transformers and Creative Analogies: Exploring Large Language Models’ Capacity for Augmenting Cross-Domain Analogical Creativity
dc.typeArticle
local.equitableAccessSubmissionNo

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Ding et al.pdf
Size:
3.53 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.55 KB
Format:
Item-specific license agreed upon to submission
Description: