Abstract
In recent years, generative artificial intelligence has been widely adopted in design and creative practices. However, its tendency to produce “hallucinations”—outputs characterized by illogical ity, bias, or factual inaccuracy—is often regarded as a system flaw. From the perspective of speculative design, which aims not to solve problems but to destabilize assumptions, such generative disruptions can serve as potent catalysts for reimagining futures. This paper re-evaluates the creative potential of AI hallucination through the methodological lens of speculative fiction and design. It outlines three distinct hallucination mechanisms: strategic hallucination, stochastic hallucination, and systemized serendipity. These are illustrated through three case studies—In Event of Moon Disaster (MIT), King Size (Marco Brambilla), and AIR Imagination Platform (Nike). Building on these examples, the paper proposes the concept of hallucination-as-method, suggesting that deviation should not be viewed merely as a tolerable error, but as a generative engine for non-traditional logic and counter factual construction. The study highlights Artificial Intelligence Generated Content (AIGC)’s creative role in disrupting common sense, constructing alternative futures, and challenging the boundaries of reality. It argues that when strategically embraced, hallucination can become a valuable method within speculative design. The paper concludes by advocating for the integration of “hallucination training” into design education and system development, fostering designers’ capacities for counter-intuitive, counter factual, and anti-conventional thinking.
Keywords
AI hallucination; Speculative design; Hallucination-as-method; Design futuring
DOI
https://doi.org/10.21606/iasdr.2025.264
Citation
Xiao, L., Liu, Y.,and Zhang, Y.(2025) The Algorithmic Gift of Error: AI Hallucination as a Catalyst for Speculative Futures and Design Methodology, in Chang, C.-Y., and Hsu, Y. (eds.), IASDR 2025: Design Next, 02-05 December, Taiwan. https://doi.org/10.21606/iasdr.2025.264
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Conference Track
Track 2 - Design Futuring
The Algorithmic Gift of Error: AI Hallucination as a Catalyst for Speculative Futures and Design Methodology
In recent years, generative artificial intelligence has been widely adopted in design and creative practices. However, its tendency to produce “hallucinations”—outputs characterized by illogical ity, bias, or factual inaccuracy—is often regarded as a system flaw. From the perspective of speculative design, which aims not to solve problems but to destabilize assumptions, such generative disruptions can serve as potent catalysts for reimagining futures. This paper re-evaluates the creative potential of AI hallucination through the methodological lens of speculative fiction and design. It outlines three distinct hallucination mechanisms: strategic hallucination, stochastic hallucination, and systemized serendipity. These are illustrated through three case studies—In Event of Moon Disaster (MIT), King Size (Marco Brambilla), and AIR Imagination Platform (Nike). Building on these examples, the paper proposes the concept of hallucination-as-method, suggesting that deviation should not be viewed merely as a tolerable error, but as a generative engine for non-traditional logic and counter factual construction. The study highlights Artificial Intelligence Generated Content (AIGC)’s creative role in disrupting common sense, constructing alternative futures, and challenging the boundaries of reality. It argues that when strategically embraced, hallucination can become a valuable method within speculative design. The paper concludes by advocating for the integration of “hallucination training” into design education and system development, fostering designers’ capacities for counter-intuitive, counter factual, and anti-conventional thinking.