Generative architectures are revolutionizing various industries, from generating stunning visual art to crafting captivating text. However, these powerful assets can sometimes produce unexpected results, known as hallucinations. When an AI model hallucinates, it generates erroneous or nonsensical output that deviates from the desired result. These