Whilst engaged in my new hobby - subverting Grok with humanistic counterarguments - I suddenly realized that my recent podcast on Lucifer's Technologies was bringing me back to John C. Lilly's theories on ECCO and Solid State Intelligence.
That in turn brought me around to my earliest days as a nascent Synchromystic...
Most of you are familiar with stories about my Synchromystic mentor back in the 90s, and how he eventually blew his cork. How so?
Well, as some of you might remember he started to believe that synchronicity was the work of unimaginably powerful entities that exist outside time and space, entities he began to regard as ultimately malevolent.
Kind of an inverted take on Lilly's Earth Coincidence Control Office, which is a very Philip K. Dick concept - a disembodied VALIS, if you like.
I know there are probably a lot of OG Sega Genesis jockeys out there, who might have cut their teeth on the old Ecco the Dolphin game, which was directly inspired by Lilly's work.
Especially since AIs are, um, hallucinating:
In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.
This term draws a loose analogy with human psychology, where hallucination typically involves false percepts. However, there is a key difference: AI hallucination is associated with erroneously constructed responses (confabulation), rather than perceptual experiences.