7 Comments
User's avatar
Alex Kozak's avatar

"But we can go further and suggest that what this new instrument does is to allow us to search our collective unconscious"

I've been troubled by the 'collective unconscious' framing for model outputs. I'm not sure we should so readily accept that the training data is comprehensive enough or weighted properly to capture any essential us-ness. Surely it's reflective of *something* but how do we know whether it's an accurate or distorted image? And is it even a philosophically and scientifically coherent idea that there is a collective unconscious, or are we extending misguided folk/early psychology too far?

Expand full comment
Nicklas Berild Lundblad's avatar

Yes, maybe we are. That is a fair question. But we are searching *something*. The idea of a shared imagination might be better, and what we are searching is a part of that imagination - literally, in a sense - perhaps. Or, and this is interesting too, an imagination - the set of variations of images that we can extrapolate from existing images. Maybe this imagination is not ours - as you suggest - because ours is differently understood or defined. But I think your intuition is right -- if we searched the space of possible melodies we would not feel the need to imagine that we searched our "collective music" - but we would see that we are searching through a space of combinatorial complexities, some of which could possibly have been imagined by human beings, but not all. It reminds me of what chess players said when they met advanced ML - that it was like playing an alien. We could, then, suggest that it is more like searching an alien imagination, rather than our human imagination - alien defined only as not human. What else do we call the space of possible images produced by fundamental models? An imaginarium, distinct and different from any human space of possible images - continuous rather than discrete and ultimately limited by the training set in all directions. Hmmm. Very interesting comment, and a good objection.

Expand full comment
Alex Kozak's avatar

This is not a very punchy term, but I've been thinking of the outputs more like semantic echos. It's all derived or inherited from earlier meaningful works. And it helps see why using ML generated outputs as inputs into new models can cause problems ... feedback loops and echo chambers!

Expand full comment
Nicklas Berild Lundblad's avatar

I like that, but I sort of think it is something more. We seem to vacillate between thinking (hoping?) that these models cannot produce anything of enduring original value and over-hyping them as the next virtual da Vinci -- I think the reality is that they exist in between, they do search some kind of space that can contain the original and creative, as well as the merely derivative. I think the notion that these images are just echoes needs to then explain what it is that we do that is so different - or if all art then is just a series of semantic echoes of something else. The outputs are embedded in a context and so can have a life of their own in a way.

Expand full comment
Alex Kozak's avatar

Yes it's a great point, and if you look beyond images it's not hard to convince yourself that it can be creative in a totally new way.

Still, when it comes to creative content (and images), I'm open to the possibility that it's essentially meaningless. Or maybe there is no fact of the matter and it's basically an emotive decision.

Expand full comment
Nicklas Berild Lundblad's avatar

I now find myself really hoping you are right and that we are not transversing our collective unconscious. Because...Loab. https://techcrunch.com/2022/09/13/loab-ai-generated-horror/

Expand full comment
Alex Kozak's avatar

Wow. Excited for my new career as a scifi/horror author - gives me some great ideas.

Expand full comment