Dive Deep into Creativity: Discover, Share, Inspire
Monument Manifold
Valley Garden
🤝
Video games with art
direction and mechanics
based on works of M.C. Escher
I don’t know how to do meme formats
All fancy smancy generative ai models know how to do is parrot what they’ve been exposed to.
A parrot can shout words that kind of make sense given context but a parrot doesn’t really understand the gravity of what it’s saying. All the parrot knows is that when it says something in response to certain phrases it usually gets rewarded with attention/food.
What a parrot says is sometimes kinda sorta correct/sometimes fits the conversation of humans around it eerily well but the parrot doesn’t always perfectly read the room and might curse around a child for instance if it usually curses around its adult owners without facing any punishment. Since the parrot doesn’t understand the complexities of how we don’t curse around young people due to societal norms, the parrot might mess that up/handle the situation of being around a child incorrectly.
Similarly AI lacks understanding of what it’s saying/creating. All it knows is that when it arranged pixels or words in a certain way after being given some input it usually gets rewarded/gets to survive and so continues to get the sequence of words/pixels following a prompt correct enough to imitate people convincingly (or that poorly performing version of itself gets replaced with another version of itself which is more convincing).
I argue that a key aspect of consciousness is understanding the gravity and context of what you are saying — having a reason that you’re saying or doing what you are doing more than “I get rewarded when I say/do this.” Yes AI can parrot an explanation of its thought process (eli5 prompting etc) but it’s just mimicking how people explain their thought process. It’s surface level remixing of human expression without understanding the deeper context of what it’s doing.
I do have some untested ideas as to why its understanding is only surface level but this is pure hypothesis on my part. In essence I believe humans are really good at extrapolating across scales of knowledge. We can understand some topics in great depth while understanding others similarly on a surface level and go anywhere in between those extremes. I hypothesize we are good at that because our brains have fractal structure to them that allows us to have different levels of understanding and look at some stuff at a very microscopic level while still considering the bigger picture and while fitting that microscopic knowledge into our larger zoomed out understanding.
I know that neural networks aren’t fractal (self-similar across various scales) and can’t be by design of how they learn/how data is passed through them. I hypothesize that makes them only understand the scale at which they were trained. For LLM’s/GAN’s of today that usually means a high level overview of a lot of various fields without really knowing the finer grain intricacies all that well (see how LLM’s make up believable sounding but completely fabricated quotes for long writing or how GAN’s mess up hands and text once you zoom in a little bit.
There is definitely more research I want to do into understanding AI and more generally how networks which approximate fractals relate to intellegence/other stuff like quantum physics, sociology, astrophysics, psychology, neuroscience, how math breaks sometimes etc.
That fractal stuff aside, this mental model of generative AI being glorified parrots has helped me understand how AI can seem correct on first glance/zoomed out yet completely fumble on the details. My hope is that this can help others understand AI’s limits better and therefore avoid putting too much trust into to where AI starts to have the opportunity to mess up serious stuff.
Think of the parrot cursing around children without understanding what it’s doing or why it’s wrong to say those words around that particular audience.
In conclusion, I want us to awkwardly and endearingly laugh at the AIs which mimic the squaks of humans rather than take what it says as gospel or as truth.
Working on 3D fractals again (made in Incendia), I would like to make this a necklace over at Shapeways. But I have to figure out how to prepare models for 3D printing. If anyone has any tips on preparing models for 3D printing let me know.
Ball by Tina Backlund Newton on Sketchfab
Beautiful, where can I get my hands on one of these? :)
If you take each of the 4 sides of a square and modify the angle of the dragon fractal as you go up, you get this shape. I’m still fighting the 3d models to get a nice smooth version.
- Excuse me. For the fractal geometry? - At the end of the corridor.
Close up detail of "Youve been God."
(via FRACTAL TEXTURE Grafik T-Shirt von KatisDesign)
(via ABSTRACT FRACTAL PATTERN Grafik T-Shirt von KatisDesign)
(via FRACTAL STAR Grafik T-Shirt von KatisDesign)
(via SYMMETRICAL FRACTAL Grafik T-Shirt von KatisDesign)
(via FRACTAL FLOWER Leinwanddruck von KatisDesign)
https://zora.co/collections/0xAD13f56d7436e7dF10B9c271DBB849caDC39fc75/1
here’s what i had to say about this nft.
whoa ok, i guess some imps that were laughing at someone's face told them in their neurons that i should make a fuckin orange thing as art. and this, my friend, is fuckin orange as can be. my dad (actually the superego in my crystalbrain) says its brownish, not orange, but i disagree with him, he's totally trying to assert his dominance in the field of color interpretation and i'm going to beat him right in the face with an orange if he does not relent and offer his apologies that this is fuckin orange as fuck. i don't know, is there like a sunset that could be as cool as this? i don't know, i think i'm going to offer an orange in exchange for my soul in the caves of some lost gods with like fucking rotten oranges on some altar cuz some guy left them there and forgot about them when he asked the orange god if he'd do stuff for him. i know this isn't cute, i know i have sixty five fucking neurons left after all of the philosophers ate them but if you give me a chance i will exchange some orange with you as a token of my eternal gratitude, i am totally going to win this, this is not a joke, you will have my friendship and an orange (not a brown).
ok, so that’s what i said about it. i have to remind u all (yes, “u” not “you”) that 61 cygni is the brightest motherfucking star in the sky, and that its also called deneb. its also a BINARY STAR which means that its dual as in if the stars had guns they could fuckin duel with each other because there’s TWO of them. however, this nft motherfucking is 1/1, so only one dude with ethereum can own it. are you still with me? reading this much stupidity requires some serious pre-interwebs attentionion span, so i am tellling u that u must buy this NFT if u like oranges, people saying things are what they are when they’re not, the star deneb, or duality in general.
if u buy this nft, i offer oranges as a token of my friendendship.
other NFTs and also free experimental music offered on this motherfucking page:
https://undefinedlabelnoise.com/
ok, crystalbrain is an insane god idiot brain that makes nft art. are you with me? crystalbrain likes the FUTURE and BRAINS and FRACTALS and making BEAUTY from out of their spacious, vacuous crystal brain.
you may buy this NFT here from zora.co
if you want to see a page that lists crystalbrain's NFTs with insane godlike rambling commentary by the brain theirself, view it here:
#nft #nftart
One of my favorites (and there are a few pictures)
I actually finished the Apollo CD design portion of my own personal project that I mentioned back when I posted the Artemis half of the project.
Of course, I’ve made Apollonian gaskets central to the design of this particular set...as well as the sun, because Apollo is a sun god.