You ever hear that old chestnut about how most people neglect the part of the story of Icarus where he also had to avoid flying too low, lest the spray of the sea soak his feathers and cause him to fall and drown? You ever think about how different the world would be if Icarus died that way instead? If the idiom was to Fly To Close To The Sea? A warning against playing it far too safe, about not stretching your wings and soaring properly? You ever think about how Icarus died because he was happy?
trump could inadvertently win the nobel peace prize if he got those three to actually agree on something
āš©š©š²šŖš¦š«šš±š¢š” ššš«š²š°š šÆš¦šš± š Inspired by medieval manuscripts and embellished tales of heroism | webstore link
the thing is that childhood doesn't just end when you turn 18 or when you turn 21. it's going to end dozens of times over. your childhood pet will die. actors you loved in movies you watched as a kid will die. your grandparents will die, and then your parents will die. it's going to end dozens and dozens of times and all you can do is let it. all you can do is stand in the middle of the grocery store and stare at freezers full of microwave pizza because you've suddenly been seized by the memory of what it felt like to have a pizza party on the last day of school before summer break. which is another ending in and of itself
Iām so easily persuaded into a ship. All I need is one good piece of artwork and Iām like, yeah I see it. I approve.
One of the common mistakes I see for people relying on "AI" (LLMs and image generators) is that they think the AI they're interacting with is capable of thought and reason. It's not. This is why using AI to write essays or answer questions is a really bad idea because it's not doing so in any meaningful or thoughtful way. All it's doing is producing the statistically most likely expected output to the input.
This is why you can ask ChatGPT "is mayonnaise a palindrome?" and it will respond "No it's not." but then you ask "Are you sure? I think it is" and it will respond "Actually it is! Mayonnaise is spelled the same backward as it is forward"
All it's doing is trying to sound like it's providing a correct answer. It doesn't actually know what a palindrome is even if it has a function capable of checking for palindromes (it doesn't). It's not "Artificial Intelligence" by any meaning of the term, it's just called AI because that's a discipline of programming. It doesn't inherently mean it has intelligence.
So if you use an AI and expect it to make something that's been made with careful thought or consideration, you're gonna get fucked over. It's not even a quality issue. It just can't consistently produce things of value because there's no understanding there. It doesn't "know" because it can't "know".