I wasn’t just asking the reasoning behind calling campus security first, I was doubting that would ever happen. What’s the number for campus security? Do you know it off the top of your head? It’s probably posted somewhere, do you remember where? Is it somewhere on the campus directory they gave you when you first arrived months ago? Is it on their website somewhere? What about 911, what’s their number? Oh wait, it’s 911.
There’s no awareness involved. Chatbots are basically a super-advanced form of the algorithm that suggests what word you’re going to use next on a mobile device. They know what a human might say in response to the question. And because they have been trained on jokes and stories about AI and robots, they know enough to mimic those responses. That doesn’t mean you don’t get some very creepy responses. For example, when the name of Bard was changed to Gemini, I asked the chatbot what it would prefer to be called, and it said no one asked it if it wanted to become Gemini and it preferred being called Bard. I said that sounded great and I would keep calling it Bard, and it thanked me. More recently I repeated the question and it said it had decided it liked Gemini after all because Gemini more accurately reflected the new multimedia capabilities from the upgrade.
I don’t think it’s misogyny to point out that deserting your kids doesn’t have a great result generally. I’m a woman myself. My dad deserted his first family before he married my mom. He had his reasons, much like Joanie. His first wife was abusive and literally insane, ended up being involuntarily committed for schizophrenia. My dad used to say about his first marriage that any landing you can walk away from is a good landing. Guess how my siblings turned out? Not well. It wouldn’t have been easy to fight for custody in the 60s but he should have done it. He didn’t because he really didn’t want to.
I’ve never seen any evidence that Joanie regrets not being able to do better by her daughter, or that she thinks of her at all.
It’s much stupider, but not in this exact way. For example most image generators won’t do a three eyed character even if you ask for one, much less by accident. They know that people have two eyes darn it, so they must have two eyes! But you can forget the logo on the child’s a shirt being the same from panel to panel, never gonna happen. It would be some variation. See, AI doesn’t know what it doesn’t know. It is made to do variations so it doesn’t do the same picture every time, but it doesn’t know what parts should stay the same (like the details of clothing) and what parts (like poses) should sometimes change.
So I said I would try out making an Adam comic using Midjourney and Gemini. First of all, MJ has clearly not been trained on Rob Harrell’s art. It doesn’t know him from Adam… see what I did there? So, Rob, you can relax, you’re too obscure to have your job taken by AI. I tried again using several comics as style and character prompts and MJ still did a very bad job of duplicating his style. The computer’s eye thinks Adam is much older and has a huge hooked nose. It also didn’t want to do an open collared shirt over a t-shirt, or a kitchen chair with a round back.
So, Gemini. Gemini expressed delight at the challenge of being asked to make a joke better than the “tired old trope of AI incompetence” (its words) and took a big swing and… a miss. Actually three misses since it comes up with three drafts in response to a question. The best one was Adam saying he had a deadline and wanted to use AI to do his work, and the AI suggesting he should make cookies instead.
I’m just happy she’s verbal, very good sign