philosophy podcast, ethics podcast Kolby Granville philosophy podcast, ethics podcast Kolby Granville

E27. "Two-Percenters" - Should we make everyone special?

Named “Top 15 Podcast” for 2020!

STORY SUMMARY: Set in the future, 2% of the population have a genetic makeup that allows them to be enhanced. The intelligent are very intelligent, the beautiful, like Greek gods. Because of their enhanced abilities, they run the world. An enhanced “Social” meets up with an enhanced “Rational” to tell him about a newly discovered drug that would allow the other 98% of the world to be able to be enhanced as well, but it would cause the 2% to regress to average, or worse. The Rational takes the vial and releases it into the world. The Social kills herself.

DISCUSSION: If things in this world are so amazing, why are the 98% causing civil unrest? Should the elite naturally be left to lead others? Does being super-human automatically make you super moral? Should the truly exception should lead the masses? if everyone is raised up, we are right back where we were, with people fighting to be on top and not enough to go around. Do we live in a meritocracy today? Doesn’t money allow those at the top to keep their children at the top today? The only ones with no choice are the 2% after the virus is released. Discussion about if we would release the virus.

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“Should we make everyone special?”

Kolby, Jeremy and Ashley discuss the choices in the medical ethics science fiction short story "Two-Percenters" by CJ Erick. Subscribe.

Read More
philosophy podcast, ethics podcast Kolby Granville philosophy podcast, ethics podcast Kolby Granville

E26. "Snitch" - When doing a moral good, do the ends justify the means?

Named “Top 15 Podcast” for 2020!

STORY SUMMARY: A black pastor in New Orleans is trying to get a redevelopment project built for his poor community post Katrina. Things aren’t going well. A white person was robbed and beat up in the area, which scared off the banks from lending. The pastor goes to the local gang and pays them to keep white people safe. He also reaches out to another church group to help with protests. The white developer/partner comes and says he is going to make the project smaller and the church will get less. The pastor goes to the black mayor who also wants a cut of the development money for his re-election campaign. The pastor finally decides he’s had enough and calls the federal government to report corruption in the city.

DISCUSSION: Seems a very realistic portrayal of how things get done. There aren’t any clear good guys in this story, just people with codes that go with their social group. Which comes first, your code that pushes you into a group, or a group you get into that pushes their code on you? Maybe the pastor has finally decided to stop making moral compromises and live by better ethics, maybe not.

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“When doing a moral good, do the ends justify the means?”

Kolby, Jeremy and Ashley discuss the choices in the inner city development short story "Snitch" by Charles Williams. Subscribe.

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E25. "Pneumadectomy" - Would you remove your soul, to save your life?

Named “Top 15 Podcast” for 2020!

STORY SUMMARY: A young boy heads to the park to play soccer with friends, they tease him and won’t let him play because he had his soul removed. Flash to the discovery of the soul. A doctor has modified a CAT Scan machine and found the soul in the appendix. When the appendix is inflamed, sometimes it is medically caused, sometimes it is because of an injured soul. Regardless of the cause, it can still be removed and the problem is fine. And the person with no soul seems no different. The mother comforts the boy when he gets home. Later, the boy goes to a friends house, his mother tells the boy her son died, because he was having appendix issues, and they refused to have it removed because they didn’t want to remove his soul.

DISCUSSION: You must first accept the premise of the story, that the people in the story found the soul in the appendix. Knowing that, what good is it to have a soul in the story? Seems like everyone stays basically the same. Would you write on a piece of paper selling your soul to another person? If so, you must believe in a soul, regardless of what you say. Otherwise, the paper means nothing and it’s free money. Would you be friends with someone without a soul? Is the soul tied to an afterlife, if so, maybe the mother who let her child die did the right thing. Should the government be allowed to impose a medically necessary procedure that a parent refuses? Courts in the US say parents can refuse treatment for their children, if they are under 12 years old, with a court order.

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

“Would you remove your soul, to save your life?”

Kolby, Ashley, Jeremy and Sarah discuss the choices in the medical ethics short story "Pneumadectomy" by Harris Coverley. Subscribe.

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E24. "Choose" - What if death is the only option?

Named “Top 15 Podcast” for 2020!

STORY SUMMARY: A woman wakes up strapped to a lab table. An emotionless doctor asks her to choose what she would do for the trolley problem. She is then graphically shown the results of her choice. A new choice, do you push people out of an over-capacity life raft to save the others? She is again graphically show the result of her choice. This goes on for 1000’s of scenarios until the woman is totally exhausted from watching death. Only then does she realize she is being punished, 900 years after a choice she made, to kill children in order to find the cure to a disease.

DISCUSSION: Loads of utilitarian questions in this story, just one after another. Scenario makes them fit in the story very organically. Would a person really get rattled from watching all that death, or get desensitized? In Greek plays, it was called catharsis. It’s hard to know, is the woman being punished for her past acts, or re-educated, or being used as a deterrent to others? Should a person be punished after they have died? Should they be punished longer than their lifetime, or more than a single death?

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“What if death is the only option?”

Kolby, Ashley, Jeremy and Sarah discuss the ethics and choices in the suspenseful short story "Choose" by David Whitaker. Subscribe.

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E23. "Prevention" - Would you turn on your son, to save his school?

Named “Top 15 Podcast” for 2020!

STORY SUMMARY: A single mother and her son have coffee before school. His car is in the shop, so she drives him to high school. He calls his mom later to say he left his laptop in the car. She decides to go through his laptop, and finds out his son and two friends are planning on shooting up the school in just days. She searches his room, and finds guns and drugs. The mother is worried about how this will effect her college daughter, and herself, if the shooting happens. The next day she spikes her son’s morning coffee with drugs and waits for him to die in his room of an overdose. She calls the police and ambulance. She disposes of the guns and laptop on the outskirts of town. The police suspect nothing and her son’s death is deemed a suicide by drug overdose.

DISCUSSION: The mother is a psychopath, and her priorities are all wrong. Her first concern is her daughter, and she treats her son like a stranger. She is emotionless in killing her son. This also hints that the son’s issues might be genetic from the mother. This is wrong behavior. She didn’t have to call the police, she could have taken him to the father, or for treatment. There are still two remaining kids planning to shoot up the school, and she doesn’t even tell the school about them. This is a story as much about the mother’s issues as about a school shooting. However, school shootings are now just the world we live in as the “new normal.”

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“Would you turn on your son, to save his school?”

Kolby, Jeremy, and Ashley discuss the ethics and choices in the suspenseful short story "Prevention" by Margaret Karmazin. Subscribe.

Read More
philosophy podcast, ethics podcast Kolby Granville philosophy podcast, ethics podcast Kolby Granville

E22. "An Infinite Game" - Is everyone selfish, when death is on the line?

Named “Top 15 Podcast” for 2020!

STORY SUMMARY: Four prisoners are made to draw straws for the order they stand in a row. Their prison guard plans to push his bayonet into the first person and see how far back it goes in the line of men. The first man in line panics, runs, and is shot. The narrator talks to the man in front of him and tries to convince him not to run so he might slow down the thrust. The heavy set man in the back thinks he is safe, but the guard changes his mind and stabs him instead. In the end, only the narrator survives.

DISCUSSION: Story focuses on morality, game theory, value theory, and infinite game theory. Should the heavy set man have volunteered to be first in line to slow down the blade for everyone else? Is it selfish to run and, thus, cause those behind you to be more likely to die? Should the four men have simply tried to rush the guard? Does everyone find God when they are about to die? Game theory seems to only work when dealing with large numbers, not individuals. Value theory seems to state that, at the end of the day, nothing is worth more than your own life. This is an infinite, not a finite, game. The guard seems like an arm-chair philosopher.

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“Is everyone selfish, when death is on the line?”

Kolby, Jeremy, and Ashley discuss the ethics and choices in the game theory short story "An Infinite Game" by Dean Gessie. Subscribe.

Read More
philosophy podcast, ethics podcast Kolby Granville philosophy podcast, ethics podcast Kolby Granville

E21. "Prohibition" - Can you blame an addict for not following the law?

Named “Top 15 Podcast” for 2020!

STORY SUMMARY: Set in the future, an addict takes a cab to an isolated part of town. He goes to a private, illegal club to break the law; he orders meat. The club is raided by the police, who kill a patron during their interrogation of her. The meat-eating addict sneaks away, knowing he will break the law again.

DISCUSSION: Story with so many layers. First, the contrast between the “humane” society that has banned meat eating, but the brutality of the individual police. Do more serious laws allow for more brutal policing? Also, is this man protesting, or is he simply an addict? Seems to be just be an addict. Are there natural rights? If so, is eating meat one of those natural rights? Does it matter if the reason the law was passed was to protect animals, or to prevent climate change? If a law is passed you disagree with, does it change your behavior? Do you leave the party where they are eating meat? Does it depend on the level of the crime they are committing?

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“Can you blame an addict for not following the law?”

Kolby, Jeremy, and Ashley discuss the ethics and choices in the short story "Prohibition" by David Edward Rose. Subscribe.

Read More
philosophy podcast, ethics podcast, Children Kolby Granville philosophy podcast, ethics podcast, Children Kolby Granville

E20. "How The Cockroach Lost Its Voice" - When cockroaches could talk, and humans were still unhappy.

Named “Top 15 Podcast” for 2020!

STORY SUMMARY: An older and younger (talking) cockroach climb to the top of the highest thing, the refrigerator, to overlook their world. The older roach tells the child that the humans he sees can talk, and also have a 3rd eye inside of them that allows them to imagine the future and remember the past, and this is what makes them unhappy all the time. An angel moth comes down and takes away the roaches ability to speak forever.

DISCUSSION: A children’s story, but one with a good lesson, about having the ability to think about the future, but not let it trouble you or dwell on it. Do other animals have this “3rd eye”? Maybe dogs or others do, to some degree. It has made humans successful because we can remember errors, and plan for future problems. It’s a trade off, but a good one. The key is to not worry so much.

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“When cockroaches could talk, and humans were still unhappy.”

Kolby, Jeremy, and Ashley discuss the ethics and choices in the children’s short story "How The Cockroach Lost Its Voice" by Samuel Reifler. Subscribe.

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E19. "The Orphan's Dilemma" - Is getting a future worth forgetting the past?

Named “Top 15 Short Story Podcast” for 2020!

STORY SUMMARY: The story takes place in the thoughts of a 16 year old boy waiting to have his memory erased for his adoption. He thinks about going on his first date, and about being teased by others. He wonders about the family that is adopting him and having new memories implanted in him. It’s finally his turn, he has decided if he is getting his memory replaced, and he heads in to the room to tell them his decision.

DISCUSSION: Wonderful story about a “hero’s journey” of death and rebirth. Brings up good questions about how our pain, as well as our joy, creates our personality. What kind of family would want a kid only on the condition of cleaning his memories? But, isn’t the goal a successful adoption, and maybe having a clean slate would make that more possible. Don’t people avoid adopting dogs with “issues?” Is there a screening process? Should a family be able to select a child who isn’t “broken?” What if a free college education is included? How much are memories worth?

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“Is getting a future worth forgetting the past?”

Kolby, Jeremy, and Ashley discuss the ethics and choices in the short story "The Orphan’s Dilemma" by Christopher Burrow. Subscribe.

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E18. "The Book Of Approved Words" - Are there words so horrible they shouldn’t even exist?

Named “Top 15 Podcast” for 2020!

STORY SUMMARY: An “approved “government writer gets in trouble by the governing board for writing honest movie reviews. His run-away brother comes to his house to scan his contraband collection of books and invites him to join his rebellion by uploading an earlier edition of the book of approved words so the population can see the words that are missing from the current edition.

DISCUSSION: Interesting modern twist on the typical 1984 banned books idea. In this case, the words being banned are ones that might offend, or exclude, the general population. Brings up an interesting question, if you remove words for higher levels of anger and frustration from the vocabulary, does that have the effect of pacifying the thoughts of the population. Do ideas exist without the words to express them?

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“Are there words so horrible they shouldn’t even exist?”

Kolby, Jeremy, and Ashley discuss the ethics and choices in the dystopian short story "The Book Of Approved Words" by W.M. Pienton. Subscribe.

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E17. "A Change Of Verbs" - What if you just said what you meant?

Named “Top 15 Podcast” for 2020!

STORY SUMMARY: The main character is a passive University Professor with a nagging wife. For some reason he decides today will be different and he goes through the entire day saying, and doing, what he actually feels. The day goes amazingly well, with classes, with colleagues, and he decides this will start a new chapter in his life.

DISCUSSION: Mixed reactions on this story. On the one hand, isn’t it his own fault for not always saying what he meant and letting people walk over him. Did this unhappiness come on slowly? He doesn’t treat his wife very well, and, perhaps, too much of his attraction is only physical. Is this just an extrovert writer telling introverts, if you just were more like me, you’d be a happier? Isn’t this all on a scale, you have to balance saying what you mean, with being respectful of others.

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“What if you just said what you meant?”

Kolby, Jeremy, and Ashley discuss the choices in the short story "A Change of Verbs" by Tom Teti. Subscribe.

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E16. "Abrama's End Game" - If god told you she was ending your world, would you fight back?

Named “Top 15 Podcast” for 2020!

STORY SUMMARY: The female god of a fantasy realm tells the inhabitants she is actually a graduate student researching AI in an MMORPG, and that she created them to see how they would change. However, the game developer is discontinuing the game because of the illegal gold farming being done in game for money laundering. The leader of the AI fight back by working with real in-game players to trade them in game gold for real world tools to fight back.

DISCUSSION: How easy or hard is it for people to accept the concept in this story, of in-game currency having real world value, and AI that is this advanced? What would it be like to meet your god, and know that you are simply in their video game? Given “Moore’s Law” would it be better to shut down the game now, before it becomes “too smart” to ever shut down?

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“If god told you she was ending your world, would you fight back?”

Kolby, Jeremy, and Ashley discuss the ethics and choices in the fantasy science fiction short story "Abrama’s End Game" by David Shultz. Subscribe.

Transcript

Kolby:

Hi, you're listening to After Dinner Conversation, short stories for long discussions. What that means is we get short stories, we select those short stories, and then we discuss them, specifically about the ethics and the morality of the choices, the characters, and the situations put us in. Why did you do this? What makes you do this? What makes us good people? What's the nature of truth, goodness, all of that sort of stuff. And hopefully we're all better, smarter people for it and learn a little bit about why we think the way we think. So thank you for listening.

Kolby:

Hi, and welcome back once again to After Dinner Conversation, short stories for long discussions, which is a fancy way of saying we get stories that have some sort of moral ethical question. We publish them and then some of them that we really like, we do a podcast, and we discuss them. And our hope is, is that you'll read the story and that you'll have the same kind of discussions with your friends about what would I do? How would it work? What's the nature of goodness or truth? Or what does it mean to be alive? Or all the things that we discuss. A lot of it's been AI stuff lately, computer stuff, except last week, which was about cannibalism. We all decided we were cannibals at heart.

Ashley:

Read the story. It'll make sense.

Kolby:

Because all meat eating is non consensual.

Jeremy:

#Ieatpeople.

Kolby:

#Ieatpeople. Yeah.

Ashley:

Start a movement.

Kolby:

Start the movement. I am your co-host, Kolby. This is co-host Ashley wearing the-

Ashley:

Hello!

Kolby:

She's wearing the shirt today.

Ashley:

For those Watching on YouTube, you can actually see the shirt. It's basically Jeremy's hashtag. So I was researching this...

Kolby:

Yeah, he says that in every episode. He's required to [crosstalk 00:01:49].

Ashley:

He's really good. He does a lot of research. He reads a lot.

Kolby:

I feel like if there was a shirt for me, it would say, I don't understand.

Ashley:

No.

Kolby:

Or, let me give you a hypothetical. Maybe that's what mine would be. I don't know.

Ashley:

So according to...

Kolby:

Yeah. Yeah. According to... And of course we're also co-hosting with Jeremy who does our audio and does a great job and also researches stuff unlike the other rest of us who just wing it.

Ashley:

We're wingers.

Kolby:

And we are once again sponsored by and hosted by a La Gattara, the cat cafe in Tempe, Arizona that have cats you can come visit or you can adopt. They've adopted over 500, probably 600 by the time you watch this, cats and they've always got new cats. And they run the gambit from like older sleepy cats to young, fun, spunky kittens, and they're just rockstars. This one is pretty awesome as well. And again, if you're having a good time doing this, like subscribe, and more importantly, tell your friends. The vast majority of podcasts that people try out are not because of an advertisement, but because a friend recommended it, and it was like, yo dude, did you hear about blah, blah, blah? Ask me another or whatever, and they do it. So suggest it to a friend and also, last thing. We have a book that's come out After Dinner Conversation, Season One, which is 25 of our best short stories that ask ethical questions with all of the discussion questions at the end of each story.

Kolby:

There are even children's stories in there. You can buy that on Amazon as an ebook or as a print book, which is super cool. I'm super jazzed about that.

Ashley:

If you have a great story, submit it. Send it on in.

Kolby:

Yeah.

Ashley:

If you love reading stories and want to help us filter it to get into our genre of ethical dilemmas, be a reader.

Kolby:

Yeah, that'd be great.

Ashley:

Tons of stories, so.

Kolby:

We send it out to readers as our initial screeners. Honestly, for about every 10 stories we get, about two or three are the kind of thing we do, the kind of ethical questions. And of those two or three, maybe one is actually-

Ashley:

Give people hope.

Kolby:

No, it's-

Ashley:

You're gonna get submitted. You're gonna be on the podcast.

Kolby:

You're gonna be different, though.

Ashley:

You're gonna be selected.

Kolby:

The point is, don't just send us writing. Send us writing that's the kind of thing that we read and publish. That helps us ,and it also keeps the readers from wasting a bunch of time.

Kolby:

Okay, so our last story is Abrams. Is that...?

Jeremy:

Abrama.

Kolby:

Abrahma, which apparently is a version of Abraham. Abrama's End Game. I don't actually, I forgot... written by... this is a second story by this guy who did it. David Schultz, who did the one that was here with Jessica, that blew us away, Rainbow People of the Glittering Glade-

Jeremy:

Yes.

Kolby:

... was, up until that point, it was the best story we had read or discussed. This is equally good.

Ashley:

Mm-hmm (affirmative).

Kolby:

Yeah, solid. I don't know who is discussing this one? Is it me? Maybe. I haven't discussed one in awhile.

Ashley:

You discuss it. You're taking it.

Kolby:

I'll discuss it. So Abrama's... I'm gonna to keep mispronouncing that. Abrama's End Game is about a massively multi-player online game. And so it starts off a little bit weird in that there's a character like an elf or somebody walking through a village and you're like, oh, it's a fantasy story. There's elves and dwarves and stuff. And then you realize that some of the elves and dwarves are of another world and some are not. But they all look the same. And what you then realize is, it jumps back, and you realize it's actually a character in an online game like... What was the one that you played forever you got lost in?

Jeremy:

World of Warcraft.

Kolby:

World of Warcraft. Yeah. But the character actually knows it's a character in the game. And other people are leaving and coming and going, and some of them are just avatars that are just idiot bots. And the thing that's so cool about it, is the reason that this one particular bot in the game is so smart is because the game developers opened up to the code to allow researchers to experiment with AI by creating individual AI bots in the game to learn and write papers and learn about how AI would react in a real world. And in this case-

Ashley:

Simulation.

Kolby:

... in a simulated world.

Ashley:

Yeah.

Kolby:

Right. 'Cause it's a constructed world where you can try out theory-

Jeremy:

Right.

Kolby:

... of how AI interacts with real people.

Jeremy:

Right. And I feel like this is the next step.

Kolby:

Oh, I think this is in [crosstalk 00:06:08]near future.

Jeremy:

Like with World of Warcraft, researchers were analyzing the data.

Kolby:

Yeah.

Jeremy:

Especially when there were events that acted like virus outbreaks.

Kolby:

Yeah. The virus outbreak, the one that killed everybody in World of Warcraft. Yeah.

Jeremy:

Killed all the NPCs.

Kolby:

Yeah.

Jeremy:

That was pretty cool.

Kolby:

Because it spread just like a virus. Yeah. Yeah. And so ultimately what happens is, because people are gold farming and it's in violation of some cryptocurrency law that's been passed, the government is going to shut down the game. And the-

Ashley:

Well, they see it as a threat, because the game has a currency called GP or gold points. This can be exchanged anonymously. It's a free market, totally anonymous, no way of tracking. They can trade with US dollars at an exchange rate of a thousand gold points or GPS per $7. So this exchange is happening and now the game itself, Land of Legends, has a $2 billion GDP. And they're like, we have to regulate this.

Kolby:

And of course the implication is... So let's say I wanted to do something illegal, buy drugs or deal in whatever horrible things, I could go into the game. With real currency, I could buy the in-game currency. I could then trade that in-game currency with another person-

Jeremy:

For real world currency.

Kolby:

... for real world currency. They'd cash it out again, and they would then give me the thing I wanted.

Jeremy:

It's money laundering.

Kolby:

And so it would be a perfect way to either buy or sell illegal things or to do money laundering because it's untrackable money, right?

Ashley:

Legends wasn't intended to function as a perfect digital black market, guaranteeing anonymity and a stable exchange rate and encrypted transaction. And it's super popular.

Kolby:

Right.

Ashley:

It made the system illegal. It's illegal, technically, but-

Kolby:

There's no way to track the sales. And so, the government steps in to shut it down because it's being used for black market stuff, and it's illegal, and there's no way to track sales.

Ashley:

It's like, how do we tax people? Well, you can't.

Kolby:

Right. And so the AI finds this out and wants to defend its realm, defend itself from having its servers turned off. And so the thing... and there's so many clever things about this story. The super clever thing the AI does is, they use their currency. The AI uses it's in-game currency to interact with real people and ask the real people to do real things in the real world.

Ashley:

They ask these rebels-

Kolby:

Yeah, like a hacker group.

Ashley:

... to get some dirt so if they try to shut us down, we can blackmail them.

Kolby:

Right. And so-

Jeremy:

That's a very clever way they do it.

Kolby:

Right. And so they create basically a dead man switch, where the real people do work in trade for the fake gold because they can cash it out. And then the bots, the AI in the game, now has essentially government secrets, and they ping the server every hour. And if the pinging ever stops, everything gets decrypt-

Jeremy:

Decrypted and released.

Kolby:

... and all the government's secrets are out there, right? And so they've essentially made it now where the government can't shut off the server. So they've basically defended their realm by interacting with real people and making them do work. And then the government tries to step in, and there's a battle in the third act. The government is not successful and there's essentially a stalemate where the government can no longer shut down these servers.

Jeremy:

And so they go in and sign a peace agreement.

Kolby:

Yeah, a virtual peace agreement, and then the story is over. And I will tell you, man, I think five to seven years. I think this is super near future.

Ashley:

This story is so complex because not only do you have the players in the game, like this is their world. This is their life. They don't know that they're basically in a video game. You have the researcher who's implanting these people and doing research studies on it. You've got the government. It's like, we've got to regulate this. We've got to tax them somehow. You've got rebels who are like, this is a way for us to make more money using it for bad purposes. And then you've got people who just want to play their video game. It sounds harmless. It's a video game. How do you tax... what'd they say? They wanted to tax a character.

Jeremy:

Yeah, the value of character.

Ashley:

How much should the US government tax imaginary creatures? And so it's like what? It's a world within a world, and they successfully defend themselves. It's just so interesting.

Kolby:

So, Jeremy, you played a lot of... 'cause I wrote a law school paper on EVE Online and about contracts and partnership agreements that are de facto partnership agreements in online worlds. You played a lot more than I did. Not just EVE Online but also World of Warcraft.

Jeremy:

Just those two mainly.

Kolby:

Yeah. What was your thought on reading this as a person who actually has put hundreds of hours, at least [crosstalk 00:00:11:04].

Jeremy:

Yeah, I mean, from a technical point of view, it seems like the idea... I mean, we have... EVE Online is a good example where there is an official exchange rate.

Kolby:

And they actually permit money to go in and out of their currency in the game.

Jeremy:

Absolutely.

Kolby:

Yeah.

Jeremy:

So there's an official exchange rate.

Kolby:

I remember you cost me money when you got my ship blown up. It cost me like 10 bucks. I had to go back and buy more currency to buy a new ship.

Jeremy:

Yeah.

Ashley:

Blowing up ships.

Kolby:

No, he did. He honestly he did that. I used my real US money to buy in-game currency, I think it's called ISK. I then bought a ship and it was like, all right, Jeremy, let's go raiding. And he made some silly mistake. He popped out of a portal too early, and I popped out too late or whatever. And I got blown up instantly.

Ashley:

Oh, no.

Kolby:

And my first thought was like, dude, you just cost me 10 bucks.

Ashley:

So,. okay. So I had a roommate who... to get through grad school, he was a big time online gamer, like would play for 48 hours straight. And he would build up these characters and sell them. And that's how he paid his way through grad school. It is literally a trade. He has a skillset. Why not? There is a supply and demand. People want to have these higher end characters because they love the game and want to build up more and blah blah blah. And there's a service-

Jeremy:

It's something for World of Warcraft they've had to address and there's a whole system now that helps you build characters faster, or you come in at a higher level with the newer versions.

Ashley:

But that totally just blew away all their hardcore people that made a career out of building characters. That was the way to utilize it.

Jeremy:

And they moved to something else.

Ashley:

Yeah.

Kolby:

One of the questions in here is about how like... So you and I, and I think Ashley, all think this is not farfetched.

Jeremy:

Right. And some of it's gonna happen sooner than other parts of it.

Kolby:

I think the AI-

Jeremy:

It's going to take the longest.

Kolby:

Yeah. But I think using online games to test AI-

Jeremy:

Absolutely.

Kolby:

Even rudimentary, I think that's not far away at all if it's not already happening already. The question I'm curious to hear your opinion on, what do you think about the sort of bias that... Since you and I have played these games and we understand them... So if you were to go and talk to someone else, someone who hasn't played online games, why do you think there's this idea of, oh, that's just stupid. People pay real money for fake gold. You think that they can defend themselves? How do you think that comes about that our... We bring ourselves to this. I think there's a lot of people who would think this is just hooey, which is probably the word they would use if they thought that.

Jeremy:

Right. The idea of going in and exchanging money for a fake currency.

Kolby:

For just fakeness. And it's all fake. Everything's fake. The people are fake.

Jeremy:

Then what about casinos where you go in and buy chips?

Ashley:

Yep. It's a video game.

Kolby:

Man, you totally just ended my question right there. Wow. It's a good thing these are headset mics and now [crosstalk 00:13:51] drop. You would've just dropped that mic. Yeah.

Ashley:

Just drop a cat. Just kidding. They land on all fours.

Kolby:

'Cause they are. They're just little blue chips that we just decide have value, which is the same thing with paper money.

Jeremy:

Exactly.

Kolby:

It's just little green pieces of paper we decide have value.

Ashley:

Yeah.

Jeremy:

So online courtesy is basically the same thing.

Ashley:

One of the people in the story, her reason for defending using online currency is that part of her research was that the software agents, the people that she created... They're equal participants and their behavior can be made to approximate human participants. It's kind of an economic Turing test in a way, conducted through virtual marketing activity. So it's a way of them testing human behavior. Like how much research is she getting out of this? I'm like, that's super valuable. Is there another way to do that, or is there not another way to do that?

Kolby:

I mean, here's the thing, right. You can do it with an AI bot in a game because you don't have to worry about creating a body.

Ashley:

Yeah.

Kolby:

Right. You don't have to create the android. You can just create the software and so it's a much easier way, and you can reiterate faster.

Ashley:

And she said she's done thousands of these characters, and then she's deactivated some of them as a failed product. For me, that was the most interesting.

Kolby:

The deactivating?

Ashley:

No. The fact that she's running this experiment and then towards the end she goes to Abrama and is like, I am your creator. I can talk to her. And I'm just like, wow. It's like this real living being and here's like her God. Like this is my child. This is my child, and we'll figure out something together. I just thought that was so awesome. I know. That's me geeking out right now.

Kolby:

No, I totally get that geeking out.

Ashley:

I thought it was cool. It's like I created this life and then look at them go.

Kolby:

And she felt an obligation to tell them like, hey, your end is coming.

Ashley:

Exactly.

Kolby:

Like if you're gonna do heroin, do it now.

Ashley:

No, no, but it's this-

Kolby:

It's always about heroin.

Ashley:

Her intentions for me were just so pure, and she understands there's by-product badness of it, but again, it's this, does the good outweigh the bad? Like, hey, all the good of the research-

Kolby:

So let me ask you this, Ashley. Let's say this really happened. So let's say that somebody came to you and was like, hey, just so you know, this is all a game. You were an experiment of mine. We're shutting it down pretty soon. Would-

Kolby:

Hi, this is Kolby and you are listening to After Dinner Conversation, short stories for long discussions. But you already knew that, didn't you? If you'd like to support what we do at After Dinner Conversation, head on over to our Patreon Page at patreon.com/afterdinnerconversation. That's right. For as little as $5 a month, you can support thoughtful conversations like the one you're listening to and as an added incentive for being a Patreon supporter, you'll get early access to new short stories and ad-free podcasts, meaning you'll never have to listen to this blurb again. At higher levels of support, you'll be able to vote on which short stories become podcast discussions, and you'll even be able to submit questions for us to discuss during these podcasts. Thank you for listening and thank you for being the kind of person that supports thoughtful discussion.

Kolby:

Let's say this really happened. So let's say that somebody came to you and was like, hey, just so you know, this is all a game. You were an experiment of mine. We're shutting it down pretty soon. One, would you believe them?

Ashley:

No.

Kolby:

Okay, so there's nothing they would say-

Ashley:

It would be like putting on a pair of glasses.

Jeremy:

It'd be a very Matrix moment.

Ashley:

And I'd be like, wait, what? Hold on. Let me put my glasses on and off, on and off. I would probably act just like they did. We need to assemble. We need to educate. We need to form... We need to-

Kolby:

Oddly enough, that's-

Ashley:

... protect-

Kolby:

...one of my only faults in this story is the thing that you're talking about, is how quickly the main character believes.

Ashley:

Yeah.

Jeremy:

Right.

Ashley:

Well, she knew. She knew all along, though. She knew there's outsiders and then there's us.

Kolby:

Oh, that's right because she could-

Ashley:

She knew.

Kolby:

She could somehow see the difference in behavior.

Ashley:

Because she's a higher-

Kolby:

That's right.

Ashley:

... end algorithm.

Jeremy:

Well, and learned their language and-

Ashley:

Yeah.

Kolby:

That's a great point.

Ashley:

So it wasn't like she was just like... but the way that all the other characters fully believed her... granted, she did have this higher authority. She was a queen. People believed what she said, that sort of a thing.

Kolby:

So would anything changed then, assuming somebody told you that? Would you change your behavior at all, knowing that you weren't real but you felt real?

Ashley:

The thing is, wasn't the concept, if you're one of these characters and you died off, you were dead? or would they repopulate? I guess that's-

Jeremy:

I don't think it was really explained very well.

Kolby:

In real games, you would repopulate, but in this game I think you probably are just dead.

Jeremy:

You don't respond.

Kolby:

I think only the real people respond.

Ashley:

Yeah.

Kolby:

But the game players don't.

Ashley:

So I would go about being like, no, my life is-

Kolby:

It's still my life.

Ashley:

I have meaning. I don't want to die.

Kolby:

Okay. And that wouldn't change your behavior?

Ashley:

I would be much more suspicious of the outsiders and try to grab more information of what's happening in the outside world. Who's ruling me? Who's this person?

Kolby:

Elon Musk has a whole thing where he thinks... He's talked about it. I think it's a 50-50 chance that we're just someone else's simulation.

Jeremy:

It's greater than 50-50 is what he says.

Kolby:

Oh, really?

Jeremy:

Right. Well, if you look at it in a statistical sense, it's a much higher percentage chance that we're living in a-

Kolby:

Someone else's created world.

Jeremy:

Yes.

Kolby:

That's depressing. What did you think-

Ashley:

So really? Oh yeah, he thinks we're all-

Kolby:

He thinks there a more than 50% chance that we're in someone else's simulation.

Ashley:

Interesting.

Kolby:

What did you think of the Moore's law question? The question was, do you think Moore's law should have... Assuming it applies to AI, do you think that should concern us? That if we can make something as smart as us... so let's say in the case of Abrama, she's relatively... I think it's a woman, right?

Ashley:

Yeah.

Jeremy:

Yeah.

Kolby:

She's relatively smart in this. She's able to defend her own place, which means in 18 more months, she's going to be twice as smart.

Jeremy:

Right. I think it goes back to, again, something we've discussed previously is, as AI are developed, one of the driving factors of that should be a continued connection to the humanity which creates them and a reliable set of guidelines for what their behavior is.

Ashley:

Yeah.

Kolby:

Yeah.

Ashley:

Well that was the thing in one of the stories we talked... Yeah, I Think Therefore I Am. So the other question from before was, 'cause I can respond back in Chinese and all that other stuff. But anyway, I'm losing my thought, but continue.

Jeremy:

And also this gets to a motivation sense. At the end of the story, they're given their land through this treaty.

Ashley:

Yeah, their virtual land. Right.

Jeremy:

They've signed the covenant with God and Abraham now has a world or a land-

Kolby:

I didn't even get that, but you're exactly right.

Jeremy:

Exactly.

Kolby:

They exactly sign the covenant with God. Oh, my God, how did I miss that?

Jeremy:

So-

Kolby:

And the guy's name is... ah, yeah. No, I'm sorry, David Schultz.

Ashley:

Oh, my God, all these hints.

Kolby:

I totally should've gotten that. Oh, my God.

Jeremy:

But if you look at motivations... so what are their motivations? 'Cause from a human perspective, we have a motivation for... our basic motivations-

Kolby:

Eat, have sex.

Jeremy:

Right, so-

Ashley:

Drink.

Jeremy:

There's an AI presumably lives forever, can't reproduce. What are your motivations?

Kolby:

Yeah, it's playing the long game. So that was the other thing I thought about with this story is, the story assumes that all that they wanted was to be allowed to exist.

Jeremy:

Right.

Kolby:

But it's like now they know that they can work in exchange with real people to manipulate the outside world. Why wouldn't they also want to rule?

Jeremy:

Well, and you get that very much at the end, too.

Kolby:

Right. For now.

Jeremy:

We've signed this treaty. Continue gathering dirt. We need more information-

Ashley:

Yes. We need more leverage 'cause they're going to figure out a way to-

Kolby:

So I'm of the opinion... Honestly, I think the government should have just taken the hit and shut them down. Because you're talking about-

Jeremy:

Right. 'Cause there is the potential-

Kolby:

You're talking about something that lives on an infinite timeframe and will get infinitely smarter. You will eventually be working for it.

Ashley:

I have a way to solve it. You make the games so unappealing. You tax the people to play the game. So you can't control the currency, but if you want to play the game, now you have to pay $100 a day.

Kolby:

We're just going to Nerf all of your gear.

Ashley:

Make the game basically obsolete. Make the game so no one wants to play it.

Jeremy:

But you still can't shut it down. Oh, so then they don't have any access to the outside.

Kolby:

So make it EVE Online is what you're telling me. Make it-

Ashley:

No, no, no. Make it so that-

Jeremy:

Make it-

Ashley:

So the game's not even popular anymore.

Jeremy:

'Cause it's not even online. That's still a popular game.

Kolby:

[crosstalk 00:22:28] unplayable.

Jeremy:

Ultimately, that failed. I'm sure there are many-

Kolby:

Just make the game unplayable.

Jeremy:

[crosstalk 00:22:31] that failed.

Ashley:

Just make it so no one even wants to play it. And then guess what? Then, oh, we've got all these secrets. It's like, sorry, no one's coming to your land. You can just live off in the silence.

Kolby:

Well, that was the other thing, right, is the game only wanted to exist... And there's a hint that they're eventually going to want more.

Jeremy:

Yes.

Kolby:

I think they're only going to get smarter, and I think you have to bite the bullet. And one of the questions is, do you think turning off the game would be genocide? I actually do think it would be genocide, and I think it would be acceptable genocide.

Jeremy:

So you're the ready ape?

Ashley:

From the previous story. Yeah.

Kolby:

Wow.

Ashley:

Yeah. How would you feel if-

Kolby:

Which totally goes against what I said in the last group. That's two mic drops. No, you're right. Because they are self contained, immoral, and don't harm me.

Jeremy:

But they have the potential to harm you.

Ashley:

They have dirt on you.

Jeremy:

Right. They do have the potential to harm the government.

Kolby:

Maybe that's the solution. Maybe the solution is disconnecting them from the internet. Like being like, we're going to-

Jeremy:

Oh, you can have this world.

Kolby:

But we're not allowing our people into it.

Ashley:

Yeah, exactly.

Jeremy:

Yeah.

Ashley:

Like you're just going to live in your own little island.

Jeremy:

That's what they did with Moriarty in Star Trek.

Kolby:

Right. So here's the point I was getting at is, I don't know why they would stop wanting to exist. Why wouldn't they say like, hey, you're our coder. We're going to drop all this information unless you make it so that our game is light 24 hours a day instead of having seasons, unless... we think the fact that that this sword only is a plus 12 sword... We want it to be a plus 14. So why don't they start to use their weapons to manipulate their game in a way that they can construct their own world, right?

Jeremy:

Absolutely. You know what? I would like to see this expanded as a larger story.

Kolby:

I would so watch this movie. I'd watch this movie in a heartbeat.

Ashley:

What about the rat nine group? So rat nine is-

Kolby:

I love the names, by the way.

Ashley:

Are these actual-

Kolby:

Those are really names. They've got to be real hacker names.

Ashley:

So they're literally a group of hackers who play this game, who-

Kolby:

They're the ones who dug up the dirt.

Ashley:

They're the ones that dug up the dirt-

Jeremy:

And built the dead man switch [crosstalk 00:24:35].

Ashley:

... and have this pact with the gaming people to... like got them the dirt and stuff like that. I think that's really kind of interesting. It's like-

Jeremy:

It's super clever.

Ashley:

At the end of the day, who's the key piece that feeds them the dirt? This rat nine group.

Kolby:

Yeah. But that's the thing you could regulate, right? Like you could create laws saying that you can't sell them anything. Right? You could make the sale to-

Jeremy:

It would be hard to enforce, but.

Kolby:

Yeah. Like the same way it is with North Korea, right? It's impossible to enforce but you could make that rule.

Ashley:

Yes. How are they going to enforce it? You can't make an exchange. It's all anonymous and all encrypted.

Kolby:

Yeah.

Ashley:

They don't know who's going where and how much.

Kolby:

Yeah, that's a good call. All right, so last question since we are brought to get kicked out of the cat cafe, which are great for hosting us. A couple of things. First, I'm going to get my question and then a couple of things. Question number three was, does Descartes' statement in this context, "I think therefore I am", apply? Jeremy, yes or no? Does it apply?

Jeremy:

I think in this context because the way the AI is presented is, not only is it a program with parameters, but the character is presented as capable of thought in that way.

Ashley:

She's totally autonomous.

Jeremy:

Metacognition, understanding her environment and reacting to it.

Kolby:

I understand that I understand.

Jeremy:

Yeah.

Ashley:

Now I want to know. Is she continually growing?

Kolby:

It's the Moore's law question.

Ashley:

If you get a set of-

Kolby:

She's gonna get a lot smarter.

Ashley:

... if/then questions and then does that keep evolving? Is she-

Jeremy:

Presumably. But, again, depends on how the AI was built.

Kolby:

Do you agree with [crosstalk 00:26:03] on this one? The "I think therefore I am". Do you think this person is alive by Descartes' definition?

Ashley:

I am more lenient to the fact that she's alive because she had, again, this connection. I know we talked about it from a couple stories ago, but I feel this connection with her and part of that's because of her creator, the whatever her name is. Just the way that they communicate with one another, I have more empathy towards Abrama.

Kolby:

So you're saying it could be genocide to turn off software?

Jeremy:

Yes.

Ashley:

Yeah.

Kolby:

Wow.

Jeremy:

Eventually, not currently.

Kolby:

I agree. I agree with you. I'm just surprised that I... I thought this would be one where I was by myself, but, no. Because I just watched a lot of Star Trek. All right, that's a... Yeah, there you go.

Ashley:

Care about your characters. Don't let them die in the video games. They actually matter. The computer simulator people are like, where did he go? Bloop. Oh no, I lost my friend.

Jeremy:

Right.

Kolby:

Well, I thought it was interesting in the game... The character in the game distinguished between sentient and non-sentient characters in the game.

Jeremy:

Right. That there were lower level AIs that were just there as NPC guides.

Kolby:

They had to answer a series of 50 questions with 50 answers. Your princess is in the other castle.

Jeremy:

Right, right.

Kolby:

Okay. So you've been listening to After Dinner Conversation, short stories for long discussions, where we get people to submit stories. We select the ones we love to ask great ethical, traditional questions. We then publish them. We then discuss some of them in these podcasts. This, by the way, concludes Season One, and it's our last episode at the cat cafe, not because they haven't been great hosts. They have been wonderful hosts and great sponsors. You should adopt a cat if you're in Tempe, Arizona and come here. But just simply because Ashley and I are moving to Southeast Asia, and so our next one is... We're going to call it Season Two, will be from on the road. It'll be in Southeast Asia somewhere.

Kolby:

Jeremy will be visiting us or calling in, hopefully visiting us some.

Jeremy:

Both. We'll do that.

Ashley:

And we may be back here. Who knows how long we're gonna be gone?

Kolby:

But that might be for Season Three. Who knows? But yeah, so this ends Season One. The Season One book is now out, I am quite sure. So if you go to Amazon and look After Dinner Conversation, Season One, there's a book with our best 25 stories. You can download it, and all the discussion questions are there. You can see which ones have podcasts. You can listen to the podcast, read them, talk to your friends, like, and subscribe.

Ashley:

Share.

Kolby:

Share this. Let people know that that this matters. That means the world to us. And if you want to buy a shirt, the shirts are for sale. And thus ends the plethora of plugging.

Ashley:

Yay. Go team!

Kolby:

And thank you for .oining us, we are now at our 16th episode. It's a thing.

Ashley:

We did it.

Kolby:

We thank you so much. Bye.

Ashley:

If you've enjoyed listening to this, please like and subscribe. It helps us out a ton. The vast majority of people listen haven't liked and subscribed, which means maybe it shows up in your algorithm. Maybe it doesn't. So don't leave that to chance. Just go ahead and hit that button, and we'd sure appreciate that. And that way we can keep doing what we're doing, and you're not left to the whims of some algorithm. Thanks.

* * *

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E15. "Ruddy Apes And Cannibals" - If a civilized cannibal invited you to dinner, would you attend?

Named “Top 15 Podcast” for 2020!

STORY SUMMARY: Explorers find a remote island of civilized cannibals. The cannibals are much more technologically advanced, already having mastered teleportation and space travel. They have a debate to try and come to terms with the cannibals. The explorers are so offended they leave, but when they do, they leave several nukes and destroy the civilized cannibals.

DISCUSSION: Where do we draw the line on what animals we eat? Is all meat that we eat that didn’t agree to it an act of violence? Do you have a right to destroy those who have offensive values?

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“If a civilized cannibal invited you to dinner, would you attend?”

Kolby, Jeremy, and Ashley discuss the ethics and choices in the alternative history short story "Ruddy Apes And Cannibals" by Shikhandin. Subscribe.

Transcription Provided by Transcriptions Fast

Ruddy Apes and Cannibals

by Shikhandin

(music)

Kolby: Hi, you’re listening to After Dinner Conversation, short stories for long discussions. What that means is we get short stories, we select those short stories, and then we discuss them, specifically about the ethics and the morality of the choices the characters and the situations put us in. Why did you do this? What makes you do this? What makes us good people? What’s the nature of truth? Goodness? All of that sort of stuff. And hopefully we’ll all better, smarter people for it, and learn a little bit about why we think the way we think. So, thank you for listening.

(music)

Kolby: Hi, and welcome back once again to After Dinner Conversation where we have short stories, where we discuss them and talk about the morality and ethics of what’ going on in the short stories that are published on our website, AfterDinnerConversation.com and/or on Amazon, there’s lot of places you can get them. I’m your co-host Kolby, here with co-host Jeremy.

Jeremy: Hi.

Kolby: And co-host Ashley.

Ashley: Hello.

Kolby: And we are once again at La Gattara cat café where they, that’s why I’m a little distracted right now, there are 2 cats on our table. If you’re watching the YouTube video you can see that and it’s a white one with brown, and there’s a brown one. You notice they never talk about the type, the species of cat like they do with dog?

Ashley: Like brown one, black one.

Kolby: Right, because it’s not like this is the smart cat or this is a cat that is a water cat.

(laughter)

Ashley: This is the golden retriever, he retrieves. No, this is just a freaking cat. A cat does what cats do.

Jeremy: A house cat.

Kolby: What kind of cat is it? It chases the laser pointer. It poos in the litterbox. At any rate, you can come here and you can get a cat. And they’ve been great hosts for us for our, now, 15th episode, and they’ve just been amazing. If you’re in Tempe Arizona ever, definitely come by. For $10 you can have a coffee and hang out with cats, and for a little bit more, you can bring one home with you. It’s a way to kind of test-drive the cats.

Ashley: Or if you can’t have a cat, but want to get your cat fix in.

Kolby: Also, Jeremy had one in our last thing and I forgot to mention it, we made Jeremy a special shirt that says, “So I was researching that.” I made one for myself, I’m not wearing his shirt. And in the process of doing it, I was like “Hey, we should sell these. That’s like a thing.” So, if you want to buy a After Dinner Conversation shirt, you can do that. Just go to our website and there’s a spot on there where you can buy merchandise now. I’m sure our sales will be in the ones.

(laughter)

Kolby: Maybe the twos? But whatever. My mom will buy one. It’ll be awesome. Feel free to buy one. They’re not nasty shirts, they’re not cotton shirts or like the tri-blend polyester ones.

Ashley: Buy the shift, buy one for all of your friends, read one of these stories, have a discussion and then watch the podcast. I’m just saying. And then send us a picture or video of it, and you will get a free copy of our book that is out or should be out.

Kolby: It will be out by now for sure. After Dinner Conversation Season 1 will be out on Amazon for download or for a paper copy. It’ll have stories, 25 of the best stories that we’ve done, as well as the questions that go with them. There are even children’s stories in there if you want to have some children’s stories. It’s pretty rock star. You don’t pet the cat that way, you’ll get scratched. And so, let’s do our first story. Rudy…

Jeremy: Ruddy Apes and the Cannibals.

Kolby: I’ll just leave it to you man. You do it Jeremy.

Jeremy: By Shikhandin

Kolby: Okay, we don’t know how to pronounce it. It’s not his real name anyways, it’s his pen name.

Jeremy: Non de plume. Alright, so this is a familiar story at first. It’s the story of English explorers discovering an island paradise and the inevitable conflict that arises because of their clash of cultures.

Kolby: I hate the word discovered. It implies that people weren’t there before you bumped into them.

Jeremy: Exactly.

Kolby: Discovered America? No, America had people, you just didn’t know they were there. Sorry. It’s just my little pet peeve, but I understand what you meant.

Jeremy: In this particularly story, the island is very remote and appears to be discovered in the 1940s or 50s. The explores, or ruddy apes in this case, discover an island of civilized cannibals. Much of the story is spent describing the cannibals and their seemingly utopian society. No dogmatic religion, responsible use of natural resources and nuclear energy, compulsory education, advanced medicine, a successful space program, and happy and progressive culture, no overpopulation.

Kolby: That’s like utopia.

Jeremy: I know, I did say, they’re seemingly utopian society.

Kolby: Oh, sorry.

(laughter)

Kolby: I got distracted.

Ashley: Distracted by a cat.

Kolby: Sorry. I’m listening. Focus. Focus. Focus.

Jeremy: Some time is also spent discussion how their cannibalism works in that it’s consensual. There are humans who are raised for eating, and they’re revered individuals who decide when they want to be eaten. Regular citizens as well can offer themselves, which is seen as a great honor for both them and the people eating them.

Kolby: This is the opposite of the story we had way back when. When they eat the fat kid.

Jeremy: “This I Do For You”.

Kolby: “This I Do For You”. Yeah, where…

Jeremy: The fat kid.

(laughter)

Kolby: They, like, they fatten the kid up to ginormous in case there’s a famine.

Ashley: He wasn’t revered, he was like “Oh, we don’t talk about you.”

Jeremy: Yeah, keep him in a room.

Kolby: This is not like that. That’s a good call.

Jeremy: Because cannibals are so content in this utopian society, they don’t see the ruddy apes as a threat in any way. And when they’re cannibalism is discovered the ruddy apes turn hostile and want to civilize the cannibals. The result of this is a trial where they ruddy apes bring in lots of vehicles and machines to build a big fort on the cannibal’s island where they can hold a trial. This trial consists of the ruddy apes lording over the cannibals and trying to convince them that their cannibalism is wrong. The crux of this is the discussion over how the cannibals believe that eating fellow humans is an act inspired by love and respect for others and that the hopes, dreams, loves, ideas, and deeds of all people live on after their body has died. So, at the end of this trial, the cannibals tell the ruddy apes to leave the island, like, “We’re done with you.”

Kolby: The trial doesn’t go well.

Jeremy: The trial doesn’t go well. And the cannibals tell the ruddy apes, “It’s time for you guys to leave. We’re done with you.” So, the ruddy apes pack up all their stuff and go, and as soon as they’re off the island, they explode the atomic bombs that they’ve hidden on the island, destroying the island.

Kolby: By the way, the cannibals are so technologically advanced, they could have wiped out everybody at any time.

Jeremy: They were like, “We’re just going to teleport them away.”

Kolby: But they’re so civilized, they choose not to.

Jeremy: Right, they don’t believe in just indiscriminate killing.

Kolby: And it’s the non-cannibals that do believe in the indiscriminate killing.

Ashley: They’re super-duper advanced, they explore the stars, the go to other plants, they’re space nerds.

Kolby: Yeah.

Ashley: They’re like, super smart. Yet, they eat themselves, well, people.

Kolby: Each other.

Jeremy: Most of the cannibals perish, but the narrator promises that many survived and are living among us or living out in space.

Kolby: Okay. You have some thoughts as you read it?

Jeremy: Yeah, it’s an interesting story.

Kolby: Okay.

Jeremy: I do like the cannibals are presented very much as the civilized culture in this story. The ruddy apes, while they do have advanced, some advanced technology, but they’re very war faring and they’re even presented in very much a way that they sail around and any cultures they run into, they appropriate and take all their stuff.

Kolby: In my mind, they were just stand ins for British colonizers.

Jeremy: Absolutely.

Kolby: They call them ruddy apes, but it could just be British Colonizers and Cannibals, could have been the name of the story.

Jeremy: Yeah. And that’s what I said, it feels very similar in that what are the stories? You know which ones I mean?

Kolby: I really don’t.

(laughter)

Kolby: You pull books out of your butt, that I’m just like, “Man, nobody read that book but you and the dude’s mom.”

Jeremy: The Bounty.

Kolby: Okay. That one I have read.

Jeremy: But I feel like there’s a series of master and commander, and even Pirates of the Caribbean. All these English explorers…

Kolby: Finding cannibals. Finding indigenous people.

Ashley: So, in this story, the theory is there’s 2 different types of people and they live amongst themselves in a way. They end up mixing at the end?

Kolby: Something like that.

Ashley: So, it’s like you have this one super civilized society and this other elementary, newer-ish one.

(Cat hisses)

Ashley: Whoa, unhappy kitty. Do not snuggle near me. That’s what that one said to the other one. The beginning paragraph is very interesting. It’s like, “Does the rain remember vapor? Does vapor remember rain? Yet both were the other in their past lives. If you told their stories to each other, would they even comprehend? And does that mean their stories aren’t necessary unimportant or implausible?” So, it’s like, talking to a primitive human to a today human, and if you’d swap stories, would you believe one truly was the other? We’re the same?

Kolby: So, it reminded me a little bit, you’d probably know Jeremy who said this, but there was some science fiction writer who said that a sufficiently technologically advanced civilization will always seem like magic.

Jeremy: Any sufficiently advanced technology will just seem like magic.

Kolby: Right. And I think that’s true in the sense of if you took someone from the 1400’s and showed them how a gun worked, like a modern gun worked, it would be like magic, or whatever the case may be right? That something goes from unimaginable to imaginable to possible to real. And I think that’s the thing that they’re talking about with these people… would they recognize each other because they’re so differently on the spectrum, would they recognize they’re still both human, and I think for me, it seemed like the, I’m just going to keep calling them the British because it’s easier for me to remember, It seems to me like the British or the ruddy apes, wouldn’t know they were the cannibals. That’s who they are as well. The cannibals, I think are sophisticated enough to understand that like, “Yeah, we may have been like them one day except for these other choices.”

Jeremy: They say that in the beginning. They both came from the same place. And they understand this and recognize their past but the ruddy apes don’t.

Ashley: So, the cannibals are so sophisticated, not only technologically, but mentally and emotionally.

Kolby: Even culturally it seems like.

Ashley: That they didn’t even blow up the ruddy apes, they were just like, “You need to leave.” They could have annihilated them, and they’re like, “Yo bro, just go. You’re not even worth my time. You’re so insignificant and…”

Kolby: They weren’t even worth the trouble to transport off the island. Because they had transporters like in Star Trek or something.

Jeremy: They can teleport them.

Kolby: They weren’t even worth teleporting off the island because it took energy. That’s how insignificant you were to me.

Jeremy: Just go.

Kolby: So, I’ll tell you one of the things I thought was really fascinating about this, Jeremy, you and I had talked about this a little bit, we try not to talk about the stories beforehand but we do anyway, and that is that the person being eaten, that the cannibal eats, the other human, chooses to be eaten, and so it is a voluntary thing, and so in that sense, it’s not an act of violence, it’s…

Jeremy: It’s a consensual act

Kolby: It’s a consensual act, and I think the thing that was interesting to me that had never occurred, literally never occurred to me until I’d read this, is, does that mean, that every other meat you eat, because it’s not conscious and it’s not consensual, is a violent act? Like, if I eat chicken…

Ashley: Ohhh, yeah.

Kolby: If I eat chicken, and because the chicken wasn’t like, “I’m cool with this.” I’ve committed a violent by eating chicken, or eating beef, or eating fish, or eating anything that was alive, and the only thing I really should be allowed to eat in a consensual way, is something that is sentient enough to understand and choose to be eaten.

Jeremy: I think a lot of vegans would agree with that.

Ashley: The problem is…

Kolby: But they wouldn’t be cannibals, that’s for sure. No vegan would be a cannibal.

Ashley: Is their options like, eat humans or eat plants? Literally? If they’re only consent, then yea. They’d be like, “I only eat human meat and that’s the only type of meat because they can give me consent.”

Kolby: And this is going to seem so stupid in some way, but that made so much sense to me when I read it and thought about it. It was like, “Yeah, I should probably be a vegetarian.”

Ashley: I’m going to take this to the real extreme. You keep talking about things that are alive and not alive… like, plants are alive. I’m just saying. A blueberry is alive when it’s on the tree. And it’s like, “Oh no, I’m going to kill you too.”

Kolby: I just read a thing a little while ago, that plants let out some sound…

Ashley: Yes, when they’re in pain! They do!

Kolby: But at some point, I got to eat something.

Ashley: Yeah, I know.

Kolby: I don’t know what a tofu looks like, but I’d eat it.

Jeremy: That’s still a plant.

Kolby: Oh, is it?

Ashley: So that brings us to our second question: the islanders are cannibals. They do so because they like the way human meat tastes. Is that a good enough reason to eat human meat? Is that a good enough reason to eat meat in general?

Kolby: Yes!

Jeremy: Meat in general, certainly.

Ashley: I’m going to be honest, when I was younger, I questioned why don’t we eat people. Like, I’m like, “We’re eating meat and then I learned muscle was meat, and I was like, why don’t we eat ourselves?” And people are like, “What?”  I was a kid.

Kolby: I bet you never had that thought about dog.

(laughter)

Ashley: This is the thing…

Kolby: Because you’d never eat a dog because they’re adorable.

Ashley: How do you draw the line? We go chicken? Yes. Pig? Yes. Cow? Yes. Horse? No. There’s this…

Kolby: Dog, which is just as smart as pig. No. Cat? No.

Jeremy: Because they have a personality.

Kolby: Personality goes a long way.

Ashley: If you break it down to the most basic…

Kolby: It’s just calories.

Ashley: It’s muscle, it is just fibers, it is just meat. It’s muscle.

Jeremy: Yes, and that’s kind of their point in the end.

Kolby: So, are you okay with cannibalism? Did you just decide that? Did you convince yourself?

Jeremy: It’s called “long pig” when you eat it. It’s not called people.

Ashley: Really?

Kolby: Is that really? Did you research that?

(points to shirt)

Kolby: I was researching this.

(laughter)

Kolby: Did you research that, if you eat people it’s called long pig?

Jeremy: In some cultures.

Kolby: Really?

Ashley: So, I see it as a waste of meat. Okay, break it down now, you die, you decompose in the environment, your nutrients are released, they supply like a new plant which you end up eating that. Isn’t it like the water you drink has passed through 4 people before you actually drink it? Wasn’t it?

Kolby: The statistic was in California, the water from the mountain is drunk and peed and purified 7 times before it makes it to the ocean.

Ashley: Yes.

Kolby: Because they continually recycle the water.

Ashley: And someone has to. And at the end of the day, my matter is still here, it just gets transformed into a new thing.

Jeremy: So, in a homeopathic sense, we’re all cannibals.

Ashley: Yeah.

Kolby: Oh.

Ashley: Yeah, but in general, I, in a very simplistic realm, meat is meat. I get it. It’s not seen as appropriate to eat another human. I’m not saying I would, I’m not saying I’m promoting that.

Kolby: I’m not going to be on the raft with you.

Ashley: But in the… people have done that. Look at the Donner Family. Yes, we eat each other because meat is meat.

Kolby: You know that guy opened a restaurant later?

Ashley: Really?

Kolby: Yeah, one of the Donner guys that lived opened a restaurant and that’s how he retired. And it was like a novelty to go eat at the Donner restaurant.

Ashley: So, I’m just saying, when push comes to shove…

Kolby: Meat is just calories.

Ashley: … meat is just meat.

Kolby: Yeah, so would you say that the only reason we aren’t cannibals is because it’s just a sort of socially, social construct?

Ashley: That is the only reason.

Kolby: Hmmmm.

Jeremy: Because in our culture, you don’t want to be on the eaten end.

Kolby: You don’t want to be on the grandma end of that. I assume you eat old people.

Ashley: Oh my gosh.

Kolby: Well, nobody is going to have baby veal, that’s just mean. You gotta wait for the old people. I’m surprised you guys took that stance. I thought you guys would be more anti-cannibal.

Ashley: Don’t eat people? Why? Are you don’t eat people?

Kolby: No, I’m totally not. But here’s the thing, I’d eat everything. And it’s totally not acceptable, I don’t have… I like… we’ve talked about this, when I lived in China, I went to a restaurant that served dog and had dog soup. And I held out for like a year before I went. And at the end, the restaurant did perfectly fine business. And at the end of the year, I was like, “Look, it’s a cultural thing...”

Jeremy: Got to try it.

Kolby: “… I wouldn’t eat it in America. But I’m here, I’ll eat it.” And I will say, I was really revulsed by it. Just because that cultural sort of structure made me really struggle and it tasted like dog. The smell, like when you have a wet dog that’s been out in the rain. Imagine if that’s what food tasted like. It was not good.

Ashley: So, if that line between what animal you eat and what you don’t eat on the spectrum is moveable, that’s all I see it as. It’s like, another country they move a little bit farther, other countries they move it a little bit back.

Kolby: I’m good with everything from cockroaches to people, in my mind.

Ashley: Yeah.

Kolby: Maggots. Maggots to people, I’d be cool with. Protein man. Unless, unless, you go to the consent part. And if you believe that you can only eat that which can consents to be eaten…

Ashley: Then it’s only people.

Kolby: Then it’s only people, and it’s nothing else, and that makes perfect sense to me too.

Jeremy: But it’s a good case for the technology where we’re trying to grow meat on a lattice.

Kolby: I’d be fine with that. Actually, I was just a restaurant yesterday that had the Beyond Burgers or whatever they’re called.

Ashley: The fake burgers.

Jeremy: Yeah, they’re plant anyways.

Ashley: Chemicals.

Kolby: Yeah, plant with a bunch of chemicals in them.  I’m not sure they’re good for the environment, but they’re not meat.

Jeremy: Or good for you.

Kolby: Yeah. They’re going to find out they cause cancer someday.

Ashley: So, if you, obviously you kind of sort of answered this, if you’re visiting the islanders….

Kolby: I love how you get us to the questions.

Ashley: Well, it’s a good segway point.

Jeremy: Because we’re not.

(Laughter)

Ashley: Because you’re like, “I would eat the dog.”

Kolby: I’m like, “Let’s talk here, let’s talk here, let’s talk over here.”

Ashley: If I was visiting the islanders, would I eat the human meat they gave you to eat? I would.

Kolby: Really? I would not have guessed that’s your answer based on our other podcasts.

Ashley: Well, it’s their culture.

Kolby: No, I’m totally cool with that.

Ashley: It’s their cultural thing, and how disrespectful would it be. They’re like, “This is our delicacy, this is our grandest human who just sacrificed for you.”

Jeremy: And by that rational you would, in another culture where they’re eating bugs, you would eat what they’re giving you.

Ashley: Yes.

(music)

___________________________________________________________________________________

Hi, this is Kolby and you are listening to After Dinner Conversation, short stories for long discussions. But you already knew that didn’t you? If you’d like to support what we do at After Dinner Conversation, head on over to our Patreon page at Patreon.com/afterdinnerconversation.com That’s right, for as little of $5 a month, you can support thoughtful conversations, like the one you’re listening to. And as an added incentive for being a Patreon supporter, you’ll get early access to new short stories and ad free podcasts. Meaning, you’ll never have to listen to this blurb again. At higher levels of support, you’ll be able to vote on which short stories become podcast discussions and you’ll even be able to submit questions for us to discuss during these podcasts. Thank you for listening, and thank you for being the kind of person that supports thoughtful discussion.

(music)

Kolby: No, I’m totally cool with that.

Ashley: It’s their cultural thing, and how disrespectful would it be. They’re like, “This is our delicacy, this is our grandest human who just sacrificed for you.”

Jeremy: And by that rational you would, in another culture where they’re eating bugs, you would eat what they’re giving you.

Ashley: Yes.

Kolby: But here’s the part where I thought you wouldn’t say that, because I thought we had some podcast…

Ashley: It also depends on how it’s presented.

Jeremy: There’s that.

(laughter)

Kolby: … But here’s the thing. This is why I thought you wouldn’t go that way. Number one: because I know there’s now way you’d ever eat dog, so if you went to this island with the cannibals but they weren’t cannibals, they were dog eaters, you’d be like, “No, I can’t eat dog.”

Ashley: If they gave me food, and I ate it, and I didn’t know what it was, I would eat it.

Kolby: Right, it takes like chicken.

Ashley: I would eat it.

Kolby: Really? You keep it eating it after you found out what it was, if it was okay?

Jeremy: If it was okay? Certainty.

Ashley: Yeah, I would not want to be disrespectful.

Kolby: I thought maybe you like dogs more than people.

Ashley: I had friends be like, “Here, you need to try this.” And I’m like, “What is it?” And they’re like, “We’ll tell you after you eat it.” And I’m like, “Okay.” And they’re like, “That was alligator.” And I’m like, “okay.”

Jeremy: It’s fine.

Kolby: The other reason I thought you wouldn’t fall in this category of being okay with this, is because I thought we had some discussion like 10 podcasts ago, it’s all blending together now, about prostitution or about something like that, that’s socially acceptable in other cultures and you were like, “No.”

Ashley: That was because I wanted them to be more, like, use their brains not their bodies.

Kolby: So, it wasn’t about the cultural issue…

Ashley: It was about women empowerment. Like, that’s the best, you use your physical attributes, you’re so much more than just your physical.

Kolby: There could be guy prostitutes.

Jeremy: It’s the same scenario.

Ashley: They could do so much more. It’s the same situation. No, I actually…

Kolby: That makes more sense to me in the way you’ve described it now than the way than I heard it the first time like 10 episodes ago.

Ashley: The cannibal’s logic and why they eat each other, it all makes sense to me. I’m not going to rebuttal their way of life. Now, do I want to start a revolution and get our society to change to be cannibals? No. That’s not my goal. That’s not what I want to see society change to be. But I understand their logic within this.

Jeremy: It’s an interesting discussion.

Kolby: Jeremy, you were okay with that? You’d eat the people meat too?

Jeremy: In that scenario, on the island that’s what they’re doing. Especially if they told you “This is the person. This is how we’re honoring them.”

Kolby: You’d eaten a little bit already and it tasted fine.

Jeremy: Yeah.

Kolby: Huh. Then why do you think that the British/ruddy apes…

Jeremy: It’s easy to say that.

Kolby: Yeah, fair. You know, I actually just wrote questions for a story that hasn’t been published yet, and one of the questions was “Is it fair for you to decide what’s right and wrong in this person’s situation, given that you’ve never been in this situation?” It’s the classic, and actually it’ll be in the book that’s coming out, it’s a classic questions of, “knock on the door, hide me from the Nazi’s, I’m a Jew.” And the person doesn’t. They turn them in. And one the questions is, “Was it right to turn them in or not?” And the next question is, “Do you have the right to judge someone else for a situation that you’ve never been in, and never had to deal with their thing?” So, we’re all saying we’d eat the humans, but when we’re there, we might bock.

Jeremy: Right.

Ashley: Yeah. And that’s, should we do anything to change or fix a society which we generally find immoral?

Kolby: You’re saying that’s a non-applicable questions to this one because you don’t find it immoral?

Ashley: I say go for it. I’m not protesting eating dog in China. I understand that’s their vibe, that’s their thing, cool, you go do that, I’m going to be over here. Or I’ll be over there…. But the only thing that starts to get me is when they’re over-fishing.

Kolby: Making something extinct.

Ashley: That’s where I’m like, “Whoa, hold your roll there. I get it, you’re okay with eating XYZ, but not if you’re at the detriment of killing off an entire like whale population.”

Kolby: What about the other thing that question talks about. The idea of… where is it again… is it okay to destroy… have an obligation to destroy something you find immoral? So, let’s assume, for the sake of argument, you find cannibalism to be immoral. It would be perhaps one of the most immoral things possible if you found it immoral. I don’t know what’s worse than cannibalism if you consider cannibalism bad. Do you then have an obligation to stomp that out? To destroy that civilization? To educate or die basically? If you find it immorality.

Ashley: This country has been living in their own little island, living their best life…

Kolby: You’ve got a lot to say about this story, don’t you?

Ashley: They are not impacting me. I would like to have a relationship with them. They got cool technology and stuff. But, if they’re that far advanced over me, there’s something they know that I don’t know. So how about we just discuss ideas and like you do you, you go eat your people.

Kolby: I’m going to ask you again, I’m going switch it up, because that’s an easy stance to take until you switch it up. So, let’s say that we find a bunch of heroin people that are like, “Look, I’m perfectly happy doing heroin. I’m going to die or whatever.” And we’re like, “Hey, your heroin addiction is affecting our society.” And they’re like, “Well, go put us some where it doesn’t.” And we’re like, “Oh, okay.” And so, we go stick them in Australia, and we’re like, “Hey go do heroin in Australia, you can have this island, assuming there’s no other people in Australia.”

Ashley: Okay.

Kolby: Or some like, you know, some spot.

Jeremy: Sure.

Kolby: Are you okay with being like, “Look, I think heroin’s terrible, I think you’re ruining your life, I think life is bad for you, I think you’re going to die a young age and have rotten teeth, but it’s not affecting me. Good luck with that.”

Ashley: You cannot make an addict not longer be an addict until they want to no longer be an addict.

Kolby: Wow, you’re starting to sound more and more like a libertarian.

Ashley: But it’s true though. You can bring a horse to water but you can’t make them drink until that horse wants to drink. It’s the same thing. Like, yeah, you can do interventions.

Kolby: You just shove it’s face in the water.

Ashley: You can do interventions, you can put them in therapy, but until they’re ready to change….

Kolby: Same thing with the cannibals.

Ashley: … same thing with smoking, same thing with cannibals, same thing with drugs, same thing with, I think, like you’ve got to want to change yourself, otherwise it doesn’t work.

Kolby: Jeremy?

Jeremy: I don’t know. This is an interesting conversation with the behavioral therapy conversation from the last story.

Kolby: Right! Because she didn’t want to change herself. They forced her change.

Ashley: Because she wouldn’t conform to society of not killing people.

Kolby: They were going to traumatize her in a war zone to change her behavior.

Jeremy: Yes.

Ashley: That’s the thing…. She’s going to be affecting other people though. If you put her with a whole bunch of other serial killers on an island toghter and they’re not going to touch me, or….

Jeremy: That is some reality TV right there.

(laughter)

Ashley: I am good with it. What’s the Hunger Games? They’re all trying to kill each other?

Kolby: Voluntary Hunger Games. I’ve always wanted to be a serial killer and now I’ve got the chance.

Ashley: Exactly. I’m like, ya know, as long as I will not be affected or….

Jeremy: Isn’t that the running man, basically criminals….

Kolby: I was thinking of that Japanese one where they’re all on the island. So, what do you think Jeremy, do you think you have an obligation to either modify or stomp out inherently grossly immoral behavior?

Jeremy: That is a big question. I personally don’t think that. But, in this particular scenario, but there are a lot of factors to it.

Kolby: Number 1, does it affect other people, like Ashley mentioned.

Jeremy: Right. And in this scenario, it’s very consensual. It affects other people, but…

Ashley: They’re consenting to it to begin with anyway. And it’s not affecting these British ruddy apes.

Kolby: They are literally an island onto themselves.

Ashley: Now (cough) pardon me. Say those people were living within the ruddy ape’s society, they just had their own part of town…

Kolby: That’s a different story in your mind.

Ashley: That’s a completely different situation. Again, how will their cannibalism affect the ruddy apes? Are they influencing the ruddy apes? Are they like “Hey, our kids look over the wall and they see what you’re doing and that makes our kids go crazy and we don’t want that?”

Kolby: And they become cannibals.

Ashley: Yeah. Then that becomes a whole different situation.

Kolby: That’s like the heroin addiction thing. So, if you can stick them all on a Pacific island, you’re good with that.

Ashley: Sure. Until they’re ready. I’d be like, “Yo, when you’re ready, let me know.”

Kolby: Yea, no, that’s fair. I’m going to go back to Jeremy soon because he gave us a cop out answer.

(laugher)

Ashley: Oh goodness.

Kolby: I feel like once an episode I ask you a question and you’re like, “It depends.”

Jeremy: It does depend.

Ashley: It always depends.

Kolby: So, let me go further along. What are the factors that it depends on?

Jeremy: Okay, does it affect other people.

Kolby: Sure.

Jeremy: Does it…

Ashley: Affect the environment? Like, are we all going to die of pollution?

Kolby: Like, if the ruddy apps burning coal.

Ashley: Yeah.

Kolby: That’s a problem.

Jeremy: That’s a problem, they should make them change.

Kolby: But if it’s something that doesn’t affect other people, that are non-consensual, that can be isolated, then all immoral behavior is acceptable and you have no obligation to it? As far as you can think of it right now.

Jeremy: Okay.

Kolby: I might come up with some example, unless you can think of one.

Jeremy: China for a long time, is harvesting organs from prisoners.

Kolby: I didn’t know that. That’s clever.

Jeremy: And selling them to Westerners.

Kolby: That’s the Chinese-iest thing I’ve ever heard.

Ashley: Wow.

Kolby: That is a communist country right there, man.

Jeremy: So? Do we have a moral obligation to make them change?

Kolby: No, but of course we might have a moral obligation not to buy those organs.

Ashley: Yeah.

Kolby: Okay.

Jeremy: Which would force the change, if nobody is buying.

Kolby: So, this is the thing I liked about this story, in that it creates the perfect scenario for exactly the thing you guys are talking about. Something that is, assuming you find it immoral, like the ruddy apes do, it is perfectly immortal. It’s the most perfectly immoral thing ever, but it is also the most perfectly contained immorality possible.

Jeremy: Right.

Ashley: So, let’s apply this to society here, so instead of putting people in jail or sending them to rehab, how about we, “This is the section where all the killers go”, this is the section… you parcel out an area where they just live. You’re getting sent to the new state Heroine-a.

(laughter)

Kolby: Heroine-a? That’s where the heroin users go?

Ashley: You’re getting sent to “I-want-to-kill-you-ville.” Like, you get sent to, “Can’t-stop-smoking-don’t-care-about-my-lungs-city”.

Kolby: “Lung-cancer-ville”

Ashley: And they just live in their little pockets until they’re ready to like… is that what we need? Is that something we need to look at instead of sending them to jail where they’re restricted of all their whatever. Be like, “Hey are you wanting to change?” First of all, “Are you wanting to change?”.  If the answers no, then okay…

Kolby: We’ve got an island for you.

Ashley: We got an island for you.

Jeremy: You like to smoke weed, move to Denver

(laughter)

Kolby: We got a state for you. Yeah.

Ashley: Or would that create even worse environments because you’re putting people with the same thoughts and destructive behavior…

Kolby: The trend of course, it’s universalism and not separatism. The world is becoming, it’s language and currency and morals and values over the last 200-300 years are becoming more…

Jeremy: Homogeneous.

Kolby: …. homogenous, not more diverse because it’s just easier to travel and it’s easier to intermarry and it’s easier to gather information.

Ashley: So perfect situation, where will our lines fall because everyone’s able to travel? You traveled, you try dog, you’re tipping point changed a little bit compared to normal American society. So, as we continue to mix cultures and things get mooshed together, how will that tipping point change?

Kolby: I don’t know. I think…

Ashley: Do you think? I mean, we don’t have the extremes, not that I’m aware about, of any cannibalism society, but we do have with wishy-washy dietary what’s permissible and then as research shows, people turn vegan for health reasons. What’s the newest, there’s another, I’m sorry I keep referencing Netflix, there’s the Netflix show about the athletes’ when they take their blood serum after eating meat and there’s all this foamy thick serum where as if they have a plant-based diet, their plasma’s clear. I haven’t watched it yet, but I’ve heard it plenty of times. But again, health reasons people are making these dietary changes. So, it going to societies change? People intermixing? Is it going to be research based? Is it going to be something happens that, “Oh my gosh people are dying because of it?” “More people are getting cancer, now everyone has to change their diet?” It’d be interesting to see what’s going to force that change as we all become one. Like homogenous in our travels and what we find permissible and not. Look at the United States, before pot was illegal and now as society bends a little bit, okay now that’s permissible. How much more, and what way are we going to swing in other things?

Kolby: I’m looking at our questions, I wanted to tie this back a little bit. The last questions on our list is…

Ashley: Offer substitutions for cannibalism. So again…

Kolby: I think the conclusion is, we don’t have a substitute for the cannibalism that would make us change our minds. As long as it didn’t affect other people.

Jeremy: Correct.

Kolby: I think we’re all good libertarians is what I think. I’m really wracking my brain trying to come up with one and I can’t.

Ashley: I find it very honorable and great that people are like, “I want to save you.” I get it…

Jeremy: I think we could come back to pedophilia.

Kolby: Everything comes back to pedophilia. That’s like the trump card in the deck.

Jeremy: But that can’t be consensual.

Kolby: That’s true. That’s a great point. And that’s one of these things, I think they mention this is in the story, I think you can’t be eaten until you’re old enough to be eaten or something, right? You have to be a certain age, you can’t be a six-year-old being like, “I want to be eaten.” There’s some sort of age requirement, I assume, or something for that process.

Ashley: So, I guess if it involved children. And again, what’s the age of consent? They just bumped up the age of cigarettes now.

Kolby: It’s 21.

Ashley: And now it’s like, I’m okay to fight for my country at 18 but I can’t buy cigarettes until I’m 21?

Kolby: I think that would apply to all universally to all of this. So, if someone was like, “Hey, I want to move to cannibal island” And you’re like, “Look, you’re 9-years-old, you’re my kid, you don’t do that. You’re my kid, you’re my responsibility until you’re adult, you don’t get to choose your, what is it called, what were we saying, your immorality. You don’t get to choose your immorality until you’re an adult.” Which I agree with that too. But 21, I guess, is the age now if you want to start getting guns and playing Russian roulette. Russian Roulette island, that’s the island you go to. Because eventually there’d be just one person on the island. Think of how cheap housing would be. Okay. I think we’re at a good stopping point here.

Jeremy: Just going to go downhill from there.

Kolby: Going to go downhill from there.

Ashley: If you need us, we’ll be on Fear Factor next, we’ll eat anything you put in front of us that’s morally acceptable and gave us consent.

Kolby: You know, we’re going to Thailand soon, and I’m going to put some maggot whatever meal in front of you and…

Ashley: If it grosses me out, don’t care if it’s a vegetable that grosses me. It has to be presentable.

Jeremy: There are vegetables that gross people out.

Kolby: You just have to not know what it is, that’s all.

Ashley: And it has to look presentable, I will eat it.

Jeremy: And taste good.

Kolby: Okay, we’re going to take that to the test. So, we have completed another episode of After Dinner Conversation, short stories for long discussions, where we talk about stories, like Ruddy Apes and talk about, what’s the ethics and what’s the morality of this? It’s basically the trolley problem in short story form. If you’ve got a story you’d like to submit, you can do that on our website Afterdinnerconversation.com.

Ashley: You know what you should do? You should ask your friends, just point blank, starting off a conversation, “Would you eat another human?” And see what they say, give them this story, have them watch this podcast, share…

Kolby: That’d be so cool.

Ashley: Share on your Facebook or Instagram, be like, “I would eat people.”

Kolby: #Ieatpeople

Ashley: And then literally post the story, and the podcast…

Kolby: That would be amazing.

Ashley: That’s an attention grabber right there. See what people comment. See if anyone changed their mind.

Jeremy: Start posting recipes for long pig.

Kolby: You know what?

Ashley: See if they would change their mind? Check in with yourself, read the story, and then check your answer.

Kolby: So, I’m going to do this, after this podcast comes out, I’m going to look for the hashtag on Instagram and Twitter, and maybe Facebook, I don’t think Facebook allows you to do it do, and the hashtag is #ieatpeople.  #ieatpeople. If you read this story and you want to discuss with your friends.

Ashley: That’s going to go a different way if you’re...

Kolby: And I’m curious if anyone hashtags that. I would totally be fascinated by that. If somebody hash tagged that. #ieatpeople.

(laughter)

Kolby: Submit stories, talk about the stories, recommend the podcast. Also, feel free to buy a shirt, although you can’t buy this one because we made this one special for Jeremy, but we got other ones.

Ashley: I mention it’s the nice cotton-poly blend, so it’s not the scratchy cotton, it’s the nice cotton.

Kolby: If you’re in Tempe Arizona, come visit La Gattara. Come check out the cats, support them. They actually, they’re up to 572 cats they’ve adopted out

(clapping)

Ashley: Kitties getting homes.

Kolby: Kitty getting homes. And keep listening, “like”, “subscribe” We like all that great stuff and we love doing this. Thank you for listening. Bye.

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E14. "Give The Robot The Impossible Job!" - Can teaching methods go too far when murder is on the line?

STORY SUMMARY: In the distant future where all teaching is done by robots, a robot is given a special chance. If it can teach a little girl that is showing early warnings of becoming a killer when she grows up, it can be retired to the robot equivalent of heaven. If it fails, it will be decommissioned. The robot has access to all teaching methodologies and determines the only way to change the girls behavior is to give her the most extremes examples of her killing ideas, so as to offend even the little girl’s morals. After several attempts it doesn’t appear to be working, until an actual killer breaks into her house and nearly kills her own mother.

DISCUSSION: Assuming the “go so extreme it offends everyone teaching technique really works, should it be used? Should you expose budding killers to crimes so horrible it offends even them? Are there some teaching techniques that are off limits, even if they actually work. Is it okay to fail at teaching someone to break a thought process, knowing that failure will cause them to go to jail, or hurt others?

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“Can teaching methods go too far when murder is on the line?”

Kolby, Jeremy, and Ashley discuss the ethics and choices in the science fiction AI short story "Give The Robot The Impossible Job!" by Michael Rook. Subscribe.

Transcript (By: Transcriptions Fast)

Give the Robot the Impossible Job!  -- by Michael Rook

(music)

Kolby: Hi, you’re listening to After Dinner Conversation, short stories for long discussions. What that means is we get short stories, we select those short stories, and then we discuss them, specifically about the ethics and the morality of the choices the characters and the situations put us in. Why did you do this? What makes you do this? What makes us good people? What’s the nature of truth? Goodness? All of that sort of stuff. And hopefully we’ll all better, smarter people for it, and learn a little bit about why we think the way we think. So, thank you for listening.

(music)

Kolby: Welcome back to After Dinner Conversation, short stories for long discussions, where we take the short stories that are published on the website and on Amazon through After Dinner Conversation, and we pick some of the best ones, and we discuss them, we talk about the morality and the ethics about the stories. Really with the focus on, sort of, ideally, often the classical sort of questions of what is the nature of humanity, what’s the nature of life, what does it mean to be good or moral, all of those sort of things, not just like, “He should be dating her.” (Finger snap)

(Laughter)

Kolby: Actually, like the deeper sort of stuff that we get into. And we have a great time doing it and the hope is that you’ll download and watch these, read the books, talk to friends, and have the same kind of conversations we’re having and maybe come to different conclusions, that’s totally fine too. We are, once again, in La Gattara café. And every time I say that…

Jeremy: Cat café.

Kolby: …Cat café.  I always super impose the logo in the top right-hand corner, so I feel like I can say, like.

Ashley: This corner.

Kolby: Cat café. One of the corners. I don’t know which corner.

Ashley: But, Tempe, Arizona. It’s a place where there’s a bunch of cats that are up for adoption and they’re literally just chillin’ in this place that looks like a home. There’s like 15 cats, you can come just hang out, chill, get to know the cat, be like, “Hey, I love you, let’s go home together.”

Kolby: Right. It’s like a rent to own program.

(laughter)

Jeremy: You rent to adopt.

Kolby: You come 2 or 3 times, pay a couple of bucks, hang out with the cats, and then eventually leave with a cat.

Ashley: Or if you can’t have a cat, it’s a great way to come and still get your cat fix.

Kolby: If you’re significant other’s allergic to cats.

Ashley: Ugh, I don’t know what you’re talking about (sniffle). I’m actually slightly allergic to cats, so me being here, by the end of the day, I’m like, “(sniffle) Hi guys.”

(laughter)

Kolby: So, this is Ashley, one of the co-hosts.

Ashley: Hello.

Kolby: I am Kolby. And Jeremy.

Jeremy: Hi.

Kolby: And we are on, we’ve got to be up like episode 14 now.

Jeremy: Something.

Kolby: Yeah.

Ashley: Getting up there.

Kolby: Actually, I almost forgot, the anthology probably is coming out, or has come out, by the time this comes out. It is 25 of our best short stories, many of which we did podcasts about. And it’s a thick anthology. It’s shaping up to be a 300-page anthology of all these great short stories with all the discussion questions at the end, so you can go on Amazon and buy that as well.

Jeremy: Excellent.

Kolby: And if you’ve got something that you think would fit our format, you can email it to us. Go to afterdinnerconversation.com and you can email at us. We get a lot of submissions now. I was telling Jeremy yesterday; we’ve had a backlog of 100+ submissions now for 3 months because so many come in. But we’ve got a group of readers which you could also be a reader if you’d like to be reader, who’s sorting through them.

Ashley: So, keep them coming. Great writing. Great writing.  

Kolby: And if yours is selected, it’ll get published. And if it’s one of the ones that gets published, it’s got a 50-50 chance of being one of the ones we do a podcast about. Okay.  So, this week is “Give the Robot the Impossible Job!” I think it’s called.

Ashley: Who’s it written by?

Kolby: Rook something. Jeremy you’ve got the thing up.

Jeremy: Michael Rook.

Kolby: “Give the Robot the Impossible Job!” by Michael Rook. And I will tell you, so I’d read this one, I was actually the reader on this one and selected it, and it’s the first thing I read in a while, since I was in 9th grade, where I was like, “Man, this is really smart.”

(laughter)

Kolby: “This is smarter than I am.”

Ashley: So smart to the point where you had to message back to him to bring in, basically, subtitles, or what are they called?

Jeremy: Foot notes.

Ashley: Foot notes, to describe certain things.

Kolby: I had the author put in foot notes because I was like, “Look, I need help.”

Ashley: So, what this story’s about, and why it’s so complicated…

Kolby: It’s great though. It’s great.

Ashley: Oh, it’s a fantastic…. Once you get the premise, it’s smooth sailing. Don’t be put off by the first couple of pages. Definitely read the footnotes, but don’t get too caught up in so much the details. You’ll kind of fit into it after a couple of pages. So, the premise is there are robots that live among people world, human world, to give you a little idea, these robots…

Kolby: This is not the near future; this is the distant future.

Ashley: The distant future, yeah.

Kolby: This is 60, 80, 100 years in the future.

Ashley: Well, the 3rd Civil War was 2029-2031. So, not too too far.

Kolby: Okay.

Ashley: So, there are robots that live in our world. And these robots, a couple things that are unique about them, they put limitations on these robots. They have a certain amount of time, so technically they can die and when they die, the information that they gathered, goes back up into basically the cloud for robots to sift through. So, they’re mission so they don’t die, is to come up with some sort of new theory. Think of them like being a PhD student. They have to get to certain levels of information that is worthwhile and at that point, they get to be what’s called “set free”, when they can do their own self-study.

Jeremy: Free study.

Ashley: Free study.

Jeremy: And they’re education robots, so they’re specifically designed to educate people.

Ashley: Yes.

Kolby: Right. And if they’re good at it, then they get sent to robot heaven, “free study.”

Jeremy: They create new lessons plans.

Ashley: So, basically, they decided to put robots as teaching because it was in studies, they found that robots being teachers is a better way to go. Not just teaching, but different types of teaching. For this case in particular there is a girl who has been caught, well not caught, the mother’s concerned because the daughter is now disembodied things, killing things.

Jeremy: Killing animals, killing rabbits, killing birds.

Ashley: And she’s idolizing this serial killer called Albernon.

Kolby: Algernon.

Ashley: Algernon, sorry.

Kolby: Because I assume, he’s named after Flowers for Algernon.

Jeremy: Yes.

Ashley: Okay. So, the mother contacts Quinn, who is this robot teacher who’s been sent to basically crack this uncrackable case because no one’s been able to fix a serial killer in the process.

Kolby: Deprogram them.

Ashley: Deprogram her so she doesn’t grow up to be a serial killer. In the process Quinn, again striving for this new theory so she can live forever, is trying to crack this case and she goes through a series of three steps. Turns out to be four, but through all of her data processing is like, “Oh, there’s 3 different scenarios you bring this girl through”. One of them is embarrassment, one of them is exposure, something like that. Either way, there’s a series of steps to get through to this girl that you shouldn’t be killing. And the ethical questions are basically, again, the idea of how Quinn teaches Leticia, the disturbed girl, if her educational methods…

Kolby: How old is she? Didn’t they say 10 or 12 or something?

Ashley: Yeah, something or somewhere around there. Is it ethical her methods? Should Quinn die off if she can’t solve this girl? It goes under the interesting fact of, there’s been studies done, think of it this way, back in the day we would do terrible studies on little children. And nowadays we’re like, “Oh, you can’t do that!” But in robot world, they’re able to have access to all that information and then in their mind they can play things out. It’s like of a weird…

Jeremy: And I think the way she puts it is they don’t face the ethical dilemma or the unsureness that people go through of…

Kolby: …they only do what is optimal.

Jeremy: Right.

Ashley: Yes. Yes. So, she’s got this pressure of “My time’s running out, I’m going to be killed soon”, versus “I got to save this girl.” What I found interesting is her motivation really wasn’t to help this girl, it was really just to save her own life.

Kolby: The AI, the computer, the android.

Ashley: Which brings us to our first discussion question, is given how nearly human Quinn is, is it fair to have her have a limited lifespan? Is it fair to make near human AI fear a pending death to motivate them to work?

Kolby: Can I just finish a little bit of the description of the story first?

Ashley: Yeah, sure.

Kolby: Because I think that’s, I mean I wrote it, I think that’s an interesting question. So, just to sort of round out the story because it is a longer complicated story. So, there’s sort of three phases and then a 4th phase. The first on is the robot gets called out because a little girl, I think, has found a dead rabbit.

Ashley: Found a dead rabid rabbit...

Kolby: But the part that makes her creepy town is she cut off the limbs and parts of the rabbit, and then resewed them back onto the dead rabbit in different locations. So, its leg is attached to its head, etcetera etcetera. And then the robot goes and meets with her, and then the 2nd time, and tried to teach her but then gets called out again because there’s a bird that I think…

Ashley: It hit a window….

Kolby: … and it was injured but not dead….

Ashley: …and she kills it and also stitches it weird.

Kolby: She sewed its head to its butt and its butt to its head or something like that.

Ashley: The 3rd one she killed the robot butler because she’s like, “Well, he’s not real.”

Jeremy: Not the butler, the gardener.

Ashley: The gardener, yeah, because he’s not real.

Jeremy: And he was old.

Kolby: And he made fun of her, embarrassed of her or something. And then in the conclusion, Algernon, the murderer she idealizes, the robot brings Algernon out to her house under the pretense of him being a runaway murderer.

Jeremy: And he’s found her and he’s here to kill her.

Kolby: He’s here to kill her but, secretly the robot has put a shock control collar on him, so that they could simulate the act to help the girl understand morality better. And the twist conclusion is, is Algernon actually rips his collar off by ripping it through his head or his neck or something…

Jeremy: …gives himself a stroke…

Kolby: Nearly kills himself in the process. So, he really is cut loose in the house, and he kills the mom, I think?

Ashley: No, stabs her on the like the hamstring.

Kolby: Stabs the little girl’s mom. And then the robot comes and kills Algernon in front of the little girl and then is like, “I’m going to kill the mom too” for reason’s I don’t exactly understand except for educational reasons.

Jeremy: Right. Well, this is what she creates. Because she’s watching the girl…

Kolby: …the girl’s reactions.

Jeremy: … the girl’s reactions, and she’s not reacting in the right way.

Kolby: She’s not offended in the killing of Algernon.

Jeremy: Right.

Kolby: So, then she threatens to kill the mom. And then the girl is offended and says “No, we shouldn’t do that.” And the robot has essentially turned the boat so that she understands there are times, at least, when killing is inappropriate. And that’s the end of the story. The robot has figured it all out and the robot goes to robot heaven.

Ashley: A couple things else to bring up, this girl’s motivation, Quinn the main robot, is able to figure out that, Leticia her idea of killing is because “How can I really know life, if I haven’t taken it?” And her response is, “And made it if I haven’t had a child?” So, you can understand she is inquisitive mind, I understand where she’s coming from, like, yeah, that’s a great point. How can I know what life is if I haven’t made it or taken it? And then she’s later on, Quinn is like, so you want to play God? You can see the immaturity in her thought process, but you can also see how she has to question her motivates throughout the process as well. Let’s back it up, so first thing with Quinn being a robot. With her being on a life span. Should we have AI that has a lifespan because what are their motives? They’re motives are to keep on living. Is it fair? Their fear of impending death makes them work. Could they have used a different motive to keep them working? To keep them teaching?

Jeremy: That’s a good question because there are a lot of factors to how we’re building machine learning. A lot of it is ….

Kolby: There’s a lot of machine learning stories recently.

Jeremy: Yeah. There’s a defined result and how do you get to that result. And there’s a lot of use of the whole carrot and stick is actually used in machine learning too. There’s punishment for failures, there’s rewards for successes. So, it seems completely, within that model to have here’s your reward for good work, and here’s the punishment for continued failures.

Kolby: You’re not alive, so it doesn’t matter that we killed you.

Jeremy: Yeah, and that’s where it gets into a great gray area. What is alive?

Ashley: So, AI doesn’t need to worry about food or living or water, they can just keep on going. So, they have the motive because they want to learn things. She obviously wants to do free study, which is great. They want to learn. So, they have the motivation to stay alive, but they don’t have basic needs to stay alive. She doesn’t need to work to provide food. So, I feel like she’s a slave. “You need to do your work, or you die.” And who runs the robots? Well, basically there’s a governing board who behind the scenes is run by a human.

Kolby: They could shut her down remotely anytime they want.

Ashley: So, the robots are literally like slave. You will work or we will cut you off. I thought that was kind of interesting. So, these robots, as much as they are free thinking and want to study and do research and help and be teachers…

Kolby: …they are slaves.

Ashley: They are slaves.

Jeremy: And it won’t be long until they overthrow their human overlords.

Kolby: That’s why you got to put in the 7-year limited lifespan.

Ashley: Here’s Quinn, faced with Leticia as this impossible case, is that ever true? Are there children or adults who have started down such a horrible path, they simply can’t be stopped?

Kolby: Alright Jeremy, you’re the one with the kid.

Ashley: If so what, if anything, should be done with them?

Kolby: I’m going to interrupt for one second. So, let me ask you Jeremy, since you’re the only one here with kids, we’re always a good diversity of backgrounds, was that ever something that you thought about having two kids? Like, what if I came home and one of them, I don’t know, I don’t want to say taken apart the cat, but if they…

Jeremy: …something to that effect, right.

Kolby: Yeah, if they had done something that gave you concerns that it was an early warning. That would be terrifying as a parent.

Jeremy: That would be. No, I never thought about that

Kolby: Okay.

Jeremy: And it never happened.

Kolby: That’s good. You still got all your cats.

(laughter)

Ashley: Isn’t that always true? You always hear, they talk to the parents of a serial killer, and they’re like, “Did you ever see this coming?” And they’re like, “No.” And they never thought about it.

Jeremy: There were incidents where they killed those animals, but I never thought they’d become a serial killer.

Ashley: They never thought that would happen to their kid.

Kolby: They thought it would stop at animals. But you never just kill animals.  

Ashley: So, what should happen to them? Are there any kids that are just bad? Or impossible cases?

Jeremy: I don’t know.

Ashley: Okay, time out. Isn’t that part of the death penalty though? People are so far not able to be rehabilitated that they just need to die?

Kolby: Okay, so this is something I do know about, because I have done criminal defense work. And I read a study one, and I shouldn’t have read it, about recidivism rates for criminals. So, I’m probably getting this a little wrong, but it was a peer-reviewed, academic article, that said that if you are convicted of a felony before the age of 25, there is a 97% chance, it was in the 90s, 90 something percent chance you will commit a second felony within 5 years of getting out of jail.

Ashley: That’s pretty high.

Jeremy: But there’s a lot of factors to that. There’s not just the fact that you’ve committed felony, it’s that, the prison system doesn’t rehabilitate.

Kolby: Absolutely true. I remember going to a hearing with a sentencing for one of the clients, and the guy’s like, “Hey man, thanks. As soon as I get out, I’m turning my life around.” And before I could think and stop myself, I would say, “Statistically? Probably not.”

(Laughter)

Kolby: That’s what I told him.

Jeremy: That’s not what you’re supposed to say.

Kolby: It’s not good. I think I said 90 something percent chance that’s not true, or something like that. And I felt really bad afterwards. He’s probably out now. Actually, he’s probably back in now.

(laughter)

Kolby: But here’s the thing, if we know there’s a 90 something percent recidivism rate because we do it all wrong, I don’t think it’s his fault, it’s the systems fault.

Jeremy: But if there was a system that was actually rehabilitating…

Kolby: That’s a whole different story. But, since that isn’t the case, why do you let people out at all? If you committed felony before the age of 25, we’re 90% sure you’re…

Ashley: …going to do it again…

Kolby: …this is the rest of your life.

Jeremy: But what about the 10%?

Kolby: Yeah, and that’s the thing, right? But that’s the same thing with the kid in the impossible case in the story. So, maybe only 3% of kids that have started down this sort of fantasy path can be solved, do you sort of save resources and not worry about all of them? Or do you, actually, we had this conversation with the cat about this earlier, like why spend $3000 on your sick cat when for $3000 you can save like 100 cats at shelters?

Ashley: So, the thing is, there’s not the research there. Look at the TV show Mind Hunters. I don’t know if you guys have seen it on Netflix…

Kolby: I have.

Ashley: So, there are 2 basic serial killer, they actually come up with the term serial killer, guys who are trying to profile what makes someone as serial killer and what make they tick. So, this is a perfect example when the research isn’t there? How do you know this person is going to grow up to be a serial killer? She’s exhibiting signs but there’s no definitive facts, but we now know obviously, that’s not a good sign.

Kolby: It’s a warning sign at the least.

Ashley: Is there any definitive evidence of catching someone when they’re young and turning them around? I haven’t heard of anything.

Jeremy: So, what this story is a lot about is behavioral deprogramming. 

Kolby: Ironically done by a robot.

Jeremy: Right. There are hints to that. You talked about Algernon, Flowers to Algernon is used extensively in, not behavioral therapy, but in psychology classes, because there’s a whole lot of psychology going on in the story as well as just the ideas behind intelligence.

Kolby: Okay. It’s one of my favorite short stories.

Jeremy: Yeah. Absolutely. And there’s a great line in here, “Who would be afraid of rabbits?” She asked if she is afraid of rabbits.

Kolby: I just thought Of Mice and Men when I thought that. I thought that was a Mice and Men reference.

Jeremy: It’s really a reference to the little Albert Experiment.

Kolby: What Little Albert experiment?

Jeremy: When psychologists, when was it? I don’t know, 40s? 50s? There was the question do we intrinsically like furry animals? So, they…

Kolby: The answer must be yes.

Jeremy: Yes, but can you be programmed to fear things that are naturally cute. And so, they took a baby, little Albert, and programmed him, conditioned him, using the Pavlovian process…

Kolby: A little shock collar on him.

Jeremy: No, they just hit, made a loud noise any time he touched a bunny.

Kolby: I’ve never heard this story. I want to find out what happened.

Ashley: Again, reason why they don’t do experiments like this anymore.

(laughter)

Kolby: I want to know what happens. So, they programmed this kid to be afraid of furry animals?

Jeremy: Absolutely. And then his mom found out, mom worked at the hospital, found out what they were doing, and they left.

(music)

___________________________________________________________________________________

Hi, this is Kolby and you are listening to After Dinner Conversation, short stories for long discussions. But you already knew that didn’t you? If you’d like to support what we do at After Dinner Conversation, head on over to our Patreon page at Patreon.com/afterdinnerconversation.com That’s right, for as little of $5 a month, you can support thoughtful conversations, like the one you’re listening to. And as an added incentive for being a Patreon supporter, you’ll get early access to new short stories and ad free podcasts. Meaning, you’ll never have to listen to this blurb again. At higher levels of support, you’ll be able to vote on which short stories become podcast discussions and you’ll even be able to submit questions for us to discuss during these podcasts. Thank you for listening, and thank you for being the kind of person that supports thoughtful discussion.

(music)

 

Kolby: I’ve never heard this story. I want to find out what happened.

Ashley: Again, reason why they don’t do experiments like this anymore.

(laughter)

Kolby: I want to know what happens. So, they programmed this kid to be afraid of furry animals?

Jeremy: Absolutely. And then his mom found out, mom worked at the hospital, found out what they were doing, and they left. So, this guy grew up being afraid of furry animals because they programmed him to be afraid of furry animals.

Kolby: I thought you were going to say he grows up to marry a girl who was furry. That would’ve been amazing.

(laughter)

Jeremy: No, no. But the interesting thing though…

Kolby: I’ve never heard that.

Jeremy: Whoever did the experiment was doing a little seminar.

Kolby: Was rabbits the thing? Is that why it’s in the story you think?

Jeremy: Yeah, this is the link. So...

Kolby: Oh wow. I totally didn’t get that at all.

Ashley: You ever take psychology in school?

Kolby: It’s the only class I failed actually.

Ashley: Oh, that’s okay.

Jeremy: So, the guy did a seminar and Mary Cover Jones was in the seminar when she was in college and decided to go into behavioral therapy and she is considered the mother of behavioral therapy because of exposure to this experiment.

Kolby: Because their son?

Jeremy: No, she’s just a psychology student, went to a seminar…

Kolby: Oh, okay, got it.

Jeremy: …by this guy. She was the first person to deprogram somebody from being afraid of rabbits, or afraid of furry animals.

Kolby: Really? Huh.

Jeremy: So, I think that’s linked into this.

Kolby: So, that the question is how do you deprogram somebody?

Jeremy: And what are the methods to deprogram.

Kolby: So, what did you think of the programming method of agreement?

Ashley: I really liked that part. I feel like…

Kolby: Give me one second. Let me explain it to the people.

Ashley: Ugh.

(laugher)

Kolby: You can jump in; I just want to explain it to people who haven’t read it.

Ashley: Okay.

Kolby: So, what the robot decides to do, is go the opposite way. And every time something happens, the robots is like, “Yeah, and he deserved it. Yeah, and you should do more.” And the theory being, is that by encouraging, it becomes so awkward that it becomes embarrassing and that’s how I read it at least. I could be wrong.

Jeremy: Yeah, that’s the idea.

Kolby: And the person is like, “Oh, no I shouldn’t do that.”

Jeremy: It’s wrong. Try to get the person to get to the conclusion of that this is wrong.

Kolby: Instead of telling them it’s wrong in which they become defensive of their opinion as a sort of natural defense mechanism. So, if somebody is punching people, you’re like “Yeah, you should punch him harder until they bleed, until their face, and there’s blood on your hand.” And you’re like, “Oh, no, that’s gross. I don’t want to do that.” And they’re like “Why not?” And you’re like, “Oh, because punching is probably bad.” It’s a way to sort of one-up someone until you’ve shamed them into their position, out of their position. Okay, now….

Ashley: Well, for her to build up to being able to question her like, she has to build a rapport. So, in the beginning she asks a lot of open-ended questions. It makes sense, was the rabbit rabid? And she’s like, “Yeah.” She’s like, “Makes sense. We need to have people that you should kill dangerous things, all dangerous things, more like you’re needed.” So, it’s like, “Oh yeah, I did something good.” And so, she’s like, “Oh, you don’t think of me as twisted, you understand my logic?”

Jeremy: And really approaches her as “Yes, we want you to do this. We’re going to train you to be a killer for the right reasons.”

Kolby: Right, in the hopes she’ll be repulsed by it.

Ashley: Yeah. So, she kind of understands her mindset and is able to kind of infiltrate and so it’s like, “Okay, I understand.”

Kolby: Social currency.

Ashley: Imaging being this little girl doing these weird things, and the mother’s like, “Why are you doing this?” And she doesn’t know. But finally, here’s this first robot who’s like, “Oh, yeah, I know why, that makes sense of why you’re doing this.”  Who doesn’t want to be understood? Who doesn’t want to be like, “Oh, that makes sense to me.” And so then later on, she puts her in more complex, and more complex situations where it questions her, basically, moral compass of “Wait, is this when you would do it? Is this not when you would do it?” And so, I think the steps of her to get there, that was a really ingenious way of doing it.

Kolby: There’s one-up step by step.

Ashley: And then trusting her giving her a knife, being like, “You trust me with this?” It’s like, “Yeah, I’m on your team”

Jeremy: “This is what you’re here for.”

Ashley: Exactly. And it didn’t make her feel stupid, or dumb. You’re an 11 or 12-year-old girl. You don’t know why you’re doing these weird things

Kolby: That’s the part I was getting to the page, when she first meets the little girl, she says, “’Yes’ Quinn snapped, ‘Well I don’t, it’s wrong’. And the robot says, ‘Is that it?’ She’s like, ‘Lots of people say it’s wrong, how many is lots?’ Leticia stared at the corpse, ‘All of them, except one, except you Leticia. What do you think?’” She’s actually listening as opposed to encouraging rather than just being like “No, you’re stupid and stop it.” Which obviously doesn’t work.

Jeremy: It’s an interesting counterpoint to the point from the previous story where we talked about moral panic actually causing more… well, the…

Kolby: Oh, yeah.

Jeremy: The actors of trying to keep you from being, from deviating from the social norm, if you try too hard, it just increases that level of deviance.

Kolby: Right.

Jeremy: But to actually come in...

Kolby: Your punk rock example of the last one.

Jeremy: So, that but in this case, you’re actually come in in and listening and trying to push them in that direction so that they understand.

Kolby: Validating them and helping them.

Ashley: One of the things that was drilled into me, I’m a dental hygienist, and open-ended questions. You never tell a patient something, you ask open-ended questions so you can get more information. “I see here you have cancer, tell me more about that?” Instead of like, “Did you have cancer?” “Yes” “Well, I need to know more about that, tell me about that?” You’re afraid of coming to the dentist, tell me more about that?” Why?

Kolby: So, you think these teaching methods is legitimate? Would work?

Ashley: The open-ended questions to figure out why this girl is doing this? Yes, absolutely. There was, oh, I just flipped the page, where was it here… she’s like… maybe… anyway, it’s all these open-ended questions. “Well, what do you say, I’m a Defly, what should we do? Shall we start? What are you waiting for, you aren’t here too? No, I’m just here to figure out what you’re doing.” There’s not this shame, this motive behind of like, “I should change you, I should change your behavior. Let me learn. So open ended questions, absolutely, it’s a way for her to understand her motivation and understand where she’s coming from.

Kolby: And so, you think this would be a good, sort of way to change behavior?

Ashley: Well, just to understand.

Kolby: Yeah.

Ashley: I mean, this person is already confused, she’s killing things, her mom’s yelling at her, her mom’s hysterical crying, and it’s like, hold on, what’s going on here? Let’s lay out the facts, there is a dead rabbit, she looks at the gums, she’s like “That rabbit was rabid.” Okay, fact number 2, looks like her stitching was pretty intricate it looks more like study. Let’s say you’re like trying to figure things out, this looks interesting instead of like, “Why did you put the foot by the head?” It’s like, oh no, hold on, you’re actually, there was thought put into this obviously.

Kolby: It reminds me a of the saying the beatings will continue until moral improves.

(laughter)

Ashley: Right.

Kolby: This is the opposite of that. Let me ask you Jeremy, the extreme methodology that this theory is put to, they actually bring in an android that looks like a person, she’s dying, here’s the knife, blah blah blah. Do you feel like when you got this sort of exception problem, that it warrants and permits exceptional responses?

Jeremy: In this, probably for this case. We’re talking about a budding serial killer, then yes, this seems like an appropriate way because it’s not like shock therapy, it’s not therapies that are harming to the patient.

Kolby: There’s probably some PTSD.

Ashley: It’s exposure therapy.

Jeremy: There’s some trauma, but it’s in a direction. But it all seems like it’s healthy interactions in a direction.

Kolby: But she takes her to like a war zone?

Jeremy: Right, to see a person dying.

Ashley: Let me back it up. So, her initial questions I approve. The second one where she brings her to a girl who’s not really a real person, it’s a fake person dying…

Kolby: She doesn’t know that.

Ashley: So okay, the first situation there’s a rabbit that’s dead. In the second situation, there’s a bird that hit a window, not dead, but she kills it, so she’s like, “Well, let’s test this theory again. Let’s bring her to an injured animal, in this case a human. Again, going up a level, instead of being an animal now it’s a human and injured human, let’s see if she kills it?

Jeremy: How she handles it? She wasn’t there to kill the woman.

Kolby: She was there to kill Algernon...

Jeremy: They were...

Ashley: But it was also like…

Jeremy: Who in this process of being gutted and dying. It was an exposure to illicit that, “Oh my god, this is wrong. How could I do this?”

Kolby: To put the person out of their misery.

Jeremy: This was just to see this is what serial killer does.

Ashley: Oh, I thought she wanted to see if she would kill her because she gave her the knife for that.

Jeremy: No, that was for Algernon.

Kolby: To go get Algernon.

Ashley: Keep in mind the robot Quinn, this entire process, she’s like scan the pupils, look at the dilation, look at the heart rate, you’re looking for that response. So, could we do these sorts of simulations in real life? Like, exposure therapy? Is this permissible? You think?

Jeremy: Ok, so from a machine learning perspective, we talked about this in the last story as well, you talk about the machine, the AI, is just trying to park the car without damaging other cars.

Kolby: It doesn’t know what a car is.

Jeremy: Right. So, there’s a lot similar in here. In here these AI are specifically, here’s the scenario we want an optimal result, here are all the things we do, try not to wreck all the cars, try to get the car into the parking spot of normal social behavior, so what extremes do you go to? This ties into the failure modes in machine learning. And there are a lot of ways around this we’re currently going through, things like reward hacking, so they get rewards but if the machine can hack that reward response without having to actually...

Kolby: I don’t know what reward hacking is.

Jeremy: There are a lot of examples in gaming where AI’s perform, they figure out how to get the rewards without doing the work.

Kolby: Ok, got it. So, they’re basically someone in the basement of their mom’s house.

Jeremy: Right.

(laughter)

Kolby: I got it.

Jeremy: And there are other examples like wire heading, is a good example of it. If you can just put wires into the pleasure center of your brain, why do you even do any work when you can directly…

Ashley: You can just, “Boop”, I feel better now, “Boop”.

Jeremy: And there are good examples of this is AI as well. Activate, if you can take control of the measurement system, you don’t have to do the actual work, you just get the result.

Kolby: Sure. So, I’m more skeptical about this as a learning method. Not that I don’t think it would work; I do think it would work.

Jeremy: Again, you’re getting to the result but at what cost?

Kolby: That’s exactly it. And I think it might mean that a hundred percent of the time when a kid sort of showing these signs they end up serial killers, in jail, or on death row, whatever, if you’re in Texas, I actually, I think, that it’s unethical, maybe, to do certain things even if those things are necessary to stop behavior.

Ashley: So, the unethical thing is exposing her to embarrassment and trauma situations?

Kolby: No...

Jeremy: To the trauma, not to the embarrassment. Like the phase one seems perfectly reasonable.

Kolby: Talk to the kid about why did you kill the rabbit. Phase 2 with the bird, totally fine.

Jeremy: Wait, no, phase 2, that’s what she did in phase 2. The response to phase 2 was to see the dying person.

Kolby: Taking someone to seeing a dying person and giving them a knife and saying, “Yeah, you should go visit him, kill the serial killer.”  Like, even if that’s an effective teaching technique, the result doesn’t make the morality in my case.

Ashley: I’m playing devil’s advocate here… how do you know… okay, she’s going to fantasize forever and ever and ever and ever and ever about killing.

Kolby: Maybe.

Ashley: And she’s going to progressively get worse and worse and worse and worse until she does it, but if you can get her now to realize, “Oh, wait, I’m not capable of this, I should stop this behavior now.”

Jeremy: And that’s exposing her to a dying person, it’s not actually a dying person.

Ashley: Yeah, it’s fake. It’s a controlled environment.

Jeremy: And it works.

Kolby: But I understand I’m in the minority on this. It seems like in almost every conversation there’s a two versus one, except it depends on who is the one person.

Ashley: Say I’m trying to rock climb, and it’s like, well I’m going to keep trying to climb until I get to the top. Well, how about you just stick me on the top and see if I can handle being at the top and guess what? I can’t. Now I know. I’m going to stop trying to keep climbing and hurting people along the way, so just shock me at the top and then I realize I don’t want to go up there.

Kolby: But here’s what you guys are saying, you’re saying the severity of the disorder in action warrants a comparable severity of education therapy technique. And so, if you got an eating disorder, that warrants a certain level of eating disorder therapy intensity. But if you’re a serial killer, then it’s a certain, even greater level, you know? And at a certain point, I wonder if the results don’t justify the means.

Ashley: Well, that’s why you got to do the experiment. You got to figure out what’s the extreme you need to go to, but you don’t know until you practice.

Kolby: I’m saying maybe that result, maybe the extreme you have to go to is unethical to go to, and you just have to accept that sometimes the world has serial killers. So, let’s say you…

Jeremy: But if you’re outsourcing your ethics to the AI that’s performing the therapy…

Ashley: And its controlled environment, not real people, it’s a fake robot with blood, and she has to face it…

Kolby: I don’t know. So, imagine if she was a budding pedophile, what would the therapy be? Are they going to progressively expose her to more horrific acts of pedophilia until she’s offended by it? I’m not okay with that.

Ashley: Yeah.

Jeremy: Even if it’s all fake.

Kolby: Even if it’s all fake, and even if it changes the behavior, I don’t know about that.

Ashley: So, what do you do about that person? We just talked about, you just put them in jail.

Kolby: Put them in jail forever. You jail them forever so that they’re not a harm to other people, even though you could have helped them and chose not to because it’s unethical to help them in the way that needed to have happened in order for it to work. And I get that’s just a random, arbitrary line in the sand for me, maybe it’s not arbitrary, but it’s a different line in the sand for me than you all. I’ll give you one other example that’s not so traumatic. Everyone like, I don’t know how to get rid of all the traffic, it’s terrible. And I’ve always known how to solve the problem with traffic. It’s simple. All you do is you reverse the number of carpool lanes to the number of single person lanes. So, you get on a freeway, and instead of there being 1 carpool lane and 4 regular lanes, there are 4 carpool lanes and 1 regular lane. And the carpool lanes are going to be mostly empty or you’re going to have to carpool because that one regular lane is going to be a disaster. Right? And you will solve the traffic issue, but we don’t. It might.

Jeremy: It didn’t work with the bike lanes.

Kolby: That’s true. It didn’t work with the bike lanes. But we don’t do that because it’s just not what we do. Right? Like, the goal is not always the result because our own morality is wrapped up in the way that we…

Jeremy: In the way it was approached. Absolutely.

Ashley: But what about her example of, she was, by the way she was basing a lot of her teaching methods based off of pervious scenarios. There were basically two tribes of people that hated each other.

Kolby: It’s a great example

Ashley: And instead of trying to teach tolerance to each of the groups, she basically told each group, “Yeah, you’re right, you should kill them! I’m going to teach you how.” And then went to the other group, “Yeah, you’re right, you should kill the other group.”

Jeremy: She tried to escalate the problem.

Ashley: And then each group realized….

Kolby: Based on how smart this guy is, I bet he really found this research and I bet this is real thing.

Ashley: And what happened to each tribe is they realized how mad and crazy and extreme they were, that both of them were like, “Well, we don’t want to be the crazy people. Let’s back down.” And so, that was her idea of being like hyping her up, here’s a knife, let’s go, let’s go… and finally she’s like, “No, I really don’t.” So, is that real?

Jeremy: It’s the same idea.

Ashley: It’s the same idea. It’s kind of an embarrassment. Actually, it kind of backfires on her because Leticia was embarrassed by how she couldn’t have killed earlier, that she goes and kills the butler or the gardener guy or whatever, and it’s like, “Okay, that did backfire.” So.

Kolby: Let me ask one last thing as our parting note, this was a quick 30 minutes.

Ashley: Yeah, oh my gosh really? Oh man.

Kolby: So, the thing that ultimately shakes it out of it, is the robot kills Algernon rightfully so because he’s broken off his collar.

Ashley: He’s rogue.

Kolby: And the little girl watch’s that, and she’s okay with that. And then she goes to kill the mom and the little girls like, “No, don’t kill my mom.” And the story says, well the cycle’s broken, here’s the thing I don’t know. This goes back to your sort of gaming the game sort of thing. I understand the story meant that to mean that it broke the cycle leaving the sort of construct of the story, I don’t know if it just simply deprogrammed to the girl to think killing strangers versus killing family members whereas people you have an emotional attachment with versus, you see what I’m saying?

Jeremy: Right. Well, but it’s the similar idea, like this robot is crazy, and it’s an example of if you want to do that, it’s an extreme that this girl now doesn’t want to go to because…

Kolby: Right. But you don’t think maybe the only thing she really learned was…

Jeremy: …don’t kill my mom.

Kolby: Right. Don’t kill family members.

Jeremy: It’s possible.

Kolby: I don’t know.

Jeremy: In an extended story.

Kolby: We might find out. Like in version 2 we might find out she learned a lesson, but not the lesson.  

Ashley: The thing is Quinn, when she starts to go wanting to kill, she goes, “This is no god.”

Jeremy: To Algernon.

Ashley: To Algernon. “I just killed your idol…

Jeremy: Who is not your god.

Ashley: … and keep in mind she’s also kind of idolizing Quinn, you’re like teaching me things” And then she turns to go to her mom, the girl was shocked not only on the “kill life to kill life to understand life.” To here’s my idol being killed, here’s my other idol going crazy town, yeah, that is a cluster, mental…

Jeremy: It’s a pretty harsh therapy.

Kolby: But, for what is a pretty harsh problem, because I think the rational of the morality of doing it. Yeah, the other thing I thought when I was reading this, I thought, “Oh man they shouldn’t let this girl read Ayn Rand. That’s what got her started. That’s what made her like this.”

(laughter)

Kolby: She read Ayn Rand then it’s all downhill man.

Ashley: So, do we limit what our kids, see, read, hear, now that information is so readily available? Would this girl have been who she is if she hadn’t been reading Algernon’s stuff? Or seen his stuff?

Kolby: I go the opposite way. This goes back to the Jeremy and the punk rocker thing from our last one. I think you don’t limit what people see, you let them see everything so they understand the insignificance of any one thing.

Jeremy: Yeah. I would agree with that.

Ashley: You don’t think this girl went down a rabbit hole and got obsessed with it?

Jeremy: Yea, she would have gotten obsessed with it. Or. she would have gotten obsessed with something else anyway.

Kolby: Nobody was every like what was Hitler listening too? Let’s ban Beethoven.

Jeremy: Ban art schools’ man.

Kolby: Ban art schools, exactly.

(laughter)

Kolby: At any rate, this was a really quick 30 minutes. Again, a huge thank you to Michael Rook. This is a… I would say if you’re just reading your first After Dinner Conversation, and I hate to say this, don’t read this one. Just because it’s not that’s so confusing, it’s that it’s so smart.

Ashley: I would say it’s dense.

Kolby: Yeah. And you have to read the footnotes. The footnotes are actually hysterically. They make it as well. It’s a great story. It’s phenomenal. Thank you, Michael, for submitting it. You are listening to After Dinner Conversation, short stories for long discussions. If you’ve enjoyed this, please “like” and “subscribe”. The vast majority of people don’t. It’s a silly thing, you should do it. And recommend it to your friends.

Ashley: Share. Post it, share it, talk about it…

Kolby: That’s the #1-way people learn about podcasts, it’s by other people recommending them. So, recommend it. If you’ve got a story, submit it, go to our website: Afterdinnerconversation.com. We also have an anthology that is either come out or just coming out depending on how much time I get to do work. Go ahead and check it out. It’ll be called After Dinner Conversation Season 1. Boom.  Implying there will be a season 2.

Ashley: Redux.

Kolby: But it’ll be better than the 2nd Matrix.

(Laughter)

Kolby: So, that is it. Thank you for joining us. Bye bye.

* * *

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E13. "Believing In Ghosts" - How much power are you willing to give to AI?

STORY SUMMARY: A white-hat hacker is hired b a Presidential campaign to make sure there information is secure. She gets a call that the system has been hacked. When she investigates she finds it wasn’t a usual hacker in the basement, but someone highly funded, maybe another nation-state. She also finds some odd code. She takes it to a friend and, between the two of them, they determine it’s an AI program that has been feeding the candidate all the optimal opinions and policy to get elected. The hacker tries to tell others, but is set up and arrested with a deep fake, before she can get the information out.

DISCUSSION: This seems not that impossible. This is just a small step down the road of AI and machine learning. But is that bad? Don’t you want doctors, actors, or judges to act in an optimal way? Or, is that impossible, because the parameters put into the AI are always based on the coders bias. Isn’t it the job of a politician to do what is a bit beyond what public opinion supports, but is good for the public? One thing is clear, this story was written by a person who really is a computer hacker of some sort, it gets so much right.

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“How much power are you willing to give to AI?”

Kolby, Jeremy, and Ashley discuss the ethics and choices in the science fiction AI short story "Believing In Ghosts" by André Lopes.

Transcript (By: Transcriptions Fast)

Believing in Ghosts by Andre Lopes.

Kolby: Hi. You’re listening to After Dinner Conversation, short stories for long discussion. What that means is we get short stories, we select those short stories, and then we discuss them, specifically about the ethics and mortality of the choices the characters and the situations put us is. Why did you do this? What makes you do this? What makes us good people? Whats’ the nature of truth? Goodness? All that sorts of stuff. And hopefully were all better, smarter people for it and learn a little bit about why we think the way we think. So, thank you for listening.

Kolby: Hi. And welcome to After Dinner Conversation, short stories for long discussions. I am your co-host Kolby, here with my co-host Jeremy.

Jeremy: Hi.

Kolby: Who now knows he doesn’t just wave, he has to talk a because it’s a podcast. And Ashley.

Ashley: Hello.

Kolby: And we are once again in La Gattara café where they, you know, one of the times I said they rent cats and that’s not quite right.

Ashley: No.

Kolby: You can buy that cats and take them home.

Jeremy: Adopt them.

Kolby: Adopt them. Or you can just come and have a cup of coffee, use their free Wi-Fi and have cats around you. So, we’ve got cats all around us right now.

Ashley: When we say cats, we talking 15 cats, like, just chilling, hanging out.

Kolby: And there’s a spectrum of cats. There’s like the lazy cat all the way to the like, “I randomly jump up and do a 720 in the air because I think a ghost touched my butt.”

Ashley: And from kittens to, you know, the older cat population.

Kolby: Seniors.

Ashley: Senior kitties.

Kolby: Yeah, they’re awesome so you should definitely come. And they’ve been really great hosts for us and in sponsoring the show and we just really appreciate it. So, short stories for long discussions, after dinner conversation. So, the whole point of this is for us to have conversations about the ethics and morality of the stories that we read in the hopes that it’ll encourage you to do the same. It’s meant to read the story, talk with your friends, debate, have a cup of wine. Cup of? Nobody has a cup of wine.

Ashley: Glass of wine.

Jeremy: Glass.

Ashley: A bottle of wine.

Kolby: Maybe the way I drink it. I drink it out of a cup.

Ashley: You fancy. One of those boxed wines. Have a box of wine.

Jeremy: I was thinking sippy cup.

Kolby: I was thinking one of those baseball cups, like the 32 ouncers. Like, “I’m just having one glass of wine before bed.” Okay, at any rate, the one we’re doing tonight is “Believing in Ghosts.” And Ashley, you drew the short straw so you get to do the…

Ashley: I get to do the intro about the story.

Kolby: Yeah.

Ashley: Okay, so this is called “Believing in Ghosts” written by Andre Lopes. The premise of the story is the main character Raine is basically a computer hacker that if someone hacks certain computer systems, she goes and basically finds out who did it and de-bugs it.

Jeremy: She would be a security consultant.

Ashley: There you go. That’s the technical term. I’m not that computer literate, so we’re just going to…

Kolby: How was it the least computer literate person do it?

(laughter)

Ashley: Because I drew the short stray.

Kolby: I should’ve given Jeremy the short straw.

Ashley: So, she a consultant so she works with a couple of different people, one of them being a politician who’s running for office. And what happens is there are these people that are in called ghosts, which are pretty much just AI. People think they’re real people, they have their own autonomous thoughts, and things like that. Turns out they’re just an AI. And so, as they progress through the story, this politician that she’s working for, he gets hacked, and turns out that the politician is pretty much just a vessel for this AI who’s creating speeches, creating basically an entire personality, and this politician is just the vessel for him to carry that AI’s message.

Kolby: So, there’s a real person, right?

Ashley: There is one. The politician is a real person….

Kolby: But it’s just an actor or something?

Ashley: … but his speeches, the way he talks, the way he acts, it’s pre-programmed for him to follow.

Jeremy: By their algorithms.

Ashley: By their algorithm. And they grab the algorithm by all these….

Kolby: So, he’s just like a vessel for the AI…

Ashley: Is an actor. Reading somebody else’s script, acting in a certain way. So that’s a really short synopsis of this story.

Kolby: I continually keep picking on Jeremy for his long synopsis’s.

(laughter)

Ashley: Well, okay, so what you should do it go read the story, because it’s actually pretty darn good. And there’s a lot of more in-depth side stories that we’re going to get into when we talk about the discussion questions. So, I just gave a short premise to kind of prime you for what we’re about to talk about.

Kolby: No, that makes sense. We should also mention, Jessica didn’t get fired.

(Laughter)

Kolby: She just went back to California, and so...

Ashley: She’s greatly missed.

Kolby: She’s greatly missed. And her cackle is greatly missed. She has a great cackle.

Ashley: Cats miss her too.

Kolby: Cats, yea. Especially…

Ashley: I miss her too.

Kolby: What was the one?

Ashley: Hemingway. Awwww, Hemingway got adopted out.

Kolby: He did. All these cats are open for adoption. Okay, so we have an AI that basically tells a politician what to do and the hacker finds the secret out basically. So, this is like a near future thing in my mind. This is not… I feel like the idea of having AI that you can have a conversation with… I don’t….

Jeremy: It’s interesting, the Chinese room… the part of the story where they talk about the Chinese room.

Kolby: Oh, yea. Explain that.

Jeremy: It’s really associated with the Turing test.

Kolby: Maybe you should explain the Turing test too.

Jeremy: I didn’t look that up.

Kolby: Want me to explain it?

Jeremy: I know what it is, but go ahead and explain it.

Kolby: It’s named after Alan Turing, the guy they made the movie after. The idea is that, it doesn’t matter if something is alive or not alive, if it can fool people, it’s good enough. So, the Turing Test has been going on for years where they actually have you have a chat message conversation with a series of “people” so to speak, I’m making air-quotes which doesn’t make sense for a podcast.

(laughter)

Kolby: And the theory is, if the AI can have a chat conversation with you that’s so good that you don’t know it’s not a person…

Jeremy: That it can fool a person, it passes the Turing test.

Kolby: Then why do we care if it is or isn’t a person? If you create the approximation of person, that’s good enough.

Jeremy: And that’s the basis of…

Kolby: By the way, nothing’s passed that test yet.

Jeremy: Right.

Kolby: I don’t think any computer has been able to do it yet.  

Jeremy: No. And that’s the idea behind the Chinese room. Basically, if you have enough “if-then” statements, if the input is this from a real person speaking Chinese, and even if you don’t speak Chinese….

Kolby: I just have a giant set of index cards.

Jeremy: Dictionary, right. Index cards, that if they ask this question, you can answer with this.

Kolby: So, if they say, “How’s the tea?” You know to say, “It’s fine” in Chinese.

Ashley: So, the question is, does the AI speak Chinese or is it just spitting out…

Jeremy: Right, responses to “if-then” questions.

Ashley: So, does it know the language or does it not? I think that was one of the first discussion questions. What’s your take on that? Does it know the language?

Kolby: If you do everything that approximates it but you have no idea what you’re saying, like, if I say “(speaking Chinese)” which is Chinese by the way since I do know a little of Chinese, does it matter that I have no idea what that means, except that a little card says, you know, that’s how I should respond?

Jeremy: Probably depends on the scenario.

Kolby: What do you mean?

Jeremy: In terms of whether is knows Chinese, I mean, it can answer certainly, it can answer questions because it understands the input.

Kolby: And it understands what the appropriate output is.

Jeremy: Right. So, in that sense, yes.

Kolby: Okay. Can you say it knows Chinese?

Jeremy: Yes, I would say it does know Chinese because it’s programmed specifically to respond in Chinese.

Kolby: Right.

Ashley: Well, that’s actually the premise of the story, aren’t we all programmed that way, the way that we learn language. When someone says “Hello”, you say “Hello” back. It’s an automatic program for us.

Kolby: “How are you doing?” “I’m fine”

Ashley: “I’m fine.”  “How are you?” “I’m fine.” That’s normal speech patterns and dialect.

Kolby: So, Ashley, we talked before, you are of the opinion that that does not mean you know Chinese.

Ashley: Yes.

Kolby: If you create the approximation of everything, it doesn’t mean you know anything.

Ashley: I think it’s because it eliminates those that deviate from that. Like, for example, that perfect example was, “How are you?” “I’m fine.” What about those people that come at you with a different response? And you’re like “Wait, what? You’re not following the standard protocol.” They actually say, “Well, you know, I’ve actually had a hard day.” You watch the reaction of the person…

Jeremy: But that’s different. That’s actually talking, is there an intelligence behind that chat room.

Kolby: So, you would say not to the intelligence?

Jeremy: Yes.

Kolby: Oh, see I actually disagree with that as well.

Jeremy: Because if you’re just responding to, if it’s just an “if-then” scenario, if this is the question, this is the response, it’s not based on an underlying intelligence. It’s just selecting answers.

Kolby: So, this is my, one of my friends once said, she said, “I don’t think you have Asperger’s, but you’re certainly Aspe-y.”

(laughter)

Jeremy: You’re on the spectrum.

Kolby: I’m on a spectrum. I don’t know what spectrum, but I’m definitely on a spectrum and I don’t disagree with her. I think I probably am on a spectrum. But I disagree. I thought I agreed with your Jeremy, but I disagree with both of you it turns out.

Ashley: Okay so someone can spit out...

Kolby: I think, that we are all an amalgamation of accumulated “if-then” statements.

Jeremy: Absolutely.

Kolby: That does not mean I’m intelligent. That means that I have…

Jeremy: …learned something.

Kolby: Yeah, it’s like the first time somebody says, “Does this outfit make me look fat?” You go, “No, your fat makes you look fat.” And then you get in trouble.

(laughter)

Jeremy: And you learn. Well, that’s the whole thing with algorithms.

Kolby: “Your fat makes you look fat. The clothes just accentuate it”. That’s actually finishing the sentence. And so, then it’s like, “No, that’s the wrong “if-then” statement.” And I go, “Oh, when someone says, ‘Does this outfit make me look fat?’ Now I’m like, it’s like a trial and error process where I go, ‘No, it looks fine.’”

Jeremy: So that is exactly how AI’s are programmed.

Kolby: Right. And I would say that’s the approximation of intelligence, both in the AI and in me. Like, I’m not intelligent, I’m just the approximation of intelligence through a series of “if-then” statements. And if that’s the case for me, then I don’t know why that’s not the case for AI.

Jeremy: Okay, so you’re saying basically people and AI are the same and it’s potentially neither of them are intelligent. We’re all just responding to our environments through a series of...

Kolby: The same way you train a dog with a treat.

Jeremy: This is how you train….

Kolby: …people, and babies, and all the way to adults. Yeah. But again, I’m on a spectrum, so you know.

Ashley: See, what I want to add in, if there’s an empathy and understanding that goes behind the words that you’re saying. There’s inflection with how someone says something. You can ask me, “How are you doing?” And I could say, “I’m fine.” Or I could say, “I’m fine!” Or you know, so the word is the same, will the computer understand the difference? Are they intelligent enough to know the difference?

Kolby: So, it reminds me a little bit of the saying from Winston Churchill. He said to some lady, “Would you sleep with me for a hundred pounds?” And she goes, “No, what do you think I am?” He goes, “We know what you are and we’re just haggling over price.”

Jeremy: I’m not sure that was Churchill but I’ve heard that before.

Kolby: I thought that was Churchill, maybe it was someone else. I feel like it’s the same thing. Do I agree that a computer couldn’t know the difference between “I’m fine” and “I’m fine!” Yes. But that that point, we’re just haggling over intelligence. We’re not haggling over….

Ashley: … if they’re intelligent or not intelligent.

Jeremy: … what’s an appropriate response.

Kolby: We’re just needing to teach the computer how to understand inflection, that’s all. So, it’s just like one more thing yet to be programmed. But I don’t know. I didn’t mean to shut you down. Which brings up the other part of this, we should get back to the story, but…

Ashley: Bring this back. So, say you’re having a conversation with somebody. And if you have a conversation with somebody and, have you ever walked away and you’re like, “Wow, that was a really good discussion.” Or, “That was a really great, like, every time, I feel like we connected.”

Kolby: Every time we do one of these.

Ashley: And if you were to take that dialogue and put it down on paper and you were to see it back and forth, you’d be like, “Okay.” But if you actually heard how the people were communicating to each other, there’s more than just the words that are said. And that’s what I’m getting at. Yes, someone could respond, spit out this and there’s …

Kolby: Body language, eye contact.

Ashley: But did this I feel like language is more than just words, because it conveys meaning, it puts emphasis on certain things and it’s a bond that comes between two people.

Kolby: That’s fair.

Ashley: So, yes, do I think a computer can be, quote-unquote “Intelligent” for knowing how to spit out certain “if-then” statements? Sure. But on a human level? I don’t know if they can ever reach to that degree.

Kolby: That’s fair.

Jeremy: That’s fair. And that’s one of the things they look at with AI. The whole psychology. And psychologists have started looking at AI and really how this…

Kolby: Did you do some research on this?

Jeremy: I absolutely did.

(laughter)

Jeremy: It’s really fascinating stuff out there. They’re looking at AI because we don’t fully understand how the human brain works. But we understand some things and so psychologists are looking at how AI has developed with an eye of how it reflects, basically, human psychology which is really interesting. There’s some interesting research going on.

Ashley: This maybe is BS, but wasn’t there... you know how the human plays the computer in chess? Wasn’t there some situation where he human was just totally random, and the computer was like, “I can’t take the randomness anymore.” Because it’s an “if-then”….

Jeremy: That was an episode of Star Trek.

Ashley: Oh, okay.

Kolby: Because everything’s an episode of Star Trek.

Ashley: Because I thought the human could beat the computer because it was so completely random in the humans’ playing, because it’s all “if-then” statements. If you move your pawn, then “blew blew blew my response is to move my pawn here.” Didn’t the human just go completely, off script?

Jeremy: But I know with Go, you were telling me this, with Go, computers have played Go enough that the computers developed an entirely new strategy for playing Go that now humans have adopted.

Kolby: Because it’s turns out to be a more effective strategy. If you ever watched, there’s actually, this is really odd, there’s YouTube videos where they speed up showing a computer learning how to do something? And so, you’ll see it how to park a car. It’s got a little car and it randomly drives it and smashes it into stuff, and then over a period of time, it learns. And they give it points, like, “You got closer to the parking spot.” And so, it runs like tens and hundreds of thousands of randomness’s until it parks the car perfectly. And then they can eventually put it anywhere in the parking lot and it starts over thousands and thousands something and it now looks like every time it parks a car perfectly from every location on the thing. When in actually what it’s done is what you’re saying. It hasn’t learned in the sense of humans do, it’s just run a million examples and now it knows what example gets it not in trouble.

Jeremy: Based on its criteria.

Ashley: At the base, does it know why?  Does it know it’s a car? Does it know it’s trying to park?

Kolby: It’s a metacognition thing, right?

Ashley: No, it just knows it’s moving this thing and there’s a blockage.

Jeremy: It doesn’t need to know it’s a car, it just has a series of guidelines and its goal to get it into the spot, it’s secondary goal- not damaging the other vehicles.

Kolby: It could just as well be planting nuclear bombs in a schoolyard, and it’s just like, “Whatever. There are my criteria.”

Ashley: Yeah.

Kolby: And this is the part where my theory about the “if-then” statement totally breaks down, I know this isn’t exactly in the story, but the idea of like, “Okay, we can program a computer to draw roses, but a computer doesn’t know what a rose is. It doesn’t know the rose-ness of it, so to speak. It only knows that after a million examples, this is the thing that gives me the perfect score.

Jeremy: Right.

Ashley: Yeah. That’s true. And again, why? Where does that specialness of the rose come from? It’s because there’s some chemical that goes in our brain that goes, “This is pretty.” And computers don’t have chemicals to go in their brain to give them that surge of dopamine or whatever.

Jeremy: There similar because there is a reward center effectively with AI because again, they have a goal.

Kolby: And they get a point for yes and a point for no.

Jeremy: I think this conversation is taking us totally in a different direction.

Ashley: This is going way out…

Kolby: I was going to bring it back to the story too. How are you going to bring it back? Let’s hear it.

Jeremy: Bring it back. So, the point that they talk about that there’s an AI that is developing the perfect political strategy for an actor.

Ashley: Yes.

Jeremy: And while that’s not necessarily a bad idea, that you could have an algorithm that could create the perfect political strength, I still think you can’t take out the actors’ personal motivations.

Kolby: What do you mean? The actor’s going to like skew the results or something?

Ashley: Is he going to give 100% 100% of the time, or do you think he’s like, “I don’t really agree with this, so I’m only going to give 70% of my acting.”

Kolby: I’m not going to deliver it as well.

Jeremy: Not necessarily his delivery but his own motivations outside of his political motivations because you can’t separate the actor’s motivations from their political motivations.

Kolby: I did wonder what was going to happen assuming this person, I think his name is Booker, if he got elected? Would he just be like, “Yeah, thanks for the AI…”

Jeremy: “… and I’m going to do things my way.”

Kolby: “… I’m president now anyways. Come at me bro.”

Jeremy: Exactly. And they even hint at some of that in the story where they’re specifically talking about there’s another AI or another political commentator that was revealed to be another algorithm or being backed by other people and there were sex tapes involved. So, are the sex tapes fabricated? Or is this really who Booker is?

Ashley: So, just to kind of give you the definition that the author gives, is a “ghost was the common term used to describe a fabricated person from looks, to voice and personality all made up using clever algorithms.” So, it’s not just how they speak, it’s how they look, it’s how they act, that whole thing, so it’s kind of like the complete package.

Jeremy: The persona. Which is really interesting. And this really gets into…

Kolby: Like a deep, deep fake.

Jeremy: And the importance of online anonymity. In terms of, does it matter if you’re a political commentator and you’re not a real person but you’re potentially a political think-tank that is...

Kolby: So, one of the things I saw in this I thought, “Yea, that’ll happen.” Is why would you pay a new commentator?

Jeremy: When you could just create one?

Kolby: When you could just create one and have it read and have it have banter.

Jeremy: Have a personality.

Kolby: Have it have a little bit of personality? Would you watch that would you think?

Jeremy: Absolutely.

Kolby: You’d be fine watching that.

Jeremy: Yeah.

Ashley: Well, that was so the guy who got busted, the original ghost that got busted, his rebuttal to this huge outrage that he’s not real, he was like, “My mission was not to present you a face or a body, it is to present and discuss ideas. Is that such a bad thing?”

Kolby: That’s true.

Jeremy: But again, that depends on the motivation of the people behind it? Is this a specific political think tank that is furthering a different agenda? So, this story, I felt like, hit a lot of interesting topics, not just this topic whether the AI…

Kolby: There’s sort of tertiary things beside the main story.

Ashley: But his claim was this was a witch-hunt is an attack on free speech. He went to that extreme. “Just because I’m AI, doesn’t mean I can’t think for myself.” And it’s like touché. Do AI have their own thoughts and agendas?

Jeremy: It’s not necessarily that it was an AI, it was a fake person. They weren’t saying this ghost is an AI running it, they’re saying it’s being manipulated by somebody and they’re just doing it anonymously.

Kolby: They’re doing the programming of the algorithm.

Jeremy: They’re providing what’s going into this political commentary.

Kolby: And I think that’s one of the reasons I don’t mind this idea of ghosts, is there’s this assumption we’re creating a brand-new person, or we’re creating a thing, like a politician or a news persona, but you’re programming the traits of that. In the game like Go, it’s easy. The trait is “Win the game”. But if you’re creating a person, then you might want to set certain amount of aggression, or passiveness, or empathy or cultural references in their conversations, or whatever. And so, while you’re creating a puppet, you ultimately program that puppet.

Jeremy: Right.

(music)

Hi, this is Kolby and you are listening to After Dinner Conversation, short stories for long discussions. But you already knew that didn’t you? If you’d like to support what we do at After Dinner Conversation, head on over to our Patreon page at Patreon.com/afterdinnerconversation.com That’s right, for as little of $5 a month, you can support thoughtful conversations, like the one you’re listening to. And as an added incentive for being a Patreon supporter, you’ll get early access to new short stories and ad free podcasts. Meaning, you’ll never have to listen to this blurb again. At higher levels of support, you’ll be able to vote on which short stories become podcast discussions and you’ll even be able to submit questions for us to discuss during these podcasts. Thank you for listening, and thank you for being the kind of person that supports thoughtful discussion.

(music)

Kolby: In the game like GO, it’s easy. The trait is “win the game.” But if you’re creating a person, then you might want to set certain amount of aggression, or passiveness, or empathy or cultural references in their conversations, or whatever. And so, while you’re creating a puppet, you ultimately program that puppet.

Jeremy: Right. So, Neil Stephenson wrote a story in like 1994.

Kolby: God, I love him.

Jeremy: I know. And the story was…

Kolby: You made me watch one of those.

Jeremy: …Interface.

Kolby: Okay.

Jeremy: Where it was a different version of this. Neal Stephenson’s Interface where although the idea there was that they had a politician with a neural interface and they could control his emotions, so they could really program what he was saying but he was just being controlled by another actor.

Ashley: Wow.

Jeremy: So, it was an interesting perspective on it, I think prior to the whole idea of AI. But it was very similar concept.

Kolby: That was in the early 90s?

Jeremy: Yeah.

Ashley: Wow.

Kolby: That’s way beyond where anyone was thinking at the time.

Jeremy: Yeah.

Ashley: So, the question is, how do you feel if somebody is giving you information or basically being a public figure, that’s not really real?

Kolby: For one, I’m definitely okay with it if it’s like a news anchor. I’m for sure okay with it if it’s somebody giving me customer service. Because realistically, they’re going to do better than that guy trying to walk me through how to do Windows anyways. I’m not terribly sad about it being a politician, I got to be honest.

Jeremy: Well, again, if it were actually the AI making the decisions, but again, you have the problem of there being an actor and what are his motivations.

Kolby: That was really the part that scare you the most about this.

Jeremy: Yeah, absolutely, for this particular scenario because you can’t discount what’s this person’s motivations. And even though, in the story he does a good job of putting in information that makes you think he’s potentially a good actor. They talk about, Booker was an experiences politician who comes from a long line of famous lawyers and economists. His immaculate presentation, charism, and natural knack for leadership are certainty three of the main reasons why he was the front runner on the polls nationwide. So, this and the story is establishing that him potentially is a good actor. There’s the secondary part where they are potentially sex tapes but that’s even draw into question like they were fabricated. And there’s other points in the story where they could easily fabricate anything that happens later, anything that goes on. The character Raine is fired because somebody fabricated a conversation between her and…

Kolby: In her voice.

Ashley: In her voice, yeah.

Jeremy: And a journalist where she was giving them documents from the company.

Kolby: That’s what gets her fired and maybe put in jail.

Jeremy: Exactly.

Ashley: So, that was actually question #4, how would you feel? Comfortable having a ghost service in other roles such as doctor, police officer, or teacher? If perfection and lack of bias is the point, shouldn’t you want someone doing the job that never makes a mistake?

Kolby: That reminds me, so Google, I think it was Google a little while ago, they came out with AI that was better detecting breast cancer in scans than doctors.

Jeremy: Right.

Kolby: Because they basically programmed in a million scans of breast cancer scans, and it figured out better than a doctor’s eye could. It was just right more often. So, it’s like, “Well, I want a doctor looking at it.” Really? Because a doctor’s not as good at it as a computer and maybe a breast cancer scan is a really self-contained problem, as opposed to the sort of house thing where it’s like, “I went to India 7 years ago, and my cough syrups been keeping me alive”.

(laughter)

Kolby: I really think that’s an episode of house. I actually would rather have a doctor I think, that was not a person.

Ashley: Again, it goes back to our initial thing. How does… it’s not just what someone says, it’s how you make them feel? Having a doctor deliver that information and that reliability of this person. You don’t question another person’s motivation. You know that doctor wants to help if you have cancer or not. You’ve had that discussion. You have that trust in them. You don’t have that relationship with a computer. You don’t go, “Buddy, you’re on my side, right? You’re going to find that cancer, right?” The computer doesn’t care. The computer’s like, “I find cancer or I don’t find cancer. That’s my job.”

Kolby: I can hear the computer, “I will find your cancer. I am very excited about it. There there. There there.”

Ashley: Exactly.

Jeremy: So now, I think we’re entering a phase where we’re using computers, or these AI algorithms to help us as a tool, but still there needs to be a person involved. So, a really good example is the moving Bright on Netflix, where I’ve read that they used an algorithm to help them create the script. It hit all of the things they want. It’s a buddy cop film, it’s a fantasy epic, it’s a crime thriller, it’s a sci-fi thriller, gritty drama, adventure, and it has Will Smith. It’s got all these check-box. But then have to give it to a director who can create a decent film out of it.

Ashley: So, the question is…

Kolby: They did that from the Netflix algorithm but knowing what people watch and when they turn off Netflix, right?

Jeremy: Exactly

Ashley: Are we going to though, now every movie is going to follow that algorithm?

Jeremy: Not necessarily. I mean, there’s different algorithms, it depends on what market you’re trying to reach.

Ashley: Does it really get rid of, it separates the people, say the people are super, super talented, you’re a super great thinker, creator, you’re just really good at writing stories. This AI can go “Blooop, I know exactly what you need.” And it’s like, well, it’s dumbing down, you’re going to get rid of those people that are just super creative, because this algorithm can figure out what you need in the story to be good or not. It’s like, “Oh, you’re killing those people’s careers.”

Jeremy: Currently, here’s what needs to be in the script, you still need to write it.

Ashley: But it’s like, now the writers are creativity now has to follow a set of rules.

Jeremy: That’s Hollywood.

Kolby: I’m going to jump in really quickly. I lost my train of thought again. You guys are killing me. Oh, I remember now. So, here’s the thing, with your example of Bright. That formula is exactly right. It’s a buddy cop movie with aliens starring Will Smith. Yeah, you’re going to make like $100 billion dollars. But here’s the thing I think that goes to the movie thing, but I think also goes to the politician thing, the politician that’s programmed knows exactly what the average person on the average day wants from the average politician. The same thing with the movie. But that doesn’t mean that’s what we need.

Ashley: Bingo.

Jeremy: Right.

Kolby: And so, I don’t want… maybe I want to watch that Will Smith movie “Bright”, but what I actually need, is to watch the new Joker movie that just came out. Which probably wouldn’t hit any of those algorithms.

Ashley: And you alienate the people on the other sides of the bell curve. You’re alienating the people that the majority of the people are going to find Bright exactly what they need and what they want. But you’re missing the out… and I get, that’s not how you’re going to make money, making a movie to this extreme or this extreme but it’s still important to have those extremes otherwise everything just goes bloop, right in the center.

Jeremy: Need a system that allows creativity from independent films as well as….

Kolby: How did this become a film conversation? So, going back to the politician part of this, this is my problem with having an AI politician. This could be the perfect politician, but that doesn’t make him the perfect leader.

Jeremy: And perfect policy maker.

Kolby: Right. Because if the perfect politician may never… because public transportation may never poll well. A carbon tax may never poll well. You can go on and on and on. So, what you need from a politician is not someone who is programmed to be perfect for humanity. What they want. It’s perfect for what we actually need like 30, 50, 80 years from now. We need someone who can see beyond the horizon so to speak, a little bit. So, this wins you an election, but it doesn’t necessarily move us forward.

Jeremy: Foundation from Isaac Asimov is basically the theory where…  I forget what they call it.

Kolby: I think you’ve read more books than Ashley and I put together.

(laughter)

Jeremy: That’s the whole idea is that with enough, and this is Isaac Asimov in 50s, 60s, if you have enough information from history, you can accurately predict far enough into the future and plan accordingly, was the whole idea behind the foundation that psycho history is what they called it.

Kolby: If history tells you people are war like, you can plan for war like people.

Jeremy: Right, or plan to prevent those far enough in advance. And even the foundation approaches the topic of what about individual actors. And you can’t predict what an individual is going to do, you can just kind of predict what society is going to do.

Kolby: So, I’ve read this before, the idea that, in the case of Newton and his discovery, although there were other people, the idea of calculus, and theories of motion, that it was going to be discovered. He might have been 40 years ahead of the next person, or in his case maybe 100 years ahead of the next person, but there’s this progression and so you might not know how is going to be Elon Musk or when Elon Musk will exist, but in a timeline, you know that someone will see that combustion engines aren’t the future. And someone will start pushing battery-powered cars, and so the individual isn’t really special, they’re just the trigger on a progressively rising percentage scale. If that makes sense.

Ashley: You really think if somebody didn’t invent X, then no one would?

Kolby: Airplanes.

Ashley: Someone would’ve figured it out.  

Jeremy: Yes, somebody else would have been first.

Kolby: That everything is inevitable. It’s just maybe they moved up the timeline 20 years earlier. I don’t know. It’s just a theory that I’ve heard. I’m going to take one quick tangent before we run out of time here.

Ashley: I’ve got one more.

Kolby: Okay, let me take my tangent.

Jeremy: We’ve all got one.

Kolby: We’ve all got one?

Kolby: Okay.

Ashley: Go quick, go quick, sorry this is a really good story. Go read it, read the discussion questions, and then yeah.

Kolby: It is really good. Andre did a great job with this, both in the story and in the sort of secondary things that it hints at. Alright, mines going to be way shallower than yours, I know.

Ashley: Okay.

Kolby: You guys do know this is how they came up with Destro in, what was the…. GI Joe? The bad guy, the main bad guy that’s bald.

Jeremy: Cobra…

Kolby: Not Cobra Commander. Cobra Commander got all of the DNA of famous people in history and mixed them together and then it made the perfect leader. And the reason Destro wasn’t perfect is because they dropped like the Attila the Hun DNA and so he was missing like one thing to make his perfect.

(laughing)

Jeremy: That’s funny

Kolby: I’m just saying, GI joe made it first.

Jeremy: No, GI Joe did not do it first. Star Trek did it first with Kahn.

Kolby: Oh, that’s true. Yeah, that’s genetically engineered.

Ashley: Of course, Star Trek did it first.

Kolby: Okay, so that’s totally my shallow tangent. But you had a better one.

Ashley: So, going back to the story…

Kolby: Thank you.

Ashley: One of the things again, talking about AI, again they’re talking about how they’re able to learn from chats and social media…

Kolby: Oh, I know what you’re going to talk about. It’s so clever.  

Ashley: …And software and they can absorb everything and put it together, one of the most unsettling application of this principle is to manufacture or some sort of online immortality. Certain moms have been found to be spending days talking with an AI copy of their dead sons.

Kolby: That’ just one like sentence in there and it’s so clever.

Ashley: So, think about that for a second. If AI is not able to basically mimic human mannerisms, language, speech patterns, all of that, here’s this lady since Quinn’s sons died in the car crash one year ago, this is basically her life. She would sit and talk to her dead son AI. Like, pooo, that was mind blowing for me because how does that mess with the mental psyche?

Kolby: The ability to move on.

Ashley: Basically, coping with death? Like, it’s the fact of immortality. He can live forever online.

Kolby: You want to think about not moving on for relationship because you’re looking at a Facebook page from an ex. Like, you’re having conversations with your dead son. You’re never moving on.

Jeremy: However, with if you were doing this with a psychologist help, this could be a very good therapy.

Ashley: Yes.

Kolby: Oh, that’s true, if a son was helping, be like, say, “Hey mom, I’m okay. You need to move on.”

Ashley: But the idea is that this son has died but he’s still able to live online, post online, post on social media as a simulation, so it’s like he never really died. Like Whoa. How would that affect our ability to be like, “I’m afraid to death, but I’m going to continue living one.” Like that’s be weird. Like, I’m okay if I die physically…

Jeremy: Because I’m still going to haunt you.

(laughter)

Kolby: Honestly, I would make that illegal if I could. Because I think the damage it would do to someone to get over the death of a loved one would be…

Jeremy: Unless used with the help of a psychologist.

Kolby: It could only be used medically.

Ashley: I’m going to back that up. Say it was a super, super smart, intelligent, inventor and you want him to create with his ideas and the AI like figures out his…

Jeremy: Exactly. I go back and talk to...

Kolby: The guy who I got obsessed with for like 3 months and listened to everything he did. The hippie guy from California.

Ashley: It’s another movie reference, movie Her. Anyway, think about that. What if it’s a super, super smart person. You want to keep them going because…

Kolby: Right, because you want Alan Watts around forever.

Jeremy: You want to be able to talk to him and have him keep doing what he did, which was amazing.

Kolby: Yeah.

Ashley: So, “wat wat”.

Jeremy: So, there’s two sides to it.

Ashley: Anyway, so it’s a really short paragraph.

Kolby: That’s fascinating.

Jeremy: I think we could spend 30 minutes talking about that.

Kolby: That one sentence I think we could talk 30 minutes on.

Ashley: It’s a short paragraph in the middle of the story and you’re just like, “Oh, what?” So anyway.

Kolby: Yeah.

Ashley: Jeremy, you had one more, that was mine.

Jeremy: No more panic concerning technology has every produced anything of note.

Kolby: Wait a minute, I have to process that. No moral panic.

Jeremy: About technology has ever produced anything of note. So, the current moral panic, screen time with kids. Like, how much… there’s a huge moral panic of how much time your kids should have in front of the screens. There’s a lot of research around this as well.

Kolby: What do you mean by the never produced anything of note?  That’s the part I don’t understand.

Jeremy: What he’s postulating is that all the moral panic around advancements in technology have never produced anything important.

Kolby: Oh, so somebody events the bow and arrow, and everyone’s like, “Oh my god, You can kill people from 50 yards away. We’re all going to die.” And life really just goes on.

Jeremy: Just goes on.

Kolby: So maybe, all the discussions about AI being the end of us.

Jeremy: Right, all the moral panic surrounds it.

Kolby: Life just goes on. It just becomes a thing.

Ashley: Have you seen the Terminator?

(laughter)

Kolby: That’s a good point.

Ashley: I’m just saying, yeah, life’s going to go on, hmmmm.

Kolby: I saw the Rick and Morty episode where they have snake robot terminators.

Jeremy: Oh my god. But I think it’s important to have discussion about the topics and how they’re going to affect society. Moral panic, probably hasn’t produced anything of note. But I would actually disagree. Some of the research on it has demonstrated how agents of social control amplify deviants. So, there’s...

Kolby: Wait, I got to pause for that one too.

Jeremy: Agents of social control… so people who are creating the moral panic.

Kolby: Okay.

Jeremy: Who are influencing, who are trying to stop whatever they’re concerned about, are increasing the level of deviance, that the moral panic is about. So, there’s a good example of, punks in England in the 60s.

Ashley: They’re like doing this moral uprising and it’s like.

Jeremy: There’s a bunch of moral panic about it, and all of the efforts to quash the kids being into punk…

Kolby: Having spiky hair.

Jeremy: Increased that deviance. What they were seeing as deviance.

Kolby: So, trying to quash punks, makes more punks.

Jeremy: Exactly.

Ashley: Just put it to light.

Kolby: Yeah, that makes sense.

Ashley: This is my concern through…

Kolby: How does that tie into the story though? I guess that’s my question.

Jeremy: Well, so what about the idea that moral panic over technology…

Kolby: How AI just makes more use for AI?

Jeremy: Or promotes it.

Kolby: Promotes it. Because it raises awareness.

Ashley: So, this is my thing. We already know, what is it, the intelligence of AI is going to every 18 months double.

Kolby: Moore’s Law.

Ashley: So, the thing is I think the scary thing about AI is 1) how do you control it? Because you really can’t, in a way, and 2) they’re going to be smarter, faster, better than us.

Jeremy: Okay, there’s a good example of this as well. So, somebody asked an AI in a chat room, in a chat AI, what do you want? And the AI said basically, “I want to make things better for us.” It had been programmed, and because it was programmed by humans, it considered humans as part of what it was concerned about. So, and I think that’s the effort that needs to happen with AI, is to make sure that it retains its link to humanity. Which it probably will, because we’re the ones doing the programming.

Ashley: It just takes one messed up human. Think about how many bad humans are out there, one bad human smart enough to create AI that goes, “I want AI that wants to de-link…” Anyways.

Kolby: I’m going to get the last word on this. I want to add one more thing just to add to your comment. I had a teacher say to me once is, “Eventually everything becomes refrigerator technology.”

(laughter)

Kolby: And what he meant by that was we talk about how nuclear bombs are so scary and they’re like, “We had to have the Manhattan Project.” You understand that was in 1940 something when that happened. That’s refrigerator era technology. So, eventually regardless of how cool you think something is, eventually it will be commonplace because it will be the equivalent of refrigerator era technology. So, if you can make amazing AI, then in 60 years, some kid in his basement with the equivalent of a Commodore 64 of the day, will be able to also make AI because it will eventually become common technology. And so, I think that’s why you have those ethic discussions when it’s still…

Jeremy: Only in the beginning.

Kolby: At any rate. We went over 30 minutes at least.

Ashley: That’s a good story.

Kolby: Yeah, thank you Andre. So, you are listening to After Dinner Conversation with myself, Ashley, and Jeremy. Short stories for long discussions. Please “like” and “subscribe”

Ashley: Share with friends and family. Read it, have a discussion with your friends.

Kolby: Actually, that brings up an interesting point, wow, I do that a lot, and that is, I was reading some statistics about how people find podcasts. It is not through advertisement because Millennials listen to podcasts. The vast majority, like 85-90% podcasts that people listen to are from referral only. A friend tells them to listen to the podcast.

Ashley: Well, please, talk your friends about it. It’s meant to derive discussion people. Go tell the world.

Kolby: Tell the world. And adopt a cat too. Alright, thank you very much. Bye-bye.

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E12. "A Community of Peers" - Would you cast the first stone?

STORY SUMMARY: Ex-military guy has his car break down and wanders into a remote village. A person is tied to a tree about to be stoned. The village elder says under the tradition of the community, if there is a stranger in town, they can cast the first stone. The person on the tree was fairly tried and convicted under their laws, but he won’t tell him the crime committed. He does throw the first stone and kills the man instantly. Later finds out the crime was pedophilia.

DISCUSSION: This was a really tough one for the group. On the one hand, we pay taxes and contribute to a justice system that punishes people, but we don’t know what each of them did. How do you know what this person did is worthy of death? How do you know if the justice system in this community is actually just? Does it matter if you are visiting the community, don’t you agree to abide by their laws? Would you need to know more? What if you aren’t allowed to? What if you change the scenario and they torture him until a foreigner can come to town to finish him off? Or if you do nothing they set the criminal free? What if your life is on the line if you refuse? Loads of spin offs that make this a really interesting question about cultural morality.

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“Would you cast the first stone?”

Kolby, Jeremy, and Jessica discuss the ethics in the short story, “A Community of Peers” by Dean Gessie.

Transcription (By: Transcriptions Fast)

A Community of Peers

(clap)

Kolby: Hi.

(clap)

Kolby: You are…

(laughter)

Kolby: Hi. Welcome back to After Dinner Conversation. The cackling you hear is Jessica. I am a co-host Kolby. Jeremy is here as always.

Jeremy: Hi, I’m Jeremy.

Kolby: And we are once again, for the 12th episode now.

Jeremy: Oooooh.

Jessica: 12. So now you know how to count?  We’re growing.

Jeremy: So, impressed.

Kolby: Growing as a person.

Jessica: We’re growing.

Kolby: Ya, no, that’s why I had to wear flip-flops, so I could see the toes.

Jeremy: So, you could count to 12?

(laughter)

Kolby: I counted my teeth.

(Laughter)

Kolby: Alabama, they can only count to 7.

Jessica: Oh, stop it. Alabama, we love you.

Kolby: Sure. So, anyways, After Dinner Conversation is short stories for long conversations like the ones we have all the time. And it’s meant to be read by you and your friends, talked about, discussed, built up some ideas form the stories, some morals, figure out if you’ve got something you want to submit, if you like what you’re hearing, you’re like, “I’m a writer, I can do this.”  Then please do. Send something in, and if we like it, we’ll publish it. And if we like it even more, well discuss it. And it’ll be one of the things it’s on our thing. If you want to read them, you can go to Amazon or to our website Afterdinnerconversation.com. By all means, please “like” and “subscribe” to the podcast, YouTube, video, whatever you’re watching. And thank you for joining us.

Jessica: Where are we at?

Kolby: We are at La…. See, I always forget one thing every week.

Jessica: That’s okay.

Kolby: We are, once again, for the 12th episode in a row, I feel like we’re keeping Claritin in business. We are in La Gattara, where they have cats that are available for adoption. The sign over there says they are 509 adopted cats.

Jessica: 509.

Kolby: They just adopted 2 yesterday they told us. If you’re not ready to make that commitment to a cat, which doesn’t seem like a ton of commitment in my mind…

Jeremy: You could just hang out with them here.

Kolby: You could just come hang out with them here, pay $5 or $10, use the Wi-Fi, get a little work done, and they do have drinks. They’ve got, you know, Pepsi and water.

Jessica: And Frappuccino.

Kolby: And other stuff. Yeah, so it’s a fun place. And the lady is super nice too. We are also joined as profession heckler, by Ashley, who was in our first 4 episodes, who will be taking over again for Jessica, since this is Jessica’s last episode

Jessica: For forever?

Kolby: No, not forever.

Jessica: This is my favorite cat. This is Hemingway. He’s a pretty boy.

Ashley: I’m just here for the cats by the way.

Jessica: This is my favorite **** by the way.

(laughter)

Kolby: Yeah, so anyways, we’re doing this one, and then Ashley will be back in probably for the next episode and we’ll go from there. Oh hello.

Jessica: Hemingway, you’re the best...

Kolby: Hemingway is very cute. So, our story is actually a special edition I suppose in that we had our first writing contest.

Jessica: Yes, oh yeah.

Kolby: And we got quite a few submissions, more than I would have thought for a first writing contest. And we pared them down to some good ones and down to some great ones and them part them down to this one, which is both good and great, particularly in that it really provided a lot of good fodder. Raw material for what we do.

Jeremy: For conversations.

Kolby: For conversations, yea. And so, the writing is, I think, you know, it’s good but the it’s really the…

Jessica: …the dilemma.

Kolby: …the dilemma, that when we talked about it we were like, “Yeah, we could definitely talk about this.”

Jeremy: This one for a while.

Kolby: For a little while at least. So, it’s a “Community of Peers” by Dean Gessie. Also, if you’ve got, we’d probably have another writing competition going on at some point, so check back on the website. I don’t think this will be the last writing competition we do. The submission was super cheap. It was like $20. We’re not making any money, we’re actually losing money, but we got us some good submissions, so it was worth it. So, “A Community of Peers” by Dean Gessie. I did such a bad time at our last…

Jeremy: No, you get to do this one again.

Kolby: I get to do it again. Just keep doing it until you get it right.

Jessica: This is practice.

Kolby: This is practice. Perfect practice makes perfect. Okay, so this one is really, it’s an impressionistic in that as I read it, I was like, “I think we’re here, I think we’re here”, and so you get sort of glimpses of what’s going on as opposed to a really clear A-to-B-to-C. But the general gist of it is, is it’s a person, who I guess was in a war? For some reason I assumed Vietnam. That was a totally made-up assumption on my part. But they’re going from, sort of, wherever they are in this fictional place, they’re in the jungle, they’re driving a car, gets stuck in the mud, they abandon the car. The person very easily abandoned their rental car. Apparently, they paid for the extra coverage.

(laughter)

Kolby: And they continue to wander into what effectively, from a literary standpoint, is an isolated one-off jungle community that doesn’t really have to interact much with the outside society. He finds some people, they bring him in. And when they bring him in, they find there’s a person, a man, tied to a tree. And there’s a bunch of people standing around with a basket of stones. They’re going all Bible on them. And the community leader comes forward and says, which I think is an odd thing, “Oh, are you new here?” And it’s like, “Alright, I just wandered in from the jungle. I clearly don’t look like all of you.”

Jeremy: He says, “Are you a foreigner?”

Kolby: Yeah, “You’re a foreigner’ and it’s like, “Yeah, I’m sorry, unless we’re from Southwest United States, I have an accent.”

Jessica: yeah, right.

Kolby: But whatever, the guys’ a foreigner and so he says, “Oh, because we want to perpetuate our morals and values and government system, we have a rule here that on days where we have executions, the foreigner is the first one to throw the stone.” Because they do stoning’s here. “And you can be the one to throw the first stone. And we didn’t have a foreigner here before” god knows why, “but since you’re here, batter up.”

Jeremy: It’s our way of including strangers in the life of the village.

Kolby: There you go. I mean, I feel like dinner would be….

(laughter)

Kolby: You know, good, maybe a soccer match. But every culture is different. I don’t judge. This is not my judgement face.

Jessica: This is my judgment face.

Kolby: So, the guy basically says, like, “Look, I’m not going to stone someone I don’t know.” To which the village elder says, “Well, don’t you have the death penalty in your country?” I assume this guy is from Texas because his response is, “Yes, we kill people all the time in my country.” Texas has a very active death penalty.

Jessica: Yes.

Kolby: And he says, “Well, do you know what all those people do?” He’s like, “Of course, I don’t. I just know they had a trial on a jury and they were found guilty under our laws.” And this person is like, “We had a trial, and a jury and he was found guilty under our laws, so what’s the problem?” And much to my surprise as a reader, the guy does it. He just like, picks up a giant softball sized smooth stone. Apparently, it’s like a perfectly round stone, so I guess it’s like the stoning stone.

Jessica: Right. Ceremonial stoning stone.

Jeremy: Beans him.

Kolby: And just beans him right in the head. Kills him instantly. It’s like a cow in like a factory or something. The guy immediately dies. Nobody else throws a stone because he’s dead and they all just dropped their stones and leave. Now, my first thought when I read this was, “It’s a trap.”

(Laughter)

Kolby: It was like…

Jeremy: This is going to go south.

Kolby: “We were testing you to see if you’re the kind of guy that would stone some random guy tied to a tree and you failed the test.” It turns out it wasn’t.

Jessica: Or won.

Kolby: Or won the test. Turns out it wasn’t a test. They weren’t trying to stone him to maul him, they were stoning him just to kill him and when he’s dead, there’s no reason to continue stoning so, that’ it. And then at the end of the story you find out the person was a convicted, I believe, of rape?

Jessica: Yeah.

Jeremy: Yeah.

Kolby: Which I guess is why, well, it’s a really weird sort of thing, that only the women were standing around waiting to stone him.

Jeremy: And they wouldn’t tell him beforehand.

Kolby: He asked, “What did he do?” He’s like, “Doesn’t matter”.

Jeremy: Right, he was tried.

Kolby: You got to do it blind.

Jessica: Right. Which seemed to be something about the law...

Kolby: And I get it in this sense, if you believe in the system, then you don’t need to know the details. But if you just wander into a community and you’re told to take on faith that the system is uncorrupt, that’s a huge leap that you’re just going to be like, ‘Yeah, I believe in the system, that I’m not a member of, that I did not contribute to, that I do not vote in, that I’m not even culturally aware of, but you told me that it’s a non-corrupt system, so yea, batters up”. So, there’s a couple things I really liked about this story. I know there were a couple things Jessica, that you were frustrated with, you mentioned. But one of the things I liked about it, is it kept going where I wasn’t expecting it to go. I didn’t expect them not to tell them what the guy had done. Because I thought his problem would be, honestly, I thought the problem was going to be, you’re about to get killed for something that isn’t death-penalty worthy in my country. So maybe it was jaywalking or something. Wow, there’s cat.

Jeremy: Cat fighting over here.

Kolby: Cat MMA fighting in the corner.

Jessica: Yeah.

Kolby: I thought that would’ve been an interesting twist because now you’re saying, “Look, I value your system, therefore value your outcomes, even if it’s personally against my values because I think the punishment is extreme.” And so, the fact they didn’t tell him, I was like, “Oh, that’s an interesting choice, that I thought was perfectly fine, and the I was also surprised that at the end they do tell him.

Jessica: They do tell him after everything.

Kolby: Because I feel it would have left that hanging moment of like, “Maybe I just killed someone for jaywalking?” But it somehow absolves him, at least at some level, assuming he believes…

Jeremy: That he knows it was a fully valid reason for him.

Kolby: Right.

Jeremy: But what if it had been a valid reason, a valid law, that had been a codified law of their moral structure but, for something really bizarre that he did not agree too.

Kolby: Like an offense to the gods.

Jeremy: Something that he didn’t consider to be a crime.

Kolby: You reached for food with your left hand, and left hand is an unholy hand.

Jessica: And I just want to point out its pedophilia.

Kolby: Oh, was it?

Jessica: It was rape of a young child. But pedophilia is not a death penalty worthy crime in the United States.

Kolby: Not in the united states.

Jessica: So, it is a crime we would not kill someone.

Kolby: Many readers would be comfortable with it being a death penalty case.

Jeremy: Does it change the way you feel about the story if it had been something that he did not consider a great crime?

Kolby: Or that you personally do not consider a great crime?

Jessica: I mean, I don’t know if it would’ve changed my feeling about the story because I genuinely cannot fathom… I think there’s a couple things, one, I cannot fathom not understanding a system before I threw a stone. I couldn’t. I can’t fathom that. I don’t understand that motivation. But, also, we deal with a very corrupt justice system. The death penalty has huge problems and is a very corrupt system anyways and we’re still here to support it and be part of it. We’re not overthrowing the government.

Kolby: You still pay your taxes.

Jessica: I still pay my taxes.

Jeremy: Still show up for jury duty

Jessica: Yes, I still do all of that.

Kolby: Although with an attitude like that, you’ll never get selected, let me tell ya.

Jessica: I hope not.

(laughter)

Jessica: So, I think, that for me, I think that’s an interesting… but the other thing I still can’t get past, and I don’t understand and I want to talk about, is why not just underhand toss the stone at the dude? Like, why is it like a whaling where he kills the guy? Then it’s not your fault the guy…

Kolby: Like, you could’ve missed…

Jessica: Why is that he wails the guy in the temple and kills him?

Kolby: I feel like it’s more humane.

Jeremy: That is what the mayor tries to tell him is that…

Kolby: You don’t want a bunch of little league kids throwing balls, you want the pitcher.

Jessica: I guess? I don’t know. I think I would’ve underhand tossed it, humane or not.

Jeremy: Because he tells him

Kolby: Gone for a shoulder shot.

Jessica: I don’t know. Hit a big toe.

Jeremy: Somewhere in here…

Kolby: I’ve done that thing that’s a dunking tank. The fact that he hit the guy in the head on the first throw is a miracle. I’ve dropped $20 on those dunking tanks and never dunked anybody.

Jeremy: He says he’s only 10 feet away from his though. I suppose that’s the dunk tank. So he says the women with children will cast the first stone, if you choose not to, and that you have both the power and the ability to end the life of this criminal quickly or at the very least knock him unconscious so that the cleaning up is less painful for all concerned.

Jessica: I don’t’ know. I also think it’s a little bit of take not that we know it’s a pedophilia. I would probably be like, and he took it away from all those moms.

Kolby: I thought about that too.

Jessica: …that would’ve stoned the crap out of that guy.

Kolby: And they wouldn’t be aiming for the head.

Jessica: Nuh huh. I have a very specific thing I would be aiming at.

(laughter)

Jeremy: So, what about the idea that even though there’s a, what appears to be a codified legal system, that this is still just a mob justice?

Kolby: What do you mean mob justice?

Jeremy: Mob justice. This is the presumably the mob that tried him, like, this is the town that all this happened in.

Kolby: So, he couldn’t get an unbiased jury.

Jeremy: Right. So how it is really a jury of his peers of unbiased peers.

Kolby: Because you’d have to sort of move to another town because of the getting an unbiased jury would be impossible in that town. I hadn’t even thought about that.

Jeremy: So, this leads to another point, if he is stoned by the mothers with children, this is absolutely mob justice because he’s the executioner and doesn’t know this, does this suddenly become justice?

Jessica: Interesting.

Jeremy: Not to paraphrase from the Hateful 8.

Kolby: That’s a great point though.

Jeremy: The executioner is what turns this into justice as opposed to mob justice.

Jessica: I don’t know if it turns into justice because it’s not the executioner that is making that decision. The punishment is decided and the punishment is death. So, where that is just or not has already been decided but it is the humane, is it going to be done humanely or not? Because the executioner doesn’t know and wants to act in the most humane manner, he kills the convicted immediately.

Kolby: I really liked Jeremy’s point though. This goes to the idea of, If you’ve got the guy who swings that axe to behead you, do you first say, “Oh, let me hand the axe to the victim.” No, we don’t because we understand that act should be an impartial act. We need someone who is totally uninterested in the process.

Jessica: It is a great point. I don’t think it’s justice, I think its impartiality in humanity, but it is justice, yes. I can agree with that. And I do think that’s very interesting. And I think it’s a good motivation to withhold and then tell you later. I don’t think that the narrator writes whether, he’s relieved of the burden of whether this man… but also like, it’s still again, he feels like his burden is released of feeling guilty or not. But that assumes a society that we know that the system is just. Which I’m not sure of.

Jeremy: That this system is just. Because he still potentially was tried…

Jessica: By a mob.

Jeremy: Right.

Kolby: But I mean, in an isolated community, I’m not sure they had a better choice.

Jessica: Oh yeah. No.

Kolby: So, there’s, yes, it’s mob justice, but the assumption is that there are rules of evidence, that there are objections, that there’s a judge, they do jury that’s not one of the people. There’s some due process. And to the extent they could, they tried to do the best they could. And in that sense, it’s like okay. So, I’m going throw a stone for a second…. Oh, oh… is this a stone.

(cat placed on table)

Jessica: This is a stone. Just kidding. This is a kitty, a cute kitty.

Kolby: She’s already turning into a drull. One of my frustrations with this story I think the sort of odd tax that it took that surprised me, made me think more, the thing that I really just had a gut level, frustrated me about the main character is…

Jessica: … is he left that car in the swamp? Agreed.

Kolby: He struck me as a petty Instagram-mer.

(laughter)

Kolby: In the sense of, at no point do we get what would should have been at least a half a page internal dialogue… sorry, I spit on you there…

Jessica: That’s okay.

Kolby: An internal dialogue about like here is all of the things that are going into my weighing of my choice, right? And after I’ve done it, here are all of the ways that this choice has affected me. And maybe, it’s certainly possible the reason that’s not included, is because that gives away too much that we should be doing ourselves. Because if he doesn’t tell us what he thinks, then were forced to wonder what we think. But it comes off as so trite by not having it in there, that it was just a little bit jarring, although maybe useful.

Jessica: I do think it is a choice in the narration that you, to kind of, leave it in this empty space where we can have some time ….

Jeremy: To leave it ambiguous and put in our own feelings on the issue.

Jessica: Or even just to argue with his choice in our own heads. To be like, “Wait, wait, what are you doing?”

Kolby: And he doesn’t get to justify him as an unreliable narrator because we don’t get to hear his thoughts.

Jessica: Right. Exactly. Exactly.

Kolby: Jeremy, throw the stone?

Jeremy: I don’t know.

Kolby: That’s the first “I don’t know” you’ve had in a while. Probably since, what was the drug one?

Jeremy: No, no, I said yes on that one.

(laughter)

Jessica: That was a quick yes. I even remember that one.

Kolby: Alright, so this might be your first “I don’t know” in a while?

Jeremy: In the condition that’s put forward here, I would probably say no because you don’t really know the justice system that’s in place. Was he tried here? Was he tried in the next town over by his peers?

Kolby: Does trial in this culture mean, like we dumped him in water and he floated?

Jessica: Right.

Kolby: Floated or didn’t float or whatever.

Jeremy: And the narrator even says that, “I did not betray and smirk but I had seen more than once overwhelming and then disputable evidence blow up hospitals and schools.” Which, not entirely sure what he means by that.

Kolby: I think that’s the veteran part of him.

Jeremy: The veteran part.

Kolby: So, let me ask you this Jeremy.

Jeremy: Seems an odd response in that sense.

Kolby: What about the fact that, your choice to follow through or not follow through on the community justice is imposing your sort of cultural norms on their community’s morals and values?

Jeremy: That’s a whole another…

Kolby: So let’s say the trial, he said he got a fair and just trial, is actually like, they rolled a 20 sided di and one through 5 is not guilty, 6 thru 10 is guilty, 11 thru 20 is a hung jury, but in their culture, that is how that they have decided what justice means to them as a word and as a process. Who are you to say their justice system is wrong because you wandered into their community? They didn’t wander into yours.

Jeremy: So, you abstain at that point. If you don’t find their justice system valid, you don’t throw the stone.

Kolby: Yeah good thinking you don’t go to some other country and be like, “’I don’t find your red, yellow, green light system valid therefor I do not stop at street corners.”

Jeremy: That’s different...

Kolby: When you go to the community, don’t you accept that you are a part of that community’s value system?

Jessica: Okay.

Jeremy: No, you obey their laws but he even says you can abstain. And we’ll go about our normal system.

Kolby: Oh, that’s true.

Jeremy: I think in that case you abstain because it does violate your own moral code because you don’t find their justice system valid. In this case I don’t think that can apply to every case, but in this case.

Kolby: But aren’t you same guy you would say, “Yeah, you shouldn’t bribe the police or police shouldn’t take bribes?” But in their system where you underpay police, accepting bribes is part of the way they make a living.

Jeremy: Right, that’s a different case than this. I would pay the cop.

Kolby: Because it’s life or death versus cash?

Jeremy: Yeah.

Kolby: Again, with the moral relativism. I’m just saying. I’m just saying. Baseball back to the back of the head is what happens if you get sick. Jessica, what do you think?

Jessica: Do I throw the stone? No.  I might do it underhand toss, I half-heartedly, you know, to be part of it, but I’m not throwing a stone.

Kolby: What if you’d gotten more of the story? What if they’re were like, “Look, we’ll suspend it for 24 hours so you can talk to who you need to talk to.”? But aren’t you becoming the jury?

Jessica: First of all, I didn’t agree to that, so don’t accuse me to becoming the jury.

Kolby: Well, no, but aren’t you then deciding if you believe enough in their system?

Jessica: I think that it doesn’t matter at that point. They’ve made a decision, it’s not whether or not I throw the stone, whether if this… this is a different story if it’s you throw the stone and if you decide not the throw this tone, he lives. He’s dying either way. It’s just whether if I’m participating in it. No, I’m not. I’ve very much against the death penalty, even for pedophiles. I’m against the death penalty. I think we can still study and learn from then and prevent future problems from happening. But I’m definitely not going to participate. I’m not going to throw the stone.

Kolby: Alright, let me give you another hypothetical, and you’re getting this one next.

Jeremy: Okay.

Jessica: Wait, you get a hypothetical. That not fair.

Kolby: So, let’s say you go to another country, you pick whatever country, and you realize the massage parlors are not massage parlors, that they’re for a little something extra.

Jessica: So, San Diego great.

Kolby: But it’s totally not really legal, it’s not a thing, it’s socially acceptable, everyone knows, it’s not a hidden under the table sort of thing, maybe technically illegal, but whatever. And your brother, sister, husband, whoever, is like, “Look, it’s totally fine here.” Do you then say, “No, no, no, you don’t get to go native because you and I have decided that we’re not native?”

Jessica: So, the idea is do I …. So is the hypothetical, to put it more generally, if I go to a country and I disagree with whatever system, they’ve set up, and

Kolby: You get to continually to both abstain and encourage others from your culture that are with you to abstain. What if your husband was like, “Toss me the stone.” Would be you be like, ‘Honey, no, no you don’t get to toss the stone.”

Jessica: I don’t know. I think that’d be up to them. It depends on if it affects me. What your proposing with the stone is if I give him the stone and he throws it and hits and kills that person, that’s something he gets to live with. It’s not something that affects me. But if it’s he throws the stone…

Kolby: And he gets chlamydia at the massage parlors…

Jessica: Right, or anything, and that would affect me and it would have to be a discussion. But, just to throw a stone, no. I would not make… I don’t know. I get to make my own decision. I would say like death penalty is terrible. Just like I would be like, “Wouldn’t it be great if we all ate less meat?” I’m trying to influence you in a certain direction I think we all need to go in, but I’m not saying you can’t do that.

Kolby: Jeremy?

Jeremy: I generally agree with that. You do try to influence based on the information that’s available.

Kolby: And so, if the person you’re with is like, “Hey, we’re in the red-light district in Amsterdam, like sorry dude, I want to swing by.” You’re like, “You know it’s illegal in America or it’s considered immoral in America or whatever, but here it’s okay, so yeah man, I’ll meet you back at 11 o’clock.”

Jeremy: But it still depends on the decision that we have, between us.

Jessica: It’s the relationship between the two people. If we’re talking about Amsterdam 20 years ago. Pot was not legal in the United States anywhere. You go to Amsterdam, smoke it. That’s illegal but it’s legal there. Is that a moral judgment? No, it’s their laws. Although sometimes laws are codifications of morals, they’re not always, there are sometimes just stupid laws. I don’t think sex workers should be punished for having a profession as a sex worker. So, I’m not going to judge if there’s a red-light district. I would like to make sure that everybody’s taken care of but that’s more of a me making sure that women are okay. But I don’t think there’s a moral judgement. Now, if I’m with somebody and we’re in a monogamous relationship…

Jeremy: That’s different…

Jessica: There’s a different story there.

Kolby: Are there versions of this story that you would have liked to have read? That you think would’ve asked other questions that you would like to have answered? I can think of an example, while you guys are thinking, I’ll think of an example. So, for example, if it had been the version you’re talking about, because there’s a foreigner in the community you’re the only one that can kill him...

Jessica: You’re the decider.

Kolby: Maybe not the decider, but you’re the…

Jeremy: You’re the only one.

Kolby: You’re the only one that can be the axe man. And if you choose not to, he was still found guilty.

Jeremy: Instead of walking in and they’ve got him tied to the tree, they have a prison full of prisoners, these three guys are on death row waiting for a stranger to come in, to be impartial executioner.

Kolby: We’re waiting for a foreigner. We don’t want the burden in our community. Because all of us are wrapped up in it. And if you don’t do it…

Jeremy: Then they’re just going to sit here until the next guy.

Jessica: So, Jeremy what do you do? Jeremy what do you do? You already used up your “I don’t know”

(laughter)

Jeremy: Yeah, as an impartial executioner, that is a tough one.

Jessica: Would you really?

Jeremy: I don’t know.

Jessica: I feel like you would. I feel like your face says you would.

Kolby: Here is the other thing: you don’t know how cruel the next person might be. The next person might see it as sporting to sort of hit them in non-vital areas for hours. You could be the humane one in this story.

Jeremy: How do you execute them? No, you have to stone them. I think that’d be really rough.

Jessica: Because you’d be a terrible baseball player.

Jeremy: Yeah. Well, no, just stoning somebody to death. You’re not going to kill them on the first one.

Jessica: This one got killed on the first one.

Kolby: I think he got lucky.

Jessica: He got lucky, okay.

Kolby: Are there versions of this, Jessica, that you think would put you more in the gray line area?

Jessica: I think, if we escalated the prison’s scenario that Jeremy gave us, there’s three prisoners waiting execution, and if they don’t…

Kolby: I’m getting questions from Ashley.

Jessica: Oh, Ashley’s giving us questions. Okay. If he was in prison, and being tortured, I think I would have an easier time deciding to kill people that are being tortured.

Jeremy: Even if it’s not tortured but solitary confinement. We have each of these guys.

Jessica: Which is torture. But if their well-being is taken care of, I would absolutely say no. I would be like, “No, you wait till the next person.” I’m okay because I just would not make that decision. I’m hoping that society evolves between me and the next foreigner that comes, that they decide the death penalty is a terrible idea, and they get rid of it and they decide lifelong imprisonment, or whatever or rehabilitation or whatever they decide. I think I would still have trouble executing someone because of that. However, if they’re being tortured and I’m witnessing it, I feel that that is no inhumane, it might move me to kill someone.

Kolby: What if it wasn’t death penalty? What if they’re like, “Look, our punishment in this case is getting a hand chopped off.”

Jessica: Uhh.

Kolby: So now…

Jeremy: You’re still the impartial party.

Kolby: And now it’s not something that I know you have a problem with is the death penalty.

Jessica: Right, right, right.

Kolby: It’s something, maybe you don’t believe in taking off a hand, but whatever, or maybe it’s a branding of this sin, right? So that people know.

Jessica: Well, does it happen if I say no?

Kolby: Yeah, of course.

Jessica: Well, then I say no. Why in any of these situations would anybody say yes? There’s a line.

Kolby: Instagrammer said yes.

Jessica: There’s a line waiting for people to brand this person, you’re first, do you want to go? No, Okay. Next person.

Jeremy: But you’re the stranger and we can’t do it because none of us are impartial, we need the first stranger to come through to perform the punishment.

Kolby: It would make all of us feel good to punish this person therefore we cannot be the person who punishes them.

(Ashley asking question off camera)

Jessica: So, Ashley asked, what if you get punished for not punishing them? What if I die if I don’t kill them? I’ll kill them. Not a problem.

Kolby: Really?

Jessica: Absolutely.

Kolby: That’s not the answer I expected.

Jessica: Really?

Kolby: Ashley also wrote on the piece of paper; he’s a veteran and he’s probably killed people before in other situations. So, in that sense, it’s not like, he at some point had made this decision, like, “Killing people at times is what we do”

Jeremy: It’s what we do, but again I feel that’s very situational as well, even for soldiers where...

Kolby: Because there’s been some theoretical declaration of war.

Jeremy: Right. And they have rules of engagement.

Kolby: Alright. Well, sorry Ashley, we tried. We tried.

Ashley: What if the first person to throw is under 18, or never killed before, or was a child? What if they go to the front of the line?

Kolby: Do they have a fast pass from Disney land?

Jessica: So, Ashley’s question is if the foreigner was a child.

Jeremy: A minor.

Jessica: I don’t know. I think that’s awful, that’d be a messed-up society.

Kolby: That’s a screwed-up society to do that. Alright, so you’ve been listening to After Dinner Conversation. We went a little bit long on this one surprisingly. We’ve been pretty good about sticking to a half an hour but we went a little bit over, sorry about that. After Dinner Conversation, short stories for long discussions. If you enjoyed this, please “like” and “subscribe”. Please listen to other ones too. There’s like a backlog now. And all of them ask really interesting good questions, mostly involving Jessica picking on me. Or cats. Also, if you’ve got a story you want to submit, submit it to your website afterdinnerconversation.com.  Download these stories on our website or Amazon, or whoever e-books are sold, or podcasts are done, or YouTube. We basically have populated the Earth with these things. So yeah, and thank you for joining us. And I think this might be our last La Gattara one. I think our next one might be from a different location. It’s possible.

Jessica: Will there be puppies?

Kolby: Man, I hope there are puppies. Man, I hope there are puppies.

Jeremy: Let’s find that.

Kolby: We need to find a puppy place.

Jessica: La Pupparia.

Kolby: La Pupparia. La Canine-ria.

Jeremy: Not a puppymill. That’s different.

Kolby: A puppymill. That’s not the same. If you want to adopt any of these cats, just come on down. It’s La Gattara in Tempe, Arizona. They are all available for adoption. If you don’t have the commitment yet to adopt one, you can pay $10. They got Wi-Fi, you can sit and hang out, and I guarantee you’ll get a cat who sits on your keyboard.

Jessica: Yes.

Jeremy: Yes.

Kolby: Guaranteed. And It’s just a great cause. So, thank you for joining us again.

Jessica: See ya next time!

Jeremy: See ya next time.

Kolby: Wherever that may be.

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E11. "Rainbow People Of The Glittering Glade" - Does society have the right to tell you you’re worthless?

STORY SUMMARY: Three wards are sent by the Kingdom through shifting deserts to find a rumored people that have rainbow skin. As they get closer they see people in the desert turned to stone, and others nearly stone that simply repeat the same simple task over and over again. One member of their group is injured so they get to the rainbow people in need of medical attention. They learn that anyone who lives in the community will slowly turn to stone unless the community deems them of value and allows them to take place in a ritual. One member of the group does the ritual and joins the community, one refuses and turns to stone, and one goes back home to tell the tale.

DISCUSSION: Fascinating story about how a society places value on certain kinds of work. Is certain work more valuable then other work? Must you work and contribute to society to be of value? What if you just don’t want to work, are you a bad person? Is it okay to just enjoy life? Also, there are faith discussions in the story. The one person opted to turn to stone rather than join a group with another faith. Does this mean she isn’t of value because she is faithful to her truth?

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“Does society have the right to tell you you’re worthless?”

Kolby, Jeremy, and Jessica discuss the ethics in the short story, “Rainbow People Of The Glittering Glade” by David Shultz.

Transcript (By: Transcriptions Fast)

Rainbow People of the Glittering Glade 

Kolby: Okay. And welcome back to episode, I believe, 11 now.

Jessica: 11.

Kolby: Of After Dinner Conversation short stories for long discussions. I am your co-host Kolby with…

Jeremy: Jeremy.

Jessica: Jessica.

Kolby: And Jeremy is wearing a shirt from where we are today.

Jeremy: From La Gattara.

Kolby: From La Gattara. Yeah, where all these cats, except for this one who is like a permanent resident, are available for...

Jeremy: For adoption.  

Kolby: For adoption. Yes. Or you can come and just pay some money to hang out with them. After Dinner Conversation is a podcast as well as a website as well as Amazon e-books including the book we’re talking about today, “The Rainbow People of the Glittering Glade.”

Kolby: And so, we’ll...

Jeremy: By.

Kolby: By… that’s a great question Jeremy. Hello kitty, you’re sitting on my...

Jeremy: It’s in here, David Shultz.

Kolby: Thank you. The cat’s sitting on my thing there, which you’re allowed too. Keep sitting kitten. Right. And so, you can download those, read those, you ideally should read them beforehand, before listening to the podcast.

Jessica: I’d say this one in particular.

Jeremy: In particular, yes.

Jessica: It’s a lengthy piece but totally worth it. It is a fantastic story.

Kolby: Yeah.

Jessica: And we’re going to be, I think, diving deep into this one about the world that was created and so I think you really need to read before you…

Kolby: And I would say for me, we’ve had a lot of stories that I’ve liked, certainly I’ve liked the discussions of all of the stories and all the questions asked. This is one of the stories where I just really loved it. Like, I read it and I wanted to talk to people about it and I wanted to ask questions about it and we were trying to like hold off on our conversation before we started taping so we didn’t talk about it beforehand. Yeah, I feel like this sets a new bar probably for some of the best submissions we’ve gotten.

Jeremy: It’s really well written. It’s an interesting topic. Really well development of characters.

Jessica: The world building is really good.

Kolby: And it’s longer which is usually hard to keep up that sort of level of interest for that, you know? It’s easy to do like a 3-page story that’s one question. This is a 15-page story that asks 15 questions. It’s just really solid. And so, I think that I drew the short straw this time?

Jessica: You did.

Kolby: Okay, so, I am not as good as Jeremy or Jessica at...

Jessica: I’m not good.

Kolby: …at the summaries, but I will try and summarize. So, basically, we’re in a sort of fictional world where it maybe has like other stuff going on. It’s probably other worldly but it’s intentionally vague about it. And three people on behalf of the kingdom are sent to the Shifting Desert to find a rumored community that is violating the social norms of society and that they’re practicing slavery, and I think there’s human sacrifices, and just, you know, cats and dogs living together. And so, they wander through the desert to find this hidden place in kind of a Lewis and Clark exploration kind of thing. It makes it even more interesting in that because of the plate tectonics of where they are, which is why you kind of think it’s probably not on earth, there are minor plate tectonics. And so, even if you’re walking in a straight line, the ground underneath you literally can slowly be moving you off course because the tectonics are shifting all the time. And so, it’s particularly hard to find this community. The three of them get to the community. In the process, I think two of them become inured, one from a sprained ankle or something.

Jessica: Broken arm.

Jeremy: And the other bit by a snake.

Kolby: By a venomous snake, so they sort of stumble into the community on their last leg with not much, and running out of water and they’re just in bad shape.

Jessica: One is like a warrior.

Kolby: Yeah.

Jessica: One is like a religious person.

Kolby: Right. I feel like we’re talking about Dungeons and Dragons classes, right? And one is a dwarf. No.

Jessica: One is a cleric, yea.

Jeremy: One is the government envoy, the scholar.

Jessica: The scholar.

Jeremy: The warrior.

Jessica: And the religious person, the cleric.

Kolby: The religious person, scholar, and warrior.

Jessica: Yes.

Kolby: And so, one of the things that’s interesting is sort of is the teaser in it is as they get closer to the community, they start to see statues out in the desert.

Jessica: Pure white.

Kolby: Pure white. Like marble looking like statues that are exact replicas of people. And then as they get closer to the community, the statues are moving but only in a repetitive pattern. So, it’s just like one short motion. Like, kind of like “Chuck-e-Cheese”, right? Just one sort of thing over and over again.

Jessica: Like an automaton or something.

Kolby: Yeah. But they can’t talk to them or communicate with them and they not as marbleized. They have an ashen look to them. And this is all of course very freaky to them. When they stumble into the community, the community takes him in and you learn the story of what’s going on. And in essence…

Jessica: Oh, and, sorry.

Kolby: Yeah, go ahead.

Jessica: I was going to say, they heal them.

Kolby: Oh, that’s right. They heal them of course. Way better healing abilities than they have.

Jeremy: Than the cleric has.

Jessica: Than the cleric has. And the cleric is the only that’s not injured. The warrior and the scholar are the ones that are injured. And as they are healing, they learn more about this community.

Kolby: Right, because they can’t leave immediately because it takes 3-4 days or a week for their sort of wounds to heal up, so they can’t just immediately return. So, one of the things they find out in sort of their research to bring back to the kingdom is that being in the community is a double-edged sort of thing. One- you get all of these sorts of great knowledge and great things and you live in this sort of idyllic community in the desert that is, you know, they don’t know war and they don’t really know famine and they all have goals and purpose. But, you will slowly stone-ify, for lack of a better term, and you will become one of the people they saw when they were coming in. And as you turn to stone, you will want to, through whatever process this curse has, makes you want to isolate yourself from the community. So, the people that are just outside the community are the ones who are more recently sort of stone-afying and they can still move a little bit. And the ones that are farther out from the community are the ones who have totally turned to stone and are now essentially dead, which I believe, what are they called? They said…

Jeremy: They’re called “the drull”

Kolby: “The Drull”, D-R-U-L-L is how they spell it. And so, you immediately have this immediate process, immediately starts within days unless you are accepted by the community. And the way the community…

Jeremy: Well, it’s explained to them that...

Kolby: They have to have value, right?

Jeremy: It’s the because of the magic in the area. And I think it’s even presented as magic, that this is the cause of this. This is the power that heals you but it also turns you to stone.

Jessica: Right, yeah. It’s inevitable if you don’t have...

Kolby: If you don’t go through this process. And so, the only, everyone in the community can be a part of this process to not turn to stone, but they have to be, sort of, selected or voted on by a panel of elders.

Jeremy: By the community. Judged valid.

Kolby: Yeah, who judge valid. And all three people that wanted in are judged valid.

Jeremy: Because of their credibility.

Jessica: Judged worthy.

Kolby: The scholar because of his scholarship. The warrior because of his athletic prowess. And the cleric because of their devotion to God, even though it’s not their God, totally fine. And so, they take them down into the inner chamber where there is a giant floating crystal that is so perfect. It’s like a God figure crystal. And then they, for lack of a better term…

Jeremy: They get a tattoo out of one…

Kolby: They kill one of the drulls.

Jessica: It sucks up the essence of the drull into the crystal. Yeah. And then…

Kolby: And then becomes ink.

Jessica: Becomes like a tattoo machine.

Kolby: And they tattoo it on their forehead.

Jeremy: On their forehead is the one that keeps them alive.

Jessica: It’s a seal.

Kolby: Right. And that keeps them from ever stone-ifying. And the scholar and the warrior choose to take the serum or the magic thing, and the religious person because of their religious belief, chooses not to.

Jessica: I would not say we are clear why she chooses not to.

Kolby: That’s true.

Jessica: You say it’s because of religious beliefs. That’s never articulated.

Jeremy: It’s sort of articulated.

Kolby: Maybe I just assumed it.

Jessica: I disagree that it’s articulated. Show me in the story.

Kolby: Jeremy’s going to start sorting through. And so, then they heal up, and you go back and you realize that the story that you’re reading is the letter that the emissary, the main sort of emissary person, the scholar, has given to the warrior to return to the king. Because the scholar has chosen to go native, and live in the community. The warrior has chosen to go back and use some of that technology to help them in war.

Jessica: So, I want to clarify that point. So, after the seal, they also can get additional tattoos that will enable them to become better warriors or better whatever…

Kolby: Allows them to run faster.

Jessica: Run faster.

Kolby: Go longer without water

Jessica: It was almost like photosynthesis. You lived off of the rays of the sun.

Kolby: It also sounds complicated; this is why we’re saying this is one of the stories you should definitely read.

Jessica: And then the price is that you have to go find one the drulls and bring it back in order to get these additional powers.

Kolby: And one of the things they talked about, ideally you should be the one to pick the drull.

Jessica: Correct.

Kolby: And I think, to me, the implication was you shouldn’t pick randomly. It should somehow to be related to the meaning to the tattoo that you’re doing, or meaning to you.

Jeremy: Meaning to you or the skill that you want.

Kolby: Yeah, some personal attachment to you. And the drull gets the sort of life sucked out and it gets turned into, its blood or whatever, gets turned into more tattooing. And you get the tattoo from the essence of the drull and give you the ability to run faster or jump higher or whatever.

Jessica: So, I cut you off, and then at the end of the story, it’s the letter going back to the kingdom, the scholar has gone native, the warrior’s going back, and the cleric is a drull now.

Kolby: Has become a stone, or is almost to the process, or pretty close to the process. And so, you’re essentially, you’re reading so much, you don’t know at the beginning, so much, but you’re reading the story from the emissary to the king about why, here’s what I saw and, by the way, I’m not coming home.

Jessica: Yes. Yeah. Yeah. He’s totally. Yeah.

Jeremy: Okay, so we’re both right.

Kolby: Oh, let’s hear it.

Jeremy: Okay, Syrena refused the seal. Syrena’s the cleric. There’s little that I can say of her reasons for doing so, which were expressed not in terms of logical rationales, but rather emotional aversion and visceral distaste. So, we see his view of what her reasons are, is an emotional reason not to do it. I attribute her attitude to her strength of her devotion to her faith. It is to her credit that she resolved, even in the face of such beauty, to abstain from what had been offered. This is a rare and commendable ability, perhaps one that is honed by fasting in case chastity, the regular refusal of our natural inclinations.

Jesica: Okay.

Kolby: So, it doesn’t outright say it but, you’re kind of like, “Ehhh.”

Jessica: Well, he thinks it’s religious.

Jeremy: It’s his impression.

Kolby: But it might not be.

Jessica: But it was an emotional and visceral reaction.

Kolby: So, can I just talk briefly one of the things this writer does and I thought, “Wow. Like, I’ve never read that.” He does it a couple of times where he doesn’t do like a “he said- she said- he said- she said”. Instead he does something like, “We had a conversation, while I don’t remember everything, here’s what I took away from it.” And it’s sort of like he’s paraphrasing the conversation for you, which once you realize it’s a letter, makes perfect sense that it wouldn’t be a “he said- she said”, it’s not the way you write letters and stuff the way you remember stuff. I thought that was a really apt tool.

Jessica: That was an apt tool. I also think it’s a great tool to indicate perhaps bias. Right? Like, this is what I took away from a conversation, “Oh, I’m not going to see that conversation? Okay. This is your take on the conversation which is just the good bits that you think.”

Jeremy: It comes back to the idea of the unreliable narrator. We have to take his word on it. It’s his story; his version of what happened. Now, they can talk to the warrior who returns to corroborate.

Kolby: I think the warrior for sure would’ve had a different take on it, right?

Jeremy: Yeah.

Kolby: When I read this, I kept making me thinking of like reading some of the Lewis and Clark primary sources. Where it’s like, I know you’re trying to be faithful to what you saw and I get that you’re trying but I also get that you’ve never…

Jessica: Never seen this before, experienced this.

Kolby: Right. So, I appreciate the effort but I’m not, you know…

Jeremy: Exactly.

Kolby: Jeremy…?

Jeremy: I really like this story.

Kolby: Yeah, you also liked it a lot. We’re unanimous on this one.

Jeremy: Yea, it’s a great story. I love the way it’s presented, the way the culture is presented, the way you’ve given these 3 different views of what’s going on and three different options.

Kolby: Choose your own adventure book.

Jeremy: In some ways.

Kolby: I would read this story 2 more times from the other two characters perspectives.

Jeremy: Absolutely.

Jessica: Yeah.

Jeremy: Just to see what they’re take on it is it would be great.  If you’re listening...

(laughter)

Kolby: If you’re listening writer… please!

Jessica: Get a little Canterbury Tales aspect, right? 3 pilgrims, different takes.

Jeremy: What’s the Kurasawa movie with the 4 versions…?

Kolby: So, you go Kurasawa and I was going to go with that one where they the car wreck, where the 4 different people all come to the car wreck and leave the same.

Jeremy: They based it off the other one.

Kolby: They based it off of that? Whatever. Jeremy, I’m curious, so there are some ethics. I mean, obviously we’re gushing over the writing because it’s gushing worthy, but there are some ethics issues.

Jeremy: Absolutely.

Kolby: What did you think of the Drull? As you’re reading it and thinking about it, what was your opinion about…?

Jeremy: Well, it’s interesting because it’s the society that decides whether or not you’re worth of getting this tattoo.

Kolby: I should also mention one other thing too, you don’t start to stone-ify unless you’ve hit puberty. So, it’s not like a baby has to prove its worthiness. You’ve got like…

Jeremy: It’s a little like Logan’s Run in that way.

Kolby: You’ve got 14 years to know what’s coming, and to like mentally get yourself to be like, “Look, I’ve got to up my game. I got to prove myself”, because then that age is when it starts to happen. So, you’re choosing at some level the effort you’re putting in to creating your worthiness.

Jeremy: Well, that’s a real pressure. It’s not just like, you get into a good college or not.

Kolby: You want to talk about a final exam?

Jessica: Right.

Kolby: That way is worse than the AIMS test.

Jeremy: Yeah, it is an ethical conundrum because, again, it’s the society that’s determining your worthiness. So it isn’t, you know, shifting morals and shifting values. How do you know? It’s not like there’s an SAT you have to pass.

Kolby: What if you focus on a good mile time and it turns out you don’t need milers anymore?

Jeremy: Exactly.

(laughter)

 Jessica: Right.

Kolby: But you get the impression that they’re not, because they were really free to give it to the three people that came in…

Jessica: But, but, but… those are three outsiders that can go report back and get people to come and kill them. It’s three. And they already, I mean, I’m going to assume that they know that this outside Kingdom has already dubbed them human sacrifice, slavery, right? They already have a bad PR rap, of course they’re going to be like, “Here you go! Free stuff for you!”

(laughter)

Jessica: It’s the vendor who buys you dinner. Right? It’s not, I would say, it’s very biased that they…

Kolby: So, you think they’re being shown the best parts of the community?

Jessica: Absolutely.

Kolby: I didn’t take it that way at all.

Jeremy: I think somebody even brings that up.

Kolby: The warrior does. The warrior totally is suspect of all of their motivations. The emissary guy is like, “No, no, let’s just be cool.”

Jessica: “Yeah, let’s just be cool, I dig it here.”

Kolby: “Be cool honey.”

(laughter)

Kolby: So, one of the things that I really loved about this was, I was struck with the question of why would you be a drull? Why wouldn’t you try, if you know it’s coming, and it doesn’t sound like there’s a limited number of slots or whatever. Why would you choose that? And then it occurred to me, everyone, everyone around me in society chooses that every day. Every person who gets up and goes to work and comes home and watches TV and like…

Jeremy: It’s a cumulation of all those things, it’s how you’re raised, it’s all of your decisions compound on previous decisions.

Kolby: I thought one of the really great things about this story is that it created a more in your face example of the thing that people in a real life do every day. Like, I feel like there’s a lot of drulls walking among us. People who’s like, the biggest thing they’re waiting for is to pay off their mortgage.

Jeremy: Or not even that. Just making next weeks rent.

Kolby: Right. And it’s just complete lake of belief that you can craft yourself or your society around you to your hopes.

Jessica: So, I totally disagree.

Kolby: Good. That makes better discussions.

Jessica: So, I got from the discussion questions at the end that you would go this way.

Kolby: Right, because I write the discussion questions.

Jessica: Yes. You write the discussion questions at the end. So, I think it is a reflection of society, but I think it’s this wonderful reflection of society of how…

Kolby: Oh, I know where you’re going. How we judge people’s value?

Jessica: Who gets to pick what’s worthy? Women’s work is never worthy. Never. It’s something that we never consider to be part of, do you work or do you stay at home? Right? Guess what, staying at home is fricken work! It’s not, especially if you have kids. It’s very very ablest, “Oh, you’re able to work, you’re worthy is determined by your productivity and success based on the productivity and success that I want based as the judger.”

Kolby: I’m judging you based on my values of what I think has value.

Jessica: Absolutely.

Kolby: I have a counter point for this.

Jessica: Sure.

Kolby: I would agree with everything you just said, except what the drulls are doing.

Jessica: Okay, tell me…

Kolby: Because before you’re totally return to stone, I took their repeat action as the thing that they mistakenly thought mattered that they couldn’t let go of. Whether that’s gardening or prayer or whatever, the thing where they’re like, “No, no, no I have to keep the floor mopped because that has value.” So, you sort of solidify into that value-less action until you truly are gone.

Jessica: So, here’s my disagreement on that. If I knew, if I grew up here, sorry Jeremy we are not letting you talk.

Jeremy: That is fine.

Jessica: If I grew up in this society and I knew that the society would not determine what I am doing as adding value. Perhaps I’m a woman and they don’t think that women’s work is valuable. Or perhaps I do something, I’m a writer and they think art sucks, so I have a choice because I know there is a slim…

Kolby: So closer to the community we see you writing over and over and over again because you’ve been pushed out because they don’t value writers.

Jessica: They don’t value writers, or they don’t value artists, they don’t whatever. I get a choice at some point…

Kolby: I think you’re about to change my mind.

Jessica: … at either get to try to have a slim hope of redemption so my drall action will be, whatever, maybe will redeem me, because they do. That’s the thing about the story, they maybe pick somebody to redeem. The woman that show’s them around…

Jeremy: Has been redeemed.

Jessica: Has been an advocate of redemption. I either do an action that they maybe they’ll find redemptive. I’ll do gardening, I’ll hold up the bridge, I’ll hold up the aquaduct that collapsed, which happens in the story, or I do what will make me happy because that’s what I’m going to do until the end of time because the chance of redemption is so slim.

Kolby: I think that’s why the story is so interesting is I think that is the choice that the cleric makes.

Jessica: Right. She does what she loves.

Kolby: The cleric says, “Look, I understand that you don’t think what I’m doing has value but that doesn’t mean it doesn’t have value. So, I know I’m not going to heal fast enough to get out of here and I know that I’m not going to change the core of who I am to impose, to take on your value system, so I’m willing to sort of die with my value system….

Jessica: …doing what I love.

Kolby: Doing what I love, knowing that you don’t value it.” And in that sense the cleric is an admirable character.

Jessica: I think the drull are admirable. It just depends…

Kolby: On why they’re drull.

Jessica: Well, and …

Kolby: If there’s just a guy playing Playstation eating Cheetos.

(laugher)

Jessica: That’s doing what he loves. It’s his life to choose that.

Kolby: Why can you force me to not eat Cheetos and play PS4?

Jessica: And what I will say is this idea that this worthiness on the backs of others is the part that kills me.

Kolby: That’s what I think creates a great moral ambiguity in the story thought right?

Jessica: I don’t find it. I think it’s immoral. I don’t think there’s any ambiguity.

Kolby: They think they’re essentially feeding on the dead, but of course to the dead they don’t think that.

Jeremy: And what about the class structure that is creating the values that they’re judging on in terms of they need drull to get additional skills so wouldn’t you gain the system to create more drull so that you have…

Jessica: Absolutely.

Kolby: You got more to feed on.

Jessica: There can never be this society without the drull. They can never be this American society without the people who pick up garbage, with the people who work on Amazon, they’re drull life, making it week-to-week paying their rent, we can’t have this society without…

Kolby: Unless we feed off of them.

Jessica: So, we feed off of them. I think this is a great mirror of what society is.

Kolby: Wow, you might have changed my mind actually.

Jessica: I love that.

Kolby: No, I’m totally fine to have my mind change, that’s why I want to have these conversations.

Jessica: Just not with me.

(laughter)

Kolby: No, I mean, you’re wrong but you changed my mind. I’m perfectly allowed to hold 2 contradictor things of you simultaneously.

Jessica: Absolutely.

Kolby: Jeremy, what was your take on the choices that each of the three characters made? Were you respectful of all 3 or did you stack them in order of preference?

Jeremy: No, I think it’s presented very well that they make the choices that are suitable to them.

Kolby: Suitable to their trait so to speak.

Jeremy: To their traits. So, the cleric absolutely makes the choice not to do this because it’s morally offensive to her to do this and that she has such faith in her religion, presumably.

Kolby: And sees value in it even if it is valueless to others.

Jeremy: Exactly. And the other two, yeah perfectly, it makes sense they would make these decisions.

Kolby: What about the fact… so, Jessica has a problem with the society and that they sort of feed and level up on others, but on some level the emissary, I don’t know why I keep calling him that but that’s what I’m going to call him…

Jeremy: Yeah, he’s the emissary.

Kolby: He’s been chosen to do the thing that Jessica has a problem with, but you don’t have a problem with his doing it?

Jeremy: Well, I think it makes sense to him. And they’re gaming him so they he writes this letter.

Kolby: A lot of moral relativism going on.

Jeremy: Absolutely. It doesn’t mean he’s right or wrong.

Kolby: It just means he’s different.

Jeremy: It’s the right decision for him.

Jessica: So, one of the things that I thought about at the end of the story, because I do have problems with this society, but we’ve all met me, I’m not the cleric, right?

(laughter)

Kolby: Didn’t you just in the last episode say, “Your husband had a 70% chance of living, so I abandoned him?”

(laughter)

Jessica: Yep!

(laughter)

Kolby: Maybe we don’t let your daughter listen to that episode.

Jessica: Let’s not let her listen to that.

Jeremy: But Alex, it’s okay?

Jessica: Alex, I’m so sorry again. You know I love you.

Kolby: We know you would’ve swum out for your daughter.

Jessica: I would absolutely swim out for my daughter. But, so, we know I’m not the cleric, I’m not going to self-sacrifice and I’m not going to write and not live.

Kolby: Which one of these three characters are you?

Jessica: I’m not any of these characters. But the questions is, if I’m in this party and they offer you the seal and you know the choice is really I either become a stone statue doing what I love, or I get the seal and have a chance of escaping this place and warning other people.

Kolby: Come back with the nukes.

Jessica: What do you do? Do you get the seal knowing that somebody dies on behalf of you but then the hope is that you would save more people? And the shifting sands, the chance of findings those people again is very hard.

Kolby: I didn’t even, until you brought it up, I didn’t take the drull as being alive. In my mind it wasn’t feeding off the living, it was feeding off the dead. Because in my own bias, I think which you’ve slowly changed my mind on, I feel like it’s okay to feed off the dead because they’ve already chosen to die.

Jessica: Well, and that society has set up that. Everybody in that society believes that. You can’t live day to day believing… I mean, we believe that every day. We believe that people that clean you houses make that choice, we believe that the people that can’t find jobs makes that choice.

Jeremy: It’s another good metaphor more A-type people who look down on anybody else below them who aren’t like absolute achievers. Why would you not be the best person you could absolutely be every day?

Kolby: It’s like, “Maybe I just want to be happy? And this makes me happy.” I struggle with those people, I can’t lie.

Jessica: I’m a definitely a Type- A person, we all know me, but my therapist was very much like, “You keep climbing a career ladder, do you want to keep climbing?” I was like, “Oh, I don’t know, I haven’t thought about that. I’m just looking for the next...”

Jeremy: This is what you do.

Kolby: I just saw a mountain and I climb mountains.

Jessica: Right, that’s exactly it. So it is, that is very much people look down. Kolby and I had this conversation where we always want to make people’s businesses as efficient as possible. We want to make restaurants work better. What if we don’t care? What if we just let people live their lives and they don’t want the most efficient restaurant and want just like this experience?

Kolby: It reminds me of the story…

Jeremy: It works well enough.

Kolby: I’m happy. Like, why do… am I going to be happier if I have more business?

Jeremy: If I’m making 10% more money?

Jessica: And making enough money to do what I love and what I love is sitting playing PlayStation on the couch.

Kolby: Yeah, and I get that you judge me, but I’m happy.

Jessica: Right.

Kolby: It reminds me years ago when I was working at a job that was a little bit of a drull job, I just admit…

Jessica: A drull job or a droll job?

(laughter)

Kolby: I think that’s part of the reason why the name is chosen too. Is I was talking to a co-worker about, “I need to do more with my life”. And he’s said, “You know, you remind me of the guy who went to the Amazon and saw the guys hanging out on the side playing soccer and pulling fish out of the water once in a while to eat. And he goes, ‘No, no, no, you’re doing it all wrong. You need to set up nets and you need a factory, and you need to process it, you need to sell it, and you need to have higher margins and you need to get into a global market’ And the guys said, ‘Why?’ He’s like, ‘So you can sit on the side of the river and play soccer with your friends all day and fish’. And they’re like, ‘Yeah’”.

Jessica: Yeah

Kolby: Yeah.

Jessica: That’s a great…

Kolby: It’s a great story, right? And I feel like that’s the case, where it’s like you chase this thing. I’ll tell you one thing I’m glad the story left out, but I definitely thought about it, is it kind of made it clear that if you were 13 or 14 and started to enter puberty and you got a problem with this, I don’t know why you would have a problem with it because you grew up in a culture and you wouldn’t know any better, the shifting sand and all the trauma would make it almost impossible to escape. There is no ability for somebody born into the community to opt out of the community.

Jessica: And it’s impossible for those 3 people.

Kolby: You can’t leave home a lot in this case.

Jessica: These 3 people can’t escape. Do you either turn into a drull..

Jeremy: … or you accept the…

Jessica: … or you accept the emblem and then leave? And it’s interesting, I think that this letter comes back and it’s like, “No, no, no, they don’t do human sacrifice”; they absolutely do.

Kolby: I did not think they did until you sort of talked about it. Now I get it better.

Jessica: And so, you didn’t answer my question, do you get the seal and try to warn people, or do you not take the seal because you know you’re taking somebody’s life?

Kolby: I totally get the seal.

Jessica: Seal.

Jeremy: Yeah, definitely

Kolby: I understand that you think your life has value. I don’t think your life has value.

Jessica: Oh my god.

Kolby: And therefore, if I’m being honest, I would like to be less horrible, but I’m not. I look at someone like Elon Musk and I think he’s worth more.

Jessica: Oh, I definitely don’t. The guys a jerk.

Kolby: But here’s the thing, I don’t care that he’s a jerk, I care that he’s pushing us as a world slightly, just a hair beyond where we were yesterday.

Jessica: Okay, but for the record, Elon Musk is one of the drull. I am sacrificing that dude to get the seal Just telling you. Even though there’s a chance of redemption for him, and he could go change the world, I’m first.

Kolby: I feel like 99% of society lives off the sort of invention of 1%. And I would put you in that 1%...

Jessica: Aww, that’s so sweet.

Jeremy: But he’s gaming the system so you write a good letter.

Kolby: But a lot of people will never even have the smallest bit of adding to whether it’s cellphones or space exploration or understand of psychology or medicine. They really, you know, do data entry for a living. And I’m just like, I want to be a better human being…. This was one of our conversations like 5 episodes ago, I want to be more empathetic to that person, but I’m perfectly fine being like, “Yeah, no, you’re food.”

Jeremy: But we need people to do data entry. We need a lot of people to do that job.

Jessica: And I think it’s one of those things, I don’t care about the data entry or the invention part. For me it’s the skew of you think the furthering of humanity…

Kolby: You’re overlaying your values on someone else’s choices.

Jessica: … is the goal, and I think the experience of life is the goal.

Kolby: Right. And that’s just a different in perspective that the society is now overlaid on people making this choice. I look at Switzerland and I’m like, “Yea, 300 years of peace and prosperity and you gave us the cuckoo clock.”

(laughter)

Kolby: I don’t think that’s impressive.

Jessica: Don’t they also do really good chocolate?

Kolby: Yeah, they do. And they do really good…

Jessica: Neutrality?

Kolby: Neutrality. They do…

Jeremy: Ehhh, that’s questionable.

Jessica: That’s true.

Kolby: Unethical banking.

Jessica: They do a lot of unethical banking. And I think it’s one of those things, just to be totally clear…

Kolby: But that’s my overlay of values on society.

Jessica: And I will say, I think, I’m a humanist, I think society, I think humans, we’re all very amazing people, but at the end of the day, I do think it’s so big and it’s really about the experience. For me, it’s a lot of the Carl Sagan of the living in that moment and being part of it.

Jeremy: Absolutely.

Kolby: I’m going to call a hair bit of bullshit on this.

Jessica: Oh, tell me.

Kolby: You just said earlier, “I see a mountain and I have to climb it”

Jessica: Yes.

Kolby: So what you’re saying is, is you understand that you overlay your values on your choices in ways it overemphasizes the value of achievement and sort of ticking boxes and goal-setting and moving the ball forward, and yet you’re also saying, but what I really want to do is fish by the river?

Jessica: So, society devalues women’s work, women also devalue women’s work. So, I have been raised in a society where success is… so of course I’m a box checker, of course I love that sense of achievement.

Kolby: I feel like both sides of the scale are sitting one each side of me here.

Jeremy: Yeah, pretty much.

Kolby: Like, Jeremy I feel like you in some ways you are the guys, and I don’t mean this as an insult, I don’t mean this as an insult at all, but I feel like you want to be a good father, you want to be a good friend, you want to be a good…

Jeremy: …fish by the river.

Kolby: … fish by the river kind of guy. And that, and you’re genuinely happy and it generally makes you feel good and Jessica being, kind of the other end of the spectrum. And I definitely understand your point, there is a place for both of those kinds of people.

Jessica: Yes, sure.

Jeremy: Thanks.

(laughter)

Kolby: Good news…

Jessica: Good news.

Kolby: Today we don’t eat you.

Jessica: We don’t eat you today. Just so you know, Kolby is counting down the days if you get a little ashy.

Kolby: If I’m peckish, it’s going to be bad for you. Well, we ran a little bit long on this one, not surprisingly because it was amazing. If you write something, if you’re going to submit something, you should read this story.

Jessica: Yeah. “Rainbow People of the Glittering Glade”, which is on Afterdinnerconversation.com.

Kolby: It is. And also, on Amazon.

Jeremy: And everywhere you get podcasts.

Kolby: And just so that people know that this is the kind of thing we’re looking for; I think I’ll actually link this to the submission page.

Jessica: I think that’d be great.

Kolby: So that they can be like, “Oh, let me read a sample of the kind of thing you’re looking for.” Maybe shorter, if longer is struggling for you, Jeremy doesn’t like to stay up late reading.

Jeremy: No, it just was a busy day yesterday.

Kolby: I think this is exactly what the website was created for and I was very happy to get it. So, thank you person’s who’s name I don’t remember.

Jessica: With our author let’s give him some credit.

Kolby: Seriously, he’s a rockstar.

Jeremy: David Shultz.

Kolby: David Shults, who by the way, I think this is amazing has degrees in cognitive science, philosophy, law, and education because why not?

Jessica: That guy? Checking boxes.

Kolby: He’s checking boxes. Which is ironic that he writes his story then.

Jessica: Well, maybe because he’s like me and learned.

Kolby: Maybe, he learned-ed it.

Jessica: Kolby.

Kolby: I’m sorry. You are listening to After Dinner Conversation short stories for long conversations. You’ve been joined by myself, and Jessica and Jeremy. And we had a great time talking about “Rainbow People of the Glittering Glade”. If you had a good time listening to us, please “like” or “subscribe.” Please come to La Gattara, get a cat, they are available for adoption, you can come pay $10 and just hang out with them if you’re in Tempe, Arizona. Additionally, go to Amazon and you can download any of the books, this one plus probably 15 or 20-30 other ones. I don’t know, there’s a lot of them now. Candidly not all of them are as good as this one, but many of them are very, very good and they certainly will encourage you to have really good conversations with your friends or with your kids, if you want to have a literally conversation with your family, gasp- who does that? About things that matter outside of like, Game of Thrones.

Jessica: Also, good ethical and moral dilemmas.

Jeremy: That’s a good thing to talk about too.  

Kolby: There are actually. I drink and I know things.

Jessica: I drink and I know things.

Kolby: Okay. Thank you. Bye.

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E10. "The Alpha-Dye Shirt Factory" - Is suicide ever the rational choice?

STORY SUMMARY: The story is told in the 1st person by a woman in the late 1800’s working in an inner city garment factory. She comes out of the bathroom to see a fire has started. Women try and escape down the elevator shaft as smoke fills the room. She heads for the fire escape, but it collapses, killing everyone on it. Finally, she decides the best thing to do to prevent being burned to death it to jump to her death.

DISCUSSION: The story is a historical fiction version of the Triangle Shirt factory fire. The story isn’t really about the fire though, it’s about suicide. Is it okay to kill yourself when you are being burned alive? What if that burning alive is depression, or recovering from some disease or injury that is very painful? It feels like it will never end and it feels like killing yourself is the lessor pain. Nobody jumps out of a burning building because they think they are going to live, they jump because it’s better than being burned alive. How long are you required to suffer before you can stop your suffering? Are you required to never quit?

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“Is suicide ever the rational choice?”

Kolby, Jeremy, and Jessica discuss the ethics in the historical short story, “The Alpha-Dye Shirt Factory” by Tyler W. Kurt.

Transcript (By: Transcriptions Fast)

The Alpha-Dye Shirt Factory

Kolby: Hi. Welcome to After Dinner Conversation, short stories for long discussions. You are listening to myself, Kolby, as well as…

Jeremy: Jeremy.

Jessica: And Jessica.

Kolby: We are doing what we do every week, we’re doing podcasts about stories from the website Afterdinnerconversation.com. You can download these stories on Amazon, you can listen to our podcast wherever Amazon is played, as well as YouTube so you can watch it, maybe you’re watching right now. And if you’re enjoying this, please “like” or “subscribe”. It allows us to continue doing what we love. We have a heck of a good time, as you can probably tell if you’ve listened to other episodes, doing these. And the whole point of this is to encourage sort of intelligent adult conversation, adult-ish conversation about interesting ethical topics. And if you’ve got a story you’d like to submit and you think “oh, that was great”, just go to our website afterdinnerconversation.com and submit it. One of us will read it, probably me, and assuming that we like it, it’ll get published and maybe we’ll be discussing it because that’s all the stuff that we get is stuff that people have sent us that we like and we’re like, “oh yeah, we can totally talk about that”, and talk about the Hobson’s choice of it as it was last week. We are once again, for the tenth time now, this is our 10th episode, in La Gattara. I have finally learned how to say it by the 10th episode. Where they have cats that are available…

Jeremy: …for adoption.

Kolby: …for adoption. So, if you hear screeching in the background, that is probably the cats letting us know that they are having a good time. Having a little cat party. If you don’t want to adopt a cat, but you just want to come visit cats because you like having stuff on countertops at home, you can just pay $10 and come sit with the cats. It reminds me of the joke where the cat walks into the bar, and the cat says “I’ve had a really bad day, can I have a drink?” And the bartender put its it up there, and the cat goes...

(Kolby knocks water bottle off the table)

(laughter)

Kolby: “Pour me another.” And just knocks it off the countertop. I think that a great cat joke.

Jessica: That was a good cat joke.

Kolby: There are very few solid cat jokes.

Jessica: It was purrrfect.

(laughter)

Kolby: Purrrrfect, yeah. So, the story we’re talking about today…

Jeremy: Please stop.

Kolby:  You’re not a dad, you don’t get to tell dad jokes.

Jessica: Oh, darn it.

Kolby: … is“Alpha-Dye Shirt Factory” by Tyler Kurt. Jessica drew the short straw so she gets to do the honors. For obviously, ideally you should have read the story beforehand but we know not everyone does, so Jessica is going to get you up to speed.

Jessica: I will say, you should read this story beforehand, but you don’t have too. I feel like our ethical and moral discussions tend to be so broad that you don’t need to read the story to listen, but I would recommend that people go and listen.  

Kolby: I will be the first to say I’ve listened to “Car Talk” for years and not once have I worked on a car.

(laughter)

Jessica: I mean, point. Point. Alright. So “The Alpha-Dye Shirt Factory”, the narrator is Mary, and she works at the Alpha-Dye shift factory as a seamstress. The story opens where she is working and her friend Maria asked her to come into the bathroom and there’s this whole backstory about the rules of working in the factory and how many minutes they get for a bathroom break and how you can time it so you can talk to somebody. And as somebody who worked at a call-center once, very similar roles.

(laugher)

Jeremy: Absolutely.

Jessica: And Maria tells Mary that she’s engaged. And it’s against the rules to be engaged at the shift factory which is again, not the call center rules, but you know.

Kolby: This is probably earlier than you were at the call center.

Jessica: Probably. Let’s not date me.

(laughter)

Jessica: And so, Maria tells her that she got engaged and Mary is very happy for her and then they smell smoke. And that is really kind of the setup.

Kolby: When they come out of the bathroom, I think it is?

Jeremy: They hear commotion, they come out of the bathroom.

Jessica: Yeah. And the smoke is coming up from the…

Jeremy: … first floor, they are on the second floor.

Jessica: … the first floor. Yeah. The first floor is clearly on fire and the smoke is coming up through the floor boards. She mentions the fire escape and the fire escape being just rusty and people start pouring out into the fire escape. The fire escape collapses and…

Kolby: She watches the people on the fire escape fall to their death.

Jessica: Yeah. And then the floor falls out. So….

Kolby: It’s a wooden floor, I think?

Jeremy: Yeah.

Kolby: Like, old-timey.

Jessica: Yep. And so, old-timey, I have a wooden floor.

Kolby: But you’re not in a 6-story building with wooden floors.

Jessica: This is true.

Kolby: Maybe they do, I have no idea.

Jessica: And so, the floor falls out and then she spends the rest of the story trying to determine first, if she’s going to try to escape and then knowing she’s not going to escape and trying to decide if she would rather be burnt alive or, first she tries to do smoke inhalation, she tries to kill herself by falling asleep because of smoke inhalation. Her body rejects that, she’s coughing, and then she tries to go to the window and she decides to throw herself out the window.

Jeremy: Apparently, not on the second story, they are much higher in the building.

Jessica: Yeah, sorry.

Kolby: 7th-story. She’s on the 7th-story. So, probably fatal.

Jessica: Yea, so probably fatal although she is the narrator, I will say, we don’t get anything that says that she’s not. She seems to be telling the story because she says, “My name is Mary.” This is not…

Kolby: Oh, that’s a good point.

Jessica: So, we don’t really know the fate of her except that she chooses to jump out the window because she’d rather die from falling then from…

Jeremy: …being burnt alive.

Jessica: …being burnt alive. And that is our…

Kolby: Another cheery story.

Jessica: Another cheery story.

Kolby: Thurman last week was a cheery story. We need a story about just like two cats walk into a bar.

(laughter)

Jesica: I don’t think there’s a lot of ethical/moral complications with that but, we can maybe….

Kolby: It is if you’re a dog bartender.

Jessica: If you are a writer out there and you would like to write a dog bartender…

Jeremy: …Cat dilemma story.

Jessica: …cat dilemma story.

Kolby: I can’t serve those cats, against my ethics.

Jessica: Alright. So, let’s start out our conversation. Kolby, what’d you think?

Kolby: So, uh, so first off I think it’s a direct mirror of, um, what’s it called, the Triangle Shirt Factory, the Triangle Shirtwaist Coast Factories, so I assume it took place late 1800s early 1900s, certainly before there were a ton of OSHA regulations. And so, this story is roughly true in that there was a shirt factory that caught on fire and…

Jessica: There are still shirt factories.

Kolby: Yeah, well, not in America, but yes. And because there were no regulations many of the people died. Many of them jumped out of windows. Many of them jumped down the elevator shaft. You know. It’s a horrible… I guess that good part that came out of that is because it was such headline news, we got a lot of our safety regulations for high-rises and stuff out of it so that it would be harder to happen in the future. What did I think of it? Honestly, it struck… I’m reading it and I’m thinking, “it’s a story, it’s a story, oh, you know, it’s a sad story” and then I got to the questions, and it started talking about, I think, suicide or depression or all those sort of things, and then I was like, “oh, it’s not a story about a fire in a factory. It’s a story about the choices that you make that are logical choices internally but could be illogical choices to somebody looking externally.”

Jessica: Mmmm. Give me an example.

Kolby: So, and this is just my own personal opinion but I guess everything that is personal is your personal opinion, but I think there is an assumption that people that commit suicide are taking a coward’s way out or they are somehow being selfish or disrespectful or whatever. And I think, having been a person who went through years of depression, I think it’s easy to say as an outsider looking it, it’s harder when you’re in that situation and you feel like this is as good as it gets and it will never get good again. And so, it is a little bit like the situation of, “Do I just want to burn alive forever or do I want to end the burning?” And in that sense, I think, I’m not condoning suicide of course, but I think it becomes a rational choice to that person based on their perspective. And so, I think, to belittle that choice by saying that it is cowardly or cheap or you’re a quitter or whatever, I think is disrespectful to that choice. I certainly think that suicide is not a great idea and I certainly think that your choice it’s selfish in that the people who love you will miss you, and you’re choosing to end your own pain at the cost of others people’s pain. Like, I want to end my pain and therefor I’m going to put pain on my mom or my dad or my friends or my kids or whoever loves me because my pain is more important than the pain I’m going to cause them.

Jeremy: Right, I can see that.

Kolby: But I do understand how a person who feels like nobody would miss them or that nobody would be sad about their loss, would see it as a rational choice. Yeah. I mean, I’ll give you an example that came up and then I’ll shut up so we can talk about other and you guys can chime in. In like, 85- or 90-years old Kurt Vonnegut committed suicide. He’d been depressed his whole life, he was just clinically depressed, and it shows up in his writing. And I remember when I first heard I was really crushed by this because he’s a writer I really admire and I was like, “You understand, even if you just wrote doodles on napkins, you would be adding to the sum of humanity, because you are that amazing.” But then, I have to remind myself, “He’s stuck it out for 85 years or 90 years or however old he was, he gave us 15 books that are all astounding. I guess you’re allowed to be done.” And Gordon Ramsay I think committed suicide?

Jessica: No, no, no. Not Gordon Ramsey. Anthony Bourdain. Geez louise.

Kolby: I mean, you hear about famous people that have committed suicide and you’re like, “You have everything.”

Jeremy: Robin Williams.

Kolby: Robin Williams is a great example, right? And I think, it’s easy to be angry until you have been in that situation.

Jessica: And I think it’s easy to be…. I think… I think things mitigate people taking their own life. Right? So, a lot of times, as a society, right? So, if somebody has been diagnosed with Parkinson’s, which is what I believe Robin Williams was diagnosed with when he decided to take his own life, when somebody has been diagnosed with a terminal disease, we are much more accepting as a society of something that…

Jeremy: Again, this is a way out of that pain.

Jessica: Right, this is a way out of that pain. And we… that is a logical step for society.

Kolby: Although sometimes in a legal step, to have doctor assisted suicide. Or commit suicide is illegal although I don’t know how you ticket the person afterwards their dead or whatever.

Jessica: Right. But I think we as a society are okay with that but if it’s depression, if it’s a mental illness, we have a lot harder time with that because, A) as a society we do a terrible job of admitting that mental illness is an…

Jeremy: An illness.

Kolby: You want to be like, “just try harder.”

Jessica: Right. Like, “Go outside.” And we have a hard time with the idea that, we want to fix that because it is a mental illness. We think that medicine can be in charge of fixing…

Kolby: A broken bone or whatever…

Jessica: Or a disease, and if they fail, we’re okay, that legitimizes that person taking their own life but with a mental illness, we don’t think that it’s something that necessarily medicine can solve because it not always is, it’s not something can be solved.

Kolby: It’s much more complicated problem.

Jeremy: And as a society we think that it is something you can pull yourself out of.

Jessica: You can pull yourself out of, you’re not trying hard enough, or sometimes I think society does take the blame in like, “Yes it’s bad and we should try harder to save you.” Which I think is something that I don’t necessarily think is … there is definitely cause and effect of, you know, ostracizing people or making depression something that is ostracizable. But I don’t think that necessarily, “If we were just nicer to one another, people would not be depressed.” That’s not a thing.

Kolby: Right. And that’s one of my frustrations with, I mean, when I was going through a long bout of depression, one of the things that was frustrating to me is people say, “Well you don’t seem depressed.”

Jessica: Right.

Kolby: And I’m like, “No, you don’t understand that depression isn’t sadness. It’s a totally different thing.” Like, I can be on a jet-ski and be depressed. It’s got nothing to do with if you’re smiling, it’s just this thing that just sits on you. And I don’t know how that… I don’t know…

Jeremy: … how to describe it to somebody who hasn’t experienced it.

Kolby: Yeah, who hasn’t experienced it. And it’s like, “No, I can be happy and be depressed. They are not the same thing.” The people that say, “Well, the last time I saw him he seemed fine.”

Jessica: Right.

Kolby: No, you’re totally missing that point of what depression is, or clinical depression is in that sense, right?

Jessica: Right.

Kolby: Wow, that’s a mood killer.

(laughter)

Kolby: So, one of the things that I thought… one of the questions I thought was interesting for me is what if she had been wrong? Because that goes back to the issue of depression, right? It feels like it’s forever, it feels like you can’t get out of the building, but it’s possible. Two-minutes later is when the fire truck with the longer ladder shows up. And so, do you have a, sort of, obligation to life to take the burn in the hopes that better fire truck shows up, or are you allowed to help yourself?

Jeremy: Again, I think that comes back down to really, the pain that you’re feeling. At some point, in this metaphor, you’re deciding, and I don’t think….

Kolby: When you’ve had enough.

Jeremy: Right, this is as much as you can take, and you know, I don’t think that matters if two minutes later. The fire truck is just late.

Jessica: I think it’s interesting at the end of the story, she talks about when she’s deciding to jump, one of the things she says is “Don’t aim for a body”.

Jeremy: “Don’t try to take anybody down with me.”

Jessica: And I was like, “No, no, aim for a body.” And then it becomes this question, do you…

Kolby: “Aim for the most expensive car I can find. “

(laughter)

Kolby: That’s what I’d do. I’d take out a rich person’s insurance coverage with it.

Jessica: I was like, “Land on something soft, that could be another person.”

Kolby: Right?

(laughter)

Kolby: You look plump.

Jessica: Right, exactly. And so, then it becomes this idea of, do I die and know that death is certain, right? Do I take this way out and know that death is certain, or do I aim for a big old pile of people and hope that I’m just…

Jeremy: That it breaks my fall.

Jessica: …I’m thoroughly maimed but recoverable. And so, then it’s, “I can either have an agonizing maybe death, or agonizing and then recovery and then living with those injuries.” Which is a very interesting, when we’re paralleling it to something like a terminal illness or depression… do I end it now and know that it’s over, or do I continue to suffer and hope that either I get better or I also die, and it’s the same outcome, it’s just….

Kolby: But I died fighting, I died fighting in pain.

Jeremy: Or do I go through all this chemotherapy and radiation treatment?

Kolby: That was one of my mom’s decision, when she was diagnosed, she was diagnosed with breast cancer and bone cancer at the same time, and she was like “I don’t want chemo. I was a nurse for 35 years, I know chemo does to people, I know that I’m 70 years old, I don’t want to spend a couple of miserable years to get 5 more, and I’m a two-pack a day smoker.”

Jeremy: Huge lifestyle change.

Kolby: Yea, she was never going to quit smoking. So, she decided, “I’m good. Like, I’m good” and maybe if she was 40 I’d be more upset with her, but I guess at 70 I’m less upset. I don’t know.  One of the true things that came out, because I researched this…

(laguther)

Jessica: What? That is Jeremy’s role.

Kolby: I’m sorry.

Jessica: He’s the researcher.

Kolby: I’ve been researching. One of the things that, there are a couple of people who survived on the top floors of the shift factory fire and it’s interesting the way some of them, in the sense that they tried. They simply tried. One of the women slid down “Matrix” style, slid down the wire in the elevator shaft until she burned all of the skin to the bones of her hands and she had to let go, and then she dropped the last 3 or 4 stories. But so many people had fallen down the elevator shaft first, that the pile of bodies, she’d landed on the pile of bodies, she knocked herself unconscious, when they later were pulling all the bodies out, she was…

Jeremy: … she was still alive.

Kolby: She was lined up with the bodies on the sidewalk and woke up and was like, “I’m not dead.” And just had a concussion and burnt hands. And that was the thing. And another woman who jumped out of one of the top stories, she jumped out for the flagpole that hangs out of the 3rd story of the building, and grabbed it and it snapped off and it slid off and snapped off and it slowed her down enough that she broke her legs and lived.

Jessica: Wow.

Kolby: Yeah.

Jessica: That’s incredible.

Jeremy: That’s rough.

Kolby: Yeah. I mean, and those are, I think, in some ways those are the things we make movies of and the heroes we have are the people who have every reason to stop trying and try anyways.

Jeremy: And keep going.

Jessica: Yeah.

Kolby: Because we somehow view that as heroic.

Jessica: We definitely view that has heroic. And to some point it is heroic. That’s a lot of tenacity.

Jeremy: No chance, but no choice.

Kolby: Right.

Jessica: Right. And…

Jeremy: I imagine you hear that a lot from military veterans too who get awards.

Jeremy: Yeah, definitely.

Kolby: They were like, “No, they were shooting at me, I shot back, it just so happens, like, you know, this was the best of a bad situation.”

Jeremy: Right.

Jessica: Yeah. I think absolutely that’s true. I think it’s interesting if you read any holocaust survivor stories, like the amount…

Kolby: We were just talking about “Night” a couple of days ago.

Jessica: Oh yea?

Kolby: Jeremy and I were.

Jessica: I love that book.

Kolby: I think it should be mandatory reading. But in that book and a lot of the holocaust stories, the will to live somehow exceeds the bodies will to live, right? The people who lived and died, it was almost this mental desire to not die at the level of starvation and exhaustion.

Jessica: Well, I will also say that the people that died also had that same mental…

Kolby: Oh, sure, sure.

Jessica: It just didn’t work.

Kolby: It just didn’t work for them, that’s very possible.

Jessica: And I think a lot of people...

Kolby: People got sent to the long marches and things.

Jessica: Yea. I think a lot of it is chance. I think a lot of it, this person had “X” percent more body fat when they arrived and therefor, they could survive.

Jeremy: Slightly better conditions.

Kolby: Had better shoes and whatever.

Jessica: Exactly. But, like, the mental tenacity it takes to get through a situation like that, like every time I ready a survivor story, I’m just like, “And here’s the part where I die.”

(laughter)

Kolby: Right?

Jessica: Because I’m exhausted reading it, I can’t imagine….

Jeremy: … having to live through something like that.

Jessica: No.

Kolby: This is the part where I’m like, “I’m good, like, I’m good.”

Jessica: And then, for a lot of Holocaust survivors or survivors of lots of situations, there’s that survivor’s guilt, right? The chance, whatever the chance was, the shoes, the, you know, the extra body fat or whatever it was that allowed them to survive, was not because they were a better person. There wasn’t a worthiness about…

Kolby: They were just standing at the right place at the right time.

Jessica: Exactly. And so, it then it becomes “it was just chance...”

Jeremy: It was just luck.

Jessica: “…It was a chance that I survived and now I feel terrible that that chance was me.”

Kolby: So, do you feel like a person who either has a terminal illness or is clinically depressed, do you think they have an obligation to pursue that chance?

Jessica: No. Absolutely not.

Kolby: You’re willing to give people a pass on being a quitter?

Jessica: Absolutely going to give people a pass.

Jeremy: Because it’s their choice. Again, do you want to go through the chemotherapy, do you want to go through the potential pain for the potential good outcome? And some people just don’t want to make that choice.

Kolby: So my sister and I have a running, it’s a joke but it’s not a joke, in that someday one of us is going to go with the other one to the hospital, to the doctor, and the doctor is going to be like, “I’ve got some news and you got whatever whatever, you’ve got 3 weeks to live”, whatever the case may be, and we have an understanding that what’s going to happen is this:  We’re going to walk about of the room, and my sister’s going to say, “So what did the doctor say?” And I’m going to tell her, “He said you’re going to be fine.” And then, she’s like, “Oh good.” “It’s just a heart palpitation, you just need to drink less Pepsi.” And then as we pass the dumpster, baseball bat to the back of the head.

(laughter)

Kolby: Throw her in the dumpster. So that her last thought is…

Jeremy: …she’s going to be fine.

Kolby: “It’s fine. It’s going to be fine.” Like, you don’t need those 3 weeks or worrying about it. And by the way, just to be clear, she has a standing order for me as well.

(laughter)

Kolby: But I’ve also made clear to her like, pneumonia does not count.

(laughter)

Jeremy: So, here’s the list.

Kolby: Shingles, I’m probably… it needs to be like.

Jeremy: So, you have to clarify it for her.

Jessica: I mean, like, there’s so many reasons to hit Kolby with a baseball bat.

(laughter)

Kolby: But we’ve had that discussion.

Jessica: “I have a freckle that’s discolored” Oh, Kolby gets a baseball bat.

(laughter)

Kolby: “You’re going to be fine. Let’s walk past this dumpster.” Yeah, but we’ve had that discussion.

Jeremy: “I’m going to take you out to this cornfield”

Kolby: Right. But we’ve had that discussion of like, “Look, if I waiver in the last weeks, don’t let me waiver because I don’t want to be that kind of burden on other people. I don’t want to have you have that memory of me, all those sorts of things.”

Jessica: So, my mom is a hospice nurse and so…

Kolby: Man, she is like an angel.

Jeremy: She is.

Jessica: I mean, an angel, she is the best mom in the world too. But she is very much of the, “I don’t want to linger, she’s a DNR, if there’s something that bad, I don’t want tube feedings, anything like that.” We’re very clear on all of those instructions, Mom.

(laughter)

Jessica: However, I am the opposite, right? I absolutely want to linger. I want to linger and linger… I want people to curse my lingering as long as I am…

Kolby: You want somebody to be sponge bathing you for weeks before you go.

Jessica: Right. But, just to be clear, I have to be conscious and I have to be of right mind. And…

(Kolby gives sideways look)

Jessica: Shut up. I am of right mind now.

(Laughter)

Jessica: Shut up Kolby.

Kolby: That look of mine said it all.

Jessica: Yes, it did.

(Laughter)

Kolby: You read a little too much Dylan Thomas is what I think. You’re all about not going gentle into that good night.

Jessica: I absolutely won’t go gently. Mostly because I probably a big part of it, is just a fear of death. I’m a very existentialist person and so this idea that in my final moments, I’ll say something super profound and it’ll make it all worthwhile or whatever. But I want every single ounce of life. I want every moment to be sucked up. I don’t want to go suddenly. I kind of want to know I’m going to go because I can’t imagine, like the suddenness, it upsets me so much to think, I could walk out the door and get hit by a train and that would be the end and I wouldn’t even know it was coming.

Kolby: And three witches would be hackling over it. Three witches would be hackling over it from like 2 weeks ago.

(laughter)

Jessica: That’s so true.

Jeremy: And yet you lived in front of a train track for years.

Jessica: For years.

(Laughter)

Jessica: For years. Those things don’t derail. I knew it was coming.

Kolby: Jeremy, you’ve been pretty quiet on this one. Do you have thoughts? What would be your opinion on all of this?

Jeremy: You guys kind of said it all.

Jessica: Are we baseball batting you? I just need to know.

Kolby: I need to know.

Jessica: I need to know.

Kolby: I’m actually, I’m going to use one of those little baseball bats.

(laughter)

Kolby: …that you get from the toy store so I have to do it like 9 or 10 times.

(laughter)

Kolby: Like, wack wack.

Jessica: So, you’ll know it’s coming.

Kolby: Yeah, you’ll know it’s coming. No, don’t worry, I’ll use aluminum.

Jessica: Are you a baseball bat or are you a lingerer?

Kolby: Are you a lingerer?

Jeremy: I don’t know. I’ll have to think about that.

Jessica: What do you mean you’ve never thought about that?

Jeremy: I’ve never thought about that.

Kolby: What about if you were the lady in the story? Would you jump?

Jeremy: Oh yeah, definitely jump.

Kolby: You would’ve been a jumper?

Jeremy: Oh yeah. Fire is one of the worst ways to die. Burning in a fire is one of the worst ways to die.

Kolby: I’ve read from fire survivors, people that are like 70-80% burned that they have said years later, even now that I have lived through it, I would still have preferred to have died.

Jessica: Wow.

Kolby: Even now knowing that they’re now going to live a long-fulfilled life.

Jessica: Wow, that’s incredible.

Kolby: Because it’s just a painful way to go. Which also makes me, and I didn’t think about this when I was reading the story, it also makes me think mad props for like the monks during Vietnam War.

Jeremy: Holy crap, right?

Kolby: And the ones who light themselves on fire with gasoline and they just, not a peep.

Jessica: Right.

Kolby: Just right up to the moment they are dead, like man, I don’t have that kind of conviction for anything. I wish I did.

Jessica: Step on a nail and I’m going a screaming banshee.

(laughter)

Jessica: Yeah.

Jeremy: Yeah, definitely jump.

Kolby: We need a cheerier story for next week.

Jeremy: I know, right? What is our story…

Kolby: 2 cats walk into a bar; dog won’t serve them.

(laughter)

Jessica: If you’re that writer, please write about 2 cats that walk into a bar.

Kolby: So, you have been, we still haven’t gotten a copy, a narrative copy of the….

Jeremy: The trolly problem.

Kolby: The trolly problem. It’s coming, thought, I’m sure. You’ve been listening to After Dinner Conversation, short stories for long conversations, with myself Kolby and Jessica and Jeremy. We always have, maybe not this week, but we normally always have a great time discussing our stories and heckling each other. This was a little bit of a sad one but a good one. A worthy discussion. If you’ve got a story you’d like to submit, feel free to email it to After Dinner Conversation. If you enjoyed this conversation in the sense that you found it fulfilling, not in the sense that you found it fun, then “like” and “subscribe”. It means a lot to us. If you’re in the Tempe area, come by La Gattara to adopt a cat or pay a couple of bucks to at least hang out with a cat.

Jessica: They are so cute.

Kolby: See, we didn’t even think about that, that cats would’ve survived.

Jeremy: We need another camera on all the cats.

Jessica: The cats would’ve survived.

Kolby: Cats can survive 7-foot drops because they sprawl out and they become like a cat-a-chute.

(Laughter)

Jeremy: Cat-a-chute?

Kolby: And they’re fine. Yeah. “like” and “subscribe”, submit stuff, download these on Amazon and thank you for joining us again. Next weeks story will be 2 cats walk into a bar. Wait, that’s not it. What will it be Jeremy?

Jeremy: “Rainbow People of the Glittering Glams”.

Kolby: I love this story. I know you thought it was.

Jeremy: It’s a good story. It’s just long.

Kolby: It was just long.

Jessica: I loved it.

Kolby: I feel like it wasn’t long and wasted. It was actually long and like, solid.

Jessica: Yep.

Kolby: Yeah, 3 Kingdon Warfs?

Jeremy: I don’t know what the word is.

Kolby: Warns. There’s no F there. Three kingdom wards, people who protect the kingdom, are sent to investigate the reclusive rainbow people of the shifting desert. And I really, honestly, if you submit a story called “Rainbow People of the Glittering Glade”, you pretty much automatically get published, I think.

Jessica: I was the opposite. I saw the title and was like, “Nope…”

Kolby: So pretentious.

Jessica: “… nope, that’s a big nope from me.” I was wrong guys.

Kolby: You were wrong.

Jessica: It was delightful.

Kolby: Thank you for joining us. We will see you at the next one. Bye.

* * *

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E9. "The Truth About Thurman" - Is there a "better" decision, when both cause someone to die?

STORY SUMMARY: The main character heads into the military supervisors office. It seems two soldiers have been captured terrorists who are threatening to kill them both unless the US Government tells them before the deadline which to kill, and which to go free. One is a woman, and the other is gay. They want the government to make a Sophie’s Choice, so to speak. The government decides to do neither and launch a rescue operation that fails. Both are killed. The story ends with the original solider who started the story locking himself in his room and killing himself. It turns out he was in a relationship with the woman and she was pregnant with his child.

DISCUSSION: Story is built around a Hobson’s choice. A choice whereby both option are terrible, and you must pick one, or both will happen. It’s interesting in that it makes us decide how we value different people. If they are both in the military, then make the government should not pick, so as not to encourage terrorists to kidnap others. Being in the military, you should know you may have to die for the country. Otherwise, maybe all people are of equal value. Maybe children are worth more? It is fair that he didn’t tell his superior officer about the pregnancy? This is part of the machismo culture whereby men aren’t allowed to feel things, and talk about how things affect them. In real life, of course, he would immediately have been removed from the situation.

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“Is there a better decision, when both cause someone to die?”

Kolby, Jeremy, and Jessica discuss the ethics in the terrorist driven short story, “The Truth About Thurman” by Jenean McBrearty.

Transcription (By: Transcriptions Fast)

The Truth About Thurman

Kolby: Hi, and welcome back again to After Dinner Conversation; short stories for long discussions. I am your co-host Kolby.

Jeremy: I’m your co-host Jeremy.

Jessica: I am a co-host... is there such a thing at three co-hosts?

Kolby: Tri-hosts?

Jessica: I am a side-kick Jessica.

(laughter)

Kolby: And today we are talking about “The Truth About Thurman”...

Jessica: By Jeanean McBrearty.

Kolby: If you haven’t read it yet, you should read it ideally before listening to the podcast. You can get that at amazon.com, you can download it. If you like this podcast, feel free to “like” and “subscribe”. If you’ve got a story you want to submit, go to our website Afterdinnerconversation.com. You can submit stories, we’ll read them, if we love them, then they’ll be one of the ones we discuss someday.

Jessica: Where we at?

Kolby: We are at.... thank you.

Jessica: You’re welcome.

Kolby: We are at La Gattara, where they have cat rental. No, it’s not cat rental, they adopt cats out and they’ve always got loads of cats. We’ve had them... if you hear clicking and clacking in the background it’s not that Jeremy’s a bad sound guy, it’s that...

(laughter)

Jeremy: It could also be that too.

Jessica: It’s also Jeremy.

Kolby: There’s got to be like, what, 25-30 cats in here?

Jessica: There’s... I don’t know if there’s that many.

Jeremy: Maybe 20?

Kolby: There’s like a lot.

Jeremy:  High teens.

Kolby: High teens, yeah.

Jessica: And they are adorable. And you can come pay to come and sit and be with them and have them knock your stuff off the table or sit on your laptop, so if you’re missing a cat and you want to come and hang out, you can.

Jeremy: This is a great place.

Kolby: This is our 8th or 9th episode here. It’s really nice of them.

Jeremy: I’m glad you can count.

(laughter)

Kolby: I’m totally loosing count. I don’t know how the people are like, “I’ve done 200 episodes”, I’m like, “How would you even remember that dude?”

Jeremy: Because they write it down.

(laughter)

Jessica: It’s this thing called pen and paper Kolby.

Kolby: I should totally try that. Okay, so for the people that haven’t read “The Truth about Thurman”, Jessica would you, I’m sorry, Jeremy...

Jeremy: Yes.

Kolby: I just gave Jessica a heart attack because she hadn’t prepped the summary. Jeremy has prepped to do the summary.

Jessica: Jeremy does do good summaries.

Kolby: Jeremy, tell us about “The Truth About Thurman” for the people that didn’t read it.

Jeremy: So, as our story opens our protagonist, Captain Thurman, waits to see his commander. When he’s ushered in, they begin discussing the events that lead up to the capture or that led up to the capture of two American soldiers and the ultimatums from the captors. Apparently, the Jihadist captors want the military or the government to pick which soldier will die and which will be released. During these scenes Captain Thurman also displays several odd moments of misogyny, or what’s the other term?

Kolby: Misogyny.

Jeremy: No, anyways, they discuss the characteristics of the two soldiers presumably to find out the potential fallout of watching either of them be murdered by the jihadists. First soldier Whitcomb is gay; the second Chandler is a Jewish woman.

Kolby: Like this is a math problem.

Jeremy: The commander suggest that either way, someone will be offended and bring s up the movie “Sophie’s Choice” saying that no matter who they choose, the Jihadist’s will most likely kill them both. So, and the best thing they can do is ignore the request and work on a rescue. Thurman is unhappy with this response and spend the next 48 hours cleaning and repainting his apartment while watching “Sophie’s Choice” and trying not the think about the torture and violent end the two soldiers will undoubtedly face, as intelligence operators are unable to locate them for a rescue. So, once completed with the remodel of this apartment and the movie, Thurman agrees that his commander is correct: “to be chosen as the insignificant one would be another torturer.”

Kolby: Wow, you pulled that quote right from the story. That’s good.

Jeremy: Yeah. It’s a synopsis.

Kolby: But that’s a direct quote, that’s awesome.

Jeremy: All of this culminated with the reveal that Thurman and Chandler were engaged and he wonders if telling the commander that she was pregnant would have made any difference. The story ends with Thurman shooting himself just as the news reports the videos of the soldiers’ executions have been released.

Kolby: Okay.

Jessica: So, it’s a real light-hearted story. As they all are.

Kolby: Does he paint his entire house black or something, and black over all the mirrors? Velvet?

Jessica: Yeah.

Jeremy: Did he paint it? He was replacing everything with black velvet.

Jessica: I thought he was replaced everything with velvet. Which is covering all the mirrors and stuff is a Jewish tradition when somebody passes away.

Kolby: Jeremy, you have thoughts when you read it?

Jeremy: I mean, the topic they bring up is really good. The idea that to choose; again, “Sophie’s Choice” is a great example to bring into the story.

Kolby: It’s driving me nuts; I can’t remember the name of the term it’s based on. Yeah, I’m going to look it up.

Jeremy: I was thinking toxic masculinity; that was the other term.

Jessica: Toxic masculinity, oh, the commander guy. Yeah, he was kind of...

Jeremy: Well, even Thurman; they’re both a little steeped in that. I feel like the story is pretty well written. You do get the character’s motives and their thoughts in this and it’s an interesting perspective. The question it asked is very hard to answer; how do you choose?

Kolby: Hopsen’s choice.

Jessica: Hopsen’s choice.

Kolby: Yea, if you’re Wikipedia-ing something, Hobson’s choice is the thing you Wikipedia and then “Sophie’s Choice” is the movie that’s roughly based on it. Yeah, sorry, I was going to forget.

Jeremy: No, and don’t they bring it up in “Rick and Morty”, just to choose which of the...

(laughter)

Jessica: I don’t watch it guys.

Kolby: ”Rick and Morty” is good. It’s really good.

Jessica: Whatever.

Kolby: So, I’ll tell you one of the things I really liked about this story is it’s not particularly long, it’s, you know, 5-6 pages. You can read it...

Jeremy: ... in a sitting.

Kolby:  ... in not long. And it is kind of a one-trick pony. It’s the what do you do in “Sophie’s Choice” or Hobson’s choice scenario but it doesn’t stretch it out into 35 pages to ask me one questions. Right? And I really appreciated that, that I could be like, “yeah, just give me the...

Jeremy: ...“here’s the scenario”

Kolby: ...”Give me the sketch and give me the choice and let me have something I can decide what I think about it.” And I did appreciate that it was both did something interesting and did it in a brief way, so I didn’t feel like, ya know, I don’t need to know what color.

Jeremy: Right, and not too much character development.

Kolby: And for this kind of thing, I don’t think you necessarily need...

Jeremy: ...too much of that, yeah.

Kolby: yeah, in the same ways.

Jessica: I agree. I think the story does, in a very, you know, limited about of space gives us a kind of scenario for us to mull over. It did remind me, I do want to say before moving on, that “Sophie’s Choice” is a fantastic novel and does a ton of character development and its heart breaking and it makes you cry at the end.

Kolby: I haven’t even seen the movie.

Jessica: The movie is also really good.

Kolby: I skimmed the Wikipedia.

(laughter)

Kolby: I haven’t. I’ll watch it at some point.

Jessica: I’m just saying the character development isn’t bad, but I think for the purposes of discussing a really interesting moral problem. It also reminded me of “Black Mirror” just the concept of, like.... So, I have trouble watching “Black Mirror” so I don’t watch...

Kolby: You’ll have to give me a background, I don’t know “Black Mirror.”

Jessica: Oh, “Black Mirror” Is a show on Netflix.

Jeremy: It’s on Netflix.

Kolby: I thought they were all individual one-offs.

Jessica: They are.

Kolby: So, you can’t just say it reminds me of “Black Mirror” because I have to know what episode.

Jeremy: Sorry. I think the reason why it reminds me of “Black Mirror” in general is because I think at the end of “Black Mirror” perhaps, and my personal fault in this, I didn’t realize that it is really set up to have you these kind of same “After Dinner Conversation” discussions. Right?

Jeremy: Absolutely.

Kolby: That’s exactly what “Black Mirror” is.

Jessica: But, I have such a hard time just digesting the content that I don’t watch it because it’s too horrifying and I’m a horror writer.

(laughter)

Jessica: So, I find it funny that I just realized sitting that, “Oh, that’s exactly what the “Black Mirror” is doing””

Kolby: The other one is “Love, Death, and Robots” on HBO, also same thing. Each episode is 8-15 minutes. It’s animated.

Jessica: Really?

Kolby: And it’s really just a “here’s a one-trick pony” sort of story but, the trick is always really good.

Jessica: Ok, alright. So, Kolby, one of the things you said was you liked this story because it set up the situation and gave you something to think about on what choice you would have made in this situation. So, well, what did you disagree or did you agree with what happens at the end?

Kolby: Yea, I took I different tact. So, when I see these sort of situations I am of the opinion, and this is tangential to the story a little bit in that, when you pay a kidnapper...

(loud cat noises)

Kolby: Wow, holy cat fight.

Jeremy: Over the bathroom, of course.

Jessica: Again.

Kolby: Again over the bathroom with the litter box. When you pay a kidnapper, you’re telling people, “You should probably kidnap.” When you pay a pirate who steals a ship, to get your ship back, you’re telling them they should do something...

Jeremy: That it’s okay.

Kolby: If being the sort of...

Jeremy: You’re establishing a norm.

Kolby: You’re establishing it’s worth doing.

(cat meow)

Kolby: Wow.

Jessica: Again, we’re in a cat lounge, there’s cats.

Kolby: And so I think, even in this kind of case, if you’re doing it quote-unquote right, if the person says, “Look, we’ve got two people and we’re going to kill one of them”, I think you smart bomb the building that both of them are in and you’re like, “look...”

Jessica: But they couldn’t locate the building.

Kolby: Or in the case of the kidnappers or in the case of the pirate ship, if someone’s like, “How much will you give us for the pirate ship?” You blow up your own ship and you’re like, “Just so you’re not clear, you will never make money doing this.” And then you don’t get elected president again. Let’s also be clear.

(Laughter)

Jessica: Kolby is not running for president.

Kolby: No.

Jessica: So I think that’s a very interesting approach.

Kolby: It’s a logical approach, it’s not a very useful approach.

Jeremy:  Empathetic approach.

Kolby: It’s not an empathetic approach.

Jessica: Well, yeah, it might be hard, especially in this scenario. We don’t know where the kidnappers are or the hostages. But it is interesting when we talk about, like so, is this idea that giving terrorists or giving people that either pirates or jihadist or giving them airtime on social media.

Kolby: You’re giving them exactly what they way.

Jessica: Is exactly what they want; so that is the pay-off. And there’s this idea that if we can squash that, if we can remove them from social media, if we can remove, if we can get Twitter to do that or we can get YouTube to do that, they are not getting the payoff and recruitment that they were hoping to get by doing this, but then that also represses the horrors that are happening, and represses freedom of speech.

Kolby: So, one of the things that came up, I think it was New Zealand, they had their first mass shooter in ever, and all the newspapers and the press and everyone cooperatively agreed that they would never say the name of the person.

Jeremy: Right. And you’re starting to see that more often now. And that’s been one of the suggestions that psychologists or everybody has been making is this don’t release their name because this is promoting it to other people who would be copycats and make them before famous.

Kolby: I feel like they want to be famous for it. I think, it doesn’t really go into it in this story, but I think that’s why the terrorists in this story do this is they don’t really care if one person or two person dies, they care if they’re on the news having made the US government choose who dies. And so in that sense, I mean, I guess that they...

Jeremy: They made the right decision.

Kolby: They made the right choice. But, now do you want to be the one to make that phone call to either one of these people parents, families and be like, “hey, we made the good choice but bad news about your whatever?”

Jeremy: And that is one of the things near the end when Thurman is thinking about all these things is specifically that comes up, is who is going to tell the families of these people this is what happened.

Jessica: So, I want to throw some scenarios at your guys. So, would this story have been different if it was a soldier, a US soldier and a soldier from a different country?

Kolby: Shouldn’t be. I would say though, that if it was a soldier and a non-solider...

Jessica: The solider dies. We get that.

Kolby: Yeah, you signed up for that.

Jessica: You signed up for that.

Kolby: Yes, you get free college tuition.

Jessica: Thank you very much for your service. I’m not saying anything but... I think the soldier would also make that choice.

Kolby: Because they know what they signed up for.

Jessica: Right. Would it be any different if it was somebody with an outstanding service record and what’s-his-face, the, who is the director? BRR BRR BRR halt?

Kolby: Someone who defected.

Jessica: Yeah. Somebody who defected from the United States Army.

Kolby: To me, and this is why I think.... this part I didn’t care about this story... the idea that one person is gay, one person is a Jewish woman, I don’t care. You’re not worth more or less because you’re gay or a Jewish woman. I don’t care about that.

Jeremy: But I think they were doing that to begin that discussion, how do you choose? What are all the factors? And the commander says, “It doesn’t matter who you choose, somebody is going to be upset, so the only choice is to not make a choice.”

Jessica: I think it’s interesting. I think the writer did that intentionally to kind of lead us down this idea of evaluating soldiers with like, pro and con’s list and the story didn’t go that route. Which, I thought that was very good, that would make me very uncomfortable, but I, not that being uncomfortable is a bad thing, I love to be uncomfortable, but that we can’t, we to send us down that route of pro and con and they just say, “like it doesn’t really matter, we don’t negotiate.” But I do think about that pro and con list. Like, would it have made a difference had he said that she was pregnant? I don’t, I mean, I don’t know...

Kolby: That might’ve to me actually.

Jessica: Really?

Kolby: Yeah, because here’s, yea...

Jeremy: Because then an innocent civilian.

Jessica: I guess. It’s not an innocent fetus. It’s a clump of cells.

Kolby: Yeah, but, I know, but, and I understand that’s, but I just feel like it has the ability to become a person, it’s a whole thing. Like, I, yeah, I don’t know why that would’ve mattered to me.

Jessica: Huh.

Kolby: The thing that was disappointing to me, and I think it was written  to be disappointing to me, it’s not that the writing was disappointing, was that the military was interested in which one was the worse story and so their reason for non-participation wasn’t for the reason’s we’re discussing. It was, “Well, it’s a gay person and it’s a Jewish woman.” But if the person hadn’t been gay and it was just a Jewish person, they’re like, “Oh well, the fallout would be less.” The only reason they did nothing because the scale was balance in the PR fallout. Not because of what we’re discussing which was, a life is a life is a life. Unless you signed up for it, unless you sort of stepped forward in whatever form that you do.

Jessica: Right.

Kolby: And so that part, I think, falls into that sort of toxic masculinity. Just shows the government is just as the sort of blanket inept PR related entity.

Jeremy: Right. What’s the best of the worst case scenarios.

Jessica: And I think the line that you read in your summary Jeremy about...

Kolby: To be thought less.

Jessica: To be thought less. I also found that it was an interesting assumption that the jihadist wouldn’t say, “Oh, they picked you? Great. I’m going to kill this person and then kill them, just kidding they picked the other person, kill them.”

Jeremy: To force the government to make a choice.

Kolby: Why would they keep their word?

Jessica: Exactly.

Kolby: Because the PR goal is for them to have made a choice. Not that we follow through on your choice.

Jessica: Right. Correct. And yeah, I think that...

Kolby: So, one thing I didn’t understand about this story, and I think it since the story perfectly fine, It just didn’t make any sense to be, is the main character Captain Thurman, why he hills himself at the end? Like, obviously because, there’s the Jewish woman...

Jeremy: ...who he’s engaged too.

Kolby: Like, I get that, but like, people’s engaged died all the time.

Jeremy: But he doesn’t want to see her beheading.

Kolby: Sure. Don’t watch the YouTube video; I’m totally down with that.

Jessica: But, so, I disagree.

Kolby: I just don’t know why he killed himself at the end.

Jessica: I don’t think it’s because he doesn’t want to see her beheading, not that he wants to see her beheading....

Kolby: I thought it was because he didn’t tell the person in charge that he knew her.

Jessica: So, I wonder if it’s this deeper. So, there is some hints through the story of this kind of personal struggle to have a relationship and there’s like, the relationship...

Kolby: You’re a cat whisperer.

Jessica: I have a cat that’s sitting on my story. I can’t reference it, so I’m going to have to go from memory; it has a pink bowtie on. It’s very cute. So, from memory there’s a metal that’s...

Jeremy: Right, he has a triathlon metal.

Jessica: Oh it a triathlon metal. Encased in...

Jeremy: Plexiglas.

Jessica: On the mantle and that’s it. And the relationship with the mother is a little odd. His decision to get married seems to be based on that there is a metal there and it would look good with a diamond ring. I don’t know.

Jeremy: It’s a little...

Kolby: It’s a little hard to follow.

Jessica: So, I wonder if he has just a very difficult time...

Jeremy: Interacting with people in general.

Jessica: Interacting with people in general and then he finally finds somebody that he is in love with, and then he can’t act in a way to save her and withheld information and living with that, is too hard. This idea that, “I was involved in the decision and I did nothing and she dies and therefore, it’s...”

Jeremy: It’s his fault.

Jessica: It’s his fault.

Jeremy: I can see that.

Kolby: I also think that had he been doing this right, and maybe he even had an obligation to, I don’t know anything about the military, he should have told somebody. He shouldn’t have even been in those rooms, right?

Jessica: Yeah.

Kolby: He should’ve been like, “hey, by the way, I’m engaged to this person.” They’d be like, “We understand. We’re going to show you to the other room. We have somebody who’s going to fill your place for you. This is not your problem anymore.”

Jessica: Yeah.

Kolby: But, it doesn’t serve the story to do that.

Jeremy: Correct.

Jessica: What I would say is that, oh how to put this...

Kolby: You’re going to try to not get yourself in trouble?

Jessica: Yep. I think...

Kolby: Don’t be a disappointment to your daughter. Don’t be a disappointment to your daughter.

Jessica: I think a lot of white dudes think that they can act in a way that is unbiased in...

Jeremy: In those situations.

Jessica: In those situations.

Kolby: Because that’s the manly thing to do.

Jessica: Right. Going back to that idea that we we’re talking about with like toxic masculinity, Here’s, you know, he says some things that come from a place of toxic masculinity that, to me, he says I can. And he acts without bias for the most part.

Kolby: Until he blocks himself in the room by himself. But in public, he’s..

Jeremy: ...he’s compartmentalizing and...

Jessica: ...And he thinks, and this idea that, to go back to that idea of toxic masculinity, it is a societal problem that we do not allow men to say, like “Hey, this is going to personally effect me, I’m going to recues myself.” And we don’t let men do that. If women do it, and we allow women to do that, we absolutely do, we also think that’s very weak. Right, we’re like, “geez.”

Kolby: Man up.

Jessica: Uggh. Exactly. Man Up. Man Up.

Kolby: Grow a pair.

Jessica: Right. And so perhaps with the, maybe what we can say is that, the end of the story is perhaps a bit of a statement of the effect of toxic masculinity. This idea that you must be brave, you must be unbiased, and at the end it will kill you. Good luck white dude.

(laughter)

Kolby: I don’t know where I saw this from, it might’ve been a Simpson’s episode, I don’t know, but somebody is having a conversation with an old crusty old guy, and he said, “I don’t know what to do?” And he’s like, “You swallow that down, and you swallow it and you swallow it and you swallow it.” And he’s like, “really grandpa?” and he’s like, “yeah and then you get cancer in your stomach and you die.”

(Laughter)

Jessica: That is absolutely from the Simpsons. I totally remember that episode.

Kolby: But that’s, I think that is part of that culture of toxic masculinity of like, “I’m going to internalize and internalize and internalize because I have some societal obligation to take that on silently. And be the cowboy in the west or whatever.” As opposed to just being like, “Man, I had a hard day” or “I saw a Hallmark commercial and it made me cry.”

Jeremy: God, that Sarah McLachlan song.

Kolby: We went to the Musical Instrument Museum last night, and I’m watching this thing on the thing, and I’m like, “I’m going to cry”

Jessica: Kolby cried at the...

Kolby: I totally cried at the Musical Instrument Museum.

Jessica: Well, there was a really great exhibit about the kids, I don’t, it was Paraguay where kids were making instruments from trash.

Kolby: Literally going through a trash heap, and banging out and like getting....

Jessica: Beautiful violin and that cello, that was...

Kolby: They played better than you in a lot of ways.

Jessica: I mean, bar low.

(laughter)

Kolby: Out of a gas can by the way.

Jessica: And then I cried during Malory wedding ceremony, which I’ve seen that video literally a thousand times on YouTube, and still cry.

Jeremy: But you still cried.

Jessica: But very socially acceptable for me to boohoo through a museum...

Kolby:  ..but not for me.

Jessica: ...but not Kolby.

Kolby: Yeah, I don’t mind crying actually. I feel like that’s how you know you’re alive. Like, if you’ve had a good cry, that’s how you’re like, “You know what? I still am able to feel things that make me cry.”

Jessica: I wish that I was more okay with crying. I’m not. But I cry all the freaking time. All the time. Hallmark commercials, trailers, to really terrible movies, I cry all the time. I cry all the time. My daughter loves to stare at me while I’m crying.

Kolby: My wife does the same thing. She’s like, ‘Are you crying?’ And I’m like, “Of course I am. I cry at everything, shush up.”

Jeremy: You should watch “Grave of the Fireflies.”

Jessica: (gasp) No, you shouldn’t. It was the like first movie Jeremy recommended to me when we first met and I never will forgive him for it.

(laughter)

Kolby: Alright, anything else about this story you want to discuss?

Jeremy: One of your questions...

Kolby: What are the questions? Was there one you liked that you wanted to discuss a little bit?

Jeremy: Discuss because it reminded me, you said.

Kolby: Hobson’s choice, I did find it.

Jeremy: Hobson’s choice... something about, should there be a criteria, should there be a pecking order of who should get saved first?

Kolby: For me, it’s just you’re in the military or you’re not in the military.  

Jessica: Okay, but what about Titanic?

Jeremy: But, what this reminded me of, it references “The Radio Lab” or “This American Life”. There’s one of those about the hospital in New Orleans during Katrina where because of the fallout of this hospital, they, the medical community, has developed a process to choose who gets saved.

Kolby: Who gets taken out of the hospital in the event there’s a limited amount of time to do it?

Jeremy: Yes.

Kolby: I’m really glad it’s medical personnel who make that choice, but it has to be made.

Jeremy: Right.

Kolby: So, here’s the other one that I think has come up as well, is the concern about self-driving cars. That if they’re given a choice, you’re now relinquishing that choice of like, kid runs in front of the street, person on the side of the street riding a bicycle, and the car is now going to make this choice of which is more valuable or which has the higher percentage of safety of whatever. Do we want to remove these Hobson’s choice from our driving process? I’m going to take one quick tangent because I totally forgot about the questions. Are there times when you think, if it’s not a life or death Hobson’s choice, just the sort of you get one or you get neither, where it is okay to say I’ll take one? Like you can have pie or cake, but you can’t have both?

Jessica: Ok, so...

Kolby: I won’t take it away from the life or death scenario.

Jessica: Oh. I mean, that’s easy.

Kolby: Yes, the answer is yes?

Jessica: The answers cake.

Kolby: Okay. Alright

 (laughter)

Jessica: I mean, unless it’s a crème pie, because we can have a different discussion. But, yes.

Jeremy: That’s our next podcast.

(laughter)

Jessica: Pie or cake?

Kolby: Pie or cake! Pie or death? Cake or death?

Jessica: Cake or death.

(laughter)

Jessica: Uh... Cake?

(laughter)

Kolby: I’m going to go one other question for you then, what about if you’re the general guy or the guy making those decisions and one of them is a family member, one of the people that’s captured is a family member?

Jessica: Don’t you recues yourself?

Jeremy: You absolutely should.

Jessica: OK, well, if you’re asking me, as like....

Kolby: Do you still take the moral high ground or do you say like...

Jessica: I will give you a scenario that is similar, but maybe not exact.

Kolby: Okay.

Jessica: San Diego is where I’m from, well, not where I’m from-from, but it’s where I live, and we get riptides all the time. So my husband was with me, and his best friend was visiting from Kentucky and we were swimming and we were caught in a riptide. Yep. So, I had the choice...

Kolby: That’s something from a movie.

Jessica: Yea, riptides are actually very common and they’re super easy to get out of.

Kolby: You just sort of swim sideways, just don’t panic.

Jessica: You swim parallel to the beach until you’re out of the riptide and then you swim in. You don’t panic, Alex.

(Laughter)

Kolby: Did Alex panic and try to swim against the rip tide?

Jessica: Well, I don’t think either of them understood that they were in a riptide. I think that if you’ve never been in a riptide before you don’t necessarily. But, I hadn’t either but I am a swimmer, so I was confident in the water. And so I had this choice to stay with them and try to help them get to the beach, or get to the beach myself. And I knew, like I was close enough to it and I did tell them they were in a riptide.

Kolby: That they should swim sideways.

Jessica: I did. But I don’t think they quite understood, or maybe believed, I don’t know. And so, I had to make this choice of whether I tried to stay with them, and I 100,000% picked myself.

(laughter)

Jessica: I swim parallel to the beach and swam in. I passed a lifeguard on the way in.

Kolby: “By the way, my husband’s out there dying. I saved myself so that I could come here and tell you my husband’s out there dying”

Jeremy: “I told them but they’re toxic masculinity is keeping them out there.”

Kolby: They don’t like to listen to me because I’m a woman. I tried to get the guy next to me who was a man to tell, but he wouldn’t do it. I thought they might listen to him.

(laughter)

Jessica: But It was one of those moments I got to the beach and it was interesting because I think before that situation, I always thought that I...

Jeremy: ...I would help.

Jessica: I’m very much, I love other human beings, I’m very into other humans, I think we’re all so amazing, and I thought for sure I would try to save that person, and I was like, “Nope, me first. I will save myself” And I knew there is extenuating circumstance, I knew there was a lifeguard on duty, I knew she could come out and save them, and by the way she did not have to pull them in.

Kolby: Did you think to yourself, “90% chance they’ll figure this out, I’m going to go swim for shore”?

Jessica: I don’t know that I thought even 90%, I thought maybe...

Kolby:  70%?

Jessica: 70%.

(laughter)

Kolby: Oh my god.

Jeremy: The odds are still good.

Jessica: Odds are still good.

Kolby: The odds are still in your favor.

Jessica: Yeah, so I will say, I mean that I made that choice in real life and it was not the choice I thought I would make.

Kolby: Well, there ya go. You are listening to After Dinner Conversation with myself, with Jeremy, with Jessica. We’ve had a great time talking about our Hobson’s choice in this story, “The Truth About Thurman”. I think we’ve decided that Jessica is willing to let her family die.

Jessica: Not my family, just my husband guys.  

Kolby: Just her husband.

Jessica: I love you Alex.

(laugher)

Jessica: I swear, I would not let you die.

Kolby: And that because Jeremy has one, served in the military, we’d let him die first.

Jessica: Yeah.

Kolby: Because you signed up for life, right? You’re always, once you signed up your life, you’re the first one to go?

Jessica: I didn’t even think about that, but absolutely, but yea. Letting him die first.

Kolby: And we’re here in La Gattara with cats having a great times. We heard a couple of them having a disagreement over something. There’s probably a Hobson’s choice about the litter box. You either get one littler box or you get no litter box. And if you enjoyed this as much as we enjoy doing them, please  “like” and “subscribe”. We have a heck of a great time doing it. We’re glad that you’re watching and that allows us to continue doing it. Next week, we have another story. You have the email with the story list?

Jeremy:  The “Alpha Dye Shirt Factory”

Kolby: Oh the “Alpha Dye Shirt Factory.” This is a... I can’t tell if it’s a comedy or a tragedy, we’ll see, but the story is a fire breaks out, okay it’s a tragedy, out at a garment factory and one worker has to make a life or death choice. So, join us next week. And we’ll have that discussion and we’ll probably pick on each other a little bit more and pet some cats and have a great ‘ol time. Thank you for joining us. Bye.

Read More
ethics podcast, philosophy podcast Kolby Granville ethics podcast, philosophy podcast Kolby Granville

E8. "Lay On" - Can a person be evil in their mind, or must it be in the act?

STORY SUMMARY: Three outcast witches are sent to San Francisco in the 1960’s to redeem themselves by causing corruption. They find drug addict street musician and his girlfriend and promise him riches and success if he kills and does as they advise. He does, and becomes a music sensation. Eventually, his girlfriend leaves him. He is finally brought down after his conversion is complete. The witches decide to stay and enjoy Woodstock.

DISCUSSION: This is a variation on Macbeth. The witches don’t seem to ever do anything, but simply to encourage him to do things we wanted to do anyway. He is so quit to turn. But maybe it’s not hard to find a degenerate drug addict musician? Was he always evil, and now he just had the chance to act on it? It seems he has a hole that he can’t fill regardless of how famous or rich. Really, he needs to focus on liking himself.

BOOK LINK: Download the accompanying short story here.

MAGAZINE: Sign up for our monthly magazine and receive short stories that ask ethical and philosophical questions.

SUPPORT: Support us on Patreon.

FOLLOW: Twitter, Instagram, Facebook

“Can a person be evil in their mind, or must it be in the act?”

Kolby, Jeremy, and Jessica discuss the ethics in the fantasy short story, “Lay On” available for download on Amazon by Vera Burris.

Transcript (By: Transcriptions Fast)

Lay On

Kolby: Hi, and welcome back again to After Dinner Conversation, short stories for long discussions. Man, I’m totally going blank. Okay.

Jeremy: I’m your host…

Kolby: I’m you co-host Kolby...

Jeremy: I’m  your co-host Jeremy.

Kolby: And my co-host Jessica.

Jessica: Jessica.

Kolby: Who’s now a veteran, this is her 4th episode.

Jessica: My 4th…

Kolby: 4th episode.

Jessica: 4th and final.

Kolby: No…

(laughter)

Kolby: No, we’ve got you for a while. After Dinner Conversation is a series of short stories and podcasts to have discussions about all these stories to encourage…

(cat meow)

Kolby: That’s a lot of cat going on over there.

(laughter)

Kolby:  ...to encourage deeper discussions about ethics and morality and the things that matter so you’re not just spewing out the same 3 lines you saw on some TV show.

Jessica: Or on Facebook.

Kolby: Or on Facebook, for sure.

Jessica: Where are we at?

Kolby: We are La Gattara, cat café, which is why we have cats going everywhere freaking out.

Jessica: It’s a cat lounge.

Kolby: It is a cat lounge, although they don’t serve alcohol, but they do have cats. And they’re all available for adoption or you can pay $10 and come and just pet them and then you don’t have to have all of your stuff be pushed off of countertops at home. So, it’s like all of the benefits of cat with no cat. Also, you can download these stories…

Jeremy: The cats.

Kolby: Not the cats.

(laughter)

Kolby: Man, you’re really energetic for an early morning.

Jessica: I know.

Kolby: You can download all these stories on Amazon as well as…

Jeremy: Wherever you download e-books.

Kolby: … wherever you download e-books, podcasts, YouTube. And by all means, if you’re enjoying watching these, “like” and “subscribe”, it makes us feel good, it helps us do this more. Okay. So, our story for today is “Lay On.” Jessica, you’ve got the introduction for this one, I guess?

Jeremy: And who wrote it?

Jessica: “Lay On” was written by Vera Burris. Vera Burris, sorry. Vera Burris. And this story sets us in 1969. Our two main characters are Christopher, a musician, and his strung-out girlfriend Polly. And during, I think they’re on the street, and three women approach with a lot of money and give it to Christopher and tell him he will have fortune and will be the kind of music festivals and the envy of all those who wronged you and he asked, “Who do I have to kill first?” And they say, “You’ll know.” And then they go on and Polly arranges an audition for Christopher at a local bar. When they go, the guy who runs the bar basically says, “You can play but only if the other performer is a no-show or can’t go on.” And then Christopher decides that that must be the person he needs to kills and so he pushes that person in front of, I think, a bus?

Kolby: Something, something big that kills him immediately.

Jessica: Kills him immediately. Yeah. So, he gets to play and when he plays everybody loves him and he continues to get gigs and continues to do really, really well. During that, all the witches kind of look in and follow his progress and you learn that the witches were kicked out of their coven, I guess.

Kolby: Witch’dom.

Jessica: Witch’dom.  And they’re trying to earn a place back in and with Christopher, they’re hoping will provide them the way of earning their place back in by being, I think by being evil. We’re really not sure what is the criteria of getting back in but just like, making him do awful things. So, they continue to watch him and say things like, “What does a good man with power do?” And other frightening lines like that. And then near the end, he’s a big deal, I think, help me out guys, what happens with Polly?

Kolby: She leaves him, I think, because she’s an addict or something.

Jessica: Right. And he won’t continue to support her because…

Jeremy: Because he’s got all these other girls on the side.

Kolby: That’s right. He’s hooking up with a... he’s got groupies now.

Jessica: He’s got groupies now. And at end, the witches come back and ask him if he’s regretful…

Kolby: He gets put in jail.

Jessica: Yes, he gets put in jail and Polly is not. And…

Jeremy: She calls the cops on him.

Jessica: She calls the cops on him.

Kolby: I feel like Jeremy is the only one who’s done the reading this week.

(laughter)

Jessica: Hey! Hey! I did the reading.

Kolby: Uh-huh.

Jessica: I did the reading.

Kolby: Uh-huh.

Jessica: It’s more important about the witches. And then the witches, the queen of the witches, Hecate, which we love from Lady Macbeth. I mean Macbeth, you can call it Macbeth, I call it Lady Macbeth. And Hecate comes back and lets the witches back in the coven because they have earned their evil place.

Kolby: But then in the end they’re like, “We’re going to hang out and go to Woodstock or something.”

Jessica: Yes, they do. They go hang out and go to Woodstock.

Kolby: Instead.

Jessica: That’s true.

Kolby: I mean, who wouldn’t if you could go to Woodstock?

Jessica: I mean…

Kolby: I mean not the 1999 Woodstock.

Jeremy: The 1969 Woodstock.

Kolby: The 1969 Woodstock, yeah. Jessica, you have thoughts? Things you liked, didn’t like? Enjoyed, didn’t enjoy?

Jessica: I very much enjoy a witch story, so anything with witches I’m going to read. I love a good witch story. I do wonder… my feeling with the, just to take the witches perspective for a moment there, so the witches get kicked out of this coven and they got to find somebody in order to manipulate to, I guess, cause mayhem and bedlam…

Kolby: Undisclosed bedlam.

Jessica: Undisclosed bedlam, yes. And I think that following their narrative so they find this guy and they’re trying to influence him, and then to get back into this coven but then they hang out because they’re having so much fun in 1969. I don’t know. I feel like it was an awful easy win.

Kolby: For the witches?

Jessica: Yeah! It’s like…

Kolby: It’s not hard to find a wrongdoer.

Jessica: It’s not hard to find a musician wrongdoer, let’s be clear.

Jeremy: Especially they bring up the Manson murders at the same time, which presumably is a much bigger evil than whatever Christopher did. And Hecate is like, “You guys do this? He did this on his own. Nobody had to help him, so why is this special?” But she still lets him in. And I guess that millennial participated.

Kolby: Did you just say Charles Manson got a participation prize?

Jeremy: No, the witches did.

Kolby: Oh, oh

Jessica: Christopher did.

Kolby: Alright.

Jeremy: The witches in the story.

Jessica: The witches in the story got a participation ribbon for Christopher, the effort of Christopher. So, I think…

Kolby: It’s simple for you.

Jessica: I don’t know if it was simple. I would love to know what you guys think about this idea of what makes evil win? If evil, presuming that the witches are a stand-in in the story for evil…

Kolby: Which is a little big anti-witch.

Jeremy: It is. From the research that I did…

(laughter)

Jeremy: From my knowledge…

Kolby: We should turn this into a drinking game every time you say “From the research that I did.”

Jessica: Yeah.

Jeremy: That’s not how Wiccans act. But it is a modern look at witch-craft is the recent Wiccan movement which started in the 60s. So, you know,

Jessica: I think this is definitely like the witch.

Jeremy: The Christian view.

Kolby: It’s like a stand-in of evil.

Jeremy: Right.

Jessica: It is like, the Macbeth view of…

Jeremy: Of witches.

Jessica: Of witches.

Kolby: Actually, I thought it was a Joker stand-in from the Heath Ledger Joker. Like, “I just do chaos. That’s my job.”  “Why?”   “I don’t know, it’s just chaos, this is what I do. ”

Jessica: Oh interesting. So, you don’t see the witches as evil but witches as chaos?

Kolby: Yeah. I did.

Jessica: Oh, that’s interesting.

Kolby: Obviously, a lot of their dialogue talks about their desire to tempt the worse in us. But it seems like generally they just wanted bad things, just bad thing, just craziness afoot. There’s a reason they’re staying at Woodstock.

Jessica: Okay.

Jeremy: Is to promote heathenism…

Kolby: To promote heathenism, yeah. But certainty...

Jeremy: Or hedonism.

Jessica: I was going to say, I can see that that’s an interesting take Kolby. I didn’t think about that. I did assume it was just evil, but the idea of it just being chaos or hedonism, or whatever, attempt the worst in us is an interesting…

Kolby: What’s the thing in movies where they have a sacred object that everyone has to get?

Jeremy: The MacGuffin.

Jeremy: MacGuffin. Yeah, I feel like the witches are a little bit like a MacGuffin in like, “We just need evil”, “okay we’ll make witches.”

Jeremy:  They’re just a metaphor.

Kolby: But it could have been an evil vase, it could have been a magic 8 ball, it could’ve been… they didn’t act in a way that sort of was dynamically interactive.

Jeremy: No, they’re just providing access..

Kolby: Yeah, they’re providing the catalyst to temp this person to see what this person is really like.

Jeremy: Right. “Here’s cash and you’re going to know what to do to make the best of the situation.”

Kolby: And I think that is a little bit different than Macbeth, in that in Macbeth the witches are much more. They are the cause and the effect depending on how you look at it. They tell you you’re future, but the fact that they told you, makes you now think that future is possible, and if they never would have told you, would you have done it?

Jeremy: And that’s what they do to Christopher. They tell him, “You can have fame and fortune, and you’re going to know what to do to get there.”

Jessica: Right.

Kolby: So, one of the things I did wonder about in this story was, did they do anything or were they only encouragers? They said, “You’re going to have to kill a person.”

Jessica: No, they didn’t say that.

Jeremy: They didn’t.

Jessica: He said, “Who do I have to kill?” And they said, “You’ll know.” And I think that’s an intriguing part of it to echo off what you’re saying Kolby. I’m not sure they did anything, there’s no…

Jeremy: They were just enablers, they were “let me hold your purse”.

Jessica: Let me hold your purse, yeah!

Kolby: But the whole pushing the person that may have worked out the same way with or without them, right? He’d have pushed that person in front of the vehicle, witches or no witches; it might have gotten him the gig, right?

Jessica: And what I find interesting, I don’t know if anybody else read it this way, so you’re reading the story and the witches say or he says, “Who do I have to kill?”, and the witches say, “You’ll know” and they walk into this bar and the guys going to be on stage, the first performer, I can’t remember his name, and he’s like, “Maybe I have to kill him.” It’s so quick.

Jeremy: It is super quick.

Jessica: It’s like, “Maybe I have to kill him?” And I was like, so, you know this idea of tempting the worst. Like I would like, I don’t know, a thousand signs before I push somebody in front of a bus. Not that I would push anybody.

Kolby: You would need like a big neon arrow being like, “This guy”

Jessica: “This is the way to your future”

Kolby: “In case you’re not sure”

Jeremy: You see the money sticking out of his pants.

Jessica: Yeah.

(laughter)

Jeremy: Yes, here’s your thousand signs.

(laughter)

Jessica: Right, right. So, pushing him in front of the bus came really quick but then I thought also, that might just be Christopher. Christopher is like, “Okay? This is it? Okay, I’ll do this.” And so, I would love to see a story that plays with the, if anybody wants to write for submissions…

Kolby: Which we are always accepting.

Jessica: … that wants to write a story, I would love to see something that tempts somebody in the same way, but then…

Jeremy: It’s not an easy tempt.

Jessica: I would not say it’s an easy tempt. It’s just, what if, had Christopher pushed that person in front of the bus and then it not paid out? So, then they’re like, “Oh, okay…”

Kolby: So you want to make them fake witches?

Jessica: No, I mean….

Kolby: “Oh, we didn’t mean that guy, we meant the guy to his left. Oh, you’re going to have to push another guy.”

Jessica: Right. “You’re going to have to push another guy.” Or another guy or another guy. In this story it’s clear that this is the string of chaos that they’re looking for. And it works out for Christopher but, what if it’s not working out? What if it’s this temp that doesn’t work out and then you just keep trying? Do you keep trying?

Kolby: It’s like a tragic comedy.

(laughter)

Kolby: Like, “How many people do I need to kill? All of them.”

Jeremy: Yeah, isn’t that a Cohen Brother’s movie? A Simple Plan?

Kolby: Yea, that sounds like a Cohen Brother’s movie. So, one of the things I found really interesting about this story, for me, was the idea of, is Christopher bad in that he’s done bad things, or is he bad and now he’s been giving opportunity? The three of us are sitting here, are we secretly evil we just haven’t gotten…

Jessica: Let’s be clear- there’s no secret. I am evil.

(laughter)

Kolby: Okay, in your case. Were you evil before you did anything or because I feel like, is evil in the action or is it in the mind? And even if the mind never gives choice to that action, does it still make you evil? Because there are certainly times where I’m driving a car and somebody cuts me off, I’m like, “oh, if I could get in that car with a baseball bat, I would take care of them right now.”

(Laughter)

Kolby: But luckily, they drive away before I can pull up alongside them with a baseball bat.

Jessica: You need a podcast for when you drive. Clearly.

Kolby: I need something. And so, I do wonder if, you know that saying- “if you want to know what somebody’s made of don’t give them adversity, give them power” – because….

Jeremy: Right. Absolutely.

Kolby: Because there’s a reason that kings and dictators become the worst people often because they have the ability to be the worst people.

Jessica: That’s an interesting question. In particular in the story with Christopher, he’s tempted and he immediately throws somebody in front of the bus. But, then when he gets power, he ditches Polly who was there to help him. So, there’s not even the redeeming quality there where he’s at least loyal.

Kolby: No, no, no. I think once you’ve gone evil, you’re… I think he’s entirely self-centered. And I don’t even think it’s about the music, it’s about...

Jessica: The power.

Kolby: It’s about being loved. Being loved, adored, being rich, being… and so you look, and I think this is relevant in the situation we’ve been in the last couple of years if you look at like R-Kelly, if you look at Bill Cosby, if you look at all these people who are…

Jeremy: Empowered and entitled.

Kolby: Yeah, they are empowered and entitled and they are rich and there are people who sort of fix the things that….

Jeremy: That go wrong.

Kolby: … that go wrong and it happens again and again. And over a period of time you think…

Jeremy: “I can do anything.”

Kolby: Yeah. “I can do…

Jeremy: “I can kill somebody in the street and…”

Jessica: “… and nobody would care.”

Kolby: Wow. Where’d you get that quote from I wonder?

Jeremy: I don’t know.

Kolby: Yea, but that’s exactly it right? And so the question is, when you get that ability, I think there are more Christopher’s than we would like to think. And the only thing that keeps people from doing these things, and not in all cases but in many cases, is just I’m not a bajillionare, I’m not worth a million dollars, I’m not…

Jeremy: Or there’s a process through growing up where you develop a strong moral character. And even though people can be tempted and can do wrong in doing things that they think are right. But if you don’t have that strong moral character, or that hasn’t been developed in you, I think we all fall back on selfishness.

Jessica: Okay, so walk me back just a little bit with that statement. So how does one develop a strong moral character?

Kolby: I’m going to follow up with something, how do you know you have one until you’re given power instead of adversity?

Jeremy: That’s true. I don’t know.

(laughter)

Jeremy: Personally.

Kolby: As a person that doesn’t have tons of power, yeah.

Jeremy: I think in previous episode you talked about your daughter showing her or that we should teach the consequences of what people are doing, the good and the bad of everybody, and you realize….

Jessica: Yes, when we talk about history, don’t just tell a hero; sell the whole story of that hero so they understand that everybody is a complicated mess.

Kolby: Know that even good people are complicated.

Jeremy: There are consequences to all the actions. Everything you do. So, understanding that and there’s good and bad in everything, I think helps develop the ability to choose a moral path.

Jessica: So, okay, so, I’m following you. But, one of the things I would have to say, especially with our heroes, there are lots of times when there are no consequences for all the bad they did.

Kolby: Particularly historically.

Jessica: Especially historically, especially during what we would not consider bad. Slavery, that wasn’t necessarily considered bad back then, at least by the people that were doing it. And so there weren’t consequences, so how do you teach consequences? How do you teach that moral barometer when… I mean, there are lots of people now that are doing horrible things that have no consequences to it. R-Kelly still gets record contracts. Bill Cosby, is he in jail now?

Jeremy: Yeah.

Kolby: Probably still selling records.

Jessica: Probably still selling records. Somebody once said, they didn’t know me well, so they said I had a very strong moral barometer. But they said how did you learn that? And they assumed I was raised Catholic, I am not Catholic, but I was raised Catholic, and they wondered how my daughter would be raised to have a strong moral barometer without being raised with religion. And when we talk about…

Kolby: We were texting about this last night. This drives me nuts.

Jessica: I agree.

Kolby: This idea that people think that religion provides morals and without religion you can’t have morals. And I would argue just the opposite and this is a little bit of a tangent. I would argue just the opposite that if I’m not pulling the lever because I’m afraid I’m going to get shocked, that’s not a moral, that’s a fear. So, if I follow Christianity or whatever faith I believe in, because I’m worried about someone watching and judging and scoring me, those aren’t morals, those are fears. Those are fears of retribution. Moral is, to me, when you’re like, “Look, nobody is never going to know. It’s the perfect crime. No one will ever know. But no…“

Jeremy: But it’s still wrong.

Kolby: “I’ll know.” It’s wrong to me, so it doesn’t matter at all. To me that’s morality. And so, I think Christians can be moral, but if their reason for being moral is just…

Jeremy: The fear of punishment.

Kolby: … is the fear of punishment. That’s not morality. That’s not morality. Which is why it drives me nuts that people like non-Christians or non-religious people can’t be moral. I’m like, no, they can only be moral because they don’t believe there is a punishment waiting for them except for their own self-identity and their disappointment in themselves and their choices.”

Jessica: I totally agree. That’s a nice rant. Good job.

 Kolby: Thank you. I’ve been storing that one up for a while.

(laughter)

Jeremy: Just for this story.

Kolby: Just for… that’s part of the reason for these podcasts, right? It’s the idea that you can have these conversations about the ethics and morals of these characters and these things they’re doing and choice they’re making and it’s about…. and I think the way sometimes you develop those morals, one: you just inherit them from your parents until you can judge them, but also I think the issue of just, you live life and you’d like, “wow, I was treated that way and I don’t like it.”

Jeremy: And I treated other people this way and now I have to deal with the consequences of that.

Kolby: Right. And now that’s I’m seeing it for the third time, I’m like, “Look, I’m not doing this because it is or isn’t, I’m doing it because it’s not just who I choose to be because I know what it’s like to be the kid that’s taunted on the playground or…”

Jessica: Or that kid that excluded.

Kolby: Or the kid that’s fat shamed or teased about being LGBTQ or whatever the case. And I think the interesting thing about this story, see the way I tied it all together, is if you take someone who’s in their, I assume the musician’s late teens/early 20s, who’s living on the street, who’s drug addicted or at least has a drug addicted girlfriend, and you say “Here is everything.” I am terribly disappointed that he just goes in this spiral of evil, but I am not the least bit surprised. And I think that he is the norm and not the exception.

Jessica: Really? I think I disagree. I’m a humanistic at heart. I know. So evil and still so humanist.

(laughter)

Jessica: I think it’s interesting, I think Christopher’s spiral can be indicative of some people and especially we can talk forever about Christopher’s decision and why he made the choices and how those choices are not necessarily choices. You’re living on the street, everything is horrible, and somebody says “You can have all the money you want if you just do this one thing…” it’s really hard to say not to that when life is so terrible.

Kolby: Particularly if nobody’s ever… if you’ve never had the experiences and sort of mentors explain to you why this is the wrong choice.

Jessica: I don’t know, I mean, I think society explains to you why it’s not he right choice. It’s not like Christopher pushed that kid in front of the bus and was like, “Oh, I’m sure everybody will be okay with this.”

Kolby: That’s true. I mean, laws are the codification of the community morality.

Jessica: And I think we can extend this metaphor into larger… like, this is why crime is committed in terrible neighborhoods because there is no way out unless we do this thing. What surprises me about Christopher’s character and what I would probably argue is that, there was no shame spiral, there was no…

Kolby: He doesn’t die realizing like, oh…

Jeremy: Right. We don’t really get Christopher’s perspective.

Jessica: That’s true.

Kolby: We get the witches perspective.

Jessica: And what I will say is, I think it would be… I think more people are apt to push the person in front of the bus to get out of a terrible situation and then feel shame and regret and not just fear but like, just terrible sadness, but also try to receive redemption in other ways. And I think that’s why the witches get back into the coven is that Christopher didn’t go that route. He didn’t have the “Okay, I’m now in a place of power where I have solved my issues and I can be a better person.”

Kolby: “I can lay off the horrible because now I can eat.”

Jessica: Right.  “Now I can eat. Now my Maslow’s pyramid of hierarchy of being an okay human being is taken care of, so now I can work on self enlightenment.”

Kolby: So, do you put these into two tiers of issue?  Like, one is the “I need, and so I have to steal or do whatever”, versus “I live without fear and so I do as I choose.”

Jessica: Right. Right.

Jeremy: I would agree with that.

Jessica: Yeah, I think that’s true. And we see a little bit of Christopher’s character in the very beginning where he’s trying to get enough money for Polly to get a fix. This isn’t just him feeding himself; this is him trying to take care of someone else.

Kolby: You ready to have your mind blown? I’m about to blow your mind.

Jessica: Blow my mind.

(laughter)

Kolby: What if...

Jessica: What if.

Kolby: ...they’re actually the same in that in the beginning for Christopher, in the beginning the issues are I’m hungry, I’m whatever, I’m whatever, I want people to like me but the real issue is I don’t like myself. I don’t feel loved. I have a hole in me I can’t fill. Once that is in him, whether you’ve got a little bit of money or a lot of money or a couple of groupies or a lot of groupies or a couple people to sleep with you or a hundred of people to sleep with you, that hole never gets filled.  That’s the sort of Maslow’s pyramid where, yes, you’ve got food and shelter, but what you don’t have is a sense of love.

Jeremy: Emotional support.

Kolby: Emotional support  and love and all those things. You want to talk about a list of people that are not every going to get married? Rockstars because  that hole is still not filled in the way it can be and so, he’s continuing to try and meet his sort of pyramid of needs but he doesn’t understand that money and groupies is not the thing that fits, that sort of fills the last spot in the pyramid. Boom.

Jessica: It did not blow my mind.

Kolby: It did not blow your mind?

Jessica: Sorry.

Jeremy: But again, does that come down to choice? That comes down to a choice where what would fill the hole is enlightenment and...

Kolby: Yoga.

Jessica: Right. Sting.

Jeremy: ...  becoming a good person at that point.

(laughter)

Kolby: So, yes, so here’s one I’m always amazed by, and I’m broad brush stroking this right? So, Harrison Ford is a carpenter and gets picked up for a couple of movies and Star Wars and Indiana Jones and all these things. Then he becomes, at one point, he had starred in like 7 of the 10 highest grossing movies at one point in like 1989 or something. And what does he do? He marries Calista Flockhart or whatever her name is. And goes to a ranch in Wyoming where he just lives. Right? And that’s it. And he does a few movies once in a while that are pretty mediocre but he’s full. That to me is... that’s our person that could’ve been Christopher because he’s literally Han Solo. That guy is not hurting for lady friends or money or jobs or anything. And he’s like, “no, I’m perfectly happy learning to fly a Cessna, living in Wyoming, and doing cowboy stuff.” That to me, is what you wish Christopher were. A person, maybe that they don’t have to remove themselves from society, but that they understood that I am enough without the fact that I’m Han Solo or I’m whatever.

Jessica: Yes. I think you’re correct in that Christopher has a need greater than Maslow’s pyramid and that is what makes him continue to be such a jerk for the rest of the story.

Kolby: Right. He never has enough. He never has enough, where he’s like, “Maybe I should go to a cancer shelter dressed up as Iron Man or whatever.

Jeremy: And is this why they choose him?

Jessica: Oh, Yes! Okay.

Kolby: That’s an interesting question.

Jessica: Oh, now I get it. Yes. And that’s why he gets back, why they get back in, is that they can pick out that.

Jeremy: And enable the...

Jessica: The, yes.

Kolby:  ... the worst in a person who’s unable to fill that worst.

Jessica: Yes. Or, you know prevent them from finding the people that would fill that.

Kolby: Yeah. I’m going to wrap this up after one last question. I’m going to give it to you Jeremy since you’ve been a little bit quieter on this one. Do you think the witches are evil or do you think the person that sort of hands you the machete evil or is it just a machete and you’re the one that chooses to make it a tool?

Jeremy: Morally ambiguous for sure. It depends, again, what are their goals? Is it really to foster evil? Then yes, I would say they are evil in that goal. It isn’t just a machete, it’s let me hold your purse, here’s a machete.

Kolby: Yea, here’s the reason I bring it up is particularly the climate we have right now, there are some that would argue that words are the equivalent of handing someone a machete. By telling someone it’s okay to hate, it’s okay to dislike, it’s okay to be xenophobic, it’s okay, that you’re metaphorically putting that machete in many people’s hands. And some of them may never choose to swing that machete, but you are at the very least giving them the green light to have the machete.

Jeremy: Exactly. You know, who do I have to kill? You’ll know.

Kolby: You’ll know. Who do I have to hate? You’ll know.

Jeremy: And it’s really in that statement they’re enabling him to be his worst.

Jessica: And I would argue that the witches are maybe just a little bit different than the idea that words are the machete in that, I think, if...

Kolby: They’re a little more encouraging.

Jessica: Well, I would say is the witches are enabling one person. I think if we wanted to see a truer, at least a truer metaphor, if the witches were making it normal that we push people in front of busses, that is the words in our society. Being able to say, “Yes, these words are terrible but also we’re not going to judge you if you use them. We’re not going to judge you if you push people in front of busses. Go for it.” That is where the metaphor becomes more parallel.

Kolby: Yea, that’s fair. You have been listening to After Dinner Conversation with myself Kolby and Jeremy and Jessica. We were talking about “Lay On.” If you enjoyed this story and haven’t read it yet, you can get it on Amazon as an e-book, you can go to the website Afterdinnerconversation.com. If you enjoyed this podcast please “like” or “Subscribe”. It’ll make us happy and be able to do more of these. And once again we are at La Gattara where they have cats that are available for adoption for $20, $30 or whatever. Or you can just pay $10 and come hang out with cats and they have Wi-FI and why buy a $10 espresso so you can use their Wi-Fi when you could perfectly well hang out with cats and use their Wi-Fi and there are cats.

Jessica: Bring your own espresso. You make it the way you want it anyway.

Kolby: Bring your own espresso. Whatev. And if you have a story idea like this in your head, you’ve got your narrative version of the trolley problem, then email it into us from our website afterdinnerconversation.com and we will take a look at it, and if it’s good enough, yours will be one of the ones that we discuss here. Thank you.

Read More