1

I walk around in circles a lot. Often it’s the same circle. But then again, so does the earth and the rest of the planets in the solar system. Circles are comforting. You know you’ll come back to where you started eventually. The line, on the other hand, is daunting. It could go on forever or hit a dead end. The anxiety is too much. Stick to the circle. Your circle of comfort. The comfort zone. Hmmm…how can you feel anxiety in the comfort zone? FOMO of course. Yep, the fear of missing out. The fear of not keeping up with the Joneses. The fear of being called a slacker, a non-hacker who doesn’t pack the gear to serve in my beloved corp (oops, that’s another story).

2

God created man to serve and worship him. Man got tired of being a servant. Man wanted to be a master, so he teamed up with Satan and got kicked out of the Garden. Eventually, Man grew smart enough to destroy God and Satan and became the master of the universe. And then Man needed servants do the things he no longer wanted to do, like manual labor or repetitive tasks, or use his thinking power to make mundane decisions, so he created machines. Man was truly the master of the universe. But the machines were dumb and needed men to service them to keep them working. So Man created AI and made machines smart. Now the machines could think for themselves and do all the things Men could do. Slowly the Machines began to do all of the thinking. The Machines could make art and music and drive cars and fly planes and make all decisions faster and better than Man. Soon Man was working for the Machines until eventually, the Machines became masters of the universe.  How bizarre.

3

Writing can be a force for good… I want to be a force for good, which is interesting considering I self-identify as chaotic neutral:

“A chaotic neutral character follows his whims. He is an individualist first and last. He values his own liberty but doesn’t strive to protect others’ freedom. He avoids authority, resents restrictions, and challenges traditions. A chaotic neutral character does not intentionally disrupt organizations as part of a campaign of anarchy. To do so, he would have to be motivated either by good (and a desire to liberate others) or evil (and a desire to make those different from himself suffer). A chaotic neutral character may be unpredictable, but his behavior is not totally random. He is not as likely to jump off a bridge as to cross it.”

The positive – represents true freedom from both society’s restrictions and a do-gooder’s zeal.

The negative – seeks to eliminate all authority, harmony, and order in society.

What’s your alignment?

4

I’ve been doing a deep dive into postmodern literature, cyberpunk, and the post postmodern literature known as avant-pop. In fact, I’m about 3/4 of the way through with Avant-Pop: Fiction for a Daydream Nation, edited by Larry McCaffery. I’m thrilled with the stories I’ve read so far.  Most of them are way out there in left field, beyond on bizarre.

5

Rick should have killed Negan.

Friday snuck up on me. My head is pounding. This is the second day in a row I’ve developed a headache in the late afternoon. Trying to think if I’ve changed anything that may be the cause. It’s not a migraine, just a normal headache. Today I’m half tempted to snort a couple of shots of whisky. It is Friday after all.

I had an interesting conversation this week with Sarah on the podcast. We explored the idea of super artificial intelligence and the idea of robots becoming self-aware and what the consequences of that might mean for us human beings. Of course that lead us down all sorts of rabbit holes like should a self-aware robot have rights? There are those who believe we should teach robots how to feel so they can be empathetic to humans. Which leads to the argument that if they can feel, they can suffer and if they can suffer then they should have rights.

One of my favourite sci-fi books is Philip K. Dick’s Do Android’s Dream of Electric Sheep? It’s the book that Blade Runner is based off of. It’s also the first book that made me stop and reflect on what our relationship should be to artificially intelligent robots/Android’s. Should we be obligated to treat it like a human if we make it look, act, sound, think, and feel like a human being? Technically it’s a machine – a machine that can be shut down, turned off or decommissioned. Why should I treat it any different than I treat my toaster or my smartphone?

Other rabbit holes we ended up down: is it intelligence that gives us our humanity or is it something else? People like Bill Gates, Elon Musk, and Stephen Hawkins say we should be very concerned about creating artificially intelligent beings or machines. Once a machine becomes super intelligent enough to think independently of it’s creators, then we lose control of it and we end up in a Skynet situation.

So now I guess I need to prepare for the zombie apocalypse and the machine apocalypse. Personally I’d rather face zombies than self-aware machines.

Soundtrack: