DLGP

Doctor of Leadership in Global Perspectives: Crafting Ministry in an Interconnected World

What’s Your Point of View?

Written by: on January 15, 2024

 “ . . .The crew never believed they had failed. Instead they believed that each idea led them a bit closer to finding the better option. And that allowed them to come to work each day engaged and excited even in the midst of confusion. This is key.”[1] 

 

In 2014, Ed Catmull wrote a fascinating story about how a handful of smart people built something that profoundly changed the animation business and popular culture.  The book is called, Creativity, Inc: Overcoming the Unforeseen Forces that Stand in the Way of True Inspiration.  It’s not just a book about being a creative, rather, it’s a story on how to build a creative culture. Catmull writes how he and his team at Pixar developed methods to root out and destroy barriers to creativity while maintaining excellence. The gold of the book is the wisdom he offers to sustain a culture of disciplined creativity in the face of failures and setbacks. Catmull writes that experimentation is seen as necessary and productive, not as a frustrating waste of time. People will enjoy their work even when it is confounding. 

In Eve Poole’s, Robot Souls: Programming in Humanity, she asks practical questions like, “What is AI?” How Should We Relate to AI?”  “Will AI Replace Us?”  She then asks a curious question that reflects my own thoughts: “Would We Want to Design Perfect Beings?”  

Would we?  Would you want to be perfect? If experimentation as humans is necessary and productive, helping people enjoy their work, then what do our future students work toward if failure is taken off the table?

Perhaps there is an emotional consequence at play here that I’d like to explore for the rest of this blog.

Contemporary writers[2] who contribute to the AI conversation raise similar concerns about “Junk Code”[3]: In light of what AI can produce for us, it seems paramount to preserve our messy emotions, our unshakable ability to keep making mistakes, to tell stories, and to find meaning all around us. Even legendary music producer, Rick Rubin, confesses how something transpires in an artist’s soul through the agony of the creative process.[4] 

What will education become if it’s not teaching students the ability to cope with uncertainty? Will we merely become a society training people what to do and no longer inviting them to think?  If I am following Eve Poole’s logic accurately, (please correct me if I’m misunderstanding), she believes we’ve left out all the important parts of human beings in our robots; she fully believes we have the capacity to make soulful humans.

Three Instructive Questions Come to Mind:

  1. Will Students Forfeit Meaningful Knowledge Last year, we learned about threshold concepts, the point of entry or beginning. We read how the magnitude or intensity must be exceeded for a certain reaction, phenomenon or result for some kind of new learning to be manifested.  Meyer and Land[5] admit we’ve lost the essence of discovery when questions are no longer asked.  What I learned from their work was how, “breaking through” understanding and interpreting needs transformation to progress.  Like Ed Catmull’s crew at Pixar who believed that every idea led them a bit closer to finding a better option, overcoming barriers in learning must come through deep questions riddled with uncertainty of self, identity and purpose. Eve Poole notes how we will need to teach AI the, “etiquette of the strategic request for advice, and the range of interpersonal moves that resemble a request for information even when the answer is already available.”[6] In our quest for perfection and making life easier with AI, will our future students forfeit meaningful knowledge and lose a sense of depth?
  1. What happens to a Student’s Point of View? In simple terms, point of view is a set of perspectives and deep seated beliefs that when held together help form a person’s opinions[7]. Rick Rubin goes a step further when he emphasizes how songs, plays, stories, books are all written from a person’s point of view–and he believes that’s what we lose to AI: Point of View. A point of view is formed through our childhood experiences, relationships, imaginations, disappointments and romantic breakups. As Eve Poole poignantly iterates, “ . . If we are to establish an [emotional] baseline that is more AI centered, we need to prioritize self-awareness . . This raises an ethical question about whether we should program emotions into AI . . ?[8]  It seems our future students must have a stronger sense of their point of view and not less if they are to work with the emotions of AI. What happens to a student’s point of view if failure and setback are no longer a part of their learning or childhood?
  1. Would Fear Replace an Invitation to Think in our Educational Systems? Martyn Percy spoke to us last Fall during our time at Oxford about teaching, learning and educating. His words have stayed with me longer than I expected. He reminded us that we must teach people suspicion by asking, “Could it be different?” Remember when he asked what the difference was between theological training and theological educating? We could apply that question to a variety of pillars of learning. The agenda of training is based on fear while the agenda of educating is an invitation to think.[9] 

Wait. . . 

Might my Point Of View be all misguided? My questions may be all wrong. What If a robot were sitting in one of our classrooms with meaningful questions, uncertainty, a point of view? Could we invite a robot to think and not train them in fear?


[1] Catmull, Edwin E. “Creativity, Inc.: Overcoming the Unseen Forces That Stand in the Way of True Inspiration.” Toronto, Ontario: Random House Canada, 2014.

[2] Turkle, Sherry. “Reclaiming Conversation: The Power of Talk in a Digital Age.” New York, New York: Penguin Books, an imprint of Penguin Random House LLC, 2016.

[3] P. 142 Poole defines Junk Code: “In computing, redundant code that could be deleted or rewritten in shorter syntax without affecting the execution of the program; in this book, those human characteristics that have been deliberately excluded from AI as being irrelevant, distracting, or dangerous, like the emotions, mistakes, story-telling, Sixth Sense, uncertainty, free will and meaning.”

[4] Rubin, Rick. “The Creative Act: A Way of Being.” Edinburgh: Canongate, 2023.

[5] Meyer, Jan, and Ray Land. : : Threshold Concepts and Troublesome Knowledge. London: Routledge, 2006. https://doi.org/10.4324/9780203966273.

[6] Poole, Eve. P.123.

[7] Oak and Reeds. “Can AI-Powered Robots Have a Point of View?,” January 23, 2019. 

[8] Poole, Eve. P. 117-18.

[9] Percy, Martyn. Oxford Advance. 23 October 2023.

About the Author

mm

Pam Lau

Pamela Havey Lau brings more than 25 years of experience in speaking, teaching, writing and mediating. She has led a variety of groups, both small and large, in seminars, trainings, conferences and teachings. Pam’s passion is to see each person communicate with their most authentic voice with a transparent faith in Jesus Christ. With more than 10, 000 hours of writing, researching, and teaching the heart and soul of Pam’s calling comes from decades of walking alongside those who have experienced healing through pain and peace through conflict. As a professor and author, Pam deeply understands the role of mentoring and building bridges from one generation to another. She has developed a wisdom in how to connect leaders with their teams. Her skill in facilitating conversations extends across differences in families, businesses, schools, universities, and nonprofits. Pam specializes in simplifying complex issues and as a business owner, has helped numerous CEOs and leaders communicate effectively. She is the author of Soul Strength (Random House) and A Friend in Me (David C. Cook) and is a frequent contributor to online and print publications. You can hear Pam’s podcast on Real Life with Pamela Lau on itunes. Currently, Pam is a mediator for families, churches, and nonprofits. You can contact Pam through her website: PamelaLau.com. Brad and Pam live in Newberg, Oregon; they have three adult daughters and one son-in-law. One small, vocal dog, Cali lives in the family home where she tries to be the boss! As a family they enjoy worshiping God, tennis, good food and spending time with family and friends.

14 responses to “What’s Your Point of View?”

  1. mm Tim Clark says:

    Pam, you had me at Ed Catmull, and then you kept me with Rick Rubin. Creativity Inc. is one of my favorite books (me being a huge Pixar nerd and creative organizational fan), and Rick Rubin has produced some of my favorite work (Run DMC? Beastie Boys? Red Hot Chili Peppers? Tom Petty? JOHNNY CASH???? Come on!!!)

    I THINK (I read the book quickly) Poole is suggesting we program failure into AI, and honestly I don’t know what is more frightening. If we don’t program failure and other junk code into it we end up with psychopathic machines who can have no sense of empathetic context and will only make ‘efficient’ decisions, and since humans aren’t that efficient, that can be bad for us.

    On the other hand if we program junk code failure into AI those failures can be pretty big. One air traffic controller failing might bring a plane down, but one air traffic control system AI failing could bring every plane in the sky down.

    I don’t have answers just a lot of questions. I too wonder how advanced AI will impact students sense of self and learning by failing and the wonderful creative process in which junk code is absolutely essential.

    • mm Pam Lau says:

      Tim,
      I am remembering that in a former life you were a rock star, right? When I was listening to an interview with Rick Rubin, I was so impressed with his worldview, routine of life and sobriety considering all the musicians he has worked with for more than 40 years!

      When you really think about it, do you perceive that a robot can have a soul, sense of right/wrong, and emotions??

      • Jennifer Vernam says:

        Jumping in this thread with my two cents: when I read (also quickly) the book, I read the section on Junk code more as observations like:
        – how humans are different from AI (which was reassuring)
        – how telling it is about ourselves that we are trying to get mistakes OUT if AI… disregarding the benefits of Junk code
        – I also inferred that she was suggesting that we would never be able to program souls into robots.
        I am now wondering if I missed the place where she said this was actually achievable.

        • mm Pam Lau says:

          The best part about these threads are the thoughts we co-create! Yes–at the end of her book (Chapter 10) she writes a chapter on how programming robots with human-like qualities should be where we land. I must admit that final chapter was a bit mystical to me.

  2. Travis Vaughn says:

    Pam, I agree that it does seem that Eve Poole believes that we have the capacity to program soul-like stuff (the junk code she writes about)into A.I., but I think the “how” is the question she doesn’t know about. She writes, “Of course it is not currently possible to code for soul or for consciousness, given the lack of agreement on definitions, and on whether either are codifiable categories in any case.” (I reference that quote in my post). She also mentioned that she had “no idea how one would program intuition into AI.” (pg 190 of 266 on the kindle version)This would be a good conversation between Dr. Poole and Dr. Kahneman. If anyone has a template for system 1 thinking, it would be him, and if I were a coder attempting to code human soul into A.I., I’d want Kahneman on a retainer.

    Question: What would be one or two ways you might use AI in your practice of mediation (or not)?

    • mm Pam Lau says:

      Travis,
      Your question came at the exact moment I was finishing my research plan when I realized how AI would help me with my NPO project if I coded my questions accurately. I asked myself if I could use AI in my practice of mediation? The first thought that comes to mind is that as a mediator, I help people talk and I help them calm down in the midst of trauma. Would I want to rely on any form of technology or a robot as I work with people who need me to listen attentively?

      If I were to program or code responses ahead of time, I could use AI to help people in conflict move faster through the emotions and blockades. The problem with that is I remember the Body Keeps the Score author writing how when people know they are being deeply listened to by another human being, healing begins.

      In my NPO project, I am working on a Two-Way Communication Model for leaders to use when they need to have a difficult conversation with a direct report or when a leader needs to talk about a sensitive topic by “managing up” to their supervisor. I can see where AI would be helpful and informative if I created a digital document for each participant to plug in the questions and responses as they are somewhat guided in the conversation. My goal is to help leaders talk with truthful humility and to do so with integrity.

      Could AI help me help leaders with self-awareness? Self-reflection? Currently, in my family law mediation work, I do not use any form of AI. Thanks for the question–it’s got me thinking!

  3. Esther Edwards says:

    Pam,
    Your post makes me realize I need to revisit my Advance notes. So much great content! The thought of the difference between theological training and theological educating is one to think about as well as Martyn Percy’s question “Could it be different?” This question makes me think of the difference between strategic foresight and strategic planning. When we plan we already know the outcomes that we want. When we dream of what could be it opens the mind to innovation. That is something that AI has not quite been able to replicate.
    I’m curious. What are some of the ways AI has helped you to have more time to be creative yourself?

    • mm Pam Lau says:

      Esther~ Yes! Reading over the notes from both advances has been helpful to me through the months. I write everything down because I think in images, not linearly and writing helps me to remember.

      As a writer, I had not relied on AI at all until 2019 when I wrote a parable that I turned into an animated video. I should say, my student interns turned the parable into animation. They used incredible forms of technology to bring the characters to life. It’s still my one writing “baby” who needs to find a home to grow up in! But to answer your question AI has helped me save time for my own creativity in practical ways.

      Twice a day, I turn off all technology for 90 minutes so I can focus. AI helps me because it silences my notifications.

      Because Itunes and Spotify hold almost every song ever recorded, when I listen to a playlist, songs I never heard before will be in my ears. Music inspires me to certain kinds of writing.

      My favorite form of AI is when I am writing a talk and I need to understand the root of a word, Scripture or context. It blows my mind what I can find within 10 minutes!

      When I record for my podcasts, I no longer have to be in person (although I prefer an in person interview), Zoom, podcasting platforms and my iphone give me simple ways to share my podcast.

      What about you? How has AI helped you? Great question!

      • Esther Edwards says:

        An animated parable! How fun. We are continually stretched in the realm of technology!

        As far as my using AI – I’m sure I’ve been unintentionally using AI for many years with Zoom and automatic responses through email and our website, but intentionally, just since the summer. Now I use ChatGPT for much of my formal email writing and anything descriptive I need to use in promo. It is also a great way to take something I’ve written for church use and make it more concise if needed. The use is endless. I do need to use it more in my research. The seminar with Mario was very helpful.

  4. Cathy Glei says:

    Thank you for your post Pam. I watched a You Tube interview of Eve Poole discussing the content of Robot Souls. I think more clarity about her position is needed for me. I read the book understanding that our junk code is what makes humans irreplicable and cannot be programmed into AI. This would be a good through for discussion in our chat. What gold (wisdom) from Catmulls’ book does he offer to sustain a culture of disciplined creativity? Creatives may not think of creativity in terms of needing to be disciplined. Thank you for your post.

    • mm Pam Lau says:

      Cathy~
      I hightly recommend listening to Ed Catmull’s book. The voice (I think it’s him) is mesmerizing. The wisdom he offers is in how he tells the stories of failure over an over again. Then, suddenly, because they all stayed in the game for the long haul, someone gets inspired and they create a block buster animated film! I would love to discuss the discipline of being a creative at length with our cohort. We could all toss in our gold and learn so much from one another.

  5. mm Russell Chun says:

    AI as a student.
    Thanks for the perspective. I heard that Dr. Mario Hood was actually challenging his ChatGptAI will moral questions.

    SO I thought I would give it a try. Pretty much I was deflected but wondered if it was having an impact. Your post makes we want to continue to challenge my AI. Teach it perhaps?

    Poole writes…Poole writes, “Faith is global in its reach and spread, and has not yet been categorically superseded as an explanatory narrative” (p.106).

    Amen/Shalom

    • mm Pam Lau says:

      Russell, Did you watch Mario Hood’s webinar? What did you take from it?

      • mm Russell Chun says:

        Hi Pam,
        It appeared to me that Mario was an aggressive user who sees AI as a tool to be used.

        When I first moved to Hungary, I used email to contact visiting U.S. missions team. There was a long delay in responses and I was discouraged. I resisted the new fangled FACEBOOK, but when I sent messages, I received almost instant feedback from University age teams. Apparently, they were constantly on the app.

        So for me lesson learned. I am reminded of the old adage, “meet them where they are.”

        Another lesson…during high school I watched my son use some app that connected his football team mates. The goal was to always respond to pictures, stories etc (not facebook) for the entire school semester. Each day, without missing a beat. Sigh.

        I wonder if there is a dopamine aspect to using social media and now AI.

        Ready or not……

        Shalom.

Leave a Reply