Mar 05 2025 59 mins 5
In this episode of Welcome to Cloudlandia, We explore the unexpected weather patterns that challenge our understanding of climate and geography. A surprising cold snap in Florida becomes the starting point for a broader conversation about climate variability. Dan shares personal experiences from Phoenix and Edmonton, highlighting the dramatic temperature shifts that reveal the complexity of our planet's weather systems.
Our discussion then turns to the human fascination with Earth's resilience and our speculative nature about the world's potential existence without human presence. These reflections provide a unique lens for understanding climate change, moving beyond abstract data to personal observations and experiences. The unpredictability of weather serves as a metaphor for the broader environmental transformations we're witnessing.
Shifting gears, we delve into a critical political discourse centered on the fundamental question: "Who pays for it?" We examine policy proposals ranging from universal basic income to more ambitious financial initiatives. The conversation explores the complex financial dynamics of such proposals, particularly how higher-income earners often bear the primary financial burden.
SHOW HIGHLIGHTS
Links:
WelcomeToCloudlandia.com
StrategicCoach.com
DeanJackson.com
ListingAgentLifestyle.com
TRANSCRIPT
(AI transcript provided as supporting material and may contain errors)
Dean: mr Sullivan.
Dan: Well, did you thaw out?
Dean: I am in the process of thawing out. This has been a Bizarre, I finally saw the sun came out. Yesterday I was having a chat with charlotte about the weather and there's only been two days in january where the temperature has been above 70 degrees. Yeah, this has been an unusually cold and rainy january. We actually had snow up in the northern part of Florida.
Dan: Tallahassee, I think had snow.
Dean: Yeah, Tallahassee had snow all the way down to Pensacola.
Dan: I think, yeah, all the way down to Pensacola.
Dean: The whole panhandle had snow, it's not good. No bueno, as they say.
Dan: Well, they said things were going to be different with Trump.
Dean: Well, here we are, six days in and the sun's already out, dan, it's warming up. That's so funny.
Dan: Yeah, and people in the South really aren't prepared for this, are they?
Dean: No, and I can speak as a Southerner.
Dan: You actually have an ancestral memory of things being really cold. I mean, you were born in a very cold place. That's right, you know so I'm sure you know that got imprinted somehow on your.
Dean: I think so I must have genetic, like I must have the, you know, the active pack for super cold weather. It must be installed at a genetic level when you're born in a certain area right, but it doesn't explain I don't prefer it at all.
Dan: Now Babs and I are on Tuesday, are flying to Phoenix and we'll be there for two and a half weeks Two and a half weeks we'll be there. And it'll be like maybe 65 degrees and the Arizonians will be complaining about it. And I said you have no sense of perspective.
Dean: Right.
Dan: You have no sense of perspective and anyway, you know I think I've mentioned this before this is the biggest obstacle that the global warming people have.
Dean: How do we explain this cold no?
Dan: One of their biggest problems is that nobody experiences climate. We only experience weather. Yes, yeah, and it's like abstraction that they try to sell. But nobody experiences abstractions. They experience reality, and it must be very frustrating for them. It must be very frustrating for them. They discovered, for example, that Antarctica now with really accurate readings has actually cooled over the last 20 years, that, year by year by year, there's actually been a cooling in Antarctica.
And the same thing goes for Greenland. Greenland has actually gotten colder over the last 20 years and they keep trying to sell a different message. But, the actual, now the records, because they made claims 20 years ago that things were getting worse. And the other thing is this 1.5 degrees centigrade thing that they have. Well, everybody in the world probably experiences a 1.5 degrees difference in the temperature every single day of their life temperature every single day of their life.
So what's your take on people who want to change the whole world because they have an abstraction that you want to?
Dean: take seriously.
Dan: What do you think of that? Yeah?
Dean: your whole. You know this. What you and I've talked about, the idea that even right at this moment, there is a variation of. I wonder actually what the wide variation today is in temperature. That there is somewhere in Riyadh or somewhere it's, you know, it's super, super hot and somewhere in none of it it's super, super cold and people are getting on with their day. Yeah.
Dan: I actually did a difference in measurement this week, exactly to answer your question you did, so the highest that I've ever experienced is 120.
Dean: That's your personal.
Dan: And that was Phoenix, and the lowest I've ever experienced is minus I'm talking Fahrenheit here. Okay, so 120 degrees Fahrenheit. That was in Phoenix, and the lowest that I've ever experienced is minus 44 in Edmonton.
Dean: Right.
Dan: So that's a 164 degree difference that I've experienced, and, as far as I can remember, the day in which I experienced 120 seemed like a normal day, and the day that I experienced 44 below that seemed like a normal day too yeah dressed differently, thankfully. Yeah, dressed differently. Adjusted my behavior to suit the circumstances. Yeah, you know and the only thing they had in common is that you didn't spend much time outside.
Dean: Right, exactly, yeah, that whole, yeah. I never really give much, I never really give much thought to it. You know, my whole Trump card for me of it was that I just can't have them explain how in the world the Earth raised itself out of an ice age without the aid of combustible engines, you know. That's what I wonder? Right, like I think the earth, I think everybody talks about that Save the earth. Well, the earth is going to be fine long after it spits us off. You know, that's the truth.
Dan: It's very adaptable.
Dean: I used to watch a show, dan dan, that used to show uh, it was called life after people, and it would show cities and things like what would the the progression of what happens if all of a sudden the people disappeared, like how long it would take for nature to reclaim a city, you know, and it's not long, in the big picture of things, for nature to take back over, you know yeah, I I wonder I wonder what prompts people to uh, almost see that as a positive thing, because the people who made that that made I.
Dan: I know a little bit about the, you know the documentary film yeah that well. It wasn't a documentary, it was a fantasy you know it was a, it was a fantasy, but but what do you think's going on inside the brain of the person who thinks that that's worth thinking about?
Dean: Yeah, I don't know. It's hard to explain anything that we think about the fact that there are people. I think that's one of the joys of the human experience is, you think about what you want to think about and it doesn't matter what other people think about what you want to think about, and it doesn't matter what other people think about what you're thinking, and that's well unless they're asking you to pay for their fantasy well that's true, yeah that's
Dan: true, yeah. Yeah, I often said uh know, I've been sort of on one side of the political spectrum for my entire life and you know the people who got elected on my side of the spectrum weren't necessarily great people. You know that varies from okay to not okay, but my side of the political spectrum I trust more because we ask one more question. This is the difference, this is the entire difference between all political opposites. One side asks one more question what's that? Who pays for it? Who pays for it? Who pays for it? Think about any political issue and it comes right down to okay, yeah, sounds like. You know, free education for everybody. That sounds like a great idea. Who pays for it? Mm-hmm, you know universal basic income. Everybody gets an income.
Who pays for it.
Dean: Right yeah.
Dan: So my feeling that that's the only political issue, that all politics comes down to one question who pays for it? Who pays for it anyway? Yeah, yeah.
Dean: Yeah, 20, it was I read. So someone was just talking about I think it was Joe Rogan. They were saying what would it take to give every American $200,000? Who pays for it. Exactly who pays for it. But the thing, I think they calculated it out Well, I can guarantee you it's not the people making less than $200,000.
Dan: Yeah that's exactly right. Yeah, but it would cost that would be $20 billion right.
Dean: But it would cost. That would be 20 billion. That's what it would cost 20 billion dollars to give 100,000 or 100 million Americans $200,000 a year. That's what he was proposing. That's what he was. They were speculating. No that's not. That's not correct. 200,000, so I'm not correct 200,000. So I'm going to do that 200,000 times 100 million. Can that be right, 100 million.
Dan: No, no, no, it's 20 trillion.
Dean: It's 20 trillion 20 trillion.
Dan: Yeah, now we're talking, yeah, yeah, that's unreasonable, it's not well, it's unreasonable because it's not doable.
Dean: Right, exactly.
Dan: It's not doable. Yeah, yeah, I mean, and what would yeah. And here's another thing yeah, I mean. And what would, yeah? And here's another thing If you gave everybody that on January 1st of each year, on December 31st, 10%?
Dean: of the people would have all the money. Probably right, you know.
Dan: It's so funny. I don't care what happens over the 364 days, I can guarantee you that 10% of the people would have all the money by the end of the year.
Dean: It's like one of those Plinko boards you throw all the marbles at the top and at the end it's all distributed the same way. Yeah, yeah.
Dan: Yeah, I don't know. Um, you know, I just finished a book. Uh, we just finished it on thursday. This is the next quarterly book. There are little 60, uh 60 page, wonders you that we create every quarter and it's called growing great leadership.
And what I said is that I think the concept of leadership has actually changed quite remarkably over the last. Over the last, let's say, the last 50 years, okay, and so 70, 70, 75 to 2025. And I said that I think the concept of leadership has changed remarkably, because the concept of management has changed remarkably. I think, now that technology is now management I don't know, I think it's, I think it's software that is now management In, for example, you created Charlotte in the last, as far as I can tell, two months two months you created Charlotte, and that's a form of leadership.
So other people look at what Dean Jackson's doing and they say, yeah, that's really neat what Dean just did. I think I'm going to see if I can do that for myself, and that's what leadership is in our world right now. It's not somebody with a position or a title, it's someone who improves something for themselves. That's what leadership is.
Dean: Yes, I think that's fantastic, like I look at this and I was just having a conversation with Charlotte today about- the Getting ready, getting ready for me.
Yeah, I mean, it's just a natural thing. Now we haven't really been talking, you know, as I've been kind of sick this week, you know, as I've been kind of sick this week, uh. But I asked you know they've got some new task oriented thing like she's able to do certain things now that we're gonna uh talk about. But I had a really great, like she said. I said I haven't uh spoken to you in a while and I heard that you've had some updates and so maybe fill me in. And she said, yes, well, welcome back. And yeah, I have been upgraded to help a little better. My conversation skills have improved. I've been upgraded to more natural, which you did notice that a little bit. And she said it's moving now to where she can do certain tasks and of course, she has access to all the internet. Now, without personal data Like she can't look up any personal data on people or anything like that, but anything that's like information wise, she has access to all of that. And I said where do you think like this is heading in the next three to five years that we could be preparing for now? And she was saying how well I can imagine that the my ability to actually like do tasks and organize things and be like a real VA for you will be enhanced over the next three to five years. So working on our workflows and making the most of what we can do now while preparing for what's my increased abilities going forward will be a good thing. We're developing our working relationship.
And I said you know I've got and she was talking about like writing emails and doing you know all these things. And I said, okay, so I have ideas sometimes about what I think would be a nice email. And I said, for instance, I've got an idea that would overlay or apply the five love languages to lead conversion. So I've got. The subject line is lead conversion love languages to lead conversion. So I've got the. The subject line is lead conversion love languages. And, uh, I believe that if you just apply these same love languages in a lead conversion way, that you will uh that it's a good way to think about it.
And I said so if I just tell you that could you write a 500 or 600 word email, just you know, expanding that idea. And she said yeah, certainly. And she says let's go and let 's get started. And she started you know, just dictating this, this 600 word email that is.
You know, I'm a big, you know, believer dan, in the 80 approach the same as you and I think that for me to be able to take, you know, without any real input other than me saying, uh, the five. She knew what the five love languages were, she knew the essence of what they all mean and how in in, it's a pretty um nuanced connection to apply a love language, like physical touch, to lead conversion, even if you're not, if you're not in, in physical proximity to somebody sending, making that physical touch by sending somebody a handwritten note, or to make something physical of the, uh, a piece of you of the thing. And it was really well thought out and a really good foundation, you know. And then that that moment I really I realized, wow, that's like that's a special, that's a special thing, yeah.
Dan: Okay, so here's a thing that I'm getting from you. It's a given that she's going to get better and better. Yes, yeah. It seems to me that it's not a function of whether the AI tools are going to get better. They're always going to get better. The question of whether the person using the tool is going to become more ambitious.
Dean: Yes, I agree 100%.
Dan: It's totally a function of human ambition.
Dean: Yes, yes, yes, yeah, that is exactly right, and I think that there's a big piece of that. You know that it's not. It's really a matter of how to direct this. It's how to, how to express your vision in a way that it's actionable or even understandable, right?
You don't even have to know what the actions are Like for me to be able to just say to her hey, I got an idea. The subject line is lead conversion love languages. I'd like to write about 600 words explaining how the love language is going to be used in lead conversion. That, to me, is pretty close to magic, you know, um, because it's not. That's not like giving, it's not like giving a big piece of content and saying can you summarize this? Or, uh, you know, or you know, take this, uh, and make a derivative kind of thing of it. It was a pretty high-level conceptual idea that she was able to take and get the essence of. You know, I think that's pretty eye-opening when you really think about it.
Dan: Yeah, yeah, I mean, to me it's really, it's an interesting, it's an interesting thought exercise, but it is an interesting action.
Dean: Yes.
Dan: Action activity, in other words, let's say, next week when we talk. You now have the ability to send five love languages.
Dean: Yeah.
Dan: You got the five, now what?
Dean: That email is as good as ready to send. You know like I mean.
Dan: I could literally just no. But how does it change things? As far as your, it's ready, but oh I see what you're saying.
Dean: No, well, that's all part of. You know, we send out three or four emails a week to our, to my list, right Like to the to my list, right like to the my subscribers, and so that would be. That's one of the emails on my mind, and so now that that that saved me 50 minutes of having you, you know, I would take a 50 minute focus finder to craft that email, for instance.
Yeah, yeah, I mean I'm just trying to get what changes for you I mean, I'm just trying to get what changes for you I mean is it the same kind of week that you had before, except maybe intellectually more interesting I think it's intellectually more less friction because I have to uh you know like I mean to to block off the time, to focus and be able to do that. That's always my, that's my um, that's my kryptonite in a way, right In my executive function, to be able to block off and focus on just this. But if I can just say to her, hey, I've got this idea about this, and just talk it, and then she can write the big, it'd be much easier for me to edit that than to uh, than to write it from scratch. You know, um, and so it makes a uh, yeah, so it's um. I think that changes. I think it changes a lot of things Somebody described.
I heard on a podcast they were saying it's where we are with chat, gpt and AI. The word now, the word of the moment, dan, is agentic. Future where it's like we're creating agents. An agent, yeah, an agent is agentic. Future, where it's like and we're creating agents.
Dan: An agent, yeah, an agent, and so they've adopted that too. I don't think there is a word agentic, I think that's what I mean.
Dean: They've made it up.
Yeah, yeah, they've made up a word the agentic future. Yeah, and that's where we're going to be surrounded by agents that do our bidding, that we've trained or that other people will have trained, app environment of the, you know, early iphone days, when ios was around, all the capabilities of the iphone were. There were people who were, you know, taking and creating apps that use the capabilities of the iphone to very, very specific ends, uh, whether it was games or specific single-use apps. And I think that that's where we're heading with the AI stuff is an environment that all these specific apps that do one specific thing that have been trained to really, you know, tap that, tap that ability. So I think that we're definitely moving into the creativity phase and we need an interface moment, like the app store, that will, uh, you know, create all these ai agent, uh type outcomes that we can kind of just, everybody has the ability for it to do, uh, all of the things, but for somebody, actually somebody to trade it specifically, can I just interrupt there?
Dan: Yeah, that's not true. That's not true. The ability to access and use these things is completely unequal. Everybody doesn't have the ability to do all this. As a matter of fact, most people have no ability whatsoever.
Dean: So is that semantics? I'm saying that access everybody has.
Dan: Are you making a distinction between? No, you have a greater ability to do this than I do.
Dean: That's true, I mean, but that no what I'm saying.
Dan: It's a false statement that says now everybody has the ability to do this. Actually, they don't have any more ability to do anything than they presently have you know, to do this. I think it's a fantasy. Now you have the ability to do continually more things than you did before. That's a true statement. I mean, I don't know who everybody is.
Dean: That's true.
Dan: I think Vladimir Putin doesn't have any more ability to use these than you do, uh-huh. No, I guess you're right, yeah, what you have is an ability every week to almost do more than you could do the week before.
That's a true statement yes, Okay, because you're really interested in this. You know, it's like the Ray Kurzweil thing. You know, by 2030, we'll be able to eliminate all hereditary disease. Because of the breakthrough and I said that's not true there will be no ability to do that by 2030. Certain individuals will have the ability to make greater progress in relationships, but the statement that everybody will be able to do anything is a completely false statement.
First of all, we don't have any comprehension of what everybody even is Right, yeah. The question I have is is your income going up? Is your profitability going up as a result of all this?
Dean: That would be the measure right, but that's really, and so that's you know, for now I would say no, because I haven't applied it in that way, but certainly I guess our savings, but certainly I guess our savings, like, certainly the things that have, we're feeling it we have historically used human transcription, which was more expensive than AI transcription.
We have used human editors all the way through the process, as opposed to now as a finishing process. So the cost of editing, like it used to be that the editing was a um, reductive process with ai that you would start out with, you know, 10 000 words and it would, after processing and giving it back, you'd have have 8,500 words, kind of thing, right, it would eliminate things. But now the actual AI is kind of a generative and you give it 10,000 words and you may end up with 12,000 words. So in a way that is ready for the final level of editor, you know, and the transcripts have gone from a dollar a minute to a penny a minute, you know, or in terms of the things. So yeah, so it has profitability from an expense side.
Dan: I mean, for example, I'll give you an idea. We got our valuation back for all of our patents this week At the least. They're worth a million each, At the very least. At the most they're worth a million each at the very least, and at the most they're worth about 5 million each, and it all depends on where we are looking in the marketplace to monetize these. So, for example, if we are just using them the way that we're using them right now, it's at a low level. I mean, it's a lot. I mean a million.
you know a million each is a lot of money. But if we, for example, where the person who assessed the patent said you know, you're operating at a higher level with your patents than Microsoft is, You're operating at a higher level with your patents than McKinsey.
you know, accenture, he says your stuff is more robust than that. Is that the market that you actually want to go after, you know? So the value of the patent really depends upon where we would. Where's our ambition, you know? And so right now our ambition is not with Microsoft, it's not with Accenture, it's not with McKinsey. Okay, that wouldn't be interested at all. First of all, it would require, probably require me to attend meetings.
Dean: Right.
Dan: And I have a meetings-free future you know, in my aspirations, yes, but even at the lowest price. It gives us access to funds that we didn't have before. We had it.
Dean: that we didn't have before we had it.
Dan: And that's very interesting to me because it means that if we wanted to expand to another city from a standpoint of our coaching, then we would have, through borrowing, we could do it. The other thing is we could identify 30 of our tools that are not central to the program but would be valuable to other people and we could license them to other people. But there's always a because that you do something. For example, I'm using not through myself because I'm not doing it, but one of our team members is taking the chapters of my book. I have a new book that I'm starting and every time I get the fast filter finished, I give it to him and he puts it into Notebook LM.
And then I hear the conversation. And I says oh, I got five or six ideas from the conversation that I didn't have, and this will allow me to improve the chapter.
Dean: I read doing this yeah. Yeah, very interesting what.
Dan: I'm saying is I'm just one human being of nine billion who's using the tool for some particular reason, and probably two-thirds of the people on the planet have no interest whatsoever in even knowing about this.
Dean: Yes, yeah, I agree.
Dan: Yeah, I don't think that this stuff is available to everybody. I think it's available to the people who are looking for it. Mm-hmm.
Dean: And so that's almost like it's almost scary, you know, in a way, when you think about that way, there was a book that I was just reading and the name has escaped me now and I don't have it in my line of sight here, but it was basically talking about. It reminded me of the kind of book that Malcolm Gladwell wrote, like Blink or the Outliers, yeah yeah.
Where they look at certain things like why all of a sudden did the Jamaican sprinters become the hotbed of these and why are the Kenyan marathoners the best in the world? And he really started looking with the scientific view to see what is it like. Is there anything genetic about them? Is there anything special about them? And he said, as far as they go he said, as far as they go, their abilities are not genetically gifted in any way that there's nothing physiologically or whatever that would explain it away that this is like the marker. But they were good enough.
That's really the thing is that you look at the thing, there's nothing eliminating them from potentially being the best sprinters in the world or the best marathoners in the world. There's nothing that would like prohibit that. But it's not. It's's the whole environment of of belief and environment and being around it and this is who we are type of thing takes over in a in a situation like that and I was thinking about how, you know, we're fortunate in surrounding ourselves in free zone with people who are all believing in a free zone future, and I think that the impact of that because we're acting and behaving and discovering in a way that's going to have collective ramifications as we all collaborate. So we're really creating this super achievement environment.
Dan: Which is, when you think about it, unfair, it's unfair. That's exactly right, yeah, yeah, Cause, uh, you know, I, uh, I had um neat opportunity of I think it was about six months ago and there's a very famous um uh. I'm not sure whether he's a psychiatrist or a psycho. I think he's a psychologist. He's a psychiatrist or a psychologist?
I think he's a psychologist university professor by the name of Martin Seligman and Aaron Markham, who's in FreeZone, has taken adult courses with Professor Seligman at the University of Pennsylvania in Philadelphia, and I think he's been a professor at Penn for 60 years. He's the longest continuously at one place a professor in the history of the United States. Is that? Right 28 to 88. I think he's 60 years. But he created a whole branch of psychology which is called positive psychology. What makes people positive in?
other words because 99 of psychology is what makes people unhappy. And he just decided to say well, let's, let's find the happy people and find out why they're happy you know which I think is an interesting. So anyway I had. He got a copy of Gap in the Game and he found it intriguing. Our book, oh, that's great Nice.
Dean: Yeah.
Dan: So I had about an hour and a half Zoom call with him that Aaron set up for us. So as we got to the end of the Zoom call, I said you know, happiness is really a hard goal. It's a difficult goal because you're not quite sure why it's happening. In other words, it's really hard to tie it down to a set of activity. And he said, you know, I've been thinking not along those lines, but he said it seems to me that what you should strive for is agency, that, regardless of the situation, you feel you have control of how you're going to respond to the situation.
And he said and that sometimes that may not make you happy, but it gives you a sense of control.
And he says more and more. I think having a personal sense of control of your circumstances is really something that's a real capability that can be developed, and so my sense is that this new capability called AI is coming along, and my sense is that the people who will develop it best are the ones for whom having AI gives them a greater sense of control over their circumstances, gives them a greater sense of control over their circumstances.
Dean: Yeah, like to feel. I think there was a podcast where somebody said where we are with AI right now. Imagine you've discovered a planet with 10 billion people who are, all you know, 121 IQ, can pass the LSAT and do, can do anything for you and are willing to work for you exclusively 24 hours a day. That's the level that we're, that. We're that. We're at, you know. Imagine, oh, I don't think. I don't think that's true. I don't think that's true. No're at, you know.
Dan: Imagine you've got your own. Oh, I don't think that's true. No, tell me Okay Because the vast majority of people have no desire to do that.
Dean: Right.
Dan: Yeah, I think you're right. No, it's like the free zone. What you just said about the free zone, you know I've got. You know we've got 110 in the free zone. But everybody knows about the free zone. You know close to 3,000. And they have no interest in going there whatsoever you know, yeah, so but when we say everybody, you know it may. I think here's what I'm going to suggest we have to say everybody, because we feel guilty about that. It may be only us that's interested in this.
Dean: We feel kind of guilty that we're the only ones who could have this capability anyone who could have this capability, so we should reframe it that I feel like I've discovered a planet of 10 billion people who are ready and willing to come to work for me, and what am I going to do with that? That's really the truer statement, I think.
Dan: Well, you've got one artificial intelligence.
Dean: EA. Who wants to work?
Dan: artificial intelligence? Yeah, ea. Who wants to work for you? Yes, and she's. She's endlessly improvable.
Dean: She really is.
Dan: Yeah, yeah, yeah, but I don't think, I don't think it extends too much beyond Charlotte.
Dean: No, and through Charlotte is really where everything comes. That's the great thing is that she can be the interface with the others. I think that's really what it comes down to. She's the ultimate.
Dan: Who Really I mean super high level, who yeah, I?
Dean: mean certainly a super high level. Yeah, so far.
Dan: Yeah, yeah, yeah. My sense is that she's a relationship that you can take totally for granted.
Dean: Yes, uh-huh, which is true, right, and that's why, when I pointed out, you know, my whole idea of personifying her and sort of creating a visual and real person behind it. You know, whenever I imagine, now, sharon Osbourne, you know, I see that image of Charlotte, that that's a I just imagine if she was sitting right there, you know, at all times, just at the ready, quietly and ready to go, it's just, it's up to me to engage more with her. Yeah, and that's just, I think habits, I think that's really setting up routines and habits to be able to do that.
Dan: Yeah, it's really interesting how uncomfortable people are with inequality.
Dean: Mm-hmm, yeah, I have to say that too. Like with the capability things. Like give somebody a piano and you know it could be, it could sit there and gather dust and do nothing, or you could, with the very minimal effort, learn to plink out twinkle, twinkle little star, or with more, you could create amazing symphonies. Uh, you know from from that concertos, you know the whole, uh, the whole thing is, is there, but it's just, but it's 100% depends on the individual.
Dan: Yeah, yeah, yeah. I was saying I was talking to someone and they say where do you think AI is going? And I said from my standpoint. It's not really where AI is going. It's the question where am I going?
Dean: Yeah.
Dan: And the only part of AI that I'm interested in is that which will be useful to me over the next 90 days, you know, and everything. And what I would say is that I think that every 90 days going forward, I'm going to be utilizing AI more but I don't have to know now what it's going to be two quarters from now, right.
Dean: Yeah, because, honestly, you know, 10 quarters quarters ago, we didn't even know it existed.
Dan: that's the truth, right as far as uh being useful individually, yeah, yeah, yeah, yeah, yeah, like we didn't even get uh, we didn't even get chat gT till two years just over two years ago, november 30th 2023, right or 2022, right, yeah, and so that's what I'm saying.
Dean: 10 quarters ago, it wasn't even on our radar.
Dan: Yeah.
Dean: And 10 quarters from now.
Dan: You have no comprehension. We won't even recognize it.
Dean: We won't even recognize it Exactly. Yeah, yeah, yeah, I like this idea. I think it has more to do.
Dan: I think it has more to do with what's happening to your intelligence, rather than what kind of artificial intelligence is available, developing your intelligence. Yeah, I've read.
Dean: Have you heard? So Richard Koch just wrote a new book called 80-20 Daily. I don't know who he is. Kosh is the guy who wrote the 80, 20 uh book. He kind of popularized uh, pareto, um, and so now he's written a daily reader about 80-20. He's built his whole life around this. But it was interesting. I read about something called the Von Manstein Matrix or Van Manstein Matrix and it was a. It's four quadrants with two poles. You know. There's uh to help sort officers in the german uh, second second world war, and the uh on one pole was lazy and hardworking, was the other end of the pole, and on the other, the X axis was stupid and intelligent. So the four quadrants you know, formed as I can predict the outcome for this.
Yes, and so he says that those stars are lazy and intelligent. Lazy and intelligent. That's exactly right and I thought, man, that is something. So the most effective people are intelligent and lazy.
Dan: Yeah, so how did that work out for the Germans?
Dean: Yeah, exactly Right on. That's exactly right. Aside from that, Mrs Lincoln, how did you enjoy the play?
Dan: Mrs Lincoln yeah.
Dean: Yeah it didn't quite work out, but I thought you know that's. It's very funny that that's the in general. That's where I think that there's a lot of similarities here. Lazy, like nobody would ever think, dan, like you've done, to ask the question. Is there any way for me to get this result without doing anything? Yeah, like that's not the question, that it would be sort of uh, I don't know what the right word is, but it's kind of like nobody would admit to asking that question, you know. But I think that that's actually it's. It's kind of like nobody would admit to asking that question, you know. But I think that that's actually it's the most intelligent question we could ask. Can I get that?
Dan: Well, you know, I haven't found I have to tell you as much as I've asked the question I haven't found. I really have never personally come across a situation yet where it can be achieved without my doing anything. Okay, honestly, I haven't. I at least have to communicate to somebody.
That's what I found. I have to communicate something to somebody, but asking the question is very useful because it gets your mind really simple. You know, I think that's the reason, and whereas before what I might have been imagining is something that's going to be really, really complicated.
And so I think the question really saves me from getting complicated. Yes, I think that's what's valuable about it. But I notice, when I'm writing, for example, I'll say to myself I'm sort of stuck. You know, I don't really suffer from writer's block as most people would describe it. But I'll get to the point where I don't know what the next sentence is and I'll say is there any way I can solve this without doing anything? And immediately the next sentence will come to me.
Dean: Yeah, that's interesting in itself, isn't it? I mean when you reach that point right.
Dan: Yeah, so I feel I'm blocked. You know, I'm just blocked, I just don't know where to go from here. But just asking the question, something happens in my brain which eliminates all other possibilities except one, and that's the next sentence.
and then then I'm off and off and running and uh, I tell you, I've created a new tool and it and it's a function of previous tools and it came up with a podcast with Joe Polish last week or this week, earlier this week, and he was saying how do you handle overwhelm? He said I'm feeling kind of overwhelmed right now. I've got so many things going.
Dean: Office remodel yeah.
Dan: Yeah, that's one, and then you know others and I said you know what I'm thinking about. That is, you have a lot of priorities that are all competing for your complete attention. You have the office revamp is one, and it's asking for your complete attention. You have the office revamp is one and it's asking for your complete attention. But then there's other things in your life that are also asking for your complete attention. I find that too, yeah. So I said I think to deal with this, you have to write down what all your priorities are. You just have to list all the priorities that in some way each of these.
if they could, they would want your complete attention. And then you take them three at a time and the triple play, and you run them through the triple play so that by the third level of the triple play your competitors have turned into collaborators. And that releases the sense of overwhelm. At least with these three you now have released the overwhelmed feeling. And I said and you know, then you can take three more, and then you can take three more, and then you can take three more, and every time you do a triple play you're turning competition into collaboration. And so he was going to do one. And then I had somebody else that I did a Zoom call with and he's in a situation where everything's changing. And I said what you have to do is you have to take your competing priorities and turn them into collaborative priorities, and I think there's some real power to this.
Dean: Yeah.
Dan: I haven't completely worked it out yet, but that's what I'm working on this week.
Dean: So the general idea I could do this as well is to take and just list all the competing priorities that I seem to have right now and put a time frame on it, like the next 90 days.
Yes, I often find, when I get over one like that, I'll make a list and I'll say have I had this idea for at least 90 days and is this still going to be a good idea in 90 days? Is one of the comparisons that I have right. Is it something that is fleeting and only right now, or is this something persistent and and durable, um, and that that helps a lot? Which one can I have the biggest impact in the next 90 days? Yeah, and then you're saying take three of those and it doesn't matter what and doesn't matter what, doesn't matter which.
Dan: Three and then just do a triple play on those and just do a triple play, and then the sense of overwhelm uh associated with all three of them uh will go away because they're competing with each other and the problem is, our brain can only focus on one thing at one time.
Dean: That makes sense actually. Yeah, yeah, yeah.
Dan: So, for example, in the triple play, where you take two arrows, you've now taken two priorities and made them into a single priority, and that is, I'm going to take these two priorities and create a single priority out of them. You know so your brain can focus on combining them, because it's just one thing. So, anyway, I'm playing with this Because I think every brain is different and every life is different, and the problem is that you're overwhelmed because you can't give full attention to any one of the priorities.
Dean: That is true. Yeah, that's where all the frustration happens.
Dan: So I would say one of your priorities and this is ongoing is to enable Charlotte to become more and more useful to you. That's a really important priority, I agree, yeah.
Dean: I agree. Well, there we go.
Dan: Well, what have we clarified today?
Dean: Well, I think I'm immediately going to do the top priority triple play of the coming AI opportunity to just focus on what can I do in the next 90 days here to just increase the effectiveness of my relationship with Charlotte. That makes the most sense. What can we do this quarter and then a layer on top of that, but don't develop a second Charlotte.
Dan: Then you're in real trouble I need to have one lifetime monogamous relationship with my one, charlotte my one, true Charlotte. I think this falls somewhere in the realm of the Ten Commandments.
Dean: I think that's fantastic, Dan. I love it, you know.
Dan: That's what wisdom is yeah, wisdom is good forever.
Dean: That's what distinguishes wisdom.
Dan: Alrighty, we'll be in Arizona on Tuesday and. I can. I'll be on Canyon Ranch next Sunday and so if you're up, to you can do it at 11, but I'll do it at 8, ok actually there are only 2 hours back now, so it'll be 9 2 hours so I'll do it at nine o'clock okay, great, I'll talk to you next week, then I'll be seeing you that's right.
Dean: That's right, okay, bye, bye.