Cameron explains his MANIFESTO FOR AUSTRALIAN AGI, Steve explains why he supports the Australian government’s ban on social media for kids, and they discuss NVIDIA’s downloadable local LLM.
FULL TRANSCRIPT
FUTURISTIC 33
[00:00:00] CR: Welcome back to Futuristic. It is episode 33. It’s the 29th of November 2024. Steve Sammartino, welcome back.
[00:00:16] SS: Reilly.
[00:00:19] CR: We have, uh, some big things to talk about, Steve. Big ideas that I want to get into today about the future of AGI in Australia. Social media bans, personal AIs from NVIDIA, but before we do that, why don’t you tell everybody how you have been futuristic since our last conversation.
[00:00:44] SS: One was with VUT and the brief that I got was, you gotta help convince university that we need to embrace AI more, which is kind of crazy and interesting. You know, they’re still using it a lot to stop people cheating or using the tool in some capacity. And I just had two things to say.
[00:01:11] SS: I said, not using it is insane. And I said, I don’t know the answer, but I know the question. And the question is, how do we help our curriculum rise above the AI? Because that’s what we do with every tool. You’ve got to rise above it so that you’re working on top of it. Now, of course, you need to know what good looks like.
[00:01:30] SS: So there’s that. So that was the one thing that was interesting. And I did a workshop with CBUS. Which is a big building company, Seabus Property, who owned the superannuation in the building. It went really well. Incredible Workshop was on MVP’s, Minimum Viable Products, and I want to say that for any company trying to innovate, Or a startup.
[00:01:50] SS: MVP’s are still the ticket. That is still the secret sauce. And it’s really, really hard to do. You know, I really had to thin up some of the ideas for these guys that just cannot think small enough to get momentum. And that was an insight that that’s been true for 20 years. And I can’t see it ever changing.
[00:02:09] SS: Is that if you do something small, you can find a path to success and iterate. And I was really driving
[00:02:16] SS: that home.
[00:02:17] CR: Think small.
[00:02:19] SS: Think small.
[00:02:19] CR: 2024. Well, I’ve done a lot of coding, uh, as usual, just coming up with ideas and coding stuff. Um, just trying to automate a bunch of things. Like, I’m still trying to automate the, um, investing checklist that I do each week. So it’ll be one click button and off it’ll go. I had some success with that this week.
[00:02:39] CR: Did a first phase of it that worked. First test worked. Now I’m trying to expand. And again, think small. So it’s like, I wrote all the scripts around one stock. That I had the data for. I said, right, generate a score for this stock. Got that working. Now I’m like, trying to give it a list of stocks and get it to do scores for a list, like 10.
[00:03:01] CR: And then I’ll give it 200 and see if it still works. But it’s just, yeah, starting off building it piece by piece. But, um, Yeah, I think that’s, that’s mostly it from a technology perspective. Just a lot of using the latest coding tools and it’s frustrating. You know, I’ve been using Claude 3. 5 Sonnet, which a week or two ago was freaking amazing.
[00:03:28] CR: The last couple of days just really sucked. Uh, and I see people talking about that on Reddit. It’s this constant frustration where these tools, uh, have weeks where they’re just like genius level, amazing one shot solutions for everything. And then all of a sudden they’ll just be dumb and, you know, like they’ve had half their brain removed.
[00:03:50] CR: They’ve been lobotomized.
[00:03:51] SS: Why do you think that is? What do you think
[00:03:53] SS: causes that?
[00:03:55] CR: I think they’re. I don’t know. I see lots of theories, but it seems to be some version of, they’ve got a certain amount of GPUs that they’re using to build their frontier models, maintain their existing models. They’re quantizing things. They’re trying to, um, Save money at different stages. They’re moving the, you know, we’ve got 50, 000 GPUs applied to this.
[00:04:23] CR: We’re going to take 10, 000 and move them over to this thing. And so this thing becomes dumber. I don’t know. They’re just messing around on the back end as they’re continually building new models, training new models, trying to save money, trying to, I think, reduce The algorithms as well, like, okay, if it’s costing them a dollar to give you an answer that they’re charging you a cent for, they’re trying to figure out ways to reduce the cost of that by optimizing things, and it doesn’t always work, there’s a lot of experimentation, like, a lot of people get really shitty on Reddit, Um, I’m paying you my 20 a month and this service is terrible.
[00:05:06] CR: And I’m like, dude, you have an artificial intelligence for 20 bucks a month. Like shut the fuck up.
[00:05:12] SS: Hunched a year with excitement, the greatest tool ever invented you have for less than the price of a
[00:05:19] SS: pizza?
[00:05:20] SS: Look, let’s complain here. We’ve got the most incredible AI, mostly free, or the super version for 20 bucks a month. You should so be complaining. Your lot in
[00:05:29] SS: life is
[00:05:30] SS: horrible.
[00:05:30] CR: Yeah. Shut the fuck up and stop complaining. Yes. Yeah. It’s going to be up and down. I just accept it. It’s going to be good days and bad days, but you know, the good makes up for the frustrating things. Um,
[00:05:42] SS: plausible ideas. The one that, uh, all of a sudden, let’s say, a few people get onto the new version, there’s a couple of tabs open, next thing you know it’s amazing and there’s thousands of tabs open and the machine slows down. I mean, that’s highly plausible. The idea that, you know, GPTs are generative pre trained transformers is a big one because one of the things I see on Reddit is often a new version has, let’s call them scary capacities, where they do things that are a little bit, Oh my God, wow.
[00:06:11] SS: And they throttle those back or change those to an extent, which knowing the way the algorithms work that they’re imperfect and it’s not an, if this, or then that protocol that could infect other parts of the system. Uh, the answers that it gives. So they were like really plausible and the cost one, the third one that you’ve mentioned, that’s another great one.
[00:06:31] SS: Gee, this is great. We want to launch and show everyone how amazing it is. But now that everyone knows how amazing it is, let’s just, we’ve launched the plane, let’s pull back there and save a bit of fuel in the, in cruise mode.
[00:06:45] CR: a couple of other things that, uh, I know you introduced me to an AI. There’s a video editing tool this
[00:06:52] SS: Yeah.
[00:06:53] CR: uh, that can take your video and just turn it into TikTok clips and give you ratings on how engaging it is. I ran my investing podcast through it and it gave it all very low engagement ratings. So I didn’t, didn’t post any of them, but it’s an interesting, uh, tool.
[00:07:12] SS: I’ll, I’ll, I want to talk about that because one of the things that, you know, I noticed on, it’s called clap. app for listeners and it is a really great way to get shorts. It’s also can be used just to generate one of the most accurate forms of, uh,
[00:07:31] SS: captions that I’ve seen.
[00:07:32] SS: It’s really
[00:07:33] SS: good. And it actually has an editor. it, yeah. It has an editor inside the browser. So if you haven’t got the software like the iMovie or the Apple software, it actually is a pretty handy little piece of software to do the
[00:07:44] SS: editing on it. Bye. Clap with a K, by the way, like Klu Klux Klan, clap on your own there, Cameron. Clap with, I’ll just go clap with the guy.
[00:07:55] SS: Uh, I, what I noticed is that it does a generic version of it. It gives a summary when you write something and it’ll say, uh, whether or not it thinks it’s, uh, A viral clip and it gives it a score out of a hundred and it’ll say this is viral because it’s got a good hook or it’s quite funny or it’s and so and it actually does get it contextually.
[00:08:15] SS: What it doesn’t take into account is your audience typology, which it doesn’t really know the context of where you’re hosting it. And I’ve had a couple that had ratings of 25 out of a hundred that did better than ones that had ratings of 75 out of a hundred that I posted on my TikTok, uh, which I thought was interesting.
[00:08:34] SS: So. And it also, it chooses some that are good, but you’re better off, I still think, having a look at it, and seeing the chunk that it chose, and going, actually, I’m gonna extend this a bit, cut that a bit, and make it start earlier or finish earlier. Still needs a bit of a human eye of it, but I tell you what, it really can save you an hour and a half going through a live
[00:08:54] SS: video to find good bits.
[00:08:57] CR: Yeah, and I think it’s just a good example as to where we’re heading with these tools where, you know, it will just do all of the editing for you. It’ll be all done on the fly. The AI will do all of the editing, puts editors out of a job. It just makes it easier for everyone to produce more and more engaging content.
[00:09:19] SS: Yeah, so we’ll eat our own dog food on this one cam and we will see what it picks and we’ll see if it picks itself. We’ll see how smart this AI is. If this AI is smart as it thinks it is, it should pick itself and show clap as exactly. Let’s, this is the test of tests cam. Might be the greatest thing we’ve done
[00:09:39] SS: on The Futuristic to date.
[00:09:42] CR: I think that’s when it becomes self aware, when it starts to pick stories about itself to promote. Um, on our last show, I said that I thought Elon Musk was going to be Trump’s AI czar, and that he would be, um, you know, determining the future of AI. Story came out after that, he must have been listening, because a story came out saying President elect Trump is considering naming an AI czar.
[00:10:05] CR: Czar in the White House to coordinate federal policy and governmental use of the emerging technology. Trump transition sources told Axios, Elon Musk won’t be the AI czar, but is expected to be intimately involved in shaping the future of the debate and use cases. The sources said. Now, uh, I didn’t expect him to actually be appointed the official AI czar, but I do expect him to have a lot of influence over how that runs and the policies that they implement.
[00:10:40] CR: And, uh, this goes on to say that, um, Musk, who owns a leading AI company, XAI, has failed. Feuded publicly with rival CEOs including OpenAI Sam Altman and Google’s Sundar Pichai. Rival’s Wari Musk could leverage his Trump relationship to favor his companies. By the way, I listened to, um, do you know, um, Diamandis, Peter Diamandis?
[00:11:07] SS: I do. The, uh, he’s a Singulitarian from the Kurzweil School. He does the, uh,
[00:11:11] SS: Abundance 360.
[00:11:12] CR: Hmm. Do you listen to his podcast at all or watch his YouTube?
[00:11:16] SS: No, I don’t. I’ve seen a lot of his
[00:11:17] SS: YouTubes and his talks.
[00:11:20] CR: I saw two things of him this week. One was a chat with Musk from a Singularity conference in Riyadh that he was hosting just before the election. And that’s where Trump said he thinks we’ll have AGI by 2025, 2026 at the latest. And he also talked about a future of abundance. He said if we get this right, it won’t be a universal basic.
[00:11:46] CR: income. It’ll be a universal, uh, a universal luxury income. You’ll basically have everything you need. It’ll be full abundance. But then I also watched, uh, Diamandis do an interview with the Mooch, Scaramoochie.
[00:12:03] SS: Karamuchi.
[00:12:05] CR: And it was the first half of it was really good. The Mooch was talking about the fact, he says, he said, I have a Trump Dakota ring.
[00:12:11] CR: He was talking about Elon. They were talking about how long Elon’s going to last with Trump. Something we also talked about in that last episode. And Scaramucci famously, when he was Trump’s communications director in the first term, lasted 11 days. And he said, he calls that one Scaramucci. So the amount of time that you last, Working for Trump is, is a Scaramouche, it’s, it’s counted in 11 days, you gotta love, it is genius, and he said,
[00:12:40] SS: that. more than Moore’s Law. That’s how much I
[00:12:42] SS: love it. Wow. he said he thinks Musk is
[00:12:45] CR: going to last 30 Scaramouches, so it’s about a year, but then they were having the question about if and when Musk and Trump have a falling out, who survives, who is the more powerful of the two, and I think We all probably know the answer to that question, and it’s, uh, not the orange demon.
[00:13:06] CR: I think Musk is probably the one who comes out on top, so it’ll be interesting to see what that looks like. But anyway, he was saying, he was talking about the fact that, uh, Musk is already on the outs with Trump. And he said you can tell that because he didn’t Make Musk the head of the DOGE, the Department of Government Efficiency, by himself, it’s him and Vivek Ramaswamy.
[00:13:29] CR: He said so, he’s already, Trump is already starting to white ant Musk’s power by putting two guys in charge. Now the way I’d read that is that Musk wasn’t, he doesn’t have time to actually run this thing. He’s just going to swing in. Probably via Zoom calls once a week and say, yeah, fire this department, fire these 50 guys, and then let Vivek actually implement it.
[00:13:56] CR: Uh, Mooch, who’s got the Trump Dakota ring, is interesting. He said, there are 40 senior ranking members of his first administration that hate his guts now and want nothing to do with him and don’t think he should be president. What makes you think the new people in his administration are going to have a different experience?
[00:14:16] CR: To the first administration who mostly all hate his guts, including his former vice president, want to see him in jail. Um, yeah, he’s like, it’s, it’s going to be exactly the same experience. He’s going to, you know, shit all over everyone, piss everyone off, not listen to anyone. Anyway, it was pretty fake. But then the second half of the chat was them, all of them, Diamandis and Mooch and the other guy pumping Bitcoin as the greatest investment known to man.
[00:14:45] CR: And. And Mooch was plugging his own, his new book on Bitcoin, The Little Book of Bitcoin. Now I, uh, am well on record for five years, longer actually, but since I did the QAV podcast, I’m saying that Bitcoin is just nonsense as an investment. But I got his book and read it just to see what his arguments were, and he doesn’t have any.
[00:15:07] CR: Um, there were no arguments, just the usual thing. It’s going up, so therefore it’s a good investment. Um, and the usual
[00:15:16] SS: really interesting. Really on that, just on the Bitcoin investment. Here’s my rule with investing is that just because something is going up and you can make money out of it,
[00:15:25] SS: doesn’t make it a good investment. That is, That is,
[00:15:28] SS: the fundamental
[00:15:29] CR: Doesn’t make it an investment.
[00:15:31] SS: even an
[00:15:32] SS: investment. Exactly.
[00:15:33] CR: not investing. It’s
[00:15:34] CR: gambling.
[00:15:35] CR: Yes.
[00:15:37] SS: And I will die on that hill. And, and people have got rich gambling. On sheets and on all forms of speculation, no doubt. And it will
[00:15:47] SS: continue, but I would rather, I would rather, I would rather get
[00:15:52] SS: rich slow and get rich
[00:15:54] SS: surely.
[00:15:55] CR: Yes. And be able to sleep at night.
[00:15:57] CR: Like I, What I’ve learned from Tony over the
[00:16:00] SS: I was going to ask you
[00:16:01] SS: Tony’s opinion. Yeah,
[00:16:02] SS: on it.
[00:16:03] CR: same as Charlie Munger’s and Warren Buffett’s. Charlie Munger, before he died, used to call Bitcoin rat poison squared. Like,
[00:16:12] SS: That’s such a
[00:16:12] SS: Charlie.
[00:16:13] CR: yeah, such a
[00:16:14] CR: Charlie, love Charlie. Um, like the conversation Tony and I have talked about a lot, you know, it’s always the basic premise of investing is to buy something for less than it’s worth and wait for the market to readjust.
[00:16:30] CR: And to value it correctly. Uh, so you need to do that. You need to know what the intrinsic value of something is to know if you can buy it at less than its intrinsic value. So the question I always have for Bitcoin people is explain to me how you calculate the intrinsic value of one Bitcoin today. And then I can tell you what it’s worth.
[00:16:50] CR: And they can’t do that because it has no intrinsic
[00:16:52] SS: Well, it’s the same with all currencies. And this is the point. And yes, people speculate on currency and have done for millennia, doesn’t, and no, no investor would tell you that currency is an investment. Even people in Wall Street, they say, well, it’s a speculative tool and we use it and we can make money out of it.
[00:17:10] SS: And we’re happy to click tickets along the way. But Bitcoin is not a currency nor an
[00:17:15] SS: investment because it doesn’t pass the six rules of
[00:17:16] SS: a
[00:17:17] SS: currency.
[00:17:18] CR: And neither’s gold.
[00:17:20] CR: They always say Bitcoin is digital gold. And gold, you know, again, Buffett and Munger say gold isn’t an investment. It’s, it has no intrinsic value. It’s just a pretty
[00:17:30] SS: be a store of value, as can be grain, as can be a barrel of
[00:17:34] SS: oil,
[00:17:36] CR: least you can eat grain.
[00:17:37] SS: that’s right, and You can
[00:17:38] CR: burn
[00:17:39] CR: oil,
[00:17:40] SS: really, gold has some micro uses, you know, small electronics and other bits and pieces in jewelry, but it’s, it’s, it’s, again, it goes back to the Nobel Harari thing on money is a myth that we all collectively buy into because we need a tool for trade.
[00:17:57] CR: Anyway.
[00:17:59] SS: Moving on.
[00:17:59] CR: Let’s move
[00:18:00] CR: on.
[00:18:01] CR: Steve, I wrote a manifesto this week.
[00:18:04] SS: think you
[00:18:04] SS: write enough
[00:18:05] SS: manifestos. What is it
[00:18:07] SS: on? one thing I’ve always, listeners.
[00:18:10] CR: one thing I’ve always said about myself is I don’t write enough manifestos. I need to write more manifestos.
[00:18:15] SS: to as well. Are you going to go live in the shed in the woods? I’m just
[00:18:19] SS: asking. And do you
[00:18:21] SS: know how to,
[00:18:22] CR: got the crazy
[00:18:23] CR: hair, I just need to,
[00:18:24] CR: grow the crazy beard and I’m good to go.
[00:18:27] SS: and, and the post office now, you know, is only two every second day Cameron, just to let you know for when you’re mailing off
[00:18:32] SS: the, some universities.
[00:18:36] CR: Um, you know, Chrissy’s nickname for me is already The Wiz because she says I have crazy wizard hair. So, uh, yeah,
[00:18:43] CR: the Wiz. She’s like, hey, hey Wiz. Um, let me, let me read the preamble. Across Roads, Australia stands at a pivotal moment in history. The rise of Artificial General Intelligence, AGI, and robotics offers an extraordinary opportunity to build a future of abundance where work.
[00:19:05] CR: As we know it becomes optional, technology eliminates scarcity, and everyone has access to the essentials of life. But, who will shape this future? Who will decide how this technology is used? It’s been over two years since the release of chat GPT by open AI in November, 2022. And since then, AI models have continued to improve at a rapid rate.
[00:19:33] CR: Many senior industry leaders are predicting a GI could appear as soon as 2025. Leaders in the field of robotics are predicting that by 2040 there could be a billion bipedal robots in society and they will cost roughly the same as an entry level. Motor vehicle does today. In the last two years, our government and corporate leaders have done very little to prepare Australia for what’s happening.
[00:20:00] CR: If we leave the preparation to governments and corporations, we risk walking into a nightmare. The same governments that failed to prepare for the COVID 19 pandemic, despite decades of warnings from scientists. We’ll fail to prepare for the AGI revolution. During COVID, we saw the consequences of short term thinking, poor planning, and a political system more focused on protecting profits than saving lives.
[00:20:27] CR: Hospitals were overwhelmed, supply chains collapsed, and millions of people were left to fend for themselves. Now imagine the same incompetence larger scale, with AGI and robotics disrupting every job, every industry, every industry. and every aspect of society. Corporations, driven by profits above all else, will use AGI to maximize efficiency at the expense of workers.
[00:20:53] CR: Entire industries will disappear overnight, with no safety nets in place for those displaced. Housing, healthcare, education, and even food could be controlled by algorithms designed to extract wealth, not serve humanity. Without intervention, this technology will deepen inequality and concentrate power in the hands of a small elite, while millions of Australians are left behind, struggling to survive in a system that no longer values their existence.
[00:21:22] CR: But there is another path. A path where we, the people, take control of our future, where AGI and robotics are used to empower communities, not exploit them. A society where technology guarantees universal access to housing, healthcare, education and sustainable energy. A society where every person’s dignity is respected and every community has the tools to thrive.
[00:21:49] CR: Governments are shown as time consuming. and again that they cannot be trusted to act in the public’s best interest. If we leave this to the politicians and the corporations, we know what will happen. They will act too late, too timidly, and in service of the powerful. It is up to us to seize this moment and take responsibility for shaping the future.
[00:22:10] CR: We envision an Australia built on dignity, Equity and Abundance. A society where the benefits of AGI and robotics belong to all of us, not just the few. But this future will not happen by accident. It will only happen if we organize, act, and build it ourselves. The stakes are high. Could not be higher. The future is ours to create or ours to lose.
[00:22:36] CR: Let us rise to this challenge together.
[00:22:39] SS: By the way, I very much liked it. It was getting a little bit Unabomber until the end when he said there is an alternative. And then I breathed. I breathed, I said, I’m on a podcast with a guy. I’m not gonna have to explain in some true crime documentary letters. And
[00:22:54] SS: that’s a good thing for me. I
[00:22:56] SS: may, I really, Or is it? really liked it and I wrote a couple of notes when you were reading the first half that sprung to my mind.
[00:23:05] SS: Based on what you
[00:23:06] SS: were writing, would you like to hear them? Cameron?
[00:23:09] CR: That’s why we’re here, Steve.
[00:23:10] SS: Okay. The first thing that I wrote was that the real question is, can and will all have access to these tools of abundance? And then when I wrote that, I thought about, uh, the abundance that we have now, right? Because you could argue, and I’m just about to go to ChatGPT now and ask it, and you might even be able to do this for me if you’ve got it open, Cameron, is how much Wealth is there in the world per person.
[00:23:43] SS: I think it’s around about a hundred grand and in America, it might even be over a million. Can you check out those two numbers? Because we could argue, despite it not being as well distributed. And a lot of people in underdeveloped parts of the world, the food and the wealth and the electricity and the clean water, they don’t all have that yet.
[00:24:06] SS: So that’s one thing. Um, but we already have incredible abundance now where there is more than enough calories, energy to feed everyone. Everyone can be housed. There’s, there’s enough. Uh, homes, certainly in Australia, there’s a lot of empty homes, uh, there’s an incredible number of resources, uh, and they could be distributed if we had some sort of a system which was not capitalist, you know, potentially some sort of a communist system or where wealth is distributed, so everyone has dignity and access to the resources they need to live a healthy and, you know, moderately Well off life, but I feel like the monopoly on violence that governments have and the rules and taxations that government put in place, even though they redistribute some of the wealth, really is just part of the capitalist system where capture and control of resources is already there.
[00:25:02] SS: And if we didn’t have capture and control of resources, we could already feed, house, clothe, educate, And give medical care to everyone because we’ve certainly got the resources. And while we had corporate control, now we’re just going to have another form of algorithmic control. And the legal system might thwart people from having true control over their robot, like an Apple phone, where it can be bricked if you break some of the laws.
[00:25:28] SS: Uh, it seems as though we’ve got an oligopoly of Companies that control the AI, and even though the AI could potentially lead us to abundance, it’s controlled by the same corporations, and I can’t see them distributing their wealth to create this level of abundance in the back half of your essay, as much as I’d like
[00:25:44] SS: that to be the case.
[00:25:47] CR: I’ll answer that in a second, but let’s go to GPT.
[00:25:49] CR: The global wealth per adult averaged approximately USD 84, 718. However, this figure doesn’t reflect the stark disparities in wealth distribution worldwide. The median wealth per adult, a more accurate indicator of typical individual wealth, was around USD 8, 360, highlighting significant inequality. In Australia, the average wealth per adult stood at USD 546, 184, with a median wealth of USD 261, 805, indicating a relatively equitable distribution compared to global standards.
[00:26:31] CR: These figures underscore the vast differences in wealth both between and within countries, emphasizing the importance of considering median values to understand the economic reality of the average person.
[00:26:42] SS: than medians here, Cameron, and the reason is we’re talking about if we distributed it evenly. Right? So, so, so the averages are more important than the median in this case, because the median is more about how, how well distributed is it? This is a question of if we split up the resources and gave everyone access to abundance.
[00:26:59] SS: I’d love to know what the American number is. I reckon it’d have to be over, it’d have to be over a million per person, surely. Wealth per person in America would, because if Australia is,
[00:27:09] SS: you know, 500, yeah, you’d have to think America would
[00:27:13] SS: be at least that, maybe more.
[00:27:16] CR: Um, interesting, getting back to the, getting back to the,
[00:27:20] CR: the chat that Elon was having with Peter Diamandis, um, they were talking about it being the end of capitalism. And Elon, which I thought was fascinating, the richest man in the world, was saying, yes, this is going to bring about, if we get this right, this will bring about the end of capitalism as we know it, because everyone will have everything that they need, uh, for no cost, everything will just be provided to you.
[00:27:45] SS: are we talking robotics, creating things and extracting resources, or are we talking about nanobots creating a BMX bike
[00:27:52] SS: from scratch, organizing things at the molecular
[00:27:54] SS: level? He didn’t get into that level of detail, but yeah, all the stuff that Kurzweil talks about and Diamandis talks about in his books. In my, um Manifesto, I refer to it as Universal Basic Services, which isn’t a term I came up with, but it’s something that has been growing, I’ve been seeing people start to talk about it.
[00:28:16] CR: UBS instead of UBI. I wrote, We envision a society where everyone has access to the essentials of life, guaranteed as human rights. It should be a future of dignity, equity, and abundance. Now, I agree with you, and Tony and I have had this debate. For the last six months, his view is that the establishment, the powers that be, won’t allow this future to happen.
[00:28:43] CR: They won’t allow AI and robotics to achieve this vision because they won’t give up control. And I agree with that in part. I think that’s basically, this is what I talked about in the psychopath epidemic to a large extent. If you’re in a position of power and wealth and you don’t want to give it up, You’re going to use your power and wealth to keep your power and wealth.
[00:29:07] CR: Uh,
[00:29:08] SS: construct, mostly, was what I
[00:29:10] SS: took from the book.
[00:29:13] CR: what, what, what’s an emotional The idea of keeping wealth and power, because you’ve got abundance, and you’ve got way
[00:29:21] SS: more than You need in a lot of those circumstances.
[00:29:24] CR: You know, why would you give it up willingly? You’re not. You’re going to fight to keep it. And you’re going to use it to maintain control by manipulating or influencing the political system, the legal system, the system of media and education and all the things that influence the people because, you know, the 99 percent are always more powerful than the 1%.
[00:29:48] CR: The 99 percent can rise up at any time and overthrow the 1%.
[00:29:52] SS: And the 1 percent
[00:29:52] SS: know
[00:29:53] SS: this, they know
[00:29:54] CR: And they know this and they need to The 99 don’t though. The 99 actually don’t. You know, 5 percent of the 99 know it.
[00:30:03] CR: because the
[00:30:04] CR: 1 percent have spent a fortune telling the 99 percent that violence isn’t the answer.
[00:30:09] CR: Such a, such a graha, non violence, think about Gandhi. Gandhi was a great man, he was non violent. Did Gandhi get rid of the British from India? No, not really, it was the violent Sikhs who rose up and got rid of Britain I didn’t know that, but there’s a great show that I’m going
[00:30:27] SS: to sit down and watch called Mr. Inbetween. I haven’t Oh, oh my god,
[00:30:32] CR: the greatest show ever to come out of Australia by a landmile. Oh I’ve never heard a bad report you haven’t seen it. Now I’ve seen clips and I love it. Don’t, don’t spoil it for me because It’s when I have my last gig this week, I’m going to sit down joyously and just, and
[00:30:51] SS: just binge the whole
[00:30:52] SS: thing. man, But there’s one scene that I saw, I’ve seen a few of the scenes enough to have a taste that I know I’m going to love it.
[00:31:00] SS: And, uh, in one of the scenes, he’s sitting around in some round table, um, so he doesn’t have to go to jail. And he’s talking about a situation where he had an altercation with someone and was violent towards them. And the, uh, person sitting with all of the attendees to help with their violence, he said, can you give me an example, Ray, of a time where violence has actually worked?
[00:31:25] SS: And he said, World War II? I mean, he said, you know, we didn’t just negotiate with the Nazis. I don’t think they would have left. And I’m like, Oh yeah.
[00:31:37] CR: It, like, the story, do you know about the story behind it?
[00:31:40] SS: I do. And I’m watching, I’m
[00:31:41] SS: halfway through, uh, The Magician. I’m halfway through that. Oh, where did you find that? on YouTube.
[00:31:47] CR: find that. No, really? Yes. And I’m halfway through and I’m loving it. And I know the story about how he made the five minute film, then the longer version at the St Kilda Film Festival. And then some Australian kind of actor, one of the guys, I forget his name, sort of put some money behind it.
[00:32:04] SS: And then it got
[00:32:05] SS: picked up in the
[00:32:05] SS: US. Yeah. so, for people who don’t know the story, I know this is by the by, but fuck it. Um, it’s,
[00:32:12] SS: media, and it’s new forms of media, and it’s all that,
[00:32:14] SS: Cameron, just a great story. So Scott Ryan, the creator and the star of the show, is a um, oh man, like.
[00:32:24] CR: The story is, Don’t ruin it
[00:32:26] SS: for me, no, not the story of the show, the story of
[00:32:29] CR: him.
[00:32:30] SS: of the thing,
[00:32:30] SS: yeah. Um, he has never done anything in his life. I’ve seen him talk about it.
[00:32:37] CR: He grew up in, grew up in Melbourne, um, Richmond, um, and didn’t really do well at high school. Um, you know, they think he had sort of some disorder, oppositional defiance disorder or something like that. Um, mental health declined at 17. He had agoraphobia. And basically lived as a recluse, never left home, didn’t do drugs, but lived with people who did a lot of drugs, and, you knew this world a
[00:33:12] SS: bit. a bit.
[00:33:14] CR: but, you know, he basically just didn’t leave home for 12 years, didn’t have a job, didn’t study, didn’t do anything, practiced as a recluse.
[00:33:21] CR: Yoga and Tai Chi and meditation, didn’t drink, didn’t do drugs, just did nothing. And then he started to think about, he goes, Oh, I got to do something. And the only thing he ever enjoyed it at school was writing. So he tried to write something and he was like a fan of the Sopranos and things like that. So he tried to write a Film about a, you know, criminal.
[00:33:47] CR: And he made The Magician, um, then he did start studying filmmaking at RMIT, shot it, I think he spent like 3, 000 on it, 2003, low budget, and it got attention at the Film Festival, noticed by Nash Edgerton, Joel Edgerton’s it, I knew it was one of
[00:34:05] SS: the Hollywood Aussie
[00:34:07] SS: guys. They helped re edit it for a theatrical release, and it was released as a film.
[00:34:14] CR: And then, um, nothing happened. with it. He said, we’re going to, we’re going to get this, you know, we’re going to get a budget for it. We’re going to get it picked up. We’re going to turn it into a series. This is like early 2000s, right? Nothing happened. They couldn’t sell it. They spent like 15 years trying to sell this to get it picked up.
[00:34:37] CR: No one would touch it. And so he did nothing again for the next 15 years. I Yes, because I’ve noticed how young he is in the movie, and it’s like, what have, I actually calibrating.
[00:34:49] CR: so, uh, he just does nothing for another 15 years. Finally, they’re able to sell the show. To a US, uh, network. I can’t remember which one. Oh, FX, picked it up. Yeah, in 2018. And they make this series, which he writes and stars in. Nash Edgerton directs, and they have some other directors, but Nash, and Nash Edgerton appears in it as well.
[00:35:16] CR: He plays a minor character, and his daughter, Nash Edgerton’s real life daughter, plays Ray’s daughter in it, and she’s fantastic. But it’s just, it is like the greatest, this guy, he, he is a fan. Fucking genius. The writing is brilliant. The stories are brilliant. The acting is, he’s terrifying. He’s basically a hitman, low end hitman, enforcer.
[00:35:40] CR: He is just fucking brilliant. But he’s never done it. This is the only thing he’s ever done. And then he did like three seasons and said, I ran out of stories and he stopped. And they’re just like, and the Good. Stop.
[00:35:54] SS: Good. And the final episode, like the final scene of the final episode, Chrissy and I still talk about, it was just, it was like Perfect. my boys. Scott Kilmartin. Scott Kilmartin is a massive fan who
[00:36:06] SS: got me onto it. Big Scotty. Oh, really? Yeah. Wow. that’s, that’s my, Yeah. Tony Kynaston got me onto it too. Oh watch it next Thursday night.
[00:36:15] CR: I’m je I’m jealous, man. You gotta,
[00:36:17] CR: uh, it’s like, That’s actually one of the great things about any form of content or, you know, discovering a band or a movie or whatever. And you say,
[00:36:25] SS: I’m jealous. Cause you get to
[00:36:27] SS: experience it again For the first For the first time.
[00:36:30] CR: Yeah. Wow. That was a pivot, wasn’t
[00:36:33] SS: it?
[00:36:34] CR: How did we get onto that? Um, were talking about violence Violence. on violence that, uh, and in that, in the interim, I did look up what the U S is really, really similar to Australia in terms of the amount of wealth per person.
[00:36:48] SS: It’s, uh. over here, was it? It’s
[00:36:54] SS: 579, 000 USD. there you go. That’s probably about 700, 000 AUD.
[00:37:01] SS: Yeah.
[00:37:03] CR: So anyway, getting back to the side. So my basic idea is that I think we need to start a movement. Really? Like, are you serious when you say
[00:37:12] SS: that?
[00:37:12] CR: yeah, because I don’t have any confidence that our governments or our business leaders are going to manage this, Well, let’s, let’s think about it as well, because I think, and I do this a lot in my career, and it’s, it’s, it’s 20 years, is that we have so often been here before. And I think it is fair to say that the industrial revolution has brought about significant abundance and possibility. We just went through the numbers and we could have fed everyone.
[00:37:54] SS: We could have housed everyone and we could have given everyone access to healthcare with the industrial revolution. And we can now, we’ve just seen those numbers and yet we haven’t. And I would be flummoxed because in the transition, it’s not like you arrive at UBI day.
[00:38:13] SS: We don’t, we don’t arrive at abundance day.
[00:38:16] SS: It kind of creeps up. It’s a, it’s, it’s a little bit like Billy Gibson, you know, William Gibson. It’s like the future has already arrived. It’s just not equally distributed. Well, that’s kind
[00:38:25] SS: of, they’re the members, they’re the guys in ZZ Top, Billy Gibson. oh, is it a William Gibson, Billy Gibson? I didn’t know that. Fluffy guitars, give them a spin. Who
[00:38:35] SS: knew? Uh,
[00:38:37] SS: Yeah, so I Billy Gibbons? No. Billy Billy Gibbons
[00:38:41] CR: is uh, is easy. Tough
[00:38:43] SS: Oh, there you go, I didn’t know. Well, William Gibson, I was pretty sure I had William Gibson right, but anyway, uh,
[00:38:49] SS: so,
[00:38:49] SS: I just can’t see it.
[00:38:51] SS: I can’t see it happening
[00:38:53] CR: so here’s my, my without a revolution, without a real revolution, and I think blood would need to be
[00:38:59] SS: spilled. Okay. So maybe, but my, the, the way I mapped it out is what we need to do. First of all, am I
[00:39:11] SS: mapped out the revolution.
[00:39:13] CR: No, no, I haven’t mapped out the revolution.
[00:39:16] CR: I basically see us building together, putting together a movement. Where we bring together the best and the brightest from around the country and we build a shadow government, basically. So we figure out, we put together a vanguard,
[00:39:36] SS: Is it bad that I’m laughing at this? Is it bad that I’m laughing at this? Keep
[00:39:39] SS: going.
[00:39:42] CR: first person I’ve talked to about this, man, like, I need you on board.
[00:39:47] CR: We put together a vanguard of experts in public policy, technology, urban planning, economics, community organizing, voices from marginalized communities to make sure it’s inclusive and equitable, and then we create portfolios. We have ministers or working groups that focus on figuring out the vision for using AGI and robotics in things like housing, healthcare, education, energy, technology, and do we develop detailed, actionable policies for implementing universal basic services, develop a strategic blueprint that could be picked up and rolled out by a future government because they won’t be doing this effectively.
[00:40:36] CR: by themselves. Somebody needs to be building this out. Somebody needs to put the blueprints into place so they can be rolled out by a future government, whether it’s one of the existing political parties or one that we create ourselves because the, there’s no longer any faith in the Existing parties to guide us into this future.
[00:41:01] CR: Like this is, we’ve talked about this for the last year on the show, but we’re, this is the biggest thing that’s ever happened to humanity. AGI robotics, the AGI revolution, the artificial intelligence, and people say it’s the last human invention and they’re probably right.
[00:41:15] SS: I agree
[00:41:16] SS: too.
[00:41:17] CR: This could be
[00:41:19] CR: the greatest time for humanity, or it could be the end of humanity, or anything in between, right, in the next 10 years.
[00:41:28] CR: I don’t trust our politicians to run it effectively. I don’t trust our corporate leaders or our religious leaders or, you know, any of our institutions to manage this process, policing, legal, etc. We need to put together the best and the brightest from around the country. to lead us through this. And there’s plenty of really, people way smarter than me out there.
[00:41:52] CR: I, I’m not smart enough to do this, but I think we need to bring together the smartest people who actually know how housing and healthcare and education and energy and technology, whatever, work at a ground level, what the challenges are, what the implications of things are, so AGI in robotics? and Nanotech to solve all of these problems.
[00:42:18] CR: What are the practical realities of trying to solve these problems? Um, then we need to secure the resources. You know, we need to secure funding and support to drive this thing. We need to, you know, some combination of crowdfunding, major donors, ethical investors, In kind contributions of people who want to donate land and technology and expertise to support pilot programs around the country.
[00:42:44] CR: How do we, how do we solve the housing crisis? You know, how do we start to build cheap, uh, housing that can house people that are sleeping in tents? We’ve got tent cities and caravan cities growing around Australia. How do we solve that problem starting with the people that are urgently in need? For their children of having housing, uh, near places where they can work and go to school.
[00:43:12] CR: Somebody needs to be, our governments aren’t addressing the housing crisis. Somebody needs to
[00:43:17] SS: we have abundance, do they need to
[00:43:18] SS: be near anything?
[00:43:22] CR: Well, you need to Well,
[00:43:23] SS: to
[00:43:23] SS: work? Well, wait a minute, you
[00:43:24] CR: no, right,
[00:43:25] SS: right now,
[00:43:26] CR: now, yeah, so this isn’t like a, the, you don’t flip a switch and you have abundance, right? We need to
[00:43:32] SS: It’s a
[00:43:33] CR: figure out how to use to, yeah,
[00:43:35] SS: the shadow government would need significant funding before you even did anything. That would be pretty
[00:43:40] SS: significant funding. Good organization. You know, it’s kind of like, it’s not even a think tank. It’s yeah, shadow go, it’s, it’s pretty, it’s a pretty involved
[00:43:51] CR: well, not really. Not from day one.
[00:43:54] CR: You’re building, you’re, You’re building strategic blueprints first. You’re pulling together experts, building strategic blueprints. Then, as that’s progressing, you’re figuring, okay, how do we do pilot programs? How do we find some land where we can build cheap housing using the latest cutting edge technology to design and construct 3D printed houses interesting. So my farm, 000 people?
[00:44:17] SS: I’ve got 110 acre farm. Do you know how hard it is for us to get approval to build a second dwelling on it, because the law says only one house per 80 acres. And like, yeah, but we’re trying to build another house to prove our technology so that we can build houses 50 percent cheaper.
[00:44:31] SS: Do you think the local government is going to help
[00:44:32] SS: us? You wouldn’t believe it,
[00:44:34] SS: Cameron. Yeah, said on the Diamandis interview. He said, it takes longer. To get approval to build a new rocket than it does to actually build the rocket
[00:44:49] SS: I think that’s also the case with housing and many other things. One point that’s really interesting, actually, when you mentioned the segments, obviously healthcare, housing, one of them was
[00:45:02] SS: education.
[00:45:04] SS: And I would start there, and I’ll tell you why. I’ll tell you why I would start with education, because we already have the technology, and it’s not AGI, but it’s good enough in every category to have educational abundance
[00:45:18] SS: Right now.
[00:45:19] SS: Like. Gentlemen, we have the technology to build him better, stronger, faster. Do
[00:45:31] SS: E Boston, a man, half man, half machine, let loose, to come and help a society, a society in trouble. I’m just
[00:45:40] SS: making that up.
[00:45:43] CR: people that weren’t around in the seventies, dunno what we’re talking about, or the early eighties you missed the, it was the 6 million dollar man.
[00:45:50] CR: Yeah, 6 million. Like can’t even get a house in Brisbane for that now. Yeah, no.
[00:45:56] SS: think Education so right
[00:45:57] SS: now,
[00:45:57] CR: Yes, would say that we have, I would say, let’s call it relative abundance with education. I once spoke to RMIT University and they said, the future of education, we want you to come in, we want you to rip the band aid off, tell us everything you think, no holds barred on what the future is, right?
[00:46:20] SS: The executive there were out in the Yarra Valley and I said, okay, the first, they said, I said, so you, you really want me to, they said, yes. I said, the first thing you need to do is to get rid of the ATAR and let everyone study. Any course they want to study, regardless of the course that they got, or the score that they got.
[00:46:41] SS: And they said, oh, we can’t do that because they might not be able to pass the exams. I said, well, that doesn’t matter. And they said, we won’t be able to fit them in the building. I said, okay, if you didn’t get the full score, you can do it from home and do the same exam. Well, we wouldn’t be up that, that every reason I gave them, we couldn’t afford it.
[00:46:58] SS: I said, I had to look at your endowment. It’s 5 billion. You’re a hedge funds disguised as an educational institution. Like, what are you here for? Before I did that, I said, why do you exist? They were for the betterment of society, for the education of people. And then I went, get rid of the HR and let everyone in.
[00:47:16] SS: No, no, no, no, no. That’s just not going to work. Seriously. They hated me so much over the next two hours. We had a family photo where we all lined up at the end of the session. And when I stood there, everyone stood two meters to the side and I’ve got the photo of me on my own, right? Crazy. And when we had lunch afterwards, I sat down at a table and everyone got up and changed tables.
[00:47:40] SS: This is a hundred percent
[00:47:42] SS: the truth. Oh, that’s gold, We could do abundance right now with
[00:47:47] CR: what’s ATAR what’s ATAR is the score, which really
[00:47:51] SS: is a false limitation on the
[00:47:53] SS: amount of Your university, your entrance score, Your entrance score, which mattered when you can only fit 30 people in the building, but no one goes to the lectures anymore. No one goes to the building and we don’t need it.
[00:48:05] SS: So that’s
[00:48:05] SS: Just that is just false
[00:48:07] SS: scarcity.
[00:48:08] CR: Yeah. Just stream them online. You’ll give them an AI.
[00:48:11] SS: want to do it now. So, so already the challenges here are political, not, as you say, you need a movement because
[00:48:18] SS: we can do it right now with education today,
[00:48:19] SS: globally. Yeah. So we go, fuck universities, we’ll start our own university. Anyone can study any subject, anywhere, anytime. And we’ll get the world’s preeminent experts or experts in
[00:48:31] SS: this country. Here’s your engineering guy. Here’s your whatever, here’s your whatever.
[00:48:34] CR: we’ll just get the AI to do most of it, Yeah,
[00:48:36] SS: exactly. within a year.
[00:48:38] CR: Yeah. And we’re going to build a system for how you start from knowing nothing about medicine to being a doctor with AI teaching you everything, you know.
[00:48:48] SS: and, and so, and, and the AI robotics. So the robot will show you how to do the surgery and the, all of that sort of stuff. And Cameron will, he will donate his body to science. He’s that type of guy, but you don’t only need a shadow government. You actually need shadow systems. So then you build shadow education.
[00:49:07] SS: And you almost. It’s kind of almost like any form of industry disruption, but it’s happening category at a time. It’s kind of like what happened with media. They all went across to digital and legacy media is a rounding error with Murdoch’s just hanging on
[00:49:21] SS: somehow at 93 years of age.
[00:49:24] CR: One of the other low pieces of low hanging fruit is mental health. Access, you know, um, down in the pilot program section, I’ve got deploy AI powered mental health platforms offering 24 by 7 free on demand support for depression, ADHD, anxiety, and crisis intervention. Pair AI solutions with community hubs staffed by local volunteers and visiting professionals.
[00:49:47] CR: You know, trying to get kids diagnosed with ADHD in Australia at the moment, as everyone’s heard. Takes forever, costs a fortune, depression, anxiety, trying to get people into, uh, be able to see a therapist is absolutely ridiculous. And you know, it’s deliberately manipulated by the industry bodies to make it as expensive and as difficult.
[00:50:12] CR: I read a, there’s a big article on this in Australia a month or two ago that I read, you know, the, The GPs are trying to get permission for the GPs to be able to diagnose kids with ADHD and the psychiatry or the psychology industry bodies are trying to stop it from happening because it’s, some, some of their members are getting paid 900, 000 a year and I’m not just making that up, 900, 000 a year for to work in AI, sorry, ADHD diagnosis specialty clinics because people are willing to pay a fortune to get their kids diagnosed so they can get them on medication and it’s just a massive payday for psychiatry and psychologists.
[00:51:05] CR: Yeah, I’ll send you a link.
[00:51:06] CR: It’s insane.
[00:51:09] CR: But we need to build solutions for this sort of stuff. Education, mental health, uh, et cetera, et cetera. But it’s not gonna, like, here’s my, my, my realization was that it’s just, we, we, this is too important. We can’t allow business as usual to shepherd us through this period.
[00:51:28] CR: It, they’re not gonna, they’re not gonna do a good job. We need to put together. A shadow government to prepare the, prepare the ground for this stuff.
[00:51:39] SS: It seems to me very much like a, you know, a modern revolution. I don’t know if you would know this better than me, that if the world has had relative peace for the last 50 I don’t know if that’s a bad statement or not, uh, since World War II, I don’t know if this, when you have large tech upheavals that results in war and resources and new systems of governance, not sure, but it certainly feels as though nothing other than a revolution Would make this possible unless we somehow managed to second resources to build out these things. And I just can’t see the donors being there because the donors are typically those with the wealth who are beneficiaries of the
[00:52:33] SS: existential system.
[00:52:34] SS: Saudis love
[00:52:35] SS: to tap into Look, I agree with you. That’s going to be a challenging aspect of it, but, um, you know, you’ll find some of them out there that, that are thinking about their children and their grandchildren and want to see a better world built for them. Um, I think we’ll, we’ll be able to find. We’ll be able to find people to fund it.
[00:52:54] CR: They may not even be domestic. Uh, you know, they may be international, tapping into sources of philanthropy.
[00:53:00] SS: that do some, some brand washing. They like to do some sort of greenwashing
[00:53:05] SS: on their brand. The Saudis are big on
[00:53:07] SS: that. Well, it’s not only that, they know that crude oil has got a shelf life, man, of uh, how long they can make money out of it. Don’t know exactly how long it’s going to be, but they need to prepare for, they need to prepare for the days when they’re not able to sell oil anymore. What are they going to do next?
[00:53:27] CR: Well, I had a final thought on this. It’s just, it’s just that, um, I started this process by, by thinking about what I call techno communism. Um, you know, I, I’ve said on the QAV podcast recently that I’m a communist and, um, people sent me some emails, surprised that a guy hosting a podcast about value investing is a, And what I’ve always said about myself and communism is that if I think about the world that I want my children to live in, it’s a world where everybody is looked after, everybody has their needs taken care of, everybody, um, can have, Abundance and happiness and peace and prosperity and get to live their best life.
[00:54:20] CR: Which system of socio economic cooperation that I’ve heard about or read about over the course of my life do I think is, has the possibility of delivering that? It’s not Capitalism
[00:54:33] SS: Possibility is a good word too, because we’ve never really
[00:54:36] SS: achieved it yet. I would say that’s a
[00:54:38] SS: fair statement. Not on a large scale, not
[00:54:41] SS: on a large
[00:54:41] CR: system has achieved that. And capitalism, see the difference between capitalism and communism is that communism has a vision that it’s aiming towards, and it’s a vision of a better world where everyone is taken care of. Capitalism doesn’t have a vision. There’s no vision for capitalism apart from the capitalists control everything.
[00:55:03] CR: That’s the only vision. There’s no I’ve read Milton Friedman. I’ve read all of the foundational books on capitalism. Um, and there’s nothing. There’s no great vision for a better society.
[00:55:19] SS: but there is one book that defines what it is. I can’t remember who the author is, but it’s called Finite and Infinite
[00:55:25] SS: Games Right? and Finite and Infinite Games talks about the two. It’s a really weird book.
[00:55:32] SS: It’s from the Stuart Brand School of
[00:55:34] SS: Crazy Books in Oh, I love Stewart
[00:55:36] CR: Brand. Yeah. And so Uh, Stuart Brand, for listeners, was someone who, he ran the Whole Earth catalogue, is that right?
[00:55:45] CR: Yeah, real sort of Techno
[00:55:48] SS: utopianism, you know, in the early computer era. And he was part of that, that movement, and he’s done some really interesting things. And he also was famous for saying, information wants to be free. But he also said, information also wants to be expensive. And it’s the balance between those
[00:56:02] SS: two things is sort of where we are.
[00:56:05] SS: Uh, Finite and Infinite Games by James
[00:56:08] CR: P. Kass, an American academic who was Professor Emeritus of History and Literature of Religion at New York University. A review of the book summarizes Kass’s argument. There are at least two kinds of games, finite and infinite. A finite game is played for the purpose of winning, an infinite game for the purpose of continuing the play.
[00:56:29] CR: Finite games are those instrumental activities, from sports to politics to wars, in which the participants obey rules, recognize boundaries and announce winners and losers. The infinite game, there is only one, includes any authentic interaction from touching to culture that changes rules, plays with boundaries and exists solely for the purpose of continuing the game.
[00:56:52] CR: A finite player seeks power. The infinite one displays self sufficient strength. Finite necessitating an audience. Infinite ones are dramatic, involving participants.
[00:57:06] SS: so it’s interesting. I would put capitalism as an infinite game because the people who are playing it just want to keep playing and just want to keep the power. When you read it, it’s kind of clear. And to me, communism seems like a finite game. The game is, is that we distribute and find a system to distribute the resources so that we all have them.
[00:57:27] SS: And more resources come in and out, but the game has an end. An end is everyone has access to everything they need. Whereas Capitalism is this infinite game where I just want to win, and I will manipulate and change the rules to keep playing the game. And I want to be seen as, In front of everyone else and maintain that power structure.
[00:57:45] SS: For me, it was the clearest thing. And I obviously had my economics head on it when I read it. But, um, for me, that was one of the ones that really defines economic systems. If you read it with that context in your mind, you can really see certain patterns. Uh, but I, I think you’re right. I think capitalism, uh, is, is, is about the game itself.
[00:58:06] SS: Whereas communism is about a sense of equality, which could or should be achieved, they feel like two different types of games.
[00:58:13] SS: And hence, I think that’s why there’s a tension
[00:58:15] SS: there. And people will always say, well, we tried communism, it didn’t work. And my response to that is,
[00:58:22] CR: my response to that is threefold. Well, no, we’ve never had communism. That’s another point. We, we, communism is the end point. You have to go through socialism to get to communism and the so called communist countries then and now didn’t get past socialism.
[00:58:40] CR: Um, But the other three, the main three points are number one, capitalism hasn’t worked either. It’s brought the world to the brink of destruction. Um, and, and there’s greater economic inequality, et cetera, et cetera, than we’ve ever had before. Right now, there’s consolidation of wealth in the hands of the 1 percent or the 0.
[00:58:59] CR: 1%. So, okay. So that’s a, that’s a nil argument. It, we tried and it didn’t work. Um, but the key reasons why I, as a historian who’s studied 20th century communism for decades, if I had to put down the two main reasons why communism didn’t work, actually three main reasons why communism didn’t work in the 20th century, they are as follows.
[00:59:26] CR: Number one, the countries that tried it Skipped a step. The countries that all tried it were all, um, mostly pre industrial revolution or barely industrial revolution countries. Russia, um, China, uh, Cuba, Vietnam, Korea, North Korea. These are countries that had been either oppressed by colonials like Indochina and the French, China had had their century of humiliation under the British and, you know, their emperors hadn’t really adopted the Industrial Revolution, had their own Industrial Revolution in a way, but it wasn’t the same as the British Western one.
[01:00:10] CR: Um, and they tried to, and the Russians had the czars and all that kind of stuff that were very late to the Industrial Revolution. And by the time they started their, Socialist slash communist experiments in the early to mid 20th century, they were in a desperate situation. They were starving. They were being attacked by capitalist countries that were far wealthier and far more powerful militarily as well as economically.
[01:00:41] CR: And they tried to Speed up the process. Now, if you go back and you read Marx and Engels, the process that they envisioned for a feudal society, getting to a communist society, was you had to go from feudalism to capitalism, then to socialism, then to communism. And the capitalist period was important because that would build literacy, uh, and infrastructure.
[01:01:08] CR: You can’t really have a society where the people control the means of production unless the people are literate. And you’ve already invested in the means of production. You have, you know, you have roads and to get to that. and it’s almost like capitalism leads you through the innovation and the incentive process to get the resources and infrastructure required
[01:01:30] SS: to have an abundance for
[01:01:32] SS: all, Exactly. And then you evolve beyond capitalism into socialism, and then you evolve beyond socialism into communism, and the difference, main difference, the way I break it down for people is, socialism is where you have a state, you still have a state, that is controlling the division of, uh, Money and responsibility and power and monitoring it, a vanguard.
[01:01:55] CR: When you get to communism, you no longer need the state. Everything is just functioning by itself in communes, little communities. They’re running everything. The people have taken control of the means of production and everything just runs by itself. We never got there. And what, and what the problem, one of the problems, Russia and China and Cuba, et cetera, et cetera, had is when they tried this, their populations were mostly illiterate.
[01:02:18] CR: They hadn’t had an opportunity to build the infrastructure. So they tried to do a speed run in five year time windows to catch up because the capitalists were trying to destroy them while they’re doing this experiment and threatening them with nuclear weapons, et cetera, et cetera, the invading them, et cetera, et cetera.
[01:02:38] CR: And they, and, and the speed run didn’t work generally, which is why Dong Xiaoping is the great genius of China, because in the 70s, Dong said. we, we went, we tried to go too fast, too hard, we need to take back a step, we need to reintroduce some capitalism into the country, but we need to manage it.
[01:02:59] CR: Carefully manage it so they don’t become too powerful, but we need capitalism to invest in the infrastructure. And then after we do that for 50 years, we’ll be the most powerful country in the world, and then we can go get back to where we were. And that explains the rise of China since 1979.
[01:03:19] SS: Although that’s now a pending disaster. There’s a video I watched last night on China’s economic
[01:03:27] SS: future and it is super, super,
[01:03:29] SS: super bleak.
[01:03:31] CR: I’ve, I’ve been hearing those stories for No, no, no. I saw some numbers that blew my, this, this blew my mind. We’re going to talk about it. Next futuristic and we’re both going to watch it and it’s crazy. That said, we call that the Jack Ma clause.
[01:03:43] SS: We don’t want anyone to get too much power. Uncle Jack, listen, you’re coming on a little holiday with us. We’re going to, uh,
[01:03:50] SS: we’re going to reprogram your brain. But also you’ve got Xi Jinping, who is. becoming incredibly,
[01:03:56] SS: he’s an interesting cat,
[01:03:59] CR: Man, Lee Kuan Yew, when he was still around, said that Xi Jinping was the most impressive human he’d ever met. I mean, and Lee Kuan Yew was loved in the West for what he did in Singapore. And he said He said that Xi Jinping was a Nelson Mandela level leader of a country who was going to completely reshape China.
[01:04:19] CR: You get a lot of, you get a lot of, it gets a lot of negative press over here, but anyway, I don’t know you’ve got Chinese family and all that kind of stuff.
[01:04:25] SS: we’ll go to that another time, I mean the fundamental anyway, Alright
[01:04:28] SS: quickly because we’ve got to get back to a couple of other
[01:04:30] SS: topics on the
[01:04:30] CR: well, techno communism, so they tried to move too fast, but also, the technology wasn’t there to centrally manage hundreds of millions of people and their economic and business and social affairs, right? We have the technology now to do that, and we centrally manage Western economies all the time. You know, the way we roll out COVID vaccines is all done by technology, right?
[01:04:54] CR: And our economies are run centrally managed. Um, the RBI. Centrally manages our economy by tweaking levers. So this vision of techno communism is how I started with this. So anyway, that’s my view. That’s my movement. This is going to be the next 20 years of my life. Well, the next two years, because it’ll all be over in two years, whether we win or we lose.
[01:05:16] CR: So,
[01:05:17] SS: Well look, um the idea that abundance for all i think is, Just wondrous. And, and, and, and, and if that is, I believe that the technology could provide abundance, absolutely. And I agree with you that the challenge is the power structures around that abundance and the distribution of it, not whether or not it’s technologically
[01:05:36] SS: capable. So I agree with the
[01:05:38] SS: capability. the third reason communism failed is that psychopaths fuck everything up. That’s one of the big
[01:05:45] CR: yeah. You need to have
[01:05:48] CR: frameworks in place, whether it’s capitalism or communism or religion or police, whatever it is, to stop psychopaths from being able to take complete power, because they will. That’s what psychopaths do.
[01:06:03] CR: So part of all of this is how do we go through this process and don’t let the psychopaths Infiltrate and fuck it up. Psychopaths have their purpose and they have their uses, don’t get me wrong, but you need to carefully manage the psychopaths, because there’s one in ten people that are psychopaths, and if they can get, if they can get access to power, they will, they’re very good at that, the functional ones, and they’ll fuck everything up.
[01:06:30] CR: So that is another problem we had with communism, is the psychopaths got involved. And capitalism’s fatal flaw is that power structures emerge. And its fundamental flaw is that inequality over time, you need a redistribution of wealth. That’s the number one problem with it.
[01:06:46] SS: Capitalism, I
[01:06:47] SS: think. wealth and the power that the wealth
[01:06:50] CR: buys. Yeah, exactly. Yeah, it facilitates that and it needs to be circumvented. Moving right along, right along, Cameron, to uh, social media
[01:07:00] SS: in Australia.
[01:07:03] CR: People, man, I’m getting, I’ve seen a lot of, a lot of people are very angry for our international listeners. The Australian government has just passed legislation that says they’re going to ban social media for anyone under the age of 16 in Australia. TikTok, Facebook, Twitter, Snapchat, I think is included in there.
[01:07:22] CR: Um, doesn’t include some things like online gaming for some reason. Um, there are some things that are. Not going to be, uh, included. How are they going to do it? Not really clear at this juncture.
[01:07:37] SS: I’m gonna write a manifesto on how easy it is to police. It’s so easy. It’s embarrassing as to how easy it is to know exactly, nearly exactly how old a person is on social media. Uh, there’s been a lot of talk, and this is a classic example of
[01:07:54] SS: techno, uh, utopianism,
[01:07:56] SS: and this is, you say, name this theme song. Da da da da da da da da da da da da. You can’t answer it? Okay. You, you’re obviously underage. Get off your band.
[01:08:08] SS: that’s actually a good one. That’s fun. That’s actually a fun one. Name this theme song. We’ve got enough 80s stuff that we can really throw out there until they start memeing on 80s songs and then tell everyone what they are and they use it as an, a stealth
[01:08:20] SS: education process in pop culture. Yeah, right. Sorry. What’s your policing system?
[01:08:25] SS: well, let’s just go for the listeners first.
[01:08:28] SS: A lot of people have come out and said that, uh, it’s thwarting free speech. You can imagine Elon came out and all of the
[01:08:36] SS: complaints. Look, I think it’s, I think it’s super clear that it’s absolutely dangerous for young minds. I think we’ve spoken about it before. I’m really for this law. In fact, I wish it was 18 years of age.
[01:08:46] SS: Because I just cannot see any good coming from social media with kids, you know, young girls and self harm and bullying and all of these expectations on life, reducing their attention spans. The negative, and I’ve seen it personally with family members, where it’s incredibly damaging. And I’m all for this law.
[01:09:07] SS: And I think within a few years, it’s going to go up to 18. Now, The fine corporations can be fined up to 49. 5 million Australian dollars for systemic failures to implement an age limit. Well, first of all, what is systemic? Which means, can they just do it all year and they go, Oh yeah, about that and fight in court a year later after they’ve made zillions of dollars?
[01:09:33] SS: 49 million is nothing for big tech, even in Australia.
[01:09:38] SS: It’s, it’s not even a parking
[01:09:40] SS: ticket.
[01:09:41] CR: Why is
[01:09:41] SS: give you the
[01:09:42] SS: numbers.
[01:09:43] CR: 5?
[01:09:44] SS: Sorry?
[01:09:45] CR: Why is it 49. 5? Is that like 90, you know, 99 cents instead of a dollar? Is it
[01:09:50] SS: nice to make it feel as though they really considered it and didn’t just didn’t throw the round number of 50 is what that is, Cameron. So let’s just be clear on the psychology of
[01:09:59] SS: governance.
[01:10:00] CR: Right, of marketing
[01:10:02] CR: a fine.
[01:10:02] SS: yeah, exactly. So Facebook’s revenue in Australia is 1. 26 billion, right? So it’s what 4 percent of revenue, something like that.
[01:10:12] SS: Uh, Alphabet’s revenue in Australia is 7 billion. It’s not even 1%. Again, parking ticket. The only one that it could affect in Australia, whose revenue is growing, but last reported was only 375 million in revenue. So it’s reasonably impactful on TikTok’s revenue. So, we have a bunch of companies here where the fines are inconsequential.
[01:10:39] SS: Another mistake they made is, underage users, nor their parents, neither of those will face punishment for violations. However, if I’m caught underage drinking or I’m supplying someone underage drink, I get a fine of 2, 000 to 10, 000. Why wouldn’t they do the same thing if they’re serious about it? And I do think that social media is more dangerous for children than underage drinking or driving a car.
[01:11:03] SS: I really do. Uh, now here’s how they can know. I actually wrote like a 30 point plan on the ways that you can know exactly how old someone is on social media. The first one
[01:11:14] SS: is.
[01:11:16] SS: Yes, Cameron.
[01:11:17] CR: before before you go on, I just wanted to point out, what did you say Google’s revenue was in Australia? Do
[01:11:22] SS: Alphabet
[01:11:23] SS: in total is 7 billion.
[01:11:26] CR: you know how much of that they paid tax on?
[01:11:28] SS: Oh yeah, I know they pay about 20 million
[01:11:30] SS: tax a year or something ridiculous. Almost
[01:11:31] SS: nothing.
[01:11:33] CR: 90 million in tax they paid based on income of 1. 892 billion because they claimed that around 80 percent of their income is not taxable.
[01:11:46] SS: That’s right. Because you know what they do? They lease their brands and their assets off themselves, which is the easiest loophole to close in the
[01:11:54] SS: world. That’s like me saying I’m gonna put my brain, I’m gonna put my brain in um, a zero tax haven and I rent my brain from my, this is insane. It’s just
[01:12:04] SS: insanity.
[01:12:05] SS: It’s a joke.
[01:12:07] CR: took 6. 3 billion from Australian customers during fiscal 2021 2022, told the ATO that 93. 6 percent of their income was not taxable.
[01:12:18] SS: allow that. But this is why to remove multinational tax avoidance, it’s so, so simple. And again, Cameron, we’ve been here before. Land tax. Why was land tax introduced? Because people were finding ways to say, yeah, but I had to put the fences up and the cows and the this and the that.
[01:12:34] SS: And then they said, yeah, we’re just going to put a value on your land and send you a textbook. Congratulations. You’re my newest taxpayer. We do the same thing. It’s called revenue assessment tax for any international company. We put a valuation on your company. You did 7B in revenue. Here’s your 10 percent tax on that.
[01:12:48] SS: Congratulations or leave. We
[01:12:49] SS: don’t care. It’s real easy. You
[01:12:51] SS: need his courage. yeah, we’ll just nationalize all of your assets and
[01:12:54] CR: you can fuck off.
[01:12:57] SS: I’m not joking. This is exactly what
[01:12:59] SS: we should do. It’s so easy. It’s an
[01:13:01] CR: Same with mining companies. Oh, you don’t want to pay tax on your mine? Then fuck off. We’ll, we’ll nationalize your take it with ya. Well, we should nationalize the mines anyway, but that’s another part of our communist plan. I agree. That is insane. Zero value add. back to, how do you, how do you
[01:13:16] SS: error. It’s a rounding error. Um, now, here are some of the ways.
[01:13:22] SS: I won’t list them all because it’ll bore listeners, but I’m going to write a manifesto. First one is Despite what they say, they are looking at the face of every single person who is on social media, every single time. They are doing that. And when you’re posting, you’re taking a photo, and they can very accurately with AI, guess your age.
[01:13:38] SS: That’s number one. Number two, finger size. They know exactly how big your finger size is. Number three is the tone of your voice, whether or not you’ve been through puberty or not. If you haven’t been through puberty, they can tell in your voice whether or not you have been through puberty. There’s three.
[01:13:53] SS: We haven’t even
[01:13:53] SS: started. Which room you are in.
[01:13:56] CR: bio authentication. BioPrints. There’s about 20, right? I’ve just given you three. Location, Geo. All right, let’s go with location. The first one is, Which bedroom are you in in the house? Every single house is on Google Maps. Every single house which has been sold, sold in the last 15 years, would have a map of that house and the address knowing where the bedrooms are.
[01:14:20] SS: If the phone is next to a bed which isn’t the main bedroom, you know how old they are. You take that phone, if that phone is in a classroom, we know the classroom and the ages of the people in that classroom. Here’s another one. The most common message on social media is Happy Birthday. Happy Birthday, Steven.
[01:14:39] SS: Great, hope you enjoy your 14th birthday. You can cross reference this with all of your contacts lists who are the same contacts where you can do a triangulation test on the same things on them to work out exactly how old they are. Like, what about the content that they look at? What about the things that they share?
[01:14:55] SS: There is an insane number of indicators and I actually asked ChatGPT and it gave me more than 50 and they said we don’t know if these They actually have access, but it is highly plausible and probable that they do have access to all of this information. They know exactly how old the person is. And here’s what we need to do. Find someone who is on there, who is younger than you. We’re going to come for you. And the fine is 5 percent of your revenue. Your fine is 10 percent of your revenue. Congratulations, here’s your fine. Or we’ll take your directors and put you in an orange suit. It’s real easy. Either way, parents who give it to their kids should risk a fine of 10, 000 as well.
[01:15:37] SS: Because we need to treat it the same way as drink driving, as driving on the road illegally, and alcohol if we’re serious about it, and tobacco. If we’re serious about it, it’s easy to define who is on the phone. So easy. I’ve just given you a whole bunch
[01:15:53] SS: of reasons that you can cross
[01:15:55] SS: reference those. Well, there’s a difference. You’re talking about the ability of the technology companies to determine the age of the participants, but how do you police it? How does the government police what the social media companies, how well they’re monitoring and managing the process? That’s a way you police it is, yeah, so you find people. Who have broken the law. So this is the needle in the haystack. You got to show the
[01:16:25] SS: needle. If How do you find that? You get 15 year olds to confess to the government, go to a confessional with a Catholic priest and after he finishes touching them up just mentions that they’ve you make the fine, here’s, here’s what I think too. If you make the fine and the downside significant enough, you’ll get an incredible amount of compliance from corporations. Cause they’re, they’ve only got one
[01:16:47] SS: motivation,
[01:16:48] SS: Cameron. Yeah, but I’m not talking about the corporation, I’m talking about how does the government police that the corporations are doing it? How do they monitor it to make sure they’re doing the right job?
[01:17:04] SS: I think they will do the right job if the risk is high enough for them financially.
[01:17:10] CR: Yeah, I get that argument, but that’s That’s my argument.
[01:17:19] CR: It’s like talking to my wife. Um, okay, so
[01:17:24] SS: What does that mean? You really love me,
[01:17:25] SS: Cam. Thank you. Well, you know
[01:17:27] CR: that’s true. Um,
[01:17:30] SS: well, there would be
[01:17:31] SS: ways that you could do it. Let me, let me
[01:17:33] SS: digest that and in the you can just you can just say I don’t know.
[01:17:37] SS: not yet, but I’m sure there is a way. I don’t know yet. I
[01:17:39] SS: actually I don’t know. So you don’t have a policing strategy at all, real. Well, You’re talking about compliance strategy, not a policing strategy.
[01:17:48] SS: yeah, a compliance strategy, I think we have, the policing I think is, I mean, you could do the same way you do things with cars.
[01:17:57] SS: You could do random
[01:17:58] SS: checks. Yeah, that was where I was
[01:18:00] CR: going to go with it. So random checks and how do you do that? We’re just going to rock up to Facebook’s offices and
[01:18:06] SS: But you could do that. You, and also, well, wait a at people’s profiles and then determine whether or not they’re. Of age,
[01:18:14] SS: Well, well, if you’re gonna have a law. Maybe, maybe you could determine whether or not people are of age on their profiles, but maybe you take phones out of kids hands in the same way that you get someone on the street, uh, in a car, in a public place. Maybe that’s it. In the same way that it’s what’s in your, I don’t know.
[01:18:31] CR: cops just randomly tap kids on the shoulder on the train on the maybe, Yeah, maybe, maybe, that’s it.
[01:18:39] SS: Why, what’s wrong with that?
[01:18:40] CR: look, cause that sounds very police statey.
[01:18:44] SS: Well, we’ve already got police state with alcohol and, and cars. What’s the difference?
[01:18:49] CR: I don’t want more police statey. I want less police statey, but look, let me, let me throw some. I do, and look, I’ve had a number of people in the last couple of weeks, including my boys, uh, who are 24, say this is just ridiculous and it’s bullshit and it’s never gonna work and kids will find a way around it, they’ll, they’ll find a way to hack Like they do with alcohol and like they do with driving cars
[01:19:10] SS: illegally. Yeah, that’s what they You know, the most important thing though on it, for me, the most important thing, yeah, and, and, and I know that people laugh at me for this, but in my mind the most important thing that this does is it gives parents a reason to say it’s against the law and we are
[01:19:29] SS: not doing it. You’re not
[01:19:31] SS: having it. And I think, and I, and I do think that in countries like Australia and parents who care about their kids. In the same way that they don’t give their kids alcohol, well you hope that they don’t, when they’re young, uh, this becomes like that. And a large part of it is really about parenting, but this gives the parents tacit approval and legal approval to come and say, hey, wait a minute, no, we’re not breaking the law.
[01:19:58] SS: I think that’s the most powerful
[01:19:59] SS: thing on this law.
[01:20:02] CR: I’ve had, so I’ve had a range of conversations and I’ve seen a lot of stuff online about this over the last couple of weeks. Um, one of the arguments that I heard against it from a friend was that what about LGBTQ kids in rural towns that, uh, the only source of support that they get is from social media, being online in communities of other LGBTQ people.
[01:20:30] CR: They’re going to be cut off. Uh, it’s going to be bad for them. Do you have a response to that?
[01:20:37] SS: No, because I was just, um, typing into ChatGPT, uh, ways that it could be policed, because it might have a better answer
[01:20:45] SS: than ours. Sorry, can you ask me again, please, mate?
[01:20:48] SS: Was doing So one of the arguments that I heard against a ban is that what if there are LGBTQ kids in rural towns where social media is the only place they get any sort of community and support and they will now be cut off from that.
[01:21:05] SS: Yeah. Look, that’s a concern and people, uh, who need social connection for a variety of reasons, whether they’re mental health or their personal status or all of those. Things like you’ve mentioned. Um, we should be able to, as a part of this, provide some kind of a forum where they can find each other that is outside of the realms of traditional social media.
[01:21:33] SS: I don’t think that would be incredibly difficult to build a forum. As far as I understand, WhatsApp and a few other messaging groups are still available and Reddit and other places. I think that there is still places for them to connect and communicate. But if it is that government needs to invest in this.
[01:21:51] SS: Then they should, and they should get the money off big tech to do it, right? So build a safe place for people of that ilk to be able to connect with each other and communicate, and that’s actually even more of a healthcare issue than a social media issue, if you want my honest opinion on it, and that there should be services that the government is investing in to help people with specific.
[01:22:13] SS: Um, needs for socialization and connection
[01:22:16] SS: who, you know, don’t fit into the
[01:22:20] CR: Yeah, we already have Kids Helpline. According to Guardian, four hours ago, uh, the laws will apply to Facebook, Instagram, Snapchat, Reddit, and X. Exemptions will apply for health and education services, including YouTube, Messenger Kids, WhatsApp, Kids Helpline, and Google Classroom. So there are, uh,
[01:22:44] SS: Y.
[01:22:45] SS: Ha ha ha Ha There are going to be avenues and platforms for kids.
[01:22:50] CR: Um, so look, I think we’ve talked about this before, but this is how I’ve articulated it to people in the last couple of weeks. I am personally against censorship as a general rule. I think in a, in a free society, you want to keep the government out of censorship as much as possible. You also want to keep them out of interfering with what people can do in their private lives, their personal lives.
[01:23:13] CR: I disagree with bans on tobacco. I disagree with. Uh, Duties on Tobacco, and all that kind of stuff. Oh, right. That’s said,
[01:23:24] CR: watch what surprises you.
[01:23:26] SS: to hear that you disagree with that. I, I think in a free economy, if something’s legal, you should be able to use
[01:23:31] SS: it, but
[01:23:31] SS: yeah, it’s either
[01:23:32] SS: legal or it’s I’m happy for
[01:23:36] CR: there to be restrictions on where you can smoke in public, but, I, I’m not able to buy cigars to smoke for myself at home on my deck. Because the price went up from like five bucks a cigar to 60 bucks a cigar on the things, right? So
[01:23:55] CR: that kind of stuff, um, I, I take offense to,
[01:23:59] SS: Yeah, No, I agree but here’s the
[01:24:00] CR: thing. I disagree with that. I, and I think they should be limited. It should be a tool of last resort. That said, I do believe governments have a responsibility. of care, particularly for, uh, minors and minorities and, and people that, um, get the rough end of the stick in a society where they’re on the fringes and the margins.
[01:24:25] CR: And if, We, we know there’s increasing evidence that social media is just bad for everybody, generally speaking, but particularly for kids because their brains aren’t fully developed. They don’t have the control, the executive function to know what’s good for them, what’s not good for them. We know that there’s a lot of bullying, there’s a lot of suicide, there’s a lot of horrible stuff that happens just in terms of self esteem and all that kind of stuff that comes from what, looking at Instagram that kind of stuff.
[01:24:56] CR: That, um If corporations know this, but aren’t doing enough to stop it, if parents Know this and aren’t doing enough to stop their kids being on it that there is a point in time Where governments do have a responsibility of care to step in and go enough’s enough We’re gonna have a generation or generations of kids that are more fucked up than they need to be.
[01:25:23] CR: We need to Put an end to this. We need to stop this. We need to step up. It’s difficult. Um, I’m, I’m, I’m in a sort of mixed position on it. Um, I don’t trust governments to do this stuff well. Um, but I do understand the impetus to protect the most vulnerable in society if the, if the forms of protection that exist outside of that aren’t doing a good enough job to protect them.
[01:25:58] SS: Yeah. It was interesting when I put into ChatGPT some of the policing. I mean, the, the most obvious one, which got thwarted because of privacy concerns, which is insane by, I think the coalition was, uh, to put in, um, verified digital ID systems, which would be the easiest way, some form of ID to prove that you’re over the age of 16, whether it’s a learner’s permit or whatever.
[01:26:22] SS: And you can do that end to end encryption, which, which actually
[01:26:24] SS: has zero privacy
[01:26:25] SS: issues as well. All right. So, everyone’s like, oh, they want to get all your ID, yeah, they know who you fuckin are exactly. And they know that as well. So they should have had that. And that’s the easiest way to do it. And I would envisage, I would envisage that at some point that’ll come in. I do think that, like a lot of laws, you put them in for good reason, but there probably will be an amendment.
[01:26:47] SS: Um, some of the other ones was device level restrictions, built in settings. Um, Guardian Approval, all those
[01:26:53] SS: kind of things as well. Um,
[01:26:56] CR: Now, there’s also the argument that this has all been orchestrated by the Murdoch media as, and the other big media, Kerry Stokes, et cetera, and
[01:27:08] SS: it may well have.
[01:27:09] CR: against Google and Facebook for, you know, refusing to, um, be bribed in order to let people have access to the news. And I think that’s probably right too.
[01:27:24] CR: I’m sure I think there’s a high
[01:27:25] SS: chance of that. Doesn’t make me
[01:27:26] SS: against pushed hard for this. Exactly. Both things can change my mind on it. If it came from people who have done
[01:27:33] SS: nefarious, iniquitous things
[01:27:35] SS: in the past, it And are doing it for nefarious
[01:27:38] CR: reasons.
[01:27:39] SS: exactly, for their own power and wealth, doesn’t make me against it. Um, but I think that, um, Yeah, the law will have to be adapted, but putting it in is really important and it gives parents something really important. The policing of it, random checks it, uh, was another one that was suggested by ChatGPT with the companies providing the services and it said the fines would need to be significant enough for them to enforce it.
[01:28:08] SS: I mean, it’s a classic algebra of, do you make more money by not doing it and paying the parking fine? I mean, it’s really simple algebra, right, uh, on
[01:28:15] SS: that. It’s the seatbelt recall argument from Fight It is the Fight Club. It’s exactly that. And that’s all they need to do. I think there are ways that the easiest one is proof of ID on the device. Uh, big tech, absolutely no. If you make the fine big enough and do random checks, then, uh, and if they’re smart, they would comply with it and shut people off because then they can, oh man, I was actually, I’m, I was going to say then they would maybe thwart the laws in other countries by not drawing attention to themselves Hmm. because they would not
[01:28:50] SS: want it in other countries,
[01:28:51] SS: surely. Anyway, I mean, I’m doubtful that this will actually get implemented.
[01:28:59] SS: When you say implemented, passed
[01:29:01] SS: in the Senate, or, no,
[01:29:03] CR: actually, executed. It’s been passed in the Senate, I think. Yeah. Um, I’m, I’m way, this quote, it says, if they can target you for advertising, they can use the same technology to know exactly who you are and verify the age of a child. We know that. It’s whether or not we can police it, like you say. The enforcement is the, is the hard part, but I think that parents, they need to be, it needs to be the same as alcoholic driving.
[01:29:26] SS: But I’m, I’m for it.
[01:29:29] SS: Gee, this has been the longest podcast we’ve ever done in
[01:29:30] SS: history. nah, my, uh, my guess is when Trump is in power, uh, he and Elon will make a couple of phone calls to Albo, or whoever our government is at the time, and it’ll all be, it’ll all disappear or get watered down with some sort of bullshit excuses. That would be incredibly Or, it’ll be, you can do it to Facebook and Google, And TikTok and Snapchat, but you can’t do it to X.
[01:29:59] SS: Because X is really important for journalists and news, and it’s more of a news forum.
[01:30:04] SS: Platform for free speech
[01:30:05] SS: and news, so, Did you want to talk about Nvidia or have we run out of time?
[01:30:09] SS: I can do it, we might as well. We’ve gone this long, anyone who’s here, thanks for staying with us, and send us a little note, an email, a tweet, an X, or something on LinkedIn and say, I, I stayed with you, the whole way, JP, I’m talking to you, James Peterson, I know you stay true, and Jon Yo.
[01:30:28] CR: Johnny Yo.
[01:30:30] CR: Used to work with John at Microsoft like 20 years ago. Okay, get into it. NVIDIA, what’s NVIDIA doing that NVIDIA has launched an AI chatbot. And what I love about this is that it runs locally on your PC. ChatRTX. And I feel like we’ve spoken about this before. A custom chatbot which is trained on your own local data, including documents, videos, and more. For me, this is like the start of digital twinism, which isn’t a word, but I’ve used it.
[01:31:01] SS: And it’s kind of like, I feel like this is what Apple missed.
[01:31:05] SS: This is what Apple should be doing. They’ve got the device with all of the stuff, your voice, your print, everything. You know, I, I feel like this is, I haven’t tested it, but it feels like the first move towards digital twins, local client learning on, on your device about you and what you believe.
[01:31:22] SS: Uh, I don’t know how that would integrate when you’re offline. Cause it can work when you’re offline as well to a chatbot. Um, how much data you would need and whether or not it can work without access to the web. For me, this is kind of like gone under the radar. It’s really significant, and it’s also significant as well, because NVIDIA has teamed up with OpenAI to do the Figure 1 and Figure 3 robots, humanoid robots, which I get the sense that this will go into that local version as well.
[01:31:50] SS: So,
[01:31:51] SS: your thoughts Cameron?
[01:31:54] CR: well, um, to run this, look, I think Apple are doing this, this is their vision, um, but they’re rolling it out slowly. I mean, one of the challenges, I can’t run this, because it only runs on Windows 11. And you need an NVIDIA GeForce RTX 30 or 40 series GPU, or an NVIDIA RTX Ampere or ADA generation GPU with at least 8GB of VRAM.
[01:32:21] CR: So, I don’t have a time machine to get a Windows operating system as well. That’s the other thing you need,
[01:32:26] SS: is Yeah Yeah So, um, But! I do think this is, um, I think this is great stuff and I do think this is a hint of where the future is. So the point of this is it can read all of your own docs and notes and all your own data on your machine and you can chat with it about all of your data on your machine and you’re not sharing it with, uh, something outside with the greater world can chat with your files.
[01:33:00] CR: Um, and developers can use it to build their own rag based applications for RTX. It’s sort of a demo, I think, of how to do this. NVIDIA, uh, NVIDIA are trying to build their own tools, like they’re building their own robotics, um, operating system, their own LLM operating system. Uh, you have, you have the. LLM platforms like OpenAI and XAI talking about building their own chipsets because they can’t get them fast enough from NVIDIA.
[01:33:31] CR: Then you’ve, they want to go down the stack. You’ve got NVIDIA going, well, we can go up the stack too. We can both play at that game. So they’re going to be building more and more. I mean, most people had never, including me, had never really paid any attention to NVIDIA two years ago. NVIDIA,
[01:33:49] CR: I mean, I had heard of
[01:33:50] CR: them, but course. Yeah. Yeah,
[01:33:52] CR: They were guys that built gaming chipsets for Xboxes and shit, like it was, or high end gaming computers, right? Um, so now Nvidia are going up the market stack. And these other, the LLM companies are trying to go down. I, I think Nvidia’s play is probably easier, seeing as they’ve kind of been the partners of all of the LLM platforms in the building of their platforms in the first place. I think they understand the software layer better than the software guys probably know how to build chip fabs.
[01:34:26] CR: But, um, I expect to see more of this out of
[01:34:29] CR: Nvidia. who could encroach into the other person’s
[01:34:31] SS: space easier is a really good point.
[01:34:33] SS: Right. Yeah. A little bit I expect to see NVIDIA rolling out more of these sorts of tools, generative AI, enterprise level, as well as consumer level, uh, full stack ecosystems built on top of their platform, and start to develop more of a consumer slash business brand on the software level of the stack as well, and just, and doing it for free, because they make money out of The backend tech, you know, they can give it away.
[01:35:04] CR: It’s a bit like,
[01:35:05] SS: like what Google did with the search, right? They made money in a, in a different
[01:35:08] SS: way. And, and Facebook as
[01:35:10] SS: well. Um, pools. I did a deep dive on an American swimming pool company, actually the world’s biggest swimming pool company on QAV this week. It’s called Pool Corporation. And guess what percentage of their revenues comes from pool sales? 13%.
[01:35:28] SS: I have no idea.
[01:35:32] SS: What The rest of it,
[01:35:34] CR: maintenance on your pool. It’s the razor blade model. You get the pool cheap. Hey, take a pool. Everyone gets a pool. You get a pool. You get a pool. You get a pool.
[01:35:47] SS: You get a pull!
[01:35:49] CR: We make money off you for the rest of your life by paying for maintenance and maintenance products. They own the whole ecosystem for pools, including in
[01:35:57] CR: Australia. But did Pool Corporation. And I love a company that’s called Pool Corporation and say, what do we do here? We make computers. We’re called Computer Corporation. We make sunglasses. Therefore we are called Sunglass Corporation. We make BMS bicycles. Therefore we are called BMX
[01:36:14] SS: Corporation.
[01:36:15] SS: Okay, Cameron.
[01:36:17] CR: Yeah, I love it. Branding 101. Just, that’s why I called the, hey, Cause why are we did I call my, what we’re a body of people. Corporate, as in body, Latin, people, corporation. That’s what we are,
[01:36:28] SS: Cameron. By the way, today is the 20th anniversary of my first podcast actually coming out, being published today, 29th of November. Published my very first podcast, G’day World Number One. Um, what did I call my podcast network when I launched it a couple of months later?
[01:36:48] SS: The Podcast Network.
[01:36:51] CR: Yeah, because I’m with you on that. Just call it what it is. It’s Especially if you’re early, especially if you’re
[01:36:56] SS: early. It’s Bauhaus, Steve. The principles of Bauhaus, the design movement into war period in Germany. Form follows function, and just, just call it what it is. Don’t fuck around, don’t put frilly things on it, just say what it is. It’s a podcast network, we’re gonna call it the podcast network.
[01:37:13] CR: It’s a pool corporation, we are the pool corporation. Let’s not, let’s not pretend. Anyway, congratulations I like it a lot. Um, there you go. Well,
[01:37:23] CR: If anyone out there is downloading NVIDIA Chat with RTX and running it, uh, let us know. Tell us what you think. All right, we done? We’re done. That’s uh, an hour and 38
[01:37:37] CR: minutes, Steve. might be the
[01:37:39] SS: longest one ever. Wow. are you going to join my
[01:37:41] CR: movement?
[01:37:42] CR: Are we starting a movement together?
[01:37:43] SS: was, I was stimulated. I was really intellectually stimulated and
[01:37:48] CR: I put you down? Put you down for a contribution?
[01:37:51] SS: put me down, put me down. I, uh, am very interested in reading. Well, I’ve heard the manifesto,
[01:37:57] SS: but You can be Engles to my Marks. sure. Engles was actually the, he was the smart one. He well then I can’t be that. You don’t need to be that. You wonder, you wonder.
[01:38:09] SS: Your breath of knowledge.
[01:38:10] CR: just had the beard. I mean, Engles had a
[01:38:12] CR: beard too, but Marks had the better beard. That’s really. All right. Cheers. Thank you everyone.
[01:38:20] SS: Thank you. That was, uh, I really liked that techno utopian thing that we were, cause it’s, it’s really, really interesting and important. Like, nah, like, well, I, I kind of agree. It’s like, stop.