Foresight, Complexity and Human Adaptability

with John Smart

 Foresight, Complexity and Human Adaptability

with John Smart

John Samrt TF guests GTA BW glow style 500x500

In Episode 14 , our guest is John Smart, founder of Foresight University. John shares insights about the practice and practical application of foresight, the distinction between foresight and futurism and its relationship with complexity and human adaptability.  Follow @johnmsmart

The adaptation curve, 1st gen tech suck and are dehumanising with luck and good design 3rd gen tech can be net humanising

Breaking Banks

Hosted By Brett King, Jason Henrichs, & JP Nicols
The #1 global fintech radio show and podcast. Every week we explore the personalities, startups, innovators, and industry players driving disruption in financial services; from Incumbents to unicorns, and from the latest cutting edge technology to the people who are using it to help to create a more innovative, inclusive and healthy financial future.

this week on the futurist there’s a concept in the in the book called the adaptation curve to simplify it it’s first generation technologies always suck they’re dehumanizing we don’t get the interface right they’re too primitive second generation are continuing to be dehumanizing and with luck and good design third generation can actually be net humanizing they can actually be more adaptive so you think of first generation cities first generation factories first generation calculators made a stupid first generation video games put us in you know in basements and isolated us so now we’re close to second generation video games i mean we have teamspeak the kids can talk to each other they’re building some community skills but a lot of the things they’re doing are not really what we call serious games i can’t have my kid learn say uh you know equities trading or how to build a city i mean there is sim city there’s a few tools that snap to physical reality but we can imagine a future that’s ai enabled where the games are improving us in all the ways that we care

hey brett welcome back great to have you back from your travels uh i wanted to tell you that we have a really good show teed up this week because i’ve known for a long time one of the very first forecasters or foresight professionals that i ever met in my entire life is joining us today we have john smart correct john’s a man of many many virtues uh he is an educators an entrepreneur he is a a complex systems researcher who founded the foresight university which is a professional services consultancy and training firm for companies to help them develop the ability uh to forecast and to predict the future and anticipate the future i suppose uh but he’s also the author of a really useful book the book is called the introduction to foresight and it sets forth not only his methodology but how he arrived at the methodology and what he’s learned from many people going back in time all the way to alvin toffler and the earliest first generation of forecasters and scenario planners and so on so welcome to the show john it’s great to have you welcome robert brett it’s an honor to be here and thank you for starting this podcast to get people thinking about all the ways we look to the future uh you guys may not know this but alvin toffler after he wrote his probably the most famous uh book on the future in the 20th century called future shock 1970 right six million copies were sold and you know introduced the concept of accelerating change to the world and all the psychological issues and the coping issues around that well two years after that he wrote the futurists and it was a collection of folks just like you’ve had on your pod and it’s a wonderful kind of snapshot of what we call the first foresight spring which is from the 70s till about sorry the 60s until about 1980 and that was the first time that was the first time we really took thinking about the future seriously in this country and then uh then we went into a bit of a winner and now we’re back because of covid and uh ai and all the changes and you guys are really at the cutting edge now of that new foresight spring so thank you for doing that and do it it’s funny you mentioned the 70s it’s uh if you look back at that time um it took an environmental crisis and and a global strategic crisis to get people to take foresight and planning seriously right so you know like the original rand corporation um you know herbert weininger kind of thing um but also the limits to growth right the the club of rome put together that very very durable model about climate which actually to this day has been remarkably successful in its predictive power unfortunately it’s been ignored uh that you know there was the second half to that um that that idea of limits to the growth which is that you have to do something yeah um and people read it the book was successful in the 70s it really made an impact it got environmental awareness on the agenda in a pretty big way about companies failed to act so unfortunately um sometimes there’s a cassandra syndrome where you can say exactly what’s going to happen next but not everybody will listen to you so as a professional forecaster but you prefer the term you prefer the term force that’s right yeah give us the distinction between a futurist and a foresighter so a future is just somebody who uh talks about any aspect of the future to others whether you want to be called that or not someone is going to look at you and say hey you’re a futurist i don’t talk about the future i don’t think about the future i don’t care about the future a foresighter is anyone who’s paid to look to and analyze any aspect of the future paid or tasked uh far enough ahead or in enough detail in the short term that uncertainty matters so if you’re engineering a building and you’re applying known models that’s not foresight that’s just engineering but if you have to wrangle uncertainty and put that into your strategy that’s foresight and there’s probably uh there’s probably six orders of magnitude more people who are four siders foresight professionals than our futurists we need them both and we love our futurists because you know they think differently they’re constantly thinking about multifactorial aspects of the future you know near and long term and they’re doing it in a group right when you talk about those time frames what are the sort of time frames you’re typically dealing with is it five years out is it 20 years out it could be equity trading equity futures trading in the next 10 minutes and that’s one kind of foresight you could be climate modeling hundreds of years and that’s another kind of foresight foresight is vast but it’s always dealing with uncertainty it’s always as we’ll discuss as toffler said it has three fundamental dimensions to it you’ve got the possible future the probable future and the preferable what you want and different people in your futurist community like the ones that were in toddler’s book they have a passion for one of those three things or two a few have a passion for all three some people want to create the future some people want to see what’s coming with whether we want it or not and get you know get to the puck first as gretzky said some people really just want to preference they want to vision and get a shared vision or get a vision that i’m going to ram down you you know your throat whether you want it or not and then just work towards that vision and so that would be the preferred the dino the strong aspirational approach the predictive is called anticipation the anticipatory future and the creative is called the innovative future an innovator an entrepreneur an artist they just want to make things they want to see what can you know what the universe can can uh allow right and so we have these three fundamental um motivations and as we’ll describe there’s values for those three corners those are actually three corners of of a of a pyramid that we discuss in our book and we’ll talk about that in a minute but so what is strategic foresight it’s anything you do prior to strategy that’s it’s just a beautiful elevator pitch if you don’t do anything before you do strategy you’re just jumping into strategy you’re doing the standard stuff the stuff you’d find in you know the management books since the 1930s when strategic planning was invented if you look at trends if you create scenarios if you get a prediction market going where people have some skin in the game for predicting what’s actually going to happen next if you survey a landscape to see who’s made what bets who’s already committed right who’s got their um you know who’s got their skin in the game already if you do forecasting uh if you do uh modeling any of that that’s all that’s all foresight and you do and you can do a lot of that stuff prior to sitting down and creating a strategy so now john your book contains much that’s useful for people who are interested in this who want to get better at forecasting or develop their own ability to do foresight so folks who are listening if you get the introduction to foresight by john smart you can learn these techniques in the book he sets it up very very clearly uh it’s a nice concise uh history of the field and then also each of the key insights or learnings or techniques john why don’t you share with us a little bit of some of those things you talked about toddler you talked a little bit about um the three ps the possible problem preferable but there’s another neat overlay that you do in in your book where you overlay evo devo and many people heard this term evo devo sometimes they think about the punk rock band that’s exactly what i was going to say do you know why you know you know why they they now call it evo devil for exactly that reason yeah evo evolution and devil even though it’s development but it’s devo devo yeah no one wants to think about about um gates of steel and how it was great but for people who are listening evo d’evo stands for evolutionary processes versus developmental approaches to the future and it’s not one or the other both occur but there’s a little bit of dynamic tension between them and so you made a parallel to you know the developmental is more akin to the probable in your in your triangle um whereas devolution or sorry evolutionary part is more about the possible right that’s right talk a little bit about evo right so uh i trained under the great systems theorist uh james greer miller at ucsd he’s one of the founders of the field of systems theory which is a small field that tries to understand complex adaptive systems that existed about 30 years 40 years before the santa fe institute and the modern complexity sciences so when you look at complex systems by far the most interesting and complex are living systems and so if you can understand how living systems adapt you can understand a lot about how to make a good team how to make a good organization society and so so this question of what is adaptiveness and how do living systems how are they so good at adapting well it turns out in the 1990s a new philosophy of biology a new set of systems theories of biology called evo devo emerged what they said is you know darwin talked so much about the tree of life and how evolution loves to create variety it’s that artist possible side right think of the incredible diversity of life well it turns out you have in your body two sets of genes genes that create that variety different species your children are different from you and both in genetically and the way they think right and you have another set of genes that actually hurt all the chaos that’s happening at the molecular scale and they hit a predictable future order and those are cat herder genes and that’s actually about five percent of your genes that do that in the development the associated regulatory systems they are not evolutionary genes they don’t create variety they’re not they’re not exploring they’re actually protecting you and they’re keeping you on a life cycle so it turns out that there’s there’s actually three fundamental ways you can look at living systems you can talk about their exploratory ability their evolutionary ability you talk about their protection and all the values associated with that and then you can talk about the networks that they’re embedded in because i got to tell you individuals groups they’re not immortal they’re constantly being selected out by change networks always win life as a network has been immortal and has been improving its capacity since the very first cell emerged three and a half billion years ago how is that possible well it’s because networks are adapting so we think of all the great catastrophes that have happened in history you know the fall of rome and the kt meteorite which wiped out the dinosaurs networks are always winning so in those cases you know it’s different gene assortments the actual genetic complexity didn’t go down at all and all the major extinctions in the past it was just reassorted and you actually find this kind of catalysis where if stress that doesn’t kill you is a nietzsche said right can make you stronger it’s called anti-fragility right we’re trying to create teams and organizations that have anti-fragility stresses that strengthen us just like we go to the gym and we lift weights or when we read a book and we’re tired but we push our way through we have these anti-fragile capabilities to the networks in our brains and networks in our societies the networks of our technologies and so there’s three fundamental things you should think about when you think about complex systems their exploratory ability their protective ability and the way they tend their networks right and that’s the evo devo or the intersection of this exploration and prediction individuals are very big in the they’re they’re the exploratory actors in a complex system groups they’re the ones that are protecting the system and themselves networks which sit at the intersection between individuals and groups they’re the ones that are always the best adapting so how we tend to our own networks professionally and within our organization our relationships with everyone else you know starting with our loved ones network tending is really the highest value if you will of a living system from this systems theory perspective and of course networks are general ones that generate the preferences it’s not it’s the individuals make them the communities make them but it’s the network ones that are selected at the top to create the greatest adaptiveness and progress you look at the history of humanity it’s these fantastic networks we’re constantly building using our tools right but but humans let’s just compare a human build network to a natural network so for instance what crossed my mind when you’re speaking just now about adaptiveness and resiliency of a naturally developing network what crossed my mind is um the supply chain our global supply chain which is very rigidly defined and it’s optimized for efficiency so it’s very lean yeah and it’s optimized for time to the extent that it can be it’s optimized for time um but by doing so they’re minimizing other other virtues one of the virtues that seems to be minimized is resilience so when there’s a disruption the supply chain it creates all kinds of havoc you know it’s the point where now just today as we speak well we we saw that with covet and hospital beds right you know that’s a classic example right yeah right yeah that’s one but then there were 60 ships parked off the coast of los angeles for about a year you know it was a backlog there they finally got that sorted out and now the backlog is in the north sea off the coast of rotterdam and hamburg and other ports on the north sea so it’s sort of like this problem that bubble is moving through the global supply chain what i’m saying there is basically the the human built ones aren’t quite as resilient as the organically evolving ones what do you think john i think that’s exactly right and there’s a there’s a concept in the in the book called the adaptation curve and you know to simplify it it’s a first generation thing first generation technologies always suck they’re dehumanizing we don’t get the interface right they’re too primitive second generation are continuing to be dehumanizing and with luck and good design third generation can actually be net humanizing they can actually be more adaptive so you think of first generation cities first generation factories first generation calculators made us stupid first generation video games put us in you know in basements and isolated us so now we’re close to second generation video games i mean we have teamspeak the kids can talk to each other they’re building some community skills but a lot of the things they’re doing are not really what we call serious games i can’t have my kid learn say you know equities trading or how to build a city i mean there is simcity there’s a few tools that you know snap to physical reality but we can imagine a future that’s ai enabled where the games are improving us in all the ways that we care about and so you’re modeling a supply chain you know we were talking earlier off show about about uh the explosion of um of digital twins and how all the big companies and industries are now creating these 3d models well that’s a fantastic example of third generation use of those tools when they get sophisticated enough they have the ai behind them we can actually set them to advance all the values we care about i can set you know just recently i mean finally apple has these tools that i can watch my kids use of the ipad wonderful but i want those to be even more fine-grained so that i know she’s actually improving her exploratory i’m improving her her predictive capabilities and i’m improving her understanding of the network she’s embedded in i can actually craft those things if i have the right set of tools but there’s this adaptation curve where things go down and i think we made a lot of choices we made a lot of very specific choices with with that first wave of i.t globalization to push all our stuff to china now we’ve had this uh kovikovic disruption and china going uh far more surveillance than their values than we are comfortable with and so now we’re looking at kind of this new cold war perspective we’re saying well hey we got to reshore we got to strengthen internally i’m looking back to the 1960s and saying well that’s great let’s get that cycle going again and let’s let’s fix a lot of those things and let’s hopefully have a catalytic catastrophe a catastrophe that strengthens us right because it is a catastrophe for some people or maybe we should use the word disruption right it’s a fourth change that some people do not like but yeah in every major disruption there’s always big winners and if we if we use foresight in the correct way and we’re and we’re seeing the systems that matter and you know we started with we started with uh evo devo thinking because like i said there’s actually values behind those three things there’s exploratory protective and network tending values right and all three of those values matter in life right so but john john listen those the folks who are listening right now are going wait wait wait these three people are talking about all this globe spanning stuff they’re talking about the supply chain and the evolution of the internet and i can’t do anything about that because that stuff already happened yeah so let’s bring it back to practical futurism bring offer for us if you would some some thoughts about practical foresight what are some steps that the person listening today to can do to improve their own skills at foursex sure well you know one of the things we can do is we can as you were saying kind of um ask ourselves what we’re talking and thinking about how impactful is that to my life everything we just said could be valuable if you are in the market and i recommend everybody here be in the market everybody should have some equity people in australia retire richer than americans because they are nudged into the market using their superannuation funds where americans are it’s a choice it’s a free choice and many of us don’t invest as much as we should but there’s fantastic disruptions you mean like investing in the stock market that’s right everybody even though right now it’s uh it’s like we’ve crashed the 25 percent of the value has been erased from nineteen time now’s the time to invest in the stock market crypto reversion to the mean man that’s warren buffett’s number one strategy yeah amen hey john i want to i want to go back to one element of this like you know you talked a lot about the the historical development of these these networks and and some of them obviously are you know biological natural networks some of them are human human-led what about information retention and loss you know in those networks because um you know let’s just look at things like the egyptians ability to build pyramids and we we lost that information as to how that was possible so um how how do these networks correlate with human learning and evolution from a retention of information and techniques well i think you’re getting to one of the really interesting questions of uh network tending which is memory how do we build how do we how do we have a memory of the past you know there’s three fundamental ways we look to the future the past or three fundamental orientations that we have mentally some of us are past oriented some of us are present and some are future but we need all three and memory of the past is critical to being better at understanding the future and at being oriented to the present this practical foresight that robert keeps trying to get us back into focused on right what are we doing today in the present with this future thinking right that that’s the key question right so there’s a book called guardian of all things and this is in my book by michael malone and the subtitle is the epic story of human memory and it’s all about how major advances in human culture and organization happened when we came up with new tools like oral culture like written language like books like recorded media and now with these computational algorithms and deep learning machines ais can be thought of as actual kinds of memory a train ai that’s been trained up for a particular pattern recognition is an example of uh technological memory and adaptation for that thing and we’re just building all of these your listeners may have heard of github the single largest code based on the planet uh it’s the facebook for coders it’s all freely shared it’s probably 65 million pieces of code with about 30 million programmers around the world using it that’s a massive uh collective memory planetary memory that any kid in in a dorm room can pull that code down and do something interesting with it and about three percent of it is this ai this deep learning code that’s actually being gardened it’s not actually being engineered anymore it’s gardened we’re pulling the stuff down we’re uh we’re trying to train it like like a parent trains a kid oh i didn’t train it very well okay i gotta try a different data set it’s amazing that that these natural features right these evo devo features are coming actually now into our computers and uh i’m actually writing a sub stack on that i’ll put it in the show notes called natural alignment it’s a series of eight posts about what is the future of ai is it going to become this biomimicry future and i think it is but maybe we should get back to this practical foresight robert um

so one practical thing is this big picture thinking we’re talking about you can use it and how you invest and it is a a very very valuable thing over your lifespan to think that way and there is some value to this big picture thinking of course in how we vote right it’s helpful for that too and how uh you know how we vote with our dollars what we spend on and what’s uh create the greatest good as we see it so uh you know the values having our values first and trying to have them be adaptive is probably the fundamental thing to think about and realize your values are your north star right you may be stuck in a culture or in an organization that doesn’t have values you share but you can keep tend to your own values and create your team values to be adaptive to care about your network um and we didn’t say but what are the most fundamental values for network tending their empathy and ethics and those are the top topics that of you know the last 20 years now right the the gen z kids are all about fairness and they’re over i think they over apply it in in some dangerous ways that i know you get to in your technosocialism writing uh brett um but it’s it’s a huge issue

how do i model the other right and how do i how do i act in a way that improves the adaptiveness of the network of the whole right and uh in the self sacrifice i might need to do to do it so john let’s uh let’s take a quick break and that’ll give us a chance to uh get some um get some face time for our sponsors and after we come back what i’d like to talk about is that aspect of adaptability that you spoke of you know how how society at large adapts how we as individuals can adapt and how active should we be in terms of response to these these foresights so you’re listening to the futurist podcast with myself eric king and rob tursak just before we go to break uh we’ve been talking to john smart he’s a global futurist foresight consultant and entrepreneur and he’s the ceo of foresight university we’ll be right back after this quick break welcome to breaking banks the number one global fintech radio show and podcast i’m brett king and i’m jason henriks every week since 2013 we explored the personalities startups innovators and industry players driving disruption in financial services from incumbents to unicorns and from cutting edge technology to the people using it to help create a more innovative inclusive and healthy financial future i’m jp nichols and this is breaking banks

welcome back to the futurists i am your host brett king along with rob tursek and we’re talking to john smarter a foresight consultant and futurist you said a foresighter john is that is that how you prefer to call it okay or say professional or foresighter yeah yeah yeah you have uh insurers and officers and there’s lots of er words and a foresighter is somebody who’s paid or tasked to look to and analyze the future for for others if you have any client any single client if you’re a science fiction author writing for an audience if you’re if you’re doing something for uh you know um uh your family some you’re a foresighter in that in that capacity yeah so let’s let’s talk about the disruption aspect a little bit and and this sort of talks to the um you know the adaptability of humanity but when you look at the most disruptive innovations throughout history um you know if we look at the you know the invention of the telephone electricity you know and the telephone telegraph lines and networks and so forth um you know the steam engine the combustion engine now um you know electric vehicles and their implications when you look at the most disruptive innovations um over time do they share certain characteristics so that we that we’re then able to identify emerging disruptors in a better way yeah uh stephen johnson who wrote future perfect and several other great books uh you know the technology historian uh technology uh scholar uh he talks about the adjacent possible and you know the convergence of multiple trends right so uh you know covid was a trend that was anticipated and it converged with all these new capabilities that we had in this on the software side and it was pretty clear to some people ray dalio gave uh to his um team um a book on the 1917 spanish flu within the first month of kovit and he said i want you to think through how society dealt with this because this is another obvious similar example and so a lot of those lessons were there in history the disruptions you know there are force changes that some people don’t want but there are opportunities for others and they do involve this convergence of trends uh we were talking off uh off pod robert and i about uh the but his fantastic book vaporized uh which talked about dematerialization you’re saying well i need to write uh a follow-on with regard to the specifics of of how governments and uh you know other powerful actors can use these tools you know the um the social media for example and how and how they can be weaponized and uh you know there’s an adaptation curve there you know when they first come out things can go really badly with them and so we need to anticipate that as well and you know cambridge analytica was quite in we actually did a did a futures piece called the future of facebook and we anticipated one of our one of our uh people at that who recorded the videos they’re online um he anticipated this the whole weaponization and the cambridge analytical the face the whole fake news how far how bad it could go if there was nobody at the top whose job was to be a content moderator and if there were no fair and balanced rules which all got dismantled in the 80s right these it’s these particular trends you can see that something’s going to going to come and then you can do a lot i think uh to respond to that uh at the very least you can let people know that these are problems and even if you can’t change them you can um adapt better to them and your team so why why is it that with all that planning we had around pandemic response and so forth that in the end of at the end of the day we you know um particularly in the united states but you could even say china now with a zero-covered policy um why did we adapt so badly to the pandemic even though we had that foresight and we had that we had plans mapped out in terms of how we could tackle it well so you know we’re getting to the some of the most interesting juicy issues of of what’s called futures in our field so strategic foresight we didn’t say this at the beginning but you can get a degree in it there’s 27 places around the world now where you can get a master’s or a phd the oldest is the university of houston started in 1975 and strategic foresight like i said is anything you do prior to strategy and you can do it at six levels you can do it for yourself you can do it for teams do it for organizations you can do it with respect to the future of societies of the planet and you can do it at what’s called the universal or the science and systems level right where’s the whole system the whole universe going so global would be all the issues that we have to globally agree on like transnational crime climate climate everything yeah and traditionally our field breaks those six those are called the six domains traditionally our field breaks the first three into foresight and the last three into futures right and what we really to be honest with ourselves in the last three those systems are so big and so complex what we’re really doing is we’re trading stories with each other and trying to make them as well critiqued as possible so what we’ve been doing throughout this interview is we’ve been we started with foresight some of the foresight tools and those three those first three individual you know personal team and organizational they’re the they’re the um what my what my first book introduction to foresight our first book is about we have a second book coming out called big picture foresight it really should be called big picture futures and it’s all about these big picture things that we’re having fun talking about now so for our readers or our listeners should know that you know we are going to when we think about the future we’re going to naturally drift back and forth between this foresight space and the tools we use and the future space and the stories we share what we’re doing right now is we’re trading stories so with that whole uh preamble i’m going to give you my story model for why we responded to kova the way we did for me polarization and plutocracy are the two big changes the system changes that uh you know our grandparents didn’t have that we have now we look at american history and we had a very strong period of polarization in the civil war we recovered from that we had a very strong period of plutocracy in the gilded age in the 1890s and we recovered from that we went back to a much more uh equitable strong middle class right about the 60s what’s happened since the 60s in my very simple story that i’m sharing here is we’ve had a growth of both polarization and plutocracy it’s been slow at first but now it’s accelerating and now we’re kind of seeing a peak of it we’re seeing a lot of people writing books about these issues so i would say when we talk about societal futures and why covet was so difficult for us and and uh not so difficult uh to respond to in a coordinated way in the non-democratic countries although we’ll argue whether that was effective or not is because we are dealing right now with high levels of both of those two issues and we have to ask ourselves well how are we going to get out of those i know brett you’ve talked about universal basic income as one possible thing andrew yang of course ran as a platform on that uh ubi yeah he makes some he makes some pretty good points you know young does um you know do you would you believe it would get us out of our high current high levels of polarization and plutocracy well not necessarily you know it can it could stratify you know it like act as a permanent stratification of sort of financial or class economic classes right interesting the the big problem you’ve got is that um you know as you know we we have seen um dramatic retooling of uh the market so the most um the the largest most profitable uh companies today in terms of market cap and share price these are companies that already have introduced high levels of automation so if you look at google apple you know facebook you know compared to the the biggest companies of the 60s and 70s they employ far less people to produce the same economic return and profits um you know than those blue chip companies in the past now that that’s only going to accelerate with you know the adoption of ai here’s the problem as we talk about truck drivers learning to code and things like that you know it’s just the the reality is if you look historically our ability to actually retrain people like that is is not very good right so you’re going to need a social safety net otherwise the alternative is you have revolution due to high levels of techno unemployment i i don’t really see any other outcomes right the only thing you can do is that you know you can allow ubi to um create entrepreneurship and create community-based activities and things like that that we can’t currently do you know so um in techno socialism you know one of the the really um useful points as a talking point on ubi is of all the global studies on ubi you know we see that people in ubi trials create their own businesses at three four times the rate of the general population because they have the freedom to you know so that’s the one area that you know we could really see humanity um evolve around things that they’re really passionate about and you know once you don’t have to worry about putting food on the table it gives you a lot of time and energy to to focus on new things well said i would say they would both evolve and they would develop they would protect the things they care the most about more and better and they’d have more freedom evolutionary freedom to create things that don’t exist that are just beautiful and and wonderful i know that yang’s war on normal people um has a beautiful book uh about kind of what you know small towns could be like with everybody had um that ubi and and the economies and the local creativity that they would support um i can see that vision i think it’s i think i do think it’s inevitable the timing of course is the key question and it’ll be different in different cultures won’t it because they all have different values so america might wait and watch it happen first and uh well this is democracy right i i actually think you know if you look at china and you look at europe i think they’re not going to have an issue with it and the transition will be um you know fairly straightforward i think the us is going to have to go kicking and screaming it will get to the point of almost uh crisis uh period or it might actually you know get to to um some form of uh um you know mess class disruption before it happens because there’s a scenario where the scenario where the europeans can’t afford to continue to provide this lavish social safety net already that’s afraid in some places yeah there’s another scenario as well that the chinese central government’s very concerned about which is that china fractures from the middle uh as you know the wealth’s not equally divided uh automation’s going to hit china as hard or harder than it hits the united states you’re gonna have a lot of people who you know were uprooted from where their families were in the central lands and they’re out in the periphery on the coastal cities where they don’t they’re rootless and they’re going to be out of a job and this is going to unfold in the next 10 or 15 years it’s not too far down the road so it’s not entirely clear that either those systems is going to be nimble enough or resilient enough to respond in a timely way one of the things i noticed john is is that we’re not great at allocating resources towards the stuff that matters most to people if you think about the groups the professionals in the united states who get the rewards you know from the network activities and automation and technology and ai and all that stuff um those folks are almost i mean i mean they’re all they’re it’s an increasingly smaller group yeah that’s right and there are groups that doesn’t feel beholden they have no empathy for the people who are displaced then there’s people who deal with people are displaced i’m talking about teachers nurses social workers the people of good care to the elderly these are the jobs that pay the least but in some respects are the most important for quality of life for the most people and it’d be really interesting to see some kind of um some kind of mechanism to to redirect the wealth towards the people who provides care to other humans yeah since it looks like robots are going to be doing a lot of the producing and ultimately they’re going to do a lot of the optimizing of distribution of goods throughout the economy ultimately they might do services as well yeah we have a big issue on the timing of those things too so that is an uncertainty it’s quite possible that the difficulty of getting those final issues taken care of that make them safe enough to be used with humans and their ability to handle complex things like

logic and common sense reasoning which we’re trying to put into them now um could take a lot longer and also we might actually slow them down they might actually the this is john this is so this is really key you know the the the grassroots education um in terms of adaptability must be a key um you know i use the the story um you know the socrates and plato analogy of the stateship in in technosocialism and you know this is something that um you know george washington for example was a big fan of as was andrew hamilton um in terms of the fact that a core public education system was needed to properly govern and for people that were adaptable right and um you know we we’ve attacked uh the quality of the education system in the us sort of uh you know to get it to the lowest common denominator from cost perspective and that that doesn’t appear to have been particularly successful because a lot of the things you’re talking about in terms of that polarization and so forth a lot of it just comes from um you know like cognitive bias studying kruger effect or all of these things that could theoretically be helped with with education right yes uh although i do think a lot of it comes from a plutocracy uh where you get so much concentration at the top that you start dismantling systems that did keep uh the middle class strong and did create reasonable um fair competition and uh with what schumpeter called creative destruction and every in inside of every industry uh that matters so i’d like to share an idea with to the future so we’re still on the future side and this is from my second book uh our second book the big picture foresight which comes out next year so it’s not out yet but um actually i touch on it in our natural alignment sub stack which will be in the show notes uh go to sub stack you’ll see that it’s called the personal ai i’m curious how both of you think this will impact uh the nature of our democracy and our planet uh in over the next 30 years so as a futurist i’d love to ask anyone i can what are you most worried about what are you most optimistic about over the next 30 years you know for yourself and for your and for your society and for the planet and it’s a great question because you’ll get so many something we do we do that every time on the show the same right the same thing because you know it does really a futurist tend to be quite optimistic creatures right in terms of you know we’re in a hurry to get to the future why because we see see the benefits of of of these things so entrepreneurs as well entrepreneurs yeah so john let me turn that back on you you know what what are you most excited about over the next 30 years well i’m both optimistic and i’m concerned about the way this could be misused and it is ai and it is the specific way that it spreads into society what we have seen over the last 10 years has been nothing short of astounding in terms of how fast ai has improved after decades of not improving and the way it did it is by copying key aspects of how our own brain works called the neuro inspired design that’s what a deep neural network is it’s an artificial neural network that has actually copied key aspects of how the brain works right perceptrons the reason indeed um a deep mind with its alphago and alphago zero were so good because they actually copied some key ways that dopamine works in the human brain to uh to distribute reward based on predictions that your brain does for what’s going to happen next so it’s called the value network there’s a policy network and a value network in all these ai minds and the the ones that do reinforcement learning they have an actual emotional intuitive model when they look at something as to what’s going to be promising and plausible and what is not and they they focus their logic on the things that emotionally seem intuitively gut feeling seem correct so we have ais that are starting to build some deeper bio neuromimicry and i’m arguing that they’re going to develop biomimicry as well an analog to the way genes unfold neural networks in living systems is going to come into these ais as well over the next 20 30 years but here’s the most interesting thing for me well yes so the ais are going to be big they’re going to be disruptive they’re going to become more and more like us they’re going to be kind of some kind of a human machine merger coming right in the long term with the transhumanists some of the transhumanists talk about

how democratized is that going to be what is it going to be like when your cell phone has a model of your values and your goals and your intentions i told you about github and i told you i didn’t tell you that all the best ai tools google facebook microsoft

baidu they all the top companies put their best ai tools their deep learning tools up on github github for any kid to play with why do they do that because they know that the you know the 50 million coders on github is a development environment they can’t touch a candle to that with you know google’s 50 thousand coders so the network is already winning in ai and technology the networks i told you networks always win right the network is already the most powerful thing over any individual or any group so the top of that this is you know a consistent message we’re getting and we had uh brad templeton on a couple of weeks ago brad talked about the keynes and the stewards um and again that was his saying message is you know like we’ve seen it over the last 300 years you could go back further actually but you know technology always wins so um i i think um you know there’s an element of the foresight and fortress futurist stuff where um we can tend to spend a lot of time debating like for example is a ai going to be good or bad but based on that history we’d be far better off saying how do we transition ai into society but we’re not actually very good at that are we no no but we’re giving ourselves permission to have these conversations it’s it’s it’s platforms like this it’s conversation spaces like this where we will get better at developing those aspirational futures we didn’t talk about a lot of the practical tools they’re in this book the practical futurism so maybe we’ll do that some we’ll do that another another pod but we learn we see we do and we review that’s called the do loop and that’s the fundamental the fundamental way we look ahead and so foresight is only one of those steps first we have to learn about the present and past then we have to look ahead then we do that’s called heads down we get heads down we’re not looking at anything else we’re just getting something done and then we got a review and so that is a fundamental that do loop is fundamental to how all complex systems adapt foresight’s just one piece we have to act and that’s how we get that adaptiveness right so we’re giving ourselves permission to discuss this whole question how do we adapt to ai well one thing that has not been discussed much in the conversation space is this question of those tools those ai tools they start expensive and they sit at the top the powerful actors use them and eventually they democratize like i said the base tools

i would like to ask our listeners to think what’s it going to be like when you are training a data an ai that has a model of your values your what you’re going to do next and that entire model you’re training it by by poking and swiping and talking to it and say more of this less of that that model is yours it’s sitting behind an encrypted it’s sitting on an encrypted cloud just like your text and your email for the very first time in the history of the information revolution you are going to have a model a data model and an ai that can nudge you better than the marketers can from the very beginning they’ve been able to tr trade all the best data and now we’re at the bottom of what i would call an adaptive valley where there’s maximum ability for them to micro target and nudge and manipulate you and minimum ability for you to respond to that you can’t ban certain types of ads forever on youtube or facebook or whatever when you have that pie the personal ai that’s what it will do it will sit as a personal os if anyone i recommend everyone here see the movie her which is probably the only it’s not great but it’s probably the only good and entertaining movie about a personal os a personal ai and what and the interface and how that’ll change the way you interact with the world when you have this thing that you’ve trained up it’s recommending what you buy what you watch what you read who you connect with it’s smart enough to know the the six other people uh that have put public information up on their linkedin or whatever that they’re interested in starting that same business as you and it’ll throw down a possible you know profit-sharing partner ai based tribalism yes and you think in the activism that’ll come from that where groups will get together around and consensus but consensus building also right yeah yeah we could use it for real-time democracy yes and so we think about so what is that doing well it’s empowering it’s empowering the network and it’s empowering individuals yes and but it’s also empowering the whole network and there will be all kinds of interesting selection that will happen for adaptiveness they’ll be maladaptive ways you can use those repeat retreat into filter bubbles even further than we are today but there will be people who will skate to the center as we all know the best managers marketers politicians strategists they have to be able to think with people who think differently from them and find that subset of common values that you share even though you don’t share these other values so everything goes back to values right there are these universal values those three that i mentioned to you exploring predicting and tending to your network those are the three most obvious ones but there’s others you can explode each of those and so how do we how do we talk to each other in a common language of things that we care about well i think we’re going to be doing that with those kind of tools and they’re going to be educating back to your point about education that system that’s called that you know my friend uh who runs the da vinci institute thomas frey you might want to get him on he talks about teacherless education right what happens in a world where your personal ai is actually more of a lifelong educator for you of how the world works and protecting your values right then all the network then all the physical uh you know groups that you’re part of so so i think the world’s gonna continue to get more interesting complex foresight is important it’s your superpower and at the very least that’s a message everyone can take home is that by having these conversations we get better that’s pretty positive that’s a good way to finish it off thanks so john candy yeah yeah i know no i know we need to wrap this up so um where can people find out about yourself and and you mentioned you’ve got a new book coming out um yeah i assume um you know is that is it really ready for prime time where can we find out more information about um your books and and your work as a foresight consultant well thank you well um foresight university foresight and then you dot com is uh kind of our as our uh on uh as our web presence and there’s a there’s a newsletter there um all things future that you can sign up for and we’re gonna be running a conference um in the west coast uh in detroit and on the east coast next year and we would love some of you folks to come to that if you’re interested at the very least you know see what you think of the newsletter and see what you think of um this idea that foresight is your top superpower and you can you can get better at just by having these giving yourself permission to have these conversations so uh and of course my book is on on amazon you can uh you can get it there if you’re interested uh introduction to foresight and uh i guess that’s it yeah thank you so much well john thank you so very much for joining us those are some great insights thanks for the practical suggestions too i’m always keen to hear that that’s it for another week of the futurist uh if you like the show uh don’t forget to tweet it out or post on your favorite social media leave us leave us a five star review on itunes google podcast facebook or wherever you listen to our show this episode was produced by our us-based production team including producer elizabeth severance audio engineer kevin hersham with support from our social media team including carlo navarro and sylvie johnson but the main thing you can do is tune in to the futurist every week we’re trying to talk about the future we’re trying to explore this but you know we’re only going to get to the future and together and on that point we will see you in the future well that’s it for the futurists this week if you like the show we sure hope you did please subscribe and share it with people in your community and don’t forget to leave us a five star review that really helps other people find the show and you can ping us anytime on instagram and twitter at futurist podcast for the folks that you’d like to see on the show or the questions you’d like us to ask thanks for joining and as always we’ll see you in the future