“In the absence of trust, people will disengage. No amount of technology or process can replace the human need for belief and belonging.”
-Julian Stodd
This episode is a must listen for those looking to expand their ideas about leadership, technology, and the psychology of work!
In this episode of “Psych Tech @ Work,” I welcome Julian Stodd, founder of Sea Salt Learning and prolific author whose work challenges conventional views on leadership, work, and technology by looking at them through new paradigms. Our conversation was refreshing because Julian offered thoughtful and noteworthy reframing of my ideas about psychology in the workplace.
In our conversation Julian reframes the way we think about the evolving workplace, focusing on the shift from traditional hierarchical structures to social leadership and distributed power. His perspectives offer a compelling critique of the conventional social contract between organizations and employees, arguing that this contract has been fundamentally broken by modern work dynamics.
Julian highlights the role of technology, particularly AI, as a tool that both amplifies and disrupts traditional power dynamics within organizations. He also emphasizes the need for organizations to adapt, not just by automating tasks but by fostering trust, belief, and a sense of belonging to truly engage employees. This conversation is a powerful reminder that organizations must evolve beyond transactional relationships and embrace the social elements that drive real engagement and productivity.
Throughout the episode, Julian emphasized the critical balance between formal organizational structures and the informal, social dynamics that truly drive engagement and innovation. He suggested that many organizations over-rely on formal systems—hierarchies, policies, and processes—while underestimating the power of social structures, such as networks of trust, influence, and collaboration.
Julian repeatedly showed me a new way to look at my ideas about the psychology of the workplace. For instance:
The Traditional Leadership Model:
- I Mention: I spoke about the role of leaders in providing clear direction and overseeing teams to ensure productivity.
- Julian’s Reframe: Julian challenged this conventional view, arguing that leadership in the modern era should be about social authority rather than formal hierarchy. He explained that leadership is increasingly granted by the community based on trust and belief, not just a title or position, and that successful leaders must earn this trust by engaging meaningfully with their teams.
The Psychological Contract Between Employers and Employees:
- I Mention: The importance of the traditional idea of the psychological contract as a way to understand the employer-employee relationship, where employees trade their time and skills for compensation and job security.
- Julian’s Reframe: Julian reframed this concept by stating that the traditional social contract has been broken. He argued that organizations need to move beyond seeing employees as resources or commodities. Instead, they should recognize that people are investing their belief and engagement in return for trust, growth, and belonging. Julian emphasized that organizations must rethink this relationship to thrive in the modern, social age.
The Importance of Individual Differences:
- I mention: The importance of the concept of individual differences, focusing on the traditional scientific approach that measures and assesses traits like personality, intelligence, and abilities to predict workplace behaviors and outcomes.
- Julian’s Reframe: Julian suggested a shift away from solely focusing on the science of individual differences as fixed traits to considering individuals as dynamic and adaptable within social systems. He argued that while the science of individual differences is important, it often overlooks the relational and contextual factors that influence behavior. Julian emphasized that people are shaped by their environment, relationships, and the social dynamics they are part of. He proposed that instead of just measuring traits in isolation, organizations should focus on understanding how individuals interact with the system around them and how they can grow and evolve within that system.
“Take it or Leave it”: Julian and I give our opinions on two articles that suggest a declining need for human workers in the workplace of the future.
- “AI-Powered Companies with No Employees Coming Soon”
- Summary: This article from Sifted discusses the rise of fully autonomous companies that leverage AI agents to run without employees. While the article presents a futuristic view of e-commerce and automation, Julian and Charles critique its narrow focus and overhyped narrative, emphasizing that while AI may automate some functions, people will remain central to innovation and decision-making.
- “The Rise of Autonomous Enterprises”
- Summary: This article from HFS explores how data-driven decision-making and automation are shaping the future of organizations. Julian and Charles discuss the potential for organizations to become more efficient through data governance and AI, but caution that organizations still need human judgment and emotional engagement to make meaningful decisions.
Full transcript:
PTW_Julian_Stodd
Charles: Hello, everybody. Welcome to the latest edition of the SciTicket Work podcast. I am your host doctor Charles Handler, and I am here with another amazing guest, Julian Stone of Sea Salt Learning. Julian is the author of over twenty books and brings to this show and to the world an amazing viewpoint on things related to people and organizations and technology. Today, Julian and I are gonna talk about and he’s gonna answer some questions for myself and ones that I think you’re gonna wanna hear about too.
What do we actually mean by social organizations and the role of social leadership in facilitating the design of organizations? And what kind of paradigm shifts are gonna be required as tech and work evolve. Julian’s got a great take on these things. Also, we’re gonna talk about what people really want from work versus what organizations want from people. And and how are these things changing?
And what what paradigm shifts? And how organizations configure themselves and vision visualize themselves, execute themselves are needed to keep workers engaged and motivated. And what what the role of psych the psychology of individual differences is and all of this. Right? We have a we have some differing views that actually are the same from a different viewpoint.
I think it’s really interesting. And as as the role of technology evolves and running organizations, will organizations ever really be able to run themselves and be successful? What’s the role of people gonna be far off in the future? So we’re gonna answer all these questions through an amazing stimulating conversation. So without further ado, here is my guest, Julian Star. Welcome. Welcome to the show, Julian. How are you?
Julian: I’m very well. Thanks. Thanks for having me along.
Charles: Yeah. And where are we speaking to you from?
Julian: So I’m down in Bournemouth on the south coast in the UK.
Charles: Nice. So are you on the beach? On the water?
Julian: I’m about yeah. I’m about a ten minute walk from the beach. Now So I get to head out in the mornings. We do the school run cycling along the the promenade in the beautiful sunshine this morning.
Charles: Oh, yeah.
Julian: It’s a nice part of the world.
Charles: Yeah. Yeah. I walk my kid to school. We’re on a lake, but there’s a big levee, so you can’t really see it. But but, yeah, being close to water is always a very enriching thing for the mind and soul to be able to just kinda look out and and just wonder what’s going on out there.
Right? It’s pretty amazing stuff. Well, I’m I’m really glad to have you on the show today. I was thinking about it. I feel like the theme is and now for something completely different.
I don’t know how much of a Monty Python fan you are. I don’t wanna stereotype that all Brits are monty python fans. I certainly am. Right. So
Julian: you you can’t go up in the UK and be my age without having grown up with Monty python. So Right. Exactly.
Charles: And and you were I can’t recall, do you work for the Ministry of Funny Walk Or are you in the cheese business? Like, I can’t remember.
Julian: Oh, nice. Yeah. Yeah. More towards the funny walks, I think.
Charles: Yeah. Yeah. I know. Man, oh, man. That stuff’s great.
So good stuff. Well, look, I wanna share with our audience that you are a little bit different than a lot of our guests, but nonetheless important. In fact, maybe even more so because you look at things from a very different perspective, but at the same time, There’s a lot of psychology. There’s a lot of workplace. And of course, a lot of technology.
I know you’ve written, I think, what, twenty one books. So you’re a prolific writer. I get your daily blogging and everything. So, man, as a writer myself, I wish it flowed that freely for me, but why don’t you before we even launch into and it’s probably a good transition? Tell us a little bit about whatever book you feel you’re most proud of that might be hard with twenty one. I’ve read the the AI one, the curious book about generative AI. I probably forgot the title specifically, but that was a really cool book and talk about that one or any other one that you think our audience people who work in organizations to help make things better for both the company and the people would wanna know about.
Julian: Oh, so you see, you started with a tough question because the the the important thing is you you shouldn’t mistake being prolific for being high quality. Okay. And I’m I’m kind of okay with that. It’s embedded in my cracked Uh-huh. That my my my work is adaptive. It’s evolutionary. You know, sometimes there’s a there’s a a moment that I think is brilliant. And then, you know, a year later, I think, well, it was okay. And three years later, I think, oh, did I really write that? You you know, so so some books are very much of their time. You know, the book I wrote on mobile learning is probably not what I’d put at the front of the bookshelf now. And and and some of them, like the book on community building, that had a sort of huge audience, and I’m really proud of it. But I think my my favorite book is is probably my least successful if you measure it by number of copies out in the world. And and that’s the book I wrote on the anniversary, fiftieth anniversary of the Apollo missions. It’s called to the moon and back leadership
Charles: Yeah.
Julian: Reflections from Apollo. And I and I love that one partly because it let me say to my son that my job was drawing spaceships and and he liked that. So Yeah. I got to draw lots of spaceships. But it’s about it’s really about some of the most challenging elements of my work, considering failure complexity and the ways that we seek to control it.
So it’s it’s dressed up in a nice way, but in terms of my own thinking, it was a pretty good book to write. So I kinda like that one. Man, of course, you mentioned engines of engagement, which is There you are. Got it in there. It’s great I wrote that book with Jeff Sted and Dr.
Seyats Yeah. To co authors. And it I just have to say it was a joy from start to finish, which is quite unusual for me. I I normally don’t really enjoy writing with other people. So I I love the company of other people, but I find writing with other people really difficult. And that book just kind of came out of our you know, our our our friendship, but also our challenge to each other. And it’s been a real highlight of writing for me to get that book out in the world. And it is a beautiful book. You know,
Charles: It is. Yeah. And says how I met you, I’m I’m gonna nail her down for this show as well at some point. I saw her speak at a to a keynote at a conference I was at, and her I had a bunch of ideas floating around in my head kind of loosely and her talk. Really, I kinda had an epiphany moment where a lot of things connected for me.
So I thought that was really powerful and, you know, kinda reached out to her after that. I feel like that is kind of a happy positive, you know, read and it reads pretty quickly and some nice little pictures. So for the audience, they’ll summarize quickly that book so they can feel compelled to go out there and get it to receive the message.
Julian: My goodness. If if I’d known you were gonna ask me about the content of the book, I’d have I’d have chosen one of my books that I’m really confident and sure about. Wow. But that book, you know, engines of engagement, it’s a book of uncertainty and curiosity. So I’ll do my best.
We we’ll
Charles: have to get too deep.
Julian: Two sentences. Yeah.
Charles: Whatever however you want, but you don’t have to get too deep on it.
Julian: It’s about the the fracture and evolution of things that we are certain about. And I I think it’s a a book of confidence, but not not our confidence in our uncertainty. It’s it’s a book in our confidence that we will only survive this change if we work together, if we remain curious, and if we sort of continuously learn. So it’s I I I kind of love that about it. It’s we describe it as a book to inform the debate, but it’s not a book of answers.
Charles: Howard Bauchner: Well, we have I mean, do we have answers right there? So what we do in some sense, but when you think about just the technology and where it’s bringing us, etcetera. It’s hard to know the answers other than, you know, and I I I don’t know if you follow Ethan Mollock very much. He’s pretty amazing writer about AI. Stuff both philosophically, but also practically.
And he and and I have said the same thing in a different way that if we never had any more advancement of of generative AI, it would still feel miraculous to me what we can do every day, but but he says we’re the the AI you’re using today is the worst AI you’ll ever use. Right? So from a pure AI standpoint, it’s we’re gonna be just dog paddling in some sense to keep up just when we think we might have one problem in in hand, you know, so technology can do new things. So it’s exciting to me. I I’m a humanist and a very much a positive person.
So I feel like it’s gonna be positive for us, but it just magnifies a lot of the dualities that technology has brought to begin with, and from a a perspective of organization. So while you
didn’t talk about its specifically, I will continue to give you openings for your writings. Talking about social organizations, and I think that’s that’s really a a powerful topic for you. When you would say social organization to people, I don’t know if anybody even really calibrates to exactly the meaning you have. Right?
I mean, you’ve got, oh, it means people and companies. It means how you know, and even when I put it in with no context into, let’s say, a chatGTP to do some more research because I’m I’m not afraid to say that I used those tools to do a little research for the show. It didn’t exactly know what I was talking about until I was able to put your name next to it, and then it opened up all kinds of good stuff. But but share with the audience kind of philosophically when you think about organizations because we think about organizations all the time and the people there from a lot of different perspectives. What’s that perspective?
I think it’s important foundation for us as we continue our you know, conversation.
Julian: Yeah. Well, you know, my work is is is very diverse, but it is unified under one core idea, which is the notion of the social age. So, essentially, it says technology is brilliant. You know, we all love different gadgets and widgets and so on. But primarily, what I’m interested in is the evolving social context that they bring about.
So in that highest level of work, I look at how social and collaborative technologies make us radically connected beyond systems. It looks at how that impacts on structures of power, how it enables systems of sense making and curiosity. It looks at the evolution of learning and the notion of knowledge itself. It looks at a whole range of of factors that sort of sit at the top and it says, okay. So you can look at the context of the social age and you can say one of two things. You either say your work documents a series of aberrations, but essentially everything is how it has always been. Like it’s interesting, it’s quirky, it’s peripheral, but thank you very much. We’ll just carry on. Or you say, at some point, you look at the world and say, actually, everything is a little bit different, which means that the whole world is different, the context of the ecosystem, it’s paradigmatic, everything has shifted. And I tend towards that that view and not to be dramatic, you know.
I mean, I don’t
Charles: Yeah.
Julian: Sort of position the work like that. It’s it’s intended to be held very likely, but I I do say if the ecosystem that we and our organizations exist within has changed, then that will put unique pressures upon our organizations. And hence, we should look at how we adapt them. And that’s really what my work explores. How will we reconceptualize the organization? How will we lead within it? How do we learn within it? How does it change as a social movement? So all of this causes us to look at the intersection of social systems and formal systems. That’s where the interesting stuff happens between the formal and the social enabled by technology.
But essentially, it describes an evolution of the sociology, the different experience of being human.
Charles: Yeah. So you’re a perfect guest for the show when I think about it, that my theme is psychology, technology, and the future of work. Right? You’ve brought all three of those elements in. And it is a difficult thing.
I think we always struggle with too much structure, too little. So think about anarchy. Right? I mean, you know, anarchy can’t can’t really, truly allow us to to get anything done, I wouldn’t think. There might be people who argue that with some very precise theories that I don’t know about.
But but you can’t do that. And then, you know, over Barak seeing over governance is also something that is quite difficult in paralyzing in a lot of ways as well. So it’s all about balance. Being human and thriving or alive even in thriving is all about balance, you know. And so It’s another example of it.
And from a technology standpoint, I mean, how has that well, is there a is there a more core tenant foundation of your beliefs around how those things intersect that that are and how we work within shifting paradigms that that everybody needs to know about. If you’re just thinking about the the evolution of work and people, you know, and paradigm’s are gonna change in a lot of ways. For me, just philosophically though, until the machines completely take over, you’ve got what’s powerful to me called a psychological contract. So that is the idea that the organization provides security and both engagement of of your your interest, your drive, your motivations, fulfillment in your life from a psychological standpoint, but also economically to take care of your family. And in in the same Vain, you’re providing the organization with your effort and energy to help it achieve its goals, and it’s a symbiotic type relationship.
It’s and and there’s a lot of reasons why those so so to me, that’s immutable. But there’s a lot of change around that, and I think that the way you keep that going might change. So so talk about that a little bit.
Julian: Well, I’m a bit worried now because you said it’s immutable, and I would argue it’s broken. So, you know, I I I I say, in my work, I would chart progression, which says we went into the industrial age to build organizations that were were entities of collectivism. Yeah. So in that time, we were digging stuff out the ground, bringing it together, heating it up. You know, making stuff, shipping around the world, making complex financial systems to manage it. And in the process of doing so, we created these these infrastructure organizations the industrial organization came to be led by principles of scientific management. It came to be optimized through Mhmm. Automation and production lines and so on and methodologies of efficiency and effectiveness. And you know, I’d kind of argue, well, we we we got through that with the organizations spending the last forty years or so very deliberately fracturing the social contract. So they they they came to treat people as things Yeah.
As commodities. They they gained the last vestiges of their profit and efficiency by treating people as things, and then the context shifted around them. So if you look now what most organizations want is exactly what you described. They want our discretionary investment. They want people to go the extra mile to bring themselves in at the very moment where we understand that career has gone.
So we’ve seen this incredible democratization of space. And we’ve seen a lot of what would have been historically, the boundaries, the walls, the girders, the physical structure. A lot of it has shifted. So For example, infrastructure. You know, infrastructure used to be big and expensive and owned and it sat within a factory or it was a network or a fax machine or a telephone and, you know, you couldn’t have it yourself.
You had to go to an organization to have it. But today, you can have an organization with no infrastructure. It will rent or access what it needs. But interestingly, you or I can access infrastructure without having an organization. So the relationship between infrastructure and
organization has gone.
The relationship between space and power is really important and very relevant to the conversation around generative AI. So What many organizations failed to recognize when the pandemic arrived was that as people fled the office, they took their laptop and the pot plant with them. But what they also did as a parting shot was they turned around and shattered the relationship between space and power. Mhmm. Because whilst the chief exec could sit in an office with a gatekeeper in front of it, when you’re on Zoom or on Teams, everybody’s box is the same size.
You know, there’s no sliding scale that says how important you charge, let me make your box bigger and your voice louder. You know, it’s just like everybody has the same space. The technology is inherently democratizing and connected. And that’s never coming back you know, power has rebalanced. It’s one of the core features of the social age.
Now, that’s relevant because if you look at generative AI as a technology, you can also look at generative AI in terms of a dialogic technology it is fully mobile and distributed. Mhmm. So it doesn’t respect the boundaries of power that came before. So you see again, yet more pressure
where what are organizations giving us? Know, they’re giving us a dollar to do some work for a day, and that that’s okay.
That gets my utility. But what’s gonna get my invested engagement is a different type of currency. It’s gonna be a social currency. Mhmm. And, you know, my work is research based. I don’t just sort sit there with a coffee and make it up. We see people describe this very clearly. We engage where we have trust belief, where we feel we belong, where we hold pride, where we find validation. Now, not all of this is uniformly good, but this is nonetheless. These are the forces that drive our social behavior.
So again, we see a new ecosystem emerging where people may be able to buy your time, but they need to earn your invested engagement. And we see newly emergent structures, which can be structures of belief and belonging, and indeed structures of productivity and effectiveness, none of which would fit the model of the traditional organization.
Charles: Yeah. Wow. Yes. So what is that And I agree, I haven’t thought about it as precisely as that, but, you know, I think there was a new kind of level of humanism that, you know, that entered in in the pandemic when we did have to distribute ourselves, but also where organizations, it seemed like really cared. I mean, there’s always a little bit of how much lip service are you giving this just so you look good outwardly, and I think there’s variations there. But so what is that currency then? Right? What is it that stimulates people in this world, you know, my notes, I just wrote down learning, the opportunity to learn and grow, you know, within the organization. So being nurtured as an actual asset and a person that the company cares about, I think is important. There’s an additional pressure though in democratization and that people can jump ship pretty easily now, go somewhere else or even open up their own business, the the ability to build your own, even through no code, low code stuff, to build your own presence, and do your own stuff.
It’s never been stronger. Even if it’s not in the corporate world, you could you could knit, you know, doilies and sell it on on eBay or on Etsy and you could make a living. So so how How is that engagement working now? What are the what are the tentacles or maybe that’s a bad way to see it? What are the connection points?
Julian: It it is in in the sense it’s a proliferation of spaces. So there’s been a general disaggregation of the structures that we historically sat within. So, you know, it would be you into, first of all, a school, maybe a university, then an organization. Within a community, within a society that sat there globally. Maybe you’d also be part of an organized religion, a church that you belong to.
These were kind of structures and entities of effectiveness, of productivity, and of governance, and of belief. Now a lot of that has kind of fallen apart exactly as you said. People describe their identity becoming more nuanced, more multilayered. They can be employed by an organization, but they could have an an identity as a DJ. They could be an Etsy seller, they could be a blogger and a podcaster, or indeed, they could be many bloggers and many podcasters, some of which are public invisible, others of which are hidden and obscured.
People describe clearly how they affect active within the arms of many different communities, using a wide range of different technologies, the majority of which, the ones they trust the most, will be distributed and social. So what we see is the ecosystem kind of breaking down and spreading out. So whilst you would have joined an organization, maybe on a graduate scheme, maybe got a promotion, maybe wasted another couple of years for small paychecks. Maybe after twenty five years, you’d have been getting towards the top. What will happen now is that you’re likely to bounce between organizations.
You’re likely to find opportunity through your network, but crucially, you’re also likely to find development and insight and understanding and indeed capability within your network. So if somebody employs me today, if they’re lucky, what they also get is access to my network and that insight and understanding, and maybe they get my loyalty, my trust, my belief. So if you
look generally, what we see is that people today want greater agency. So they don’t want anarchy, and they don’t want endless freedom. But neither do they feel that they are purely one cog with, you know, zero tolerance for movement.
We see this again and again Mhmm. In the landscape of trust research people described that if the organization they worked for trusted them, they would experience that as freedom. And if you say, what will you do with that freedom? You know, will you just sneak off to Starbucks? They say no.
The number one thing I will do is help other people be effective. You know, that’s like the number one thing that they will do. It describes a whole host of stuff, which is really what organizations want. You know, they will be invested They talk about it with language, which you can measure as language of effectiveness. They they want to do a good job, but they also want a little bit of space to breathe.
Charles: Yeah.
Julian: And you you see that time and again in in the research that looks at how people believe in their leaders and believe in the organizations they work for, what people correlate with belief is opportunity. So again, they want dumb space. And if you don’t give it to me, I’ll find it elsewhere. Because you can be absolutely sure there’s a community out there somewhere that is just gonna have all the space in the world for me to to cosplay, to talk about growing cactuses, to, you know, to do anything that I wanna do. There’s gonna be a space in a world I can immerse myself in.
Charles: Yeah, we have so much more access to that stuff. You know, one of the things that I study as a psychologist and I’m interested in your take on this is individual differences. Right? We know everybody’s different. And when you think about what motivates people at work and know, some people like a lot of structure, other people like less structure. Some people are actually happy to just come in, punch the clock, and, you know, move the same widget from box a to box b all day long. I think that’s a little bit those people are outliers, but there’s all kinds, all shapes, all sizes, all mental maps. And so as an organization, what I do when you’re especially looking to hire people is understand what are these people’s individual differences? From a reliable, accurate way that we can measure those things. And how does that overlay with what we’re looking for, with what we need for success?
So when I hear what you’re saying, I feel like it is definitely resonates with me. How would you kind of handle the idea that some people want more or less of these things, you know, so that we have a whole and it’s usually like a normal distribution of of some sort. Right? How does that get accounted for in the in the giving of space and the giving of opportunity?
Julian: Well, you know, I’d almost kind of flip it around because I don’t say this stuff just because I think people want to hear it. The challenge sits with organizations. So, you know, because of this ecosystem pressure, most of our organizations are trying to learn how to do new stuff. That that, you know, there’s very few organizations that say, at the moment, you know, we make a widget that does a thing. And we strongly believe in thirty years’ time, we’re gonna be making exactly the same widget doing exactly the same thing.
And for every single one of those thirty years, people are gonna continue to pay us exactly what they’ve paid us. You know, almost no organization in the world is saying that. What they are saying is we make widgets that do a thing and we’ve got to protect that revenue stream. And we’re really good at doing it and we’ve optimized it and we’ve screwed every last bit of profit out of it. But oh my goodness, other people are starting to make widgets that people like. We need to invent new widgets don’t drop the old ball, but we’ve got to make the new one. And then they’re saying, so now you’re busy. And I need you to come up with great ideas, but I need you not to drop the ball that you’re holding. You know? So they’re trying to do these two things. They’re trying to maintain the old do the new, and they’re finding that this lever, you know, used to pull this lever and it used to do a thing. And I keep pulling the lever, and it does less and less of what it used to do. Because we are exhausted, because the organization is sub optimized. So there was a time when you could say to me, you know, do the thing or or or, you know, I’ll have to let you go. But today, I know you’re gonna let me go.
You know, like, nobody’s gonna get through their whole career without experiencing that betrayal of trust from organizations. It’s a gift you know, in the UK, you’re likely to have forty two different job roles I think across whatever passes for a career.
Charles: Right.
Julian: And you can be pretty sure it’s not gonna be a voluntary exit from all of them. So if our trust in organizations is generally lower. That doesn’t mean we don’t like them by the way. It doesn’t mean we don’t tip up to work and, you know, have a good day and have some fun with our our friends and colleagues. But it means if push comes to shove, you know, if the question is, are you going to do the right thing by me or are you going to do the right thing to make that quarterly report look good?
You know, the report’s going to win out. People understand that. So at this time when our trust is lower in an organization, we’re gonna make different types of decisions. So that’s why I say that that that that that the real challenge isn’t so much about individuals, you know, like one of my individual acts as a behavior. It’s about organizations that they’re looking to build a different type of capability.
Charles: Yeah.
Julian: So even the people who don’t want the the the the difficulty of navigating this new context, it will most likely be enforced upon them. So if you wanna sit there and, you know, shuffle widgets all day, you know, good for you, but that relies on the organization continuing to pay you to shuffle the widgets. And the odds are, you know, it’s trying to automate or replace you. So Yeah. I kind of looked the other way around.
Charles: Right. Well, yeah. I I love you flipping kind of my thoughts inversely there because it’s really important to to have, you know, different perspectives. And I I love that. To me, what popped in my head is okay.
Well, then there sure is gonna be a big onus on leadership. Not only if you think about when people struggle at work, it’s almost I I don’t know the status. I repeat the the idea all the time, and I hope I could back it up if I look for some actual data. But the number one reason people leave a job is their their manager, their direct boss, whatever. Right?
So and that is probably a lot of times because that person isn’t giving a lot of freedom. Right? I mean, we We don’t like that. But from a bigger perspective, what you just said to me really speaks to organizational leadership. I mean, you’re not gonna be able to give people space in an organization unless leadership is creating the pathways, the structure, for that to work. And so, you know, that goes on a lot of levels and that usually trickles down, you would hope, from the overall ethos that leaders set for an organization. Those changes can be difficult, but we’ve seen many examples of companies shifting how they do things to be successful as you talked about changing the paradigm to get with the times. So what’s your what’s your take on that? The role of leadership in these kind of, you know, changes?
Julian: Yeah. So, you know, my most established body of work is is on social leadership. So it says you have formal power, you know, when you sign up to the job, you get a contract, and that contract gives you a position within the hierarchy, and it gives you a boundary of your power. And formal leadership is great for driving consistency and conformity, replicability, and scale, the four pillars of the modern organization. But alongside that, you have social leadership. So leadership in the social context or in fact social context because there’s more than what. You have one formal organization but many parallel concurrent social ones. Now, social leadership is a reputation based form of authority. If your formal leadership is given to you by the organization, in your contract, your social leadership is given to you by your community. And a large part of my work charts the foundations of that power and the mechanisms by which it is effective.
And interestingly, you can have somebody who has all the formal power in the world but no social authority. And you can have somebody with very low formal power. They’re almost insignificant in the organization, but they have high social authority. And in that work, what I say is you need both. You need formal and social leadership.
And sometimes it will be the same person and sometimes it won’t. But when I talked about that exhaustion that sets in, it’s really founded in the fact that organizations continue to operate in the formal domain. They’re pulling those levers that used to work. But it’s now half of what they need. A truly, socially dynamic organization is not one thing or the other.
It’s both. So this work says, you need both. You’d still need to be a fantastic effective formal organization with an HR department and technologies and rules and system of process. I mean, you better make sure they’re good, but we know how to do that stuff. But you also need to be a social organization.
You need to foster individual agency and interconnection, and the the work on the socially dynamic organization explores all of that in some depth. But I I should probably say, you know, this is not meant to be a story of an alloyed excitement and inclusivity because I think the pressures of the social age will cause significant challenges for people. It is likely that unless we take action to the contrary, it will enable some people to move ever further ahead, but it will actually broaden inequality. There’s nothing innately good about the things that I describe. And, you know, my work is shared with a liberal bias.
So I’m very clear on that.
Charles: Yeah. Yeah. Yeah.
Julian: I say to people, if you like, you could strike out swathes of my work which carry liberal bias like the the the the book on the humble leader. Talks about the importance of humility and leadership. But it it does so. It doesn’t say you should do this to be nice. It says there’s a business imperative for doing it.
So you could read that book and completely dismiss the liberal bias, but still conclude that you actually need to be more humble in your approach.
Charles: Sure. Sure.
Julian: But this is almost our new responsibility is is, you know, the way I describe it is that we for a long time, we had organizations that have grown on top of society at the cost of society. Yeah. And in many ways, what we now need is organizations which act in service of society. And that’s a big shift.
Charles: It is and it’s sometimes very counter to capitalism. You know, and I feel that is and in some sense with AI and I’ve been wanting to kind of transition to technology, it’s a little bit the same in some ways. In my mind, it’s like when when you had the industrial revolution, you know, pollution and the byproducts of the impact. Now those things weren’t as well known, but even as you started and the way workers were treated even, I mean, you’re pretty close to slavery being a worldwide phenomenon. Right?
And so you really talk about people not having any look, the company looking at them as, you know, just hands or brains to to make things. So at that point, we didn’t stop any of that and capitalism just And then even now we’ve reached a point where as much as we like to try to have our different climate summits and agreements and everything, and, you know, there’s a lot of people on board. Thank goodness, the execution of that when there’s when there’s currency, that one could yield and look the other way that, you know, that that kind of a lot of times overrides things. And with the AI stuff, you know, I think it could be it could be the same when we think about regulation. I guess what I’m talking about is regulation and stuff.
And I don’t necessarily want to open that can of worms here on the regulation side, but from a standpoint of just AI being injected. So what you just talked about we could not talk about technology at all, and I think it’s still a very relevant meaningful conversation. You add technology in there. It’s obviously going to or maybe not. You tell me, you know, make changes or highlight some things that need to be addressed as we play out what you’re talking about. So tell us a little bit about your thoughts there on the role of technology. My guess is you’re going to say it could go either way it could be helpful and hurtful. I think that’s the only way I can look at it, but Anyway,
Julian: yeah. Well, I’ll try to I’ll try to be helpful in my response, but with the caveat that I think it’s healthy to maintain some uncertainty. So there are a couple of specific features of these emergent technologies that that we can see clearly. So the first is that the generative AI is a dialogic technology. And we talk about this in the book, you know, dialogue is something which historically was a very human feature.
You would find an expert and talk to them. And, you know, that dialogue was hence rare. It was of high value. And now It is essentially commoditized. So I can have a conversation with Claude, which gives me access to the world’s knowledge, but doesn’t just give me access, it gives me the ability to carry out sense making.
I can carry out almost every dialogic and sense making feature of learning, social learning and collaborative learning by myself. Now that, you know, that’s a really, really important thing. The commoditization of dialogue means that learning becomes much more fully distributed. Now in parallel to that, you could say organizations know what they know, but what they want is insight and understanding. Now, our our historic models of learning have driven us towards consistency and conformity.
In my work on social learning, I say it is a a mechanism of divergence. Social learning irrespective of technology. Creates divergent narratives. If you add into that the amplifying feature of individual and collective dialogue with these new tools, you see a kind of radical diversification. So then your your decision if you like becomes do I want an organization, which is able to conform, which comes together, is easily measurable, has this kind of codified strength, Or do I want an organization which holds much more diverse knowledge and insight and understanding, but possibly at the cost of some of its certainty and structural ability. And that’s tricky because The answer to that usually is I want both. But important aspect of change is to understand the role of suboptimization. You have to kind of become less efficient for the promise of future efficiency. I normally describe this by saying, if you have a formula one winning race team that wants to be a world champion, world rally champion team, you know, could they do it? Well, probably, they’ve got some pretty good drivers.
They’ve got some pretty good mechanics. They know how an engine works. You know, they can shift stuff around the world. Could they do it next year? Probably not.
You know, you’ve got to sub optimize. You’ve got to stop building. A formula one winning car so that you can learn how to build a World Rally Championship winning car. Now that’s tricky because if you’re trying to hold on to the pride and power and status of of winning, you’re never gonna be able to adapt into the new space. And that’s how I would view it. You know, it’s a dance between individuals, groups of people, our ability to learn, our ability to let things go, to sub optimize, to shift in more of a model of a social movement, and then to rebuild, but to rebuild on new terms.
Charles: Yeah. And, you know, if I’m that, I’m hugely into cars, so I feel good about this thing is that, you know, then you just go hire engineers and drivers and people who know the the rally space and you fund them, you know, and you you use a lot of the infrastructure you already have. Right? So it’s about acquiring the talent that you need to be able to execute that versus converting the people that you have or some mix of that. Right?
And I think that’s why companies hire people to help support their change. Right?
Julian: It could be. I mean, I’ll give you an example. I’m less certain of that, which is to say, well, you know, you might be right. But then how come Apple hasn’t managed to get a self driving electric car? And how come Dyson haven’t managed to get self driving electric cars? Because both of them are fantastic particularly rich organizations, both of them worked for ten years or so to build that capability, both of them hired all those great people in. Yep. And yet somehow somehow almost inexplicably, they’re not world leaders today. Now why would that be? Because it’s it’s, you know, it’s people plus culture, plus system.
Something Yeah. I would argue something about their culture has perhaps inhibited them from truly operationalizing those technologies.
Charles: Yeah. They might have just figured out it’s really, really hard and it’s away from their core business, etcetera. But you’re right. And, you know, big tech companies do have a a history of doing that too. Like, oh, we’re gonna we’re gonna go into I mean, Amazon was gonna make their own phones and go into their own phone route, and they decided, no. No. We’re not gonna do that. So there’s there’s decisions strategically that have massive impacts. And, yes, those are typically gonna be driven by the way the organization functions and what they’re looking to do. So I think a really good way to tie tech into this too as we kind of move through our show here is to do our take it or leave it segment with our articles to look at. The first one is about a profitable AI powered companies with no employees to arrive next year. Right? So this is this is from media outlet called Sifted. And, essentially, what it says is, hey, we’ve got we’ve got the ability now to build use agentic aIs to build companies that can generate revenue without people at all. They can basically use agents to automate everything and understand what needs to be automated and and continue to make a profitable business. Now definitely narrow that down to e commerce. You know, it’s not any company in the world, but the point is, you know, they’re gonna be fully autonomous businesses. And as I look really into the the future future of how companies will work, you know, there’s people that think companies will be able to con create their own vision of AI, excuse me, their own vision of a of a company and do the whole thing on a much bigger, more complex scale. Right? So certainly, this has an impact on workers.
It has an impact on the type of things we’ve been talking about. Right? And the more automation, this may just speak to fears about automation and how automation, you know, impacts people and and brings up the same ethical societal questions, you know, that we sees. What are your thoughts about this? Do you when you read this article, what comes to mind for you?
Julian: What I’m reminded of is that in the future, we could have all possible futures. You know, I I I think that there is certainty to be held nowhere. You know, I’ve already argued in my work that the notion of nations is becoming redundant. The notion of organizations themselves, you know, everything is changing. But I’m also very pragmatic.
When people ask me, you know, where do I think generative AI will take us? I have a confident
answer that says that the first thing we will see is a few people making a lot of money. And this for me falls firmly into this category. Will we see new types of organizations, yes, and what will be a driving force behind them, you know, hopefully social good, but more likely the opportunity to make a ton of money. So yes.
This is what we will see. Technology, driving, radical, reinterpretation to make a lot of money. Is that the end state? Probably not. But short term, yes.
Interesting.
Charles: I didn’t look at it as much like that, as much as I did, saying here’s the first four rays into what is possible in the future and it starts to bubble up the issues. But in this case, Probably not that much. It’s such a narrow use case. So I thought to myself, well, this is a this is primitive stuff in what we’re really gonna be looking at in the future. And all it does is is amplify some issues.
If you lay off an agent and an AI agent, I don’t think they’re gonna complain. Their family’s not gonna go broke. They’re not gonna feel badly about themselves. So they they don’t have those emotions in some sense. Right?
And you can you can continually retrain. So when you take the humanness out of your employees, you can achieve a lot of things differently, but you also, I would like to say, are losing out on some of the qualities that you would get, especially around when things get more complex.
Julian: So I agree with that. But, you know, what I would say is this. You you if you look at it in the abstract, you know, could you create a company that, you know, that that relies on no people? Well, the answer is yes. But it depends where you put the boundary.
So if every company starts to be built like this, then sooner or later, you know, somebody sat in their cupboard, worked with a keyboard, building the thing. He’s gonna say, well, hang on a minute, there’s a ton of money being made and none of it is flowing to me. Now, if we evolve our broader structures of society, so we don’t need money to go to the market, to buy some life anymore, then fair enough. Maybe I’ll say, okay, I’ll do this work for the love of it. But up until that point, sooner or later, I’m gonna be reaching for my pitchfork.
And saying, you know, that nice car you’ve got, those nice restaurants you’re going to. Ultimately, I don’t appear on your payroll, but if you dig deep enough, it comes back to me. And of course, these are the arguments that we see starting to go around the periphery. Do we need to look at new models for society, models that start to disaggregate a salary from doing work? These are interesting discussion but part of me says there are some more immediate social challenges that we should face.
You know, that if we’re looking to solve inequality, there’s plenty of opportunities to do so today without looking into a distant and dystopian future before we engage with that?
Charles: Yes. Well said, jeez. So overall, you know, we’re we’re up voting or down voting this article based on its voice tone and take, really. Because I’ve found as I do this, every article presents one thing that may be kind of shocking, but always brings it back to the so it’s it’s hard to form, you know, concrete be just like, this is a horrible article. But in general, what do you you know, if you read this, are you nodding your head, or are you saying I don’t know.
Julian: I’m down voting. It’s not not not radical enough. It’s I think it’s a I think it’s an easy narrative. And I think you should look for look for something else that doesn’t present such a earnings story.
Charles: And I too downvoted it just because I felt like they’re it’s they’re skimming such a simple use case and they’re overhyping it. As well. I just don’t see this happening. It seems a little bit of sensationalism. To me, I did like that it it made me think about this and say, oh, people are actually now in a concrete way.
Companies are talking about doing this. There’s no doubt we’ll see it, but I I don’t I didn’t take it that way either. So let’s take a look at then the next one, which is a little bit more cerebral or whatever, and this is from a it’s actually and I don’t usually do this, but I like what the article generally was talking about in terms of being useful in meaningful conversation starter from a company called HFS and from their summit, they had you know, a conversation. You can watch the video, but I just read the article about autonomous enterprises. Right? And so in some sense, there’s a connection because they’re talking about autonomy and in organizational functioning, but But here, the focus is really about data and saying, look, the data is such an important part of this ecosystem. We can we can pull people out of this in a lot of ways and just allow the data with some governance to really help us automate and, you know, become the company that we really wanna be without being slowed down by all this people stuff, you know. At the same time, it does speak to collaboration being important. So it’s it to me, it’s a balancing between collaboration and automation and saying, you know, this is an inevitable thing where enterprises are doing this. And to to execute this, people are gonna really have to be reskill.
People are gonna have to focus on different things and have, I think, different roles because a lot of what they’re doing is gonna be automated. Right? And and leadership is gonna have to have a huge piece in this. And and you know, the the question of when really comes up in this, and I think it takes a little bit of a pretty measured approach of saying, you know, we don’t know when this is gonna happen, but jeez, it’s really going to be a situation where we’re going to have fewer and fewer people less resistance to change more data, making the decisions for us. It’s almost like people are just going to have this It’s all their own they’re trying to articulate this, but machinery that they’re just sitting there, making sure the machinery is running okay, executing what it’s supposed to instead of, you know, being more involved.
I don’t know. So what was your take on this one? What did you
Julian: So, you know, I like the narrative. So, you know, for sure, if if you look at what people do all day very often, it’s about lots of people all over the world, kind of trying to navigate through the same data and disconnected, not understanding. There’s clearly an awful deal, you know, great deal of efficiency and filtering and sorting that can be done. But what we should never underestimate is that people don’t engage with data per se. They engage with the story of it and they form a belief around it.
And and I wrote a whole chapter on this in the socially dynamic organization. I said, you know, entities organizations are entities of story and belief. So in in some ways, I could sell you a machine which gave you a crystalline view of the data, this perfect structure that gave you the foundations to make brilliant decisions to do everything with perfection. And yet, you know, you like me would still manage to mess it up. Because our biases, our beliefs, we would say, well, you know, we could do that, but I wanna do it where I can ride my bike to work.
You know, we we just we have opinions and views and we weave ourselves into the storm. Yeah. So kind of two tracks for me. One is, yes. Organizations could doubtless take a pickaxe to a lot of what they do.
Hack out or load that inefficiency and mess and gain great advantage from so doing. But will they become these perfectly oiled self sustaining machines that don’t need humans? Well, maybe, but if they do, humans will just invent alternative messy organizations. If you look at what we do, we engage with people. We engage with experiences.
We engage with ideas. But we engage. The the decision to engage is often an emotional rather than a utilitarian one.
Charles: So how do you feel about it? What’s your take on that on this article?
Julian: I’m gonna give it an up vote because I’m an optimist, but how useful it will be in a few years time, I’m less sure.
Charles: Yeah. So for me, I gotta in some sense, I downloaded it because I don’t understand when you talk about data being its own things so much that people aren’t needed. I mean, I guess data runs AI. You could look at it that way. I feel like this organization might have had a little bit of an agenda around kinda what they’re selling for this.
And so but I chose it still. So anyway. So I’m gonna I’m gonna actually downvote it as a psychologist, I keep continuing to advocate for people being an important piece of it, not that you aren’t. But I I still can’t see the ways that companies will be completely automated. We need people to help us make decisions from data.
We don’t want the data just making its own decisions. Maybe in supply chain or things that are very amenable to automation, but that’s not the whole organization and what it does. So good. Well, look, thank you so much for your time and perspective. Again, such a different perspective, you could see everything that I was saying you had an inverse for it in some ways, not that you’re being argumentative.
It’s just another way to look at things. And I love that. I love my paradigms being changed and and and looking at things from a new lens. It’s so important. So thanks for your time. I really appreciate it. And everyone, this man has a lot of books. Tell us as you’re on your way out and writings and organizational consulting, etcetera. So on your way out, just tell people how to find you. What’s what’s the best way the track.
Julian: I’m I’m easy to find online. So julianstored dot wordpress dot com. That’s me. WordPress has been great for me. That’s where all my writing sits.
And sea salt learning is my organization. But if you just Google my name, I’m not difficult to find So thanks for having me alone, Charles. I mean, if we
Charles: No problem. Thank you. And companies out there, people out there, you need to find this guy. He has got some really good perspective. So thanks, Julian.
Julian: Thanks, Charles. Take care.