My motivating question for the last 30 years has been:


How come our attempts to make things better so often make them so much worse on other scales?


This, of course, leads to:


What can we do about it?


I've done a deep dive into Human Agency in Complex Systems. How do we work together to steer systems towards more humane outcomes? What skills do we need to develop to monitor, mitigate, or even just be honest about what is happening?


To address this, I've had to develop my skills in three areas:

  • Accurate modelling of complex systems (physics, math, data science, computer science, ontology)
  • Understanding of human drives and motivations (sociology, ethics, meditation and embodiment practices)
  • Clear vision of the nature of technology (epistemology, STS studies)

There is a high level of abstraction that must be considered to accurately assess these questions, and it is currently *nobody's job to do that.*


Every system we create and train people on is locally epistemically valid (if it weren't, it would not work). But each and every system imagines itself in a void, separate from the rest of the world, able to entirely focus human and material energies on local optimization.


My work is designed to help people seek ontological clarity; not just local understanding, but a continual shift to the higher levels of abstraction to clarify what IS. I draw people into compassionate collaboration, helping teams hold onto a higher vision while skillfully executing on their immediate priorities.


What does it mean to "Think Globally, Act Locally?" If you want your work to create positive impact at scale, these are the skills you need to develop.

Read more