Updated: June 1, 2025
Last week, I had the chance to attend the Business of Blockchain Technology (BBT) Conference at the Herbert Business School, University of Miami. It was my second time at the event, and once again, I walked away feeling inspired. The keynotes were a highlight – Patrick O’Meara and William Quigley brought sharp industry perspectives, while Guangzhi Shang and Youngjin Yoo offered some great academic depth. Together, they covered a range of blockchain applications, from enterprise systems to the evolving world of decentralization.
What I love most about BBT is the mix of people it brings together. You've got people from startups and big blockchain ecosystems like Polkadot chatting with professors from information systems, finance, and management. The energy that comes from that kind of cross-disciplinary crowd is hard to beat. It really pushes you to think beyond your own corner of the space.
This year was a little different for me – I wasn’t just attending. I was also invited to moderate and join a panel on “Decentralization: Promises, Reality, and Challenges.” We had four incredible panelists, and the discussion touched on everything from the technical architecture of decentralized systems to the social and organizational shifts they demand. I’ll share a recap of their main points below, but before that, I wanted to offer a behind-the-scenes look at how I prepped for the panel. If you’ve ever found yourself moderating, organizing, or even just nervously preparing for a panel appearance, this might be useful.
Here’s how I approached it:
Know your panelists. The panelists are the heart of the session, so I always start by digging into their work. I looked up public talks, interviews, and – when possible – their writing. For the academics, I read a few of their papers to get a sense of their research. For the industry folks, I tried to understand what their organizations are working on and what role they play. I also reached out directly. I asked for short bios and gently inquired about any topics they were particularly excited to cover. I made sure to frame it as a wishlist – something to guide the flow, not a strict roadmap.
Know your audience. Panels are conversations, but the real goal is to serve the audience. At BBT, the crowd usually breaks down to around 40% industry professionals, 40% business school faculty (mainly from IS, finance, and management), and about 10% students – from undergrad to PhDs. Knowing that mix helped me calibrate the depth and tone of the questions. It’s also why I believe panels work best when the audience knows what to expect. Sharing a bit about the speakers and the session ahead of time goes a long way.
Shape the agenda. Once I had a feel for both the panelists and the audience, I sketched out 3–5 key topics I wanted to explore. These were based partly on current trends in the space and partly on themes that kept coming up in the panelists’ work. We had an hour for the session, but once you factor in introductions and Q&A, that leaves about 45 minutes for the actual conversation. With four panelists, that means around 10 minutes per person – assuming no one goes too far off-track. Some moderators stick with one or two big questions for everyone. I chose a more tailored approach, assigning specific prompts to each panelist while giving space for the others to chime in. That seemed to work well and kept the conversation dynamic.
Time-keeping. Timing matters – especially when your panel is right before lunch! As a moderator, I saw my role as not just guiding the content but also keeping the rhythm of the session. It’s easy for someone to run long when they’re passionate about a topic (and let’s face it, most panelists are). I tried to strike a balance between structure and spontaneity, giving people enough room to elaborate but gently nudging things forward when needed. Having done Toastmasters in the past definitely helped with that.
With all that in place, the panel went really smoothly. Below, I’ve written up a recap of the major themes we covered and the insights each speaker brought to the table.
If you’re curious about decentralization – or just enjoy the art of a good panel discussion – read on.
Decentralization is no longer the exclusive concern of cypherpunks and ideological libertarians. What began as a fringe concept born out of mistrust in centralized institutions has evolved into a serious design challenge at the core of our digital infrastructure. As technologies like AI, blockchain, and zero-knowledge proofs mature, decentralization increasingly emerges not as a utopian goal but as a pragmatic tool for resilient, equitable, and scalable systems.
This shift was at the heart of a recent panel discussion featuring Roman Beck (Professor, Bentley University), Robert Gregory (Associate Professor, University of Miami), Filippo Franchini (Educator, Web 3.0 Foundation), and Youngjin Yoo (Professor, London School of Economics and Political Science). Each panelist brought a unique disciplinary lens – technical, economic, organizational, and philosophical – to the question of what decentralization is becoming.
Consensually, speakers arrived at the conclusion that decentralization is not a finished product or a fixed ideology. It is an evolving socio-technical process – a dynamic response to shifting power structures, institutional inertia, and coordination challenges. This blog unpacks six interwoven themes from the discussion: decentralization as layered infrastructure, governance philosophy, practical tool, organizational experiment, technical design choice, and counterbalance to centralized AI.
Decentralization as Multi-Layered and Socio-Technical Phenomenon
Filippo Franchini has highlighted a persistent misconception: that decentralization is merely about technical infrastructure like nodes or blockchains. In reality, he suggests, decentralization encompasses multiple layers – technical, administrative, economic, and social – making it fundamentally a socio-technical concept.
At the infrastructure level, decentralization involves how data is replicated, validated, and secured across distributed systems. But that’s only one layer. Governance mechanisms – how decisions are made, who gets to participate, and how proposals are ratified – form another critical layer. Administration and fund allocation, like community treasuries and grant funding, constitute a third. And finally, user access shapes how inclusive and equitable the network really is.
Filippo referred to the Web3 Foundation’s sponsorship of Inter Miami CF, noting that the decision emerged not from a centralized marketing team but through on-chain governance funded by a community treasury. This example illustrates decentralization in action: smart contracts, collective decision-making, and treasury management converging in real-world applications.
Ultimately, decentralization is about legitimacy. Infrastructure can be decentralized in appearance, but without transparent governance, equitable funding mechanisms, and open access, such systems risk becoming decentralized in name only. As Franchini emphasizes, we must stop thinking of decentralization solely as a technical design pattern – it is, fundamentally, a socio-technical process of coordination, trust, and participation.
From Ideology to Incentives: Governance Needs Hayek and Buchanan
Roman Beck has argued that decentralized governance models inspired solely by Elinor Ostrom may be insufficient. While Ostrom’s work on common-pool resources has informed many DAO frameworks, Beck questions the assumption that participants in such systems naturally align around shared goals.
He contrasts this with the perspectives of Friedrich Hayek and James Buchanan. Hayek emphasized the dispersion of knowledge and the need for systems that accommodate emergent order. Buchanan focused on constitutional processes that define how rules are created and legitimized, especially in environments where interests diverge.
Professor Beck has cautioned that most decentralized systems tend to fail not due to malicious actors, but because their governance frameworks rely on idealized cooperation. Real-world environments are adversarial: incentives conflict, information is asymmetric, and motivations vary widely. Thus, governance should be designed to absorb friction, not avoid it.
From Beck’s perspective, decentralization should be understood through the lens of conflict management and institutional design. Rather than fostering harmony, successful governance mechanisms must handle disagreement, allow for iteration, and facilitate the redistribution of power. This reorientation – from utopian idealism to structural pragmatism – may be essential for resilient decentralized systems.
Decentralization Survives as a Practical Tool, Not an Ideological End
Youngjin Yoo has suggested that the value of decentralization is increasingly recognized not for ideological reasons but for its practical advantages. This reflects a shift from early blockchain narratives that treated decentralization as a moral or political imperative.
In contemporary contexts, decentralization succeeds where it provides tangible benefits. For instance, privacy-preserving AI models leverage decentralized infrastructure to enable federated learning without centralizing sensitive data. Sovereign data wallets empower users to control the use and distribution of their personal information. Similarly, decentralized storage services offer resilience against market volatility and state censorship.
Professor Yoo’s observation points to a broader trend: investors, enterprises, and governments are no longer swayed by ideological rhetoric alone. They are interested in decentralized technologies because they perform better in specific use cases, whether in security, efficiency, or user empowerment.
This pragmatic approach signals a maturation of the ecosystem. The question is no longer whether a system is “decentralized enough,” but whether it solves real-world problems effectively. Decentralization, in this view, is not an end-state – it is a design strategy grounded in utility, not dogma.
DAOs as Laboratories of Post-Corporate Governance
Robert Gregory has proposed that DAOs are pioneering a shift in institutional logic by redefining fundamental questions of governance: who makes decisions, how rules are enforced, and how value is distributed. These are the same questions that underpin traditional corporate governance, but DAOs seek to answer them differently.
In contrast to firms, where authority is vertically structured and codified in legal contracts, DAOs use token-based voting, smart contracts, and communal deliberation to coordinate decisions. This decentralization of authority introduces a networked model of organization that operates outside the traditional boundaries of the corporation.
Professor Gregory emphasized that DAOs combine off-chain deliberation (such as Discord debates and governance forums), on-chain voting (which may use token-weighted or quadratic schemes), and automated enforcement via smart contracts. These mechanisms together create a dynamic interplay between social input and algorithmic execution.
What makes DAOs noteworthy, according to Gregory, is not the technology alone but how these elements co-evolve. Tensions between adaptability and rigidity, or transparency and efficiency, are not bugs but features – areas for innovation in governance design. Rather than treating DAOs as unfinished products, they should be seen as governance experiments exploring what digital-native institutions might become.
Over-Engineering Kills Trust: Simplicity Is a Political Value
Roman Beck has been critical of overly complex decentralized architectures, using the example of IOTA’s smart contract environment to illustrate how technical convolution can undermine trust. His critique highlights a deeper issue: complexity is not merely a technical flaw, but a political liability.
In Beck’s view, the so-called “blockchain trilemma” – which posits an inherent trade-off between decentralization, scalability, and security – is not a law of nature but a consequence of design decisions. Systems that prioritize novelty or performance over clarity often create opaque environments that are difficult to govern or audit.
From a governance standpoint, simplicity is not optional. Transparent systems are easier to inspect, understand, and trust. Over-engineering introduces barriers to entry, exacerbates power asymmetries, and ultimately erodes legitimacy.
Professor Beck advocates for minimalist architecture – not to simplify things, but to foreground intelligibility, accountability, and inclusivity. Simplified systems are easier to adopt, especially in non-technical communities. Thus, architectural simplicity becomes a political stance, affirming the values of transparency and democratic access.
Decentralization as a Counterweight to AI-Driven Centralization
Youngjin Yoo has raised concerns about AI’s inherent centralizing tendencies. The development and deployment of advanced AI models require large datasets, immense computational resources, and highly specialized talent – factors that concentrate power among a few dominant platforms.
He sees decentralization as a structural counterbalance to this consolidation. By distributing data ownership, enabling federated learning, and advocating for interoperable protocols, decentralized systems offer an alternative to monolithic AI architectures.
This tension between AI and decentralization is more than technical – it is political. Professor Yoo suggests that we are witnessing two competing megatrends: one consolidating power, the other distributing it. To maintain democratic agency and preserve user autonomy, systems must be designed to resist enclosure and centralization.
Decentralized technologies provide this resistance by embedding pluralism, transparency, and modularity into their architectures, in a world increasingly mediated by intelligent systems, decentralization functions as both a design principle and a societal safeguard.
Conclusion
Decentralization is no longer an ideological abstraction or fringe experiment. It has become a central concern in how we design systems for governance, identity, and innovation. As the digital world becomes increasingly complex and centralized through AI and platform monopolies, decentralization offers a necessary counterbalance.
The insights shared by Filippo Franchini, Roman Beck, Youngjin Yoo, and Robert Gregory underscore that decentralization is not a static condition or final destination – it is an evolving socio-technical strategy. It must be designed for resilience, not idealism, for inclusion, not purity.
Whether you’re a developer building new infrastructure, a policymaker drafting regulation, or an investor assessing risk and opportunity, understanding the layered, incentive-driven, and politically potent nature of decentralization is now a prerequisite. It is how we ensure that the next epoch of digital society remains open, accountable, and future-ready.
Author’s Note: These materials were prepared based on a recap of a conference panel by Mariia Petryk. The final text has been refined and edited with the assistance of digital tools, including generative AI technologies, to enhance clarity, structure, and coherence. All misattributions of expressed opinions and any errors remain the responsibility of the author.