Sunday, March 11, 2018


Chantal Westby's Maelstrom
Recently there has been a controversy over a tweet from one of IOHK's employees named Darryl McAdams. The tweet commented on her desire for IOHK to recruit more female and transsexual employees. This opinion has raised broader questions about IOHK's hiring practices and overall philosophy on diversity, inclusion and other social topics. 

First, to state very directly, IOHK does not maintain or endorse quotas, bias or an archetypal vision of an ideal contractor or employee. We are- and forever will be- a merit based organization. I have a fiduciary obligation as the leader of my company to always hire the most qualified person for the job regardless of where they come from.  

Second, as the CEO of IOHK, I have never wanted to lead an organization that takes it upon itself to promote a particular political cause such as social justice. It isn't our place or even within our power in a global free market to somehow cure the evils of racism, inequity or other sins perceived or actual.

Third, IOHK- and frankly all adherents to a free society- need to reserve the right to be offensive to others. Rational thought, change and challenges to existing power structures requires the ability to irritate and invoke wrath. I feel that I must elaborate on this point in more detail. 

It wasn't too long ago when concepts such as evolution, a heliocentric view of the universe, free speech and democracy were considered preposterous, revolutionary ideas that were inherently dangerous. Those who held these beliefs were and some are still persecuted within certain circles.

The reality is that all of humanity has a common journey. We are chained by biological shells programmed by a process we have little control over. The genetics we inherited have tremendous influence over our intelligence, appearance, preferences and overall ability to succeed in life. 

Some win the genetic lottery being given profound gifts. Others are cursed to suffer the indignity of physical and mental disorders disorders so severe that they can never enjoy the world as most of us do. 

While fortune toils away, another byproduct of evolution is our cognitive powers. Properly harnessed, they have allowed us to transcend parts of our biological cages to collectively become more than we were meant to be. 

In under 10,000 years, mankind has enjoyed an ascendancy that now moves to the stars and mastery over life. Soon we will be making modifications to our genetic code, adding new senses and merging our minds with computers. Nature's paintbrush is slipping into our hands. 

Another byproduct of the powers of our cognition is that some have become unhappy with the hardware and cultural programming that nature and their respective societies have endowed them with. Some have developed exotic sexual tastes (see furries). Some have embraced lifestyles that are foreign at best to utterly alien and even repulsive at worst. As an extreme example, one could look to the flesh eating Aghori monks in Varanasi.   

As a matter of pragmatism, there is what we are comfortable with and what technological advancements and globalism will force us to accept as we travel this century. For example, millions of people are living digital fantasy lives in MMORPGs like World of Warcraft, more comfortable with their avatars and their virtual connections than their own flesh and blood lives. Hollywood is even kind enough to give us Ready Player One as a visual case study.  

Characters such as Joi in Blade Runner 2049 or Her's Samantha appeal to legions of fans. To this end, capitalism has been summoned to attempt to build a crude simulacrum (see Azuma Hikari [1][2]). Should we be so naive to believe that this trend is just a fad on par with the pet rock? 

When is this Love?

The reality is that we are using our cognition to change ourselves and redefine relationships. And like prior centuries having to decide whether to embrace other cultures, ideas and religions, we are facing the equivalent of our time, but now armed with computers and profoundly advanced technology.

Thus it's reasonable to assume that mankind is going to explore depths that we haven't seen as a species before. Exploration of this nature cannot be familiar or painless. It's going to break conventional society and force a fundamentally re-evaluation of concepts like relationships, gender and even physical presence. 

Did Snowden attended the conference? What if the robot had a female face?
I am the builder of digital infrastructure. The protocols that could eventually yield control over our identity, financial lives, voting rights and property. These protocols cannot belong to a particular culture or group. They also cannot discriminate against the weak and misunderstood. Roads cannot be biased against the creatures who walk them.

Therefore, I've attempted to construct a company that welcomes a diverse group of opinions, beliefs and geographies. We never censor our employees nor ask them to remain silent on the issues that are most important to them. 

As a company, IOHK tries to embrace neutrality. It frankly isn't IOHK's place to choose sides in these debates. It's just our place to ensure they don't consume our business operations and fiduciary obligations.

Part of this creed is also accepting that people associated with my company could say things (myself especially included) that will, at times, deeply offend others. For example, I have repeatedly- at times harshly- expressed my dismay over police brutality within the United States. I have no doubt that this position is hurtful to police officers and their families. 

And this brings me to my final point, I've become gravely concerned over the attempts to de-platform opinions. Those with ideas, beliefs or even objective data contrary to particular agendas are often maligned, ostracized, banned from speaking and even physically threatened or attacked at times. 

These tactics are nothing new. They have been employed by radical movements as a means of silencing critics and rational thought in order to inflict a fanatical philosophy upon society as a whole. In my mind, there is no difference between the communist commissars and the student protesters shouting down the latest conservative speaking at a college campus. Both are trying to prevent us from hearing an opposing argument.

Part of the reason why I so admire blockchain technology is that it protects us against the revision of history, the censorship of inconvenient truths and the power of centralized actors to sculpt our view of reality. Citing fake news or social justice, I can imagine a time when Facebook or YouTube become weaponized tools of a regime to a deploy well crafted propaganda in order to preserve power and social order. 

I would be an utter hypocrite to say such things ought to be stopped, but then ask my employees to censor their opinions. I just ask for respect, dignity and reason. But I cannot ask them to avoid offending others. 

Politically I'm a libertarian; I loath taxes, regulation and socialism, but I will not mock those who collaborate with me for having a difference of opinion. Along the same token, while I at times cannot fully understand particular preferences or lifestyle choices, all I ask for is they are conducted with respect, dignity and empathy for others.

Leading an organization, I can fully appreciate why some CEOs have chosen the easy road of attempting to hide behind empty platitudes and vacuous diversity theater. It's simply better for business and one's partnerships to try to be as least offensive as possible. But that's not reality; it's a Dilbert cartoon. 

The reality of life is that as a condition of our culture, upbringing, religions (or lack thereof) and geography we are going to act in ways that create strife. While the curse of cognition is that we must endure the pain this strife brings, it's gift is that in embracing the maelstrom, we often find a creative destruction of old ideas refreshed with far superior ones.

By avoiding this process, we are losing part of what has made humanity so collectively strong and also draining authenticity from the workplace. IOHK collaborates with some of the brightest minds- Darryl McAdam's included. They simply don't have to be here if they don't want to be. If given a choice, would you rather work somewhere that accepts you or forces you to live in a gilded cage like an amusement park character? 

I didn't sign up to build Disneyland; I signed up to change the world. So that's what we are going to do as we march towards an esoteric, ever more authentic and I hope better future. Forgive us for breaking a few vases along the way.  







Sunday, March 4, 2018

An Ode to Critics (IOTA and DCI)

Recently I heard there is the possibility that one or more actors associated with the IOTA project suggested the possibility of some form of legal action against members of the DCI responsible for an unfavorable analysis of IOTA's core technology. Rather than rehash the entire affair here, I'd recommend these sources as a reference points (DCI Audit Report)(Blog Post)(IOTA Response) to bring everyone up to speed.

What is provoking me to draft a blog post on this topic is that I offered to pay legal fees DCI actors would encumber as a result of their audit of IOTA in the event an agent of the IOTA Foundation or its associates decide to sue a member of the DCI. This offer was immediate and without preconditions. It also isn't connected to an opinion of the soundness- or potential lack thereof- of IOTA's technology.

To be frank, I could care less whether IOTA works, accomplishes its commercial goals or how it manages its ecosystem and community. What concerns me far more as a developer of cryptocurrencies is the relationship between security and cryptographic researchers and protocols we develop for our space.

The reality is that we have a symbiotic relationship. Researchers enjoy spending countless hours attempting to find flaws (theoretical and practical) in the philosophy, design and implementation of our work. These hours are seldom glorified or even compensated. They are generally ignored by the mainstream public outside of an occasional sensational headline by a low information journalist. But they are absolutely necessary to evolve our work.

For the researchers, they gain academic credit, the occasional job and the intellectual joy of resolving a problem. These perks aren't exclusive to a particular protocol or even the cryptocurrency space. Inflicting havoc on Ed25519 yields just as many brownie points as finding an issue in Ethereum's network protocol.

Having paid private firms literally hundreds of thousands of dollars in consulting fees to audit code IOHK writes, I fully appreciate the value of this foundational work. In fact, often one simply cannot hire the top minds as they are only interested in university affairs. Thus their time and effort is not only valuable, it can even be simply irreplaceable.

If a member of our space begins to attack researchers he feels have been unfair in their assessment or criticism, then this event cascades far beyond the immediate actors involved. It fundamentally damages the vital symbiotic environment between researchers and protocol developers. In other words, it directly hurts Cardano, Ethereum, Zcash and every other project.

Most graduate students, postdocs and professors do not have extensive resources to defend themselves against well capitalized cryptocurrency projects that don't actually have to win a case in order to massively disrupt the lives of these researchers. Going to court is expensive, emotionally exhausting and takes a huge amount of time. If a security researcher feels his work could provoke this event - even if it's objectively true, then they will simply choose a different topic.

I also can fully appreciate the discomfort of criticism that members of the IOTA community and the developers themselves are enduring. I have first hand experience with the blatant unfairness of constant attacks over social media, blog posts, at events and through other channels where lies, half truths and baseless innuendo replace an effective dialogue. It's always painful and often crosses the threshold to malicious slander.

But it's extremely important to understand that not all criticism is unfair and even within the set that is unfair, the actors levying it ought to be considered. The academic world is tightly regulated via credentials, unspoken rules and a strong emphasis on reputation. Attacking someone unfairly isn't a pattern that can be repeated without severe career consequences.

Thus the most common response to attacks coming from the academia is to prepare a fact based rebuttal. It doesn't necessarily mean the attack will be deflected or withdrawn, but it forces the critic to acknowledge your rebuttal and provide additional context and clarity.

This process is on display for the entire academic community to form opinions. If a researcher is dishonest, has conflicts of interest or is omitting/missing key points, then it will eventually be discovered. If it's a common pattern, the researcher will be socially exiled from academia.

A prominent example in the cryptographic world comes from Dr. Neal Koblitz. He levied an aggressive series of attacks on the concept of provable security. Neal's credentials are impeccable having created elliptic curve cryptography and being a Harvard educated Putnam fellow. Despite his enormous contributions to the field of cryptography, he wasn't given a pass on what many feel is unfair criticism. And it has had career consequences.

Escalation to courts is generally only done in cases of known fraud and institutional cover-up. For example, the falsification of collected data to skew results to some desired outcome. The consequences are always brutal once discovered. As particular examples, one can review the Schön scandal and also Paolo Macchiarini affair.

Nothing in this audit seems to deserve an escalation of this nature. A researcher made a claim and provided an argument with a set of evidence. The developer says this claim is false. It's an argument and it has an objective answer for the world to see.

Thus, I have no choice but to apply some of my personal resources as a counterbalance to protect the integrity of the system I have so benefited from throughout my academic and professional career. I would recommend that the IOTA community exercise the stoicism of the person who created the heart of their protocol as he continued to teach while students rudely interrupted his class.

I'd also like to remind them that MIT and the broader academic community isn't going away. Direct attacks- even if victorious- will have Pyrrhic consequences.

I hope the matter is closed and everyone can move on to better things. 


Saturday, January 6, 2018

The Price of Craftsmanship and the Zen of Protocol Design

Having read the comments of a former business partner of mine, which will be addressed later in a dedicated blog post, I’ve decided to draft my thoughts on how IOHK approaches the design of Cardano and by extension all the cryptocurrencies it works on. As this space has become polarized with the politics of personal destruction, financial incentives to lie and a stunning lack of respect for critical analysis, I’ll try not to mention project names- just my opinion on what good design principles ought to look like. I freely admit, I could be misguided or stuck in my ways.

First, we have to define what is the point of our labor. What goals are we trying to achieve and who needs the solution? It’s stunning to me how we are littered with solutions seeking problems connected to a token actively trading. Decentralized computation, storage and other services need an audience in order to be useful and they need a necessary edge in order to survive beyond hype.

For example, replicated computation that is byzantine resistant is what Ethereum brough to the table. We freely admitted that market demand was unclear and that use cases would materialize after we launched (the field of dreams gamble). Whether those uses are economically viable or optimal, was and still is an open question being explored and forcing enhancements.

What is undeniably valuable was the beginning of a conversation about outsourcing computation in a way where the server couldn’t be trusted; either couldn’t trusted to return a correct result or not trusted to de-prioritize a particular program due to some agenda. The net neutrality debate is highlighting this concern most directly.

There seems to be an audience that likes the problems that Ethereum is trying to solve. Thus it begs the question what is the best way of doing so? What tools do we have in our bag and who are the craftsman who ought to weld them?

The point of the Cardano project has always been to build something from first principles using a functional programming approach, embracing formal methods, and checking our progress through peer review. We chose these three pillars because experience tells us that humans are good at self-deception, forming personality cults and making extremely subtle mistakes that eventually cascade (heartbleed is a great example).

Functional programming is getting code close to math. It’s saying scientists draft a beautiful blueprint and then let’s pull that blueprint directly into reality. There are some wonderful lectures from the Clojure community on the elegance of functional programming techniques (1)(2), but the broader point is that simplicity, modularity and conciseness matters more than performance.

Machines keep getting faster; legacy code is like a tattoo. You’re going to have to live with it so make it pretty. We chose Haskell because it has the perfect intersection between practicality and theory. It gives us wonderful libraries like Cloud Haskell and a community that’s extremely smart and supportive of new techniques and ideas as they become necessary.

Formal methods are an acknowledgement of the semantic gap. Humans and computers are fundamentally different animals and until Ray Kurzweil delivers us to the Singularity, we will be quite distinct. This axiom extends down to the computer’s understanding of our intent versus our own.

The DAO hack is a recent textbook example. The engineers who wrote the contract had a clear understanding of intent, but it differed slightly in code and as a consequence a hacker could cause havoc. The point of formal methods is to close the gap between man and machine.

Specification captures the intent of the scientists who spend countless hours of rigorous labor carefully writing mathematical proofs. These proofs are riddled with ideal functionality and ambiguity from an implementation perspective. Basically, such papers are the inky equivalent of the spirit Billiken- the god of things as they should be instead of what they are.

A formal specification process is slow, uncomfortable, pedantic and requires exotic languages and skills. As a consequence, it’s also terribly expensive and not fun for most people. But such techniques save lives (think planes and trains), money (think Mars Rover) and dramatically enhance our understanding of the protocols we wish to deploy (lies melt).

We’ve written some blog posts on techniques (1)(2)(3) and the philosophy we follow as well as have done a whiteboard video. As most of our work is transparent, the specification of Ouroboros Praos is no different. The repo can be seen here. Like a fine painting requiring exhaustively small brush strokes to gradually make the whole, we are paying that price of craftsmanship.

Finally, there is peer review. It somehow is conflated, misunderstood or in some cases discarded as an unnecessary formality to appease irrelevant ivory towers out of touch with the plight of normal man. I counter every single one of these attacks with a single question: can you understand the papers cryptographers write?

Humility will yield an answer of no for the majority of the populate. This statement isn’t self-serving arrogance. It’s respect for a language born from decades of careful study. Medicine has research. Physics has research.

Why is it so controversial to state that a paper like this is outside of the ken of most people? It isn’t elitism; it’s an acknowledgement that the people who wrote it spent decades of their lives learning how to write that paper and think like they do.

Somehow in the cryptocurrency space, we have forgotten that our underlying technology is constructed upon foundations of cryptography, distributed systems, game theory and programming language theory amongst other considerations. The people who study these fields literally have invested tens of thousands of hours to become proficient- meaning they can read and understand the papers- and that’s not even making a statement about meaningful and original contributions.

The question we ought to ask isn’t can I understand the papers. That’s like asking the public to understand the US Federal Budget. The question ought to be what process should this work go through in order for it to be considered correct?

Peer review via IACR conferences is an excellent option. The conferences are managed by domain experts who don’t have a financial incentive to like or dislike any particular work. The review process is double blind. The conferences hold high standards where most submissions are rejected. And acceptance means you have to show up and discuss the work with your peers.

It isn’t a perfect process by any means, but it’s a standard of quality that is objective. It’s a benchmark to start a conversation with and provide some assurance that the work meets basic standards. That someone who actually can read the paper, has read the paper and thinks it’s ok.

Like all good science, one needs to continue evolving, continue pushing the boundaries and continue asking difficult and often uncomfortable questions. The ultimate point of peer review is to acknowledge you aren’t an island and you don’t want to go on that journey alone. It’s asking for help from fellow travelers who are just as capable if not more so than you.      

As an outside observer, many of which who are directly investing their hard earned money, one should be actively asking about processes that produce truth. We chose peer review because it’s the best tool in our box that we know how to use to check our claims. It has given us modern medicine. It has given us modern physics. There is no reason it can’t be used to help give us better money.

As a final point, both Algorand and Snow White carry similar structural properties to Ouroboros. The exact same criticisms that my former business partner naively applies to Ouroboros could be applied to them - meaning that a Turing prize winner and Cornell are both inferior as well given that logic.

There also was a lack of appreciation for the holistic nature of protocol design. Raw TPS isn’t an end, it’s a necessity of large scale use. Yet there are other considerations such as network performance and the ability to store with high availability the eventual exabytes of data these systems will demand.

The Zen of protocol design is understanding that all things have to flow from a common source. That this source needs to be on bedrock, simple and secure. That this source needs to be perfectly balanced and grow naturally to meet the needs of its users. When a protocol achieves this state, like TCP/IP did, the results are magical. Others like PGP have failed despite their brilliance.

The point of how we have gone about designing Cardano is to seek this balance in our design. Ouroboros was built very carefully and in the most general way we could understand. It can be tuned to operate like many conventional protocols or run in new modes.

It will eventually include modifications to dramatically scale performance when it is necessary. We’ve even broadened the discussion to include topics like RINA and Delta-Q because they are absolutely required for natural scaling.

Yet in all these things, we are doing it with principles, craftsmanship and honesty. It’s a long and very hard journey, but has been a fun one.

Thanks for reading


Sunday, December 24, 2017

A Crypto on the Edge of Forever

Now that the dust has settled from more than twenty countries of travel, dozens of conferences, major events and community meet and greets this year, I’ve finally had the time to reflect on the progress of the Cardano project as well as some of the lessons I’ve learned. It’s honestly been the most challenging year of my life filled with drama, stress, death and some unbelievably cruel people.

It’s also been one of the most rewarding and joyful having the chance to meet thousands of passionate and kind fans, technologists and scientists- I can see the inspiration that Charles Dickens had when he said it was the best of times and the worst of times.

The reality is that the internet and in particular the cryptocurrency space can be a really toxic place if you allow it to get to you. There were times after reading some blog post or comment on reddit that I seriously questioned if this effort was worth it. I can understand why Mike Hearn left Bitcoin.

But I’ve never been here for the short term, it’s always been the dream of finding a way to get financial services to the three billion people who don’t have them using technology that was only a dream a generation ago. And I think we are making great progress there.

In January of 2017, Cardano was still mostly in a very early alpha stage. We had tremendous engineering difficulty getting Haskell, our devops and the new protocols such as Ouroboros and Scrape to play nicely together. Rather it was a constant learning curve of how to tame the three headed dragon of research, decentralized teams and exotic programming languages while managing the expectations of a huge community.

As an aside, Cardano has one of the fastest growing and most intelligent fanbases. We actively invited people who care about formal methods, peer review and functional programming to come see what we are working on. These people aren’t swayed by jargon or flashing marketing. They were born with bullshit detectors in their cribs

I’ve gained significant strength and a much needed boost in morale from interacting with our community. For example, one member asked about how we were verifying the proofs in the Ouroboros paper and I posted a link to Kawin’s Isabelle repo. Most would simply say that’s nice and move on. This member took the time to read the code and mentioned we have a long way to go with specific examples.

For most people, Isabelle is a name followed by a lake in Minnesota. For our community, some can actually read the code and comment on it. That’s a rare gift and it’s the privilege of a lifetime to be in this kind of environment (we ending up hiring the person who commented on the code).

Moving through the months, Cardano moved from the lab to a series of testnets to eventually being released in September. Dealing with these transitions gave us a newfound appreciation for just how many different computer and network configurations exist. I can almost feel a windows force ghost whispering “I told you so” in a smug voice.

We designed the Byron (the September release of Cardano) to be the minimum viable product necessary to test the concepts Cardano is built upon. We wanted to run Ouroboros in a production setting to see epochs function properly. We wanted extensive logging of both the edge nodes and relays to see how our network is being used. We wanted to have third parties play with our APIs and tell us where we screwed up (boy did they ever!). We wanted to test the update system a few times.

Overall, the experiment has been a tremendous success. There are several thousand edge nodes concurrently connected to the network. There are several exchanges and other third parties using our software in the harshest possible way. There is a wealth of data flowing in that is giving us a much better sense of what we need to do to make Cardano better.

Since launch, we’ve already pushed three updates to the network without incident. We’ve started a very rapid redesign of our middleware and its associated APIs to make it easier for third parties to integrate. We’ve started a series of systematic improvements to our network stack that will be finished with the Shelley release that should dramatically improve things.

However, what excites me most about 2018 is that Cardano is starting to open up to the world. Delegation and staking will be rolled out all throughout Q1 and Q2 in coordination with the community. Soon we’ll have a testnet running IELE allowing developers to play around with our smart contract model for the first time. And we’ll be deploying our first verified protocol with Praos thereby engaging the formal methods community.

Constantly living in the moment, one tends to eschew Cardano’s vast scope in exchange for the problem of the week. But looking at our ever growing whiteboard series demonstrates how many brilliant people wake up every morning thinking about how to solve the problems of scalability, interoperability and sustainability. These aren’t just hypothetical lectures. They are backed by papers, funding and developers working full time.

Then there are the new things. Professor Rosu’s and Runtime Verification’s work on K and semantics based compilation isn’t just really smart competitive differentiation, it’s literally moving the chains of the entire field of programming language theory. The Cardano project is creating a financial incentive to have correct by construction infrastructure from virtual machines to compilers. Our success means you don’t have to hand write this code ever again- not just in a cryptocurrency context; in a general context.

Our research efforts at Tokyo Tech under Professors Mario Larangeira and Bernardo David with multiparty computation is rapidly bringing these protocols into practical use. Kaleidoscope and Royale are case studies on how to achieve everything that Ethereum does off chain, in a low latency setting, privately and at a scale of millions of concurrent users each in their own domain. Further abstractions will push this work into more useful domains like decentralized exchange. And eventually DApp developers will be able to integrate these protocols into their code via libraries.

Professor Bingsheng Zhang’s research on treasuries and voting is groundbreaking. It’s giving our project the ability to have a discussion about how should changes to cryptocurrencies be proposed, debated, approved and funded. What’s most special here is the interdisciplinary nature of the effort that can draw from political science, game theory, sociology, open source software governance and computer science. There is something for everybody.    

Moving into 2018, we are going to open this discussion up by both engaging the community directly and by holding a conference in Switzerland. More details will be published later, but the basic idea is that this area isn’t a Cardano problem. It’s a cryptocurrency problem. And there are many great projects from Dash to Pivx who are trying to solve it in a novel way. We ought to talk to each other.

I could continue enumerate our research efforts (there’s a lot more to write), but I think the point has been made. Cardano isn’t a cryptocurrency as much as it is a movement of minds who are frustrated with the way technology works in practice.

The functional programming community has had for decades great solutions to many of the problems plaguing modern developers, but they have been historically ignored. Our RINA guys if given a chance could build a much better and more fair internet. Layering protocol development with formal methods extracts a much cleaner and more meaningful design process where ambiguity and hand waving is slain.

What Cardano has given us is a chance to answer if only the world worked this way with why not? We have the freedom to dream again and the freedom to try new things without asking permission. I even have a chance to work with my heroes like Phil Wadler. 2018 is going to be one hell of a year.

Thanks for reading


Phil Wadler and Me hanging out at Edinburgh

Saturday, December 2, 2017

My Thoughts on the Tezos Issue

As a long time forum lurker in the Tezos community and having read the Goodman whitepaper back in 2015, I’ve always admired and respected what the project is attempting to achieve. It’s a nice blend of governance, formal methods and functional programming. Given that Arthur is French, I can forgive the obvious mistake of using Ocaml instead of Haskell (yes I’m biased), but I’ve been a bit puzzled by the crisis that has befallen the project.

First a brief summary of my understanding of Tezos. It’s a cryptocurrency that is seeking to define itself in terms of three subprotocols say <Network, Transaction, Consensus>(0) and then describe these subprotocols in a machine understandable format. Then they introduce an on-blockchain voting mechanism V that allows holders of Tezos tokens to propose a new <Network, Transaction, Consensus>(k) to fork the ledger. Should it pass this becomes the new Tezos.

It’s an elegant idea and one that promotes discussion on two axises. First, the notion of formal specification of a cryptocurrency. There have been numerous attempts to do this academically as a whole via IC3 or in part via Bartoletti et. al’s work, but building a system around it is both ontologically really interesting as well as practical for cross comparisons. IOHK has pursued less ambitious work via collaborating with Alex Chepurnoy on Scorex.

Second, there is the idea of creating an ideal voting system that is both secure and fair as well as incentivized enough to expect reasonable participation of the eligible voters. Here we are confronted with who are actually eligible and what is reasonable. Who and how people get to vote are more important than what is being voted on.

As fun as it is to discuss the technology or ideas, the vogue topic is a governance crisis at Tezos. They apparently opted to have a fairly strange structure where capital pools in one place and is precommitted to buy IP from another place to benefit some group. Semantics and other vagaries aside, a fight in the heavens and some bad press have spilled out into the public domain yielding threats of lawsuits and some small class action lawsuits.

It’s somewhat ironic- in a dark Irish way- that the venture focusing most on governance is running into a governance crisis, but the good news is that it’s actually completely solvable. It appears the structures chosen were designed for the following reasons:

  1. Ensure insiders get fair compensation for their early work and risk taking
  2. Minimize regulatory risk
  3. Protect project capital in a safe jurisdiction
  4. Provide project oversight that is independent of the will of a single group      

Well as people aren’t getting along it seems that the structures are deadlocked. Trying to fire people isn’t going to make things any better. Furthermore, the longer the fight goes on the more angry (i.e. more tortuous) the Tezos buyers will be. Calls for refunds will convert into class action participants. A rush to deliver Tezos to market as a bastardized product won’t likely solve the problem. Nor will hiding behind purchasing agreements that claim everything is a donation, we have no relationship with you and expect nothing. People wouldn’t threaten to sue if there wasn’t some intention at the end of the rainbow.

So to clean up the mess here’s what I would do if I was the arbitrator. First, there is a loss of faith and trust in the Tezos Foundation. Whether this is fair or unfair is completely irrelevant. Funds from the Foundation and its assigns need to be transferred to an independent trust subject to an audit that covers both accounting and conduct. The audit report must be made public alongside all contractual obligations of Tezos Foundation with third parties and employment agreements.

Next, in terms of the IP transfer. The sale of software isn’t a bright idea. Transfer pricing considerations aside, there is software and there is specification. Tezos is a concept that can be formalized on paper in a machine understandable way. Then there is the software that actually runs the concept. Arthur and company should write formal specs and sell the specs to the Foundation.

Then the Foundation can submit a request for bids from software development firms to build Tezos from this spec and accept the best three bids. There is well more than enough capital in the foundation to diversify the development and it de-risks the project from over-reliance on a single vendor. A third party could be retained such as the Gallium team at Inria to select the top three. They are some of the most qualified in the world for such a task.

The best part of this process is that the community can have a lot of input about their expectations in terms of communication, accountability and transparency of the development process. With ETC, we have made it a point to always broadcast our development standups on a weekly basis on youtube via hangouts. With Cardano we have monthly counters and weekly reports. These metrics could be built into contracts.

Finally, there is the issue of liquidity. Tezos buyers are trading futures at the moment. I’d recommend issuing an ERC20 token that will be an airdrop target and getting it listed on as many exchanges as possible. This will provide liquidity for those who have lost faith and allow new actors to enter the ecosystem through the secondary market. It also reduces pressure considerably to deliver a product for the sake of delivering one thus a more natural process can take place.

As a parting note, there has been a lot of tough press about the project. I’ve personally gotten some and I understand how hard and frustrating it can be. But also understand that the investigative journalists didn’t raise a quarter billion dollars. Tezos did. They also didn’t start the board fight.

So it would be good to hold a media roundtable with Johann and the rest of the foundation board as well as the other relevant parties and let the media ask their questions. Silence and obfuscation are only going to promote deeper inquiry and more aggressive stories.

At the end of the day what is driving the stories is a concern that people who bought the token are being defrauded or treated unfairly in some way. The only way to clean this up is to explain why they are not. Hiding solves nothing.        

I doubt my opinion matters much, but thanks for reading and I hope it all gets resolved.


Monday, March 6, 2017

Some Thoughts Towards an Ontology for Smart Contracts

The concept of smart contracts has grown considerably since the birth of Ethereum. We've seen an explosion of interdisciplinary research and experimentation bundling legal, social, economic, cryptographic and even philosophical concerns into a rather strange milieu of tokenized intellect. Yet despite this digital cambrian explosion of thought, there seems to be a lack of a unified Ontology for smart contracts.

What exactly is an Ontology? Eschewing the philosophical sense of the word, an Ontology is simply a framework for connecting concepts or groups alongside their properties to the relationships between them. It's a fundamental word that generally is the attempt at bedrock for a topic. For example, it's meaningful to discuss the Ontology of democracy or the Ontology of mathematics.

Why exactly would one want to develop an Ontology for smart contracts? What is gained from this exercise? Is it mostly an academic exercise or is there a prescriptive value to it? I suppose there are more questions to glean, but let's take a stab at the why.

Smart contracts are essentially two concepts mashed together. One is the notion of software. Cold, austere code that does as it is written and executes for the world to see. The other is the idea of an agreement between parties. Both have semantical demands that humans have traditionally had issues with and both have connections to worlds beyond the scope in which the contract lives in.

Much of the focus of our current platforms such as Ethereum is on performance or security, yet abstracting to a more Ontological viewpoint, one ought to ask about semantics and scope.

From a semantical perspective, we are trying to establish what the authors and users of smart contracts believe to be the purpose of the contract. Here we have consent, potential for non est factum style circumstances, a hierarchy of enforceability and other factors that have challenged contract law. What about cultural and linguistic barriers? Ambiguity is also king in this land.

Where normal contracts tend to pragmatically bind to a particular jurisdiction and set of interpretations with the escape hatch of arbitration or courts to parse purposeful ambiguity, decentralized machines have no such luxury. For better or worse, there is a pipeline with smart contracts that amplify the semantical gap and then encapsulate the extracted consensus into code that again suffers from it's own gap (Loi Luu demonstrated this recently using Oyente).        

Then these structures presume dominion over something of value. Whether this dominion be data, tokens or markers that represent real life commitments or things such as deeds or titles. For the last category, like software giving recommendations to act on something in physical world, the program can tell one what to do, but someone has to do it.

So we have an object that combines software and agreements together that has a deep semantic and scope concern, but one could add more dimensions. There is the question of establishing facts and events. The relationship with time. The levels of interpretation for any given agreement. Should everything be strictly speaking parsed by machines? Is there room for human judgement in this model (see Szabo wet and dry code and this presentation)?

One could make a fair argument that one of the core pieces of complexity behind protocols like Ethereum is that it actually isn't just flirting with self-enforcing smart contracts. There are inherited notions from the Bitcoin ecosystem such as maximizing decentralization, maintaining a certain level of privacy, the use of a blockchain to order facts and events. Let's not even explore the native unit of account.

These concepts and utilities are fascinating, but contaminate attempts at a reasonable Ontology that could be constructive. A less opinionated effort has come from the Fintech world with both Clack's work on Smart Contract Templates and Brammertz work on Project ACTUS. Here we don't need immutability or blockchains. The execution environment doesn't matter as much. It's more about consensus on intent and evaluation to optimize processes.

What about the relationship of smart contracts with other smart contracts? In the cryptocurrency space, we tend to be blockchain focused, yet this concept actually obfuscates that there are three data domains in a system that uses smart contracts.

The blockchain accounts for facts, events and value. There is a graph of smart contracts in relation to each other. Then there is a social graph of nodes or things that can interact with smart contracts. These are all incredibly different actors. Adding relays into the mix, one could even discuss the internet of smart contract systems.

Perhaps where an Ontology could be most useful is on this last point. There seems to be economic value in marrying code to law for at least the purpose of standardization and efficiency, yet the hundreds of implicit assumptions and conditions upon which these systems are built need to be modelled explicitly for interoperability.

For example, if one takes a smart contract hosted on Rootstock and then via a relay communicates with a contract hosted on Ethereum and then connects to a data feed from a service like Bloomberg, then what's the trust model? What assumptions has one made about the enforceability of this agreement, the actors who can influence it and the risk to the value contained? Like using dozens of software libraries with different license, one is creating a digital mess.

To wrap up some of my brief thoughts, I think we need to do the following. First, decouple smart contracts conceptually from blockchains and their associated principles. Second, come to grips with the semantic gap and also scope of enforcement. Third, model the relationships of smart contracts with each other, the actors who use them and related system. Fourth, extract some patterns, standards and common use practices from already deployed contracts to see what we can infer. Finally, come up with better ways of making assumptions explicit.

The benefits of this approach seem to be making preparations for sorting out how one will design systems that host smart contracts and how such systems will relate to each other. There seems to be a profound lack of metadata for smart contracts floating around. Perhaps an Ontology could provide a coherent way of labeling things?

Thanks for Reading,