This blog has moved!

I have decided to move my blog to a server that I control. is great, but many features which a power user would want require costly upgrades.  For example, just to use custom CSS one has to pay.  Furthermore,’s mathematics support is poor.

I decided to use my money to obtain a server to install the WordPress management system on and to use as my personal webpage and blog.  I went with Linode.  Linode provides linux VPS’ to the masses.  They sell a VPS to a client who has full rain of the machine.  Yes, I have root access and even the ability to reinstall the OS if I choose. In less than 24 hours I was able to get my linode up and running with wordpress.

Using the WordPress management system is great, because it allows one to install a plugin called jetpack which connects my WordPress system with  This gives my site the ability to track stats and connect to social networks.  It is great!  Using the WordPress system also allows me to export all of my blogs content at and import it at its new home.  Hence, all of my data is moved over and the new blog looks identical to this one.  Lastly, I have installed the MathJax plugin which gives my new blog the ability to render beautiful mathematics.

Our new home is now at

So repoint your RSS feeds over to the new place.  I hope everyone enjoys my future posts.

Posted in Goodbyes | 2 Comments

Coq, Category Theory, and Steve Awodey’s Book.

There is a great book on category theory called, you guessed it, “Category Theory” by Steve Awodey. It is a great book! I really enjoy it. I am interested in category theory, because it provides a possible interpretation of type theory. Actually, I have been thinking of the relationship between category theory and type theory as being different perspectives of the same thing. That thing is computation. As I learn more category theory I thought it would be useful to start thinking about how to formalize results involving category theory in Coq.

To my surprise I found that Adam Megacz has started a nice library called coq-categories. His formalization follows Awodey’s book exactly. In fact in order to find a concept in the formalization one has to know which chapter it is in Awodey’s book. This is a great formalization. It uses type classes to define categories and functors. Type classes are such a nice fit for categories.

There are two issues with this formalization. First, it does not formalize anything beyond chapter 7 of the book. The second is that it has not be worked on in over a year. The last commit was 2011-04-24. The version of Coq it was developed in was 8.3pl2.

I decided to clone Adam’s git repo. and get it working with Coq 8.4. It took only about fifteen minutes to get working. The only problems that I ran into were related to unification of implicit arguments.

This post just serves as a place to put up a patch in case anyone is looking for one. I plan to develop this a bit further. I would like to formalize control categories and play around with the semantics of a few different type theories. Here is the patch.

I plan to post more about my extensions in the near future.

Posted in category theory, Coq, Fun, Logic, metatheory, Semantics, Type Theory | Tagged , , , , | 2 Comments

Cartesian Closed Comics

There is a new comic called Cartesian Closed Comics. They are pretty good.

Below is my favorite one:

Find more by clicking the above link.

Posted in Fun | Tagged , , | Leave a comment

Impredicativity and the Human Mind

I have not posted in awhile so I thought I would post something both short and interesting to get things going again. I defended my comprehensive exam this week. It went really well, and I plan to post the report on my website, but before I do I want to make a few corrections. Greg Landini a philosophy professor here at UIowa was an external member of my committee for my comprehensive exam. I am also doing a readings course with him on relevant logic. During our weekly meeting this week we got into a really interesting conversation comparing the meaning of impredicativity and predicativity according to Bertrand Russell and the definition used in type theory. The conversation eventually turned over to Greg’s work on understanding human thought and consciousness. I would like to discuss in this post the details of my conversation with Greg. First, I would like to introduce how I define impredicativity and predicativity. Then I will talk about how Russell defined these terms. Following these I will discuss Greg’s ideas on animal consciousness.

I define an impredicative formula to be an universally quantified formula — the complexity of the formula does not matter — such that, the domain of quantification contains the formula being defined. A logic or type theory is impredicative if all universal formulas are impredicative. An example would be the universal type of Girard-Reynold’s system F. Now a predicative formula is an universal formula, such that, the domain of quantification does not contain the formula which is being defined. We can make the universal type of system F predicative by stratifying types using kinds.

Consider the following formula in naive set theory: \exists y\forall x.(x \in y \iff x \not \in x). Greg points out that Russell did not call this an “impredicative formula”, because there is only one domain of objects to quantify over which is the universe of all sets. Now I would call this an extreme case of impredicativity. That is this formula is the most impredicative a formula can be. In fact, so impredicative it is inconsistent. Other than this case Russell’s views on impredicative vs. predicative are in line with the definition we use in type theory. That is types can be impredicative in the same way they are in system F and remain consistent, while just like stratified system F they can be predicative. Indeed this is the difference between the simple theory of types and the ramified theory of types. From here the conversation drifted over to Greg’s work on consciousness.

Greg believes that there are two types of reasoning going on in the mind of animals. Some use predicative reasoning while the more gifted use impredicative reasoning. He used a great example using a frog.

A frog sits and eats flies. The frog has the innate ability to spot a fly when it is near. If the frog hears some clicking noise which sounds like a fly it will try and eat the object making the clicking. If the frog sees a flying object it will also try to eat it. Now if I use a tape recorder and a small toy object which makes the same sound a fly makes then we can easily trick the frog into eating the toy. Thus, a frog cannot first identify the object making the sound and determine whether or not it is actually a fly before deciding to eat it. This kind of thought Greg believes is predicative. This suggests that most animals use predicative reasoning. He says that the animals with impredicative reasoning got lucky during the evolutionary process. Now what do we mean by lucky?

The brain is believed to be constructed by layers of mesh which make up synaptic networks. This is depicted by the following figure.

The previous figure shows a stratification of the layers of mesh. This results in predicative reasoning. Impredicative reasoning results when the layers have connections between them. As in the following diagram.

This suggests that the more these layers are collapsed the more powerful reasoning becomes.

This fits with the views of type theory almost exactly. Think of Coq where the type universe is stratified into levels. This stratification is greatly less powerful then in the full collapse of the type universe — this results in \mathsf{Type}:\mathsf{Type}. The more we collapse the type universe the more powerful the theory becomes. Think of the difference between stratified system F and system F. We can gain incremental power by making connections between the levels. One way of doing this is adding universe polymorphism. This is like the connections between the mesh layers discussed above. These connections I found to be quite surprising, but also not so surprising. The brain is essentially the worlds most powerful computer. So it makes sense that computer science could help in figuring out how it works. Computation is what we do after all.

I think this is all I am going to say about this right now. I found this to be really cool. I thought maybe others might as well.

Posted in Logic, philosophy, Type Theory | Tagged , , , , , , , , | Leave a comment

On the Dual Calculus with a Twist

Recently, my advisor (Aaron Stump) and I have been trying to unify induction and coinduction using a single operator and an explicit ordering. I know what you are thinking, awesome! While I could talk for hours about our ideas on how to do this I am instead going to introduce a type theory I think everyone should know of.

While reading over the literature covering duality between induction and coinduction and call-by-name and call-by-value we had the joy of learning about Philip Wadler’s dual calculus (DC) [1,2]. This type theory is the computational view of Gentzen’s classical sequent calculus LK. For more information on LK see [3]. A couple of interesting properties of the dual calculus is that its dual can be embedded in itself. This dual embedding can then be used to show that the call-by-value reduction strategy is dual to the call-by-name reduction strategy. Following on the previous point DC can also be used to show that the call-by-value \lambda\mu-calculus is dual to the call-by-name \lambda\mu-calculus by first embedding the call-by-value \lambda\mu-calculus into DC, taking the dual, and then translating the \lambda\mu-calculus back out [1,2]. Cool stuff! In this post I would like to simply define DC and give an intuitive meaning to its judgments. However, there is a twist. The twist is that I have come up with a new syntax for the judgments of DC. While DC is a beautiful theory I am not a fan of DC’s original nor of the version of DC in Kimura’s paper [4] synatx for judgments. I find them ugly, confusing, and hard to read. I hope the reader will find my syntax a bit more pleasing.

The syntax of DC is defined by the following set of grammars: The first thing the reader who has at least a basic understanding of type theory will notice, is that, there is no function type or \lambda-abstraction! It turns out that these can actually be defined! I hope the reader’s head did not just explode. Here are the definitions:

Using these definitions the usual typing rules are now derivable as well as the usual \beta-reduction rule. As for the remaining syntax of DC it is pretty routine. I only comment on a few aspects of it. I call variables x input variables and variables \alpha output variables or output ports. The output ports will be replaced by the output of (co)terms. Statements \alpha \bullet k can be viewed as commands where k is computed with input \alpha. The term (s).\alpha binds the output port \alpha to the return value of the statement s. Dually, the coterm x.(s) binds the input variable x in s.

DC consists of three judgments. I will first give my new syntax for these judgments, compare my syntax with the original syntax, give an intuitive explanation of the judgments, and finally define them. The syntax is of the three judgments are as follows:

Explanation of the syntax. The arrows in the judgments above indicate the flow of input/output. The first left-pointing arrow \leftarrow indicates that each expression t,k, and s can assign output to an output port in D. Now the right-pointing arrow \rightarrow in the typing judgment indicates that t may return a value of type A, dually, in the syntax for the cotyping judgment the second left-pointing arrow indicates that k takes an input of type A. Recall from above that I mentioned that across the propositions-as-types correspondence DC corresponds to Gentzen’s classical sequent calculus LK. LK consists of left and right rules, so additionally the second arrow in the typing and cotyping judgments indicate the type of rules which define the judgments. The rules defining the typing judgment are right rules, hence, the second arrow points right, while the second arrow in the cotyping judgment points left indicating its rules correspond to the left rules of LK.

The syntax used by Kimura was the following:

This syntax is very similar to the original syntax of Wadler, but instead of \vdash he used an arrow pointing to \Delta which is close to my syntax, but still not as pretty.  I find this style syntax to be hard to read, but I do think it makes sense once I understood it. I have this annoying belief that the syntax of a judgment should be sexy enough to make writing out derivations somewhat manageable.

Intuitive meaning of the judgments. Lets first expand the syntax out a bit.
The typing judgment means that when each x_i has a value and t is computed then it may either return a value of type A or assign a value to one of the output ports o_i of type B_i. Dually, the cotyping judgment means when each x_i has a value then k takes an input of type A and after computation assigns a value to one of the output ports. Finally, the cut judgments means when each x_i has a value then s is computed and assigns a value to one of the output ports. This intuitive explanation behind the judgment helped and continues to help me understand DC. I do not claim authorship of such a great meaning I learned it from [4]. They did a great job explaining the theory.

Definition of the judgments of DC. Finally, I move onto define each judgment. The following is the definition of the typing and cotyping judgment:
Lastly, we have the sole rule of the cut judgment, the cut rule:

A nice feature of this type theory is that it is easy to translate it to a logic equivalent to LK. Lets try this out. We define the following eraser:

Using this eraser we obtain the following logic which is equivalent to LK.

We have the cut rule last:

I might prove this logic equivalent to LK, all truths deducable in this logic are deducible in LK, in a later post. To finish off the definition of DC lets take a look at the reduction rules. I do not pen any specific reduction strategy down here. The reducation rules of DC are:
Well that is the dual calculus. It is a really interesting theory of classical logic. There are quite a few papers on the dual calculus. I sprinkled some throughout the post, but if the reader knows of any I would love to hear about them.

Posted in Logic, Type Theory | 3 Comments

The 2012 Oregon Programming Languages Summer School

The lecture videos and some slides are up! You can find them here.

I am particularly excited about Steve Awodey’s talk.

Dig in.

Posted in Logic, metatheory, Semantics, Type Theory, Videos | Tagged , , , | Leave a comment

The Seventh ACM SIGPLAN Workshop on Programming Languages meets Program Verification (PLPV 2013)

I am on the program committee for PLPV 2013. The first call for papers is out. I thought I would share it here just in case there is anyone reading this that doesn’t know.

The Seventh ACM SIGPLAN Workshop on Programming Languages meets Program Verification (PLPV 2013)

22nd January, 2013
Rome, Italy
(Affiliated with POPL 2013)

Call for Papers


The goal of PLPV is to foster and stimulate research at the
intersection of programming languages and program verification, by
bringing together experts from diverse areas like types, contracts,
interactive theorem proving, model checking and program analysis. Work
in this area typically attempts to reduce the burden of program
verification by taking advantage of particular semantic or structural
properties of the programming language. One example are dependently
typed programming languages, which leverage a language’s type system
to specify and check rich specifications. Another example are
extended static checking systems which incorporate contracts with
either static or dynamic contract checking.

We invite submissions on all aspects, both theoretical and practical,
of the integration of programming language and program verification
technology. To encourage interaction between different
communities, we seek a broad scope for PLPV. In particular,
submissions may have diverse foundations for verification (based on
types, Hoare-logic, abstract interpretation, etc), target
different kinds of programming languages (functional, imperative,
object-oriented, etc), and apply to diverse kinds of program
properties (data structure invariants, security properties, temporal
protocols, resource constraints, etc).

Important Dates

Submission 8th October, 2012 (Monday)
Notification 1st November, 2012 (Thursday)
Final Version 8th November, 2012 (Thursday)
Workshop 22nd January, 2013 (Tuesday)


We seek submissions of up to 12 pages related to the above
topics; shorter submissions are also welcome. Submissions may describe
new work, propose new challenge problems for language-based
verification techniques, or present a known idea in an elegant way
(i.e., a pearl).

Submissions should be prepared with SIGPLAN two-column conference
format. Submitted papers must adhere to the SIGPLAN republication
policy. Concurrent submissions to other workshops, conferences,
journals, or similar forums of publication are not allowed.

To submit a paper, access the online submission site at


Accepted papers will be published by the ACM and will appear in the
ACM Digital library.

Program Committee

Andreas Abel Ludwig-Maximilians-University Munich (co-chair)
Robert Atkey University of Strathclyde
Harley Eades The University of Iowa
Chung-Kil Hur Max Planck Institute for Software Systems
Brigitte Pientka McGill University
Andrew Pitts University of Cambridge
François Pottier INRIA
Tim Sheard Portland State University (co-chair)
Makoto Takeyama Advanced Industrial Science and Technology

Posted in Conferences/Workshops | Tagged , , , , | Leave a comment