Skip to main content

1. Bells 101

To get into the Christmas spirit I thought I’d write a post about the Bell numbers. The Bell numbers {B_{n}} count the number of partitions of a set of size {n} where a partition means what you would guess it does: a decomposition of the set into disjoint non-empty parts. In detail, writing {[n]=\{1,2,\ldots,n\}}, we say {A_{1},\ldots,A_{k}\subset [n]} form a partition of {[n]} if they are disjoint (that is {A_{i}\cap A_{j}=\varnothing} for all {i\neq j}), cover {[n]} (that is {\cup_{i=1}^{k}A_{i}=[n]}) and are non-trivial ({A_{i}\neq\varnothing} for all {i}). Again surprising no one, given such a partition, we call the {A_{i}} the parts.

While there are plenty of things we could ask about partitions I’m solely interested here in counting how many partitions there are of {[n]}, calling this the {n}th Bell Number and written {B_{n}}. Specifically I’m interested in getting an idea of how quickly {B_{n}} grows with {n} and to this end will be trying to find some half-decent bounds on {B_{n}}.

2. The motivating problem

Last year I found myself discussing the following problem with one of my students (after they had their supervision I might add!) from one of their problem sheets — but perhaps I won’t say which lest I give students an easy way to spoil a question for themselves. It asked students to prove two bounds on the Bell numbers (though the question was stated in terms of counting equivalence relations).

Specifically they were asked to show that

\displaystyle 2^{n-1}\leq B_{n}\leq 2^{(n^{2}-n)/2}.

Where might these bounds come from?

Well the left hand side is {\frac{1}{2}2^{n}}, or half the number of subsets of {[n]} — perhaps that gives a clue? Given any single subset of {[n]} we can define a partition by taking it and its complement as the two parts (I suppose strictly if one is empty we take the whole of {[n]} as the only part). But then of course every such partition arises in this way exactly twice since initially choosing a subset or its complement generates the same bipartition (again I’m lying here in the case that the set or its complement is empty). So there certainly are at least {2^{n-1}} such partitions.

On the other hand the right hand side is {2^{\binom{n}{2}}} — the number of subsets of all (unordered) pairs in {[n]}. So what could choosing a collection of pairs have to do with counting partitions? Well this is where the equivalence relation perspective is useful: every partition corresponds to a unique equivalence relation and an equivalence relation is just a collection of unordered pairs with some extra restrictions. If you haven’t seen equivalance relations before then instead given a partition generate a unique collection of pairs in {[n]} by including every pair of elements in the same part. But there are {2^{\binom{n}{2}}} ways of choosing a collection of pairs in any way whatsoever, so certainly there can be at most this many partitions.

So there we are: we’ve got both bounds. The original question however went on to ask: “can you give a stronger upper bound?”. For the rest of this post we’ll give a few refinements to both bounds until we hopefully come away with a reasonable idea how big {B_{n}} is.

2.1. A cheap upper bound

So far then we have

\displaystyle 2^{n-1}\leq B_{n}\leq 2^{(n^{2}-n)/2}.

We’d like to avoid a can’t-see-the-wood-for-the-trees situation and so introduce a little bit of notation to allow us to state the results with a little less precision but without too much extraneous information. We write {o(1)} for something which goes to zero as {n} goes to infinity. So for example since we have

\displaystyle \log_{2}{B_{n}}\geq n-1

and of course {n-1 = n(1-\frac{1}{n})} we can write

\displaystyle \log_{2}{B_{n}}\geq (1-o(1))n.

This allows us to discard a bit of the nuance of the error terms and write our bounds so far as

\displaystyle \left(1-o(1)\right)n\leq\log_{2}{B_{n}}\leq \left(1-o(1)\right)\frac{1}{2}n^{2}.

There’s clearly quite a gap.

It turns out it’s not actually as hard as you’d expect to get a pretty substantial improvement on the upper bound. Indeed we already know how to calculate partitions with extra structure. If we cyclically order each of the parts then we would get a partition and there are {n!} of those. This is a pretty substantial improvement. Indeed it gets the upper bound down to:

\displaystyle \log{B_{n}}\leq \left(1-o(1)\right)n\log{n}.

Now since {n!} is superexponential, or in other words {\log{B_{n}}} grows faster than linearly, our next objective might as well be to try and find a superexponential lower bound on {B_{n}} (or superlinear bound on {\log{B_{n}}}).

2.2. Ripping off the last lower bound

Here the idea behind the last lower bound gives a good idea of how we might make some improvements. Once we’ve chosen a subset, why just take what’s left as the other part when we could partition what’s left? Sure this doesn’t give an explicit bound, but it could give us a way to lower bound {B_{n}} in terms of {B_{m}} for some values of {m} smaller than {n}.

For instance, if we fix the size of the set we choose in the first instance as {k} then the number of ways we can partition what’s left over is just {B_{n-k}}. The slight wrinkle in this plan is that one of the parts we chose in the partition of the {n-k} elements left over could have size {k} and then we could have ended up with the same partition by choosing that subset as the initial subset of size {k}. This issue can be sidestepped by just insisting that we always include a fixed element, say 1, in the initial part or subset and selecting say {k} other elements to place in the part with it from among the {n-1} elements which aren’t 1. Then we get

\displaystyle B_{n}\geq\binom{n}{k}B_{n-k-1}.

Even for moderately small {k} this gives an improvement on what we had before. Say for example if we take {k=1} then we get

\displaystyle B_{n}\geq nB_{n-2}.

That is to say compared to our lower bound {2^{n-1}} in which we pick up a factor of 4 when {n} is increased by 2, from this bound we see that we actually pick up at least a factor of {n}. We’re ultimately not too concerned by exactly what we can get out of the {k=1} case. However, since it will serve as a nice warm up for what we will do next we give some of the details. Before doing this it’s worth pointing out that since we get a factor of {n} going from {B_{n-2}} to {B_{n}}, the best lower bound we could get from this can’t be better than {n^{\frac{1}{2}n}}.

On the other hand, suppose we had {B_{n-2}\geq (n-2)^{(\frac{1}{2}-\varepsilon)(n-2)}} for some {\varepsilon>0} (that is we pretty much have the {n^{\frac{1}{2}n}} bound for {B_{n-2}}, but just for a fraction slightly less than {\frac{1}{2}}) then our recursive bound would give

\displaystyle  \begin{array}{rcl}  B_{n}&\geq& nB_{n-2}\\ &\geq& n (n-2)^{(\frac{1}{2}-\varepsilon)(n-2)}\\ &=& n^{2(\frac{1}{2}-\varepsilon)} n^{2\varepsilon} n^{(\frac{1}{2}-\varepsilon)(n-2)} \left(1-\frac{2}{n}\right)^{(\frac{1}{2}-\varepsilon)(n-2)} \end{array}

simply by breaking the exponent of {n} up as {2(\frac{1}{2}-\varepsilon)+2\varepsilon} and writing {n-2=n(1-\frac{2}{n})}. But then

\displaystyle  \begin{array}{rcl}  B_{n}&\geq& n^{2(\frac{1}{2}-\varepsilon)} n^{2\varepsilon} n^{(\frac{1}{2}-\varepsilon)(n-2)} \left(1-\frac{2}{n}\right)^{(\frac{1}{2}-\varepsilon)(n-2)}\\ &=& n^{(\frac{1}{2}-\varepsilon)n} n^{2\varepsilon} \left(1-\frac{2}{n}\right)^{(\frac{1}{2}-\varepsilon)(n-2)} \end{array}

by collecting terms. And this in turn exceeds {n^{(\frac{1}{2}-\varepsilon)n}} for sufficiently large since {n^{2\varepsilon}\to\infty} and {\left(1-\frac{2}{n}\right)^{(\frac{1}{2}-\varepsilon)(n-2)}\to e^{-(1-2\varepsilon)}} as {n\to\infty}. All this is to say that we can prove a lower bound of the shape {n^{(\frac{1}{2}-\varepsilon)n}} by induction once it actually starts to be true. The good news is that for any {\varepsilon>0} and any fixed number of {n} there has to be a constant {C>0} such that the bound {B_{n}\geq Cn^{(\frac{1}{2}-\varepsilon)n}} holds and thereafter the argument above shows that this bound will hold for all {n} by induction.

Clearly even for {k=1} we could do a bit better than this. Indeed the previous argument only actually requires of {\varepsilon} that {\frac{1}{\log n}\ll \varepsilon \ll 1}. But the overall improvement is pretty small fry compared to what we can pull off with a similar amount of laziness for larger (but bounded) values of {k}.

So far then we have

\displaystyle \left(\frac{1}{2}-o(1)\right)n\log{n}\leq\log_{2}{B_{n}}\leq \left(1-o(1)\right)n\log{n}.

By looking at bounded {k} it’s possible to show that

\displaystyle \log{B_{n}}\geq (1-o(1))n\log{n}.

Indeed we have

\displaystyle B_{n}\geq \frac{(n)_{k}}{k!}B_{n-(k+1)}

Now for {k} bounded {n^{k}\geq(n)_{k}\geq n^{k}(1-\frac{k-1}{n})^{k}} so {(n)_{k}=(1-o(1))n^{k}} (in fact the error term here is {o(\frac{1}{n}))}). Moreover as {k} bounded, while it is potentially quite large we do certainly know that {k!} is bounded in {n}. At this point it would be handy to introduce a little bit of new notation.

You’ll remember that in the {k=1} case there was a huge amount of slack. Indeed it wouldn’t have mattered if there were some (even potentially very small) constants (that is maybe depending on {k} but not on {n}) kicking around since we had a factor of {n^{2\varepsilon} \left(1-\frac{2}{n}\right)^{(\frac{1}{2}-\varepsilon)(n-2)}} in there which went to infinity with {n}. With this in mind we write {\gtrsim} when the bound holds but perhaps up to a multiplicative constant.

Thus we have

\displaystyle  \begin{array}{rcl}  B_{n}&\geq& \frac{(n)_{k}}{k!}B_{n-(k+1)}\\ &\gtrsim& n^{k}B_{n-(k+1)} \end{array}

and from here the argument is much as in the {k=1} case. For completeness’ sake I include it below, but if you’re trusting then feel free to skip ahead. Since we’re picking up {n^{k}} when going from {B_{n-(k+1)}} to {B_{n}}, that is incrementing the subscript {k+1} times, we’re expecting to get something around {n^{\frac{k}{k+1}}} but certainly no better. As before we can proceed by induction supposing that {B_{n-(k+1)}\geq (n-(k+1))^{(\frac{k}{k+1}-\varepsilon)(n-(k+1))}} for some {\varepsilon>0} then our recursive bound would give

\displaystyle  \begin{array}{rcl}  B_{n}&\gtrsim& n^{k} (n-(k+1))^{(\frac{k}{k+1}-\varepsilon)(n-(k+1))}\\ &=& n^{(\frac{k}{k+1}-\varepsilon)(k+1)} n^{(k+1)\varepsilon} n^{(\frac{k}{k+1}-\varepsilon)(n-(k+1))} \left(1-\frac{k+1}{n}\right)^{(\frac{k}{k+1}-\varepsilon)(n-(k+1))}\\ &\geq& n^{(\frac{k}{k+1}-\varepsilon)n}\\ \end{array}

for {n} sufficiently large as {n^{(k+1)\varepsilon}\to\infty} and {\left(1-\frac{k+1}{n}\right)^{(\frac{k}{k+1}-\varepsilon)(n-(k+1))}\to e^{-(k-(k+1)\varepsilon)}} as {n\to\infty}. As in the last case we could get away with {\frac{1}{\log n}\ll \varepsilon \ll 1}.

3. Taking stock

All in all then we’ve made pretty decent progress, going from

\displaystyle 2^{n-1}\leq B_{n}\leq 2^{(n^{2}-n)/2}

or slightly more loosely

\displaystyle \left(1-o(1)\right)n\leq\log_{2}{B_{n}}\leq \left(1-o(1)\right)\frac{1}{2}n^{2}


\displaystyle \left(1-o(1)\right)n\log{n}\leq\log{B_{n}}\leq \left(1-o(1)\right)n\log{n}.

That is to say we know that up to lower order terms {\log{B_{n}}} is {n\log{n}}, in the sense that their ratio tends to 1. It all looks a bit too good to be true really, but as always the devil is in the details and it turns out that the error terms we have are quite far apart. By Sterling’s formula we know that

\displaystyle \log{n!}=n\log{n}-n+O(1)

whereas the situation with error term in the lower bound is a little less clear. Strictly what we showed was that for any {\varepsilon} satisfying {\frac{1}{\log n}\ll \varepsilon \ll 1} there exists a {C_{\varepsilon,k}} such that {B_{n}\geq C_{\varepsilon,k}n^{(\frac{k}{k+1}-\varepsilon)n}} for all {n}. So without going to the trouble of working out how the constant depends upon the choices of {k} and {\varepsilon} we don’t really get any useful qualitative information on the error term in the lower bound. That is we have

\displaystyle n\log{n}-o(n\log{n})\leq\log{B_{n}}\leq n\log{n}-n+O(1).

The plan for next time then is try to work out how big {n\log{n}-\log{B_{n}}} is.

Merry Christmas!