This document gives a self-contained presentation of the proof of the (improved) sunflower lemma.
Definition. A sunflower with petals is a collection of sets where for all , has the same value. Equivalently, for any three sets , and no element in the universe is contained in only of the sets.
Lemma 1. Original sunflower lemma.[ ER60] Set system containing sets of size , must contain sunflower with petals.
Example 2. Application: Erdős + Sarkozy. Given , let . For all , there are subsets for where is arithmetic progression. Proof: have different subsets of size with the same value, since there are only possible values. By sunflower lemma, have a sunflower in here, ; let be common area. Let subsets be , , and , and so on.
Some history:
1960 Erdős and Rado[ ER60], prove set systems containing sets of size contain sunflowers with petals.
2019: Ryan Alweiss, Shachar Lovett, Kewen Wu, Jiapeng Zhang, “Improved Bounds for the Sunflower Lemma” [ ALW+20] .
2020: Anup Rao: “Coding for Sunflowers”, in Discrete Analysis,[ Rao20] , using converse of noiseless coding theorem
2020: Terence Tao, “The sunflower lemma via Shannon entropy”[ Tao20], proof using entropy.
2020: Tolson Bell, Suchakree Chueluecha, Lutz Warnke; “Note on sunflowers”,[ BCW21] , small adjustment
2021: Lunjia Hu, “Entropy Estimation via Two Chains: Streamlining the Proof of the Sunflower Lemma”[ Hu21] simpler version of entropy proof, reduces slightly
2022: I present this, mostly copied from [ Hu21; Tao20], but with a slightly smaller and avoiding the use of combinations
Theorem 3. There exists a universal constant so that for any , , every set system containing sets of size contains an -petaled sunflower.
Definition. Spread set; for , random set is R-spread if for all . Set family is spread if uniformly picking random set in family is -spread.
For example, if the set system has the property that every set contains one of a special set of elements, then can be no more than -spread.
Remark 4. Given a random -spread set , let be a random set for which almost surely. Then is also -spread.
Lemma 5. Locating a single petal. Assume subset family on is -spread. Let be a random subset of drawn where each coordinate is independently included with probability . Then with probability , contains some .
We now show that the sunflower lemma reduces to locating a single petal (Lemma 5).
Proof. Let be the given set system, with universe , and all ; we are given that . First, we show that it suffices to prove the sunflower lemma in the case where is -spread. Assume that is not -spread. For each set , define
and let be the set maximizing this. As if , it must be . Also, . Let . Note that must be nonempty. Consider now the family of sets . Given a sunflower in , we can form a sunflower in by adding to each petal. Also, is -spread on , because for any , and drawn u.a.r,
(The inequality comes from the fact . The fact that can be assumed because, if and intersect, then .)
Thus, it suffices to prove the sunflower lemma if is -spread. Randomly partition into subsets , where for each index , we sample an independent uniform integer in and use it to determine which subset contains . The distribution of each individual will be the same as if each were sampled from the product distribution where each coordinate is present with probability . By Lemma 5, with probability , each contains some . Since the expected number of containing a set in is , there must be a specific partition of subsets where at least of them contain an element of . Let be these contained elements. They are all in , and all disjoint, because the are disjoint. __
Definition 6. Given a random variable and deterministic function , a conditionally independent copy of a variable subject to a condition is a random variable identically distributed to , where always holds, and the variables and are conditionally independent given .
Lemma 7. Refinement inequality. Let be an -spread random subset of . Let be a random subset of in which each element is included with probability , which is independent of . Let be a conditionally independent copy of subject to . Then is distributed identically to and satisfies:
We now reduce Lemma 5 to the preceding Lemma 7:
Proof. Let be an integer chosen later, and let be independent random subsets of where each possible element of is present with probability , so that contains each independently with probability . For brevity, write , and . Let be uniformly randomly chosen from the set – independent of the , and construct iteratively as follows. Say we have constructed , which is identically distributed to , may be correlated with , and satisfies and . We construct a conditionally independent copy of subject to and
For each , we note that the conditioned random variable is -spread, because for any , we have:
where we use the property that , and that is independent of . Furthermore, with , we observe that is a conditionally independent copy of subject to . We apply Lemma 7 to , finding that , and
Combining the results of the lemma over all gives , since for each possible value of , we have . This implies . Similarly,
Finally, letting , we observe that is identically distributed to , satisfies , and has
Since for all , we have and hence ; then using the fact that , we have:
We want to prove that the right hand side is , because then ; since is integral, this means that at least half the time, and holds. (Since is distributed identically to , this means that we can couple a uniform random index in to so that with probability .) Setting , we observe that if – or equivalently, . Since
it follows setting
is sufficient. (As we have assumed , , so we have .) __
Definition 8. Given a random variable with values in , its Shannon entropy is
The conditional Shannon entropy on event is
and conditioning on random variable with range gives
Monotonicity; if for a fixed function , then . If , then . If , then .
Subadditivity: with equality if independent.
Chain rule: . Also have: .
Jensen’s inequality: for convex , . For entropy, means that if is supported on , then . Similarly, if supported on , then .
Binary entropy: , and . Also, is concave, so .
Lemma 10. Subsets of small sets have small conditional entropy. Let be random finite sets. If almost surely, have .
Proof. This follows from Jensen’s inequality: since . __
Proof. We want to prove that the mutual information . Write , , and , and . Then:
Because is -spread, for any , we have
Note also that , because almost surely. Then because is convex, by Jensen’s inequality,
This completes the proof. __
Corollary 12. Let be a random subset of with each coordinate included independently with probability . Let be a random subset of for which almost surely. Then .
Proof. is -spread, because for any , . Then apply Lemma 11. __
Lemma 13. Info properties of uniformly random sets. Let be finite set, and a set with each coordinate included independently with probability . Let be a random subset of . Then .
Proof. We will show that for any coordinate ,
(1) |
Summing this over all and using subadditivity gives:
To prove Eq. 1, we first note that is concave, for all s.t.
By monotonicity and the union bound, . This and the previous statement imply:
We have two cases. If , then , and the maximum of this last upper bound happens at ; if , then the maximum happens at . In both cases, is an upper bound, so:
which completes the proof, as . __
Lemma 14. Let be finite set, and a set with each coordinate included independently with probability . Let be a random subset of , independent of . Then .
Proof. since for any set , we have . __
Lemma 15. Properties of “Conditionally independent copy.” Given and deterministic , we can construct a conditionally independent copy with the exact same distribution as , but linked so that , and , are independent when conditioned on .
Proof. The joint distribution for has
Then since are conditionally independent given ,
Adding to both sides and using the chain rule gives , since . __
Proof. We are given a conditionally independent copy of subject to ; equivalently, this ensures
We now prove .1 We will estimate in two different ways, and prove lower and upper bounds for it; some light algebra afterwards reveals the result.
For the lower bound:
Now,
Thus
The upper bound uses a more complicated decomposition:
Since , Lemma 10 implies
By monotonicity and Lemma 11,
By monotonicity, and applying Lemma 12 since ,
Finally, as we can write , (i.e, as a function of , , and ), by applying monotonicity twice, and then Lemma 14:
Combining these upper bounds gives:
Combining this with the lower bound on gives, after canceling identical terms:
hence . __
Remark 16. A remark of [ BCW21] implies that the idea of an -spread construction, combined with a random selection of a subset, cannot be further improved. They show the example of a set family on where each set has size and is disjoint from the others. The set family contains all sets of size which for all . This is -spread, but the probability that a random set of density contains any set of is
which is if , i.e, if . Would need for the multiple disjoint random sampling trick to work.
Ryan Alweiss, Shachar Lovett, Kewen Wu, and Jiapeng Zhang. Improved bounds for the sunflower lemma. In Proceedings of the 52nd Annual ACM SIGACT Symposium on Theory of Computing, STOC 2020, pages 624–630, Chicago, IL, USA. Association for Computing Machinery, 2020. isbn: 9781450369794. doi: 10 . 1145 / 3357713 . 3384234. url: https://doi.org/10.1145/3357713.3384234.
Tolson Bell, Suchakree Chueluecha, and Lutz Warnke. Note on sunflowers. Discrete Mathematics, 344(7):112367, 2021. issn: 0012-365X. doi: https://doi.org/10.1016/j.disc.2021.112367. url: https://www.sciencedirect.com/science/article/pii/S0012365X21000807.
P. Erdös and R. Rado. Intersection theorems for systems of sets. Journal of the London Mathematical Society, s1-35(1):85–90, 1960. doi: https://doi.org/10.1112/jlms/s1-35.1.85. eprint: https://londmathsoc.onlinelibrary.wiley.com/doi/pdf/10.1112/jlms/s1-35.1.85. url: https://londmathsoc.onlinelibrary.wiley.com/doi/abs/10.1112/jlms/s1-35.1.85.
Lunjia Hu. Entropy Estimation via Two Chains: Streamlining the Proof of the Sunflower Lemma. 2021. url: https://web.archive.org/web/20220502190641/https://theorydish.blog/2021/05/19/entropy-estimation-via-two-chains-streamlining-the-proof-of-the-sunflower-lemma/. Accessed: 2022-05-02.
Anup Rao. Coding for sunflowers, 2020. arXiv: 1909.04774 [math.CO].
Terence Tao. The sunflower lemma via Shannon entropy. 2020. url: https://web.archive.org/web/20220502185424/https://terrytao.wordpress.com/2020/07/20/the-sunflower-lemma-via-shannon-entropy/. Accessed: 2022-05-02.
1In fact, not only do we have , but it is possible to prove .