Motivated by the Beck-Fiala conjecture, we study the discrepancy problem in two related models of random hypergraphs on n vertices and m edges. In the first model, each of the m edges is constructed by placing each vertex into the edge independently with probability d/m, where d is a parameter satisfying d -> infinity and dn/m -> infinity.
In the second model, each vertex independently chooses a subset of d edge labels from [m] uniformly at random. Edge i is then defined to be exactly those vertices whose d-subsets include label i.
In the sparse regime, i.e., when m = O(n), we show that with high probability a random hypergraph from either model has discrepancy at least Omega(2(-n/m) root dn/m). In the dense regime, i.e., when m >> n, we show that with high probability a random hypergraph from either model has discrepancy at least Omega(root(dn/m) log gamma), where gamma = min{m/n, dn/m}.
Furthermore, we obtain nearly matching asymptotic upper bounds on the discrepancy. Specifically, we apply the partial coloring lemma of Lovett and Meka to show that, in the dense regime, with high probability the two random hypergraph models each have discrepancy O(root dn/m log(m/n)).
In fact, in a significant parameter range we can tighten our analysis to get an upper bound which matches our lower bound up to a constant factor. This result is algorithmic, and together with the work of Bansal and Meka [On the discrepancy of random low degree set systems, in Proceedings of the 2019 Annual ACM-SIAM Symposium on Discrete Algorithms, 2019, pp. 2557-2564] characterizes how the discrepancy of each random hypergraph transitions from \Theta (\surd d) to o(\surd d) as m increases from m= Theta (n) to m >> n.