# Global Rates of Convergence in Log-Concave Density Estimation

Kim, Arlene KH ; Samworth, Richard John (2016)

Article

The estimation of a log-concave density on $\Bbb R$$^d represents a central problem in the area of nonparametric inference under shape constraints. In this paper, we study the performance of log-concave density estimators with respect to global loss functions, and adopt a minimax approach. We first show that no statistical procedure based on a sample of size \textit{n} can estimate a log-concave density with respect to the squared Hellinger loss function with supremum risk smaller than order \textit{n}$$^{−4/5}$ , when $\textit{d}$ = 1, and order $\textit{n}$$^{−2/(d+1)} when \textit{d} ≥ 2. In particular, this reveals a sense in which, when \textit{d} ≥ 3, log-concave density estimation is fundamentally more challenging than the estimation of a density with two bounded derivatives (a problem to which it has been compared). Second, we show that for \textit{d} ≤ 3, the Hellinger \epsilon-bracketing entropy of a class of log-concave densities with small mean and covariance matrix close to the identity grows like max{\epsilon$$^{−d/2}$ , $\epsilon$$^{−(d−1)}$} (up to a logarithmic factor when $\textit{d}$ = 2). This enables us to prove that when $\textit{d}$ ≤ 3 the log-concave maximum likelihood estimator achieves the minimax optimal rate (up to logarithmic factors when $\textit{d}$ = 2, 3) with respect to squared Hellinger loss.