We develop a discrete analogue of Hamilton-Jacobi theory in the framework of discrete Hamiltonian mechanics. The resulting discrete Hamilton-Jacobi equation is discrete only in time. We describe a discrete analogue of Jacobi's solution and also prove a discrete version of the geometric Hamilton-Jacobi theorem. The theory applied to discrete linear Hamiltonian systems yields the discrete Riccati equation as a special case of the discrete Hamilton-Jacobi equation. We also apply the theory to discrete optimal control problems, and recover some well-known results, such as the Bellman equation (discrete-time HJB equation) of dynamic programming and its relation to the costate variable in the Pontryagin maximum principle. This relationship between the discrete Hamilton-Jacobi equation and Bellman equation is exploited to derive a generalized form of the Bellman equation that has controls at internal stages.