We introduce a new method for numerically evolving the full Einstein field equations in situations where the spacetime is dominated by a known background solution. The technique leverages the knowledge of the background solution to subtract off its contribution to the truncation error, thereby more efficiently achieving a desired level of accuracy. We demonstrate the method by applying it to the radial infall of a solar-type star into supermassive black holes with mass ratios $\geq 10^6$. The self-gravity of the star is thus consistently modeled within the context of general relativity, and the star's interaction with the black hole computed with moderate computational cost, despite the over five orders of magnitude difference in gravitational potential (as defined by the ratio of mass to radius). We compute the tidal deformation of the star during infall, and the gravitational wave emission, finding the latter is close to the prediction of the point-particle limit.