In cool-core galaxy clusters with central cooling times much shorter than a Hubble time, condensation of the ambient central gas is regulated by a heating mechanism, probably an active galactic nucleus. Previous analytical work has suggested that certain radial distributions of heat input may result in convergence to a quasi-steady global state that does not substantively change on the timescale for radiative cooling, even if the heating and cooling are not locally in balance. To test this hypothesis, we simulate idealized galaxy cluster halos using the ENZO code with an idealized, spherically symmetric heat input kernel intended to emulate. Thermal energy is distributed with radius according to a range of kernels, in which total heating is updated to match total cooling every 10 Myr. Some heating kernels can maintain quasi-steady global configurations, but no kernel we tested produces a quasi-steady state with central entropy as low as those observed in cool-core clusters. The general behavior of the simulations depends on the proportion of heating in the inner 10 kpc, with low central heating leading to central cooling catastrophes, high central heating creating a central convective zone with an inverted entropy gradient, and intermediate central heating resulting in a flat central entropy profile that exceeds observations. The timescale on which our simulated halos fall into an unsteady multiphase state is proportional to the square of the cooling time of the lowest-entropy gas, allowing more centrally concentrated heating to maintain a longer-lasting steady state.