We present the results of numerical analysis of wave drag reduction by a single-pulse energy deposition in a supersonic flow field around a sphere. The wave drag for the sphere was reduced as a result of the interaction between a low-density core following the blast wave produced by the energy deposition and the bow shock developed in front of the sphere. We investigated the drag reduction mechanism in terms of the unsteady flow field induced by the interaction. The effects of deposited energy and deposition location on energy reduction were examined by parametric study. From the obtained results, we refined the parameters, utilizing the baroclinic source term that produced vorticity in the vortex equation when the gradients of density and pressure were not parallel. The baroclinic vortex driven by Richtmyer-Meshkov-like instability was strong enough to contribute to the temporary low-entropy shock formation that caused low wave drag for the supersonic object. We determined that the reduced energy had a linear dependence on the radius of the low-density core formed in the blast wave and was proportional to the square of the freestream Mach number. Such dependencies could be predicted with the assumption that the energy was consumed by the baroclinic vortex generation and advected downward without thermalization in an inviscid shock layer.