The discovery of new inorganic materials in unexplored chemical spaces necessitates calculating total energy quickly and with sufficient accuracy. Machine learning models that provide such a capability for both ground-state (GS) and higher-energy structures would be instrumental in accelerated screening. Here, we demonstrate the importance of a balanced training dataset of GS and higher-energy structures to accurately predict total energies using a generic graph neural network architecture. Using ∼ 16,500 density functional theory calculations from the National Renewable Energy Laboratory (NREL) Materials Database and ∼ 11,000 calculations for hypothetical structures as our training database, we demonstrate that our model satisfactorily ranks the structures in the correct order of total energies for a given composition. Furthermore, we present a thorough error analysis to explain failure modes of the model, including both prediction outliers and occasional inconsistencies in the training data. By examining intermediate layers of the model, we analyze how the model represents learned structures and properties.