The effect of Radiation on the semiconductor-oxide interface, inducing interface trap states, has generally only been experimentally measured, which makes it difficult to quantify the impact of this radiation on device electrostatics. For an Ultra-Thin-Body (UTB) MOS device, the 1-D Band structure along the direction of confinement, if solved selfconsistently with the 1-D Poisson’s equation, while varying the band edge energy (∆Eedge) at the Si−SiO2 interface, can enable the quantification of the effect of interface trap states on channel electrostatics. In this work, we present an approach to correlate the radiation dose to the band edge energy (∆Eedge), thus enabling the band structure-based approach to be used to quantify the effect of these radiation-induced traps on the device electrostatics. We show a methodology that co-relates the interface charge induced by ∆Eedge and the charge yield, due to different radiating particles, on the Si − SiO2 interface. After identifying appropriate values of ∆Eedge for different particles and doses, the degradation due to radiation on the channel electrostatics can be accurately simulated with the atomistic band structure-based methodology. We also show an approach to extend this methodology to lower device temperatures, thus effectively quantifying the effect of radiation dose on UTB device electrostatics for a wide range of device temperatures.