I'm using Open Hardware Monitor to graph my Radeon HD 5770's temperature, and ATI Tray Tools to show it on my tower's LCD screen (because the LCD screen control software only gets GPU info from ATT). They are slightly different.
Open Hardware Monitor's temp is identical to that of the Overdrive section in Crystal Control Center. But ATI Tray Tool's temp seems to be 4-10°c hotter.
For example, right now, my OHM/CCC says the GPU core temp is 50°c. ATT says its 54.5°c.
However, running a DX11 game stress test, OHM says the GPU maxed out at 70°c, whereas ATT says it maxed out at 80°c.
So unfortunately, the relationship is not linear; ATT isn't just using an offset of the OHM/CCC temperature.
So why are they different?