Did people forget that the CPU is actually a mobile processor?
Tony is correct, both the PS4 and Xbox One have the AMD 8-Core processor, which is based on the low-power (low heat) Jaguar architecture. The jaguar processors are designed for the new ultra-slim type laptops and slim electronics (tablets).
Jaguar architecture (Kabini and Temash) Main article: Jaguar (microarchitecture)
In January 2013 the Jaguar-based Kabini and Temash APUs were unveiled as the successors of the Bobcat-based Ontario, Zacate and Hondo APUs. The Kabini APU is aimed at the low-power, subnotebook, netbook, ultra-thin and small form factor markets, the Temash APU is aimed at the tablet, ultra-low power and small form factor markets. The 2 to 4 Jaguar cores of the Kabini and Temash APUs feature numerous architectural improvements regarding power requirement and performance, such as support for newer x86-instructions, a higher IPC, a CC6 power state mode and clock gating. Kabini and Temash are AMD's first, and also the first ever quad-core x86 based SoCs. The integrated Fusion Controller Hubs (FCH) for Kabini and Temash are codenamed "Yangtze" and "Salton" respectively. The Yangtze FCH features support for two USB 3.0 ports, two SATA 6 Gbit/s ports, as well as the xHCI 1.0 and SD/SDIO 3.0 protocols for SD-card support. Both chips feature DirectX 11.1-compliant GCN-based graphics as well as numerous heterogeneous system architecture (HSA) improvements. They were fabricated at a 28 nm process in an FT2 BGA package by TSMC, and were released on May, 23rd 2013.
So they shouldn't be running very hot, also they are 8-core rather than the quad-core architecture they are based on which means they'd be able to process twice as much as next-gen slim laptops.
The 2.75GHz clock speed on the devkit processors is possibly for debugging as mentioned earlier but I don't think so, devkits are typically computers used to test games on which has similar specs to the console.
When computers run windows, event (code, bugs, etc) loggers and emulators it uses up more processes and Sony must've tested how much they utilize and compensated that with a higher clock speed cpu.
Or it could just be that the CPU isnt a bottleneck for the PS4 so they pushed it to 2.75GHz just so it wouldn't cause any issues and just matched the GPU specs?
I'm more interested in what GPU was used in the devkits if anyone has that info?