Orbis (PS4) devkit docs leak; 8GBRAM, 2.2GBVRAM, new Controller


Recommended Posts

"British site CVG speculated last week that, because they'd heard the PS4's controller was "trying to emulate the same user interface philosophies as the PS Vita", that meant it would feature a touch screen. Instead, the Orbis' controller features a capacitive touch pad, like you find on the back of a Vita (presumably it's also on the back of the PS4's controller), that can recognise two-point multi-touch. The entire pad can also be "clicked" for an additional input button."

The PS4's controller will again be capable of motion-sensing, like its PS3 predecessors, only now with improved technology like tilt correction. It will also feature vibration, which Sony has thankfully learned is a next-gen feature you need to launch with. It'll also have an RGB LED light in it.While there have been reports of the PS4 controller featuring "biometric" technology, there was no mention of it in the information we were provided.

There's one other addition to the PS4's pad you won't find on a DualShock 3: a "Share" button. We're not exactly sure what it does. The most likely use would be to allow users to share some aspect of their gaming experience to Twitter or Facebook. Maybe a screenshot? We have no idea. But that Share button might have something to do with...

We'll begin with the specs. And before we go any further, know that these are current specs for a PS4 development kit, not the final retail console itself. So while the general gist of the things you see here may be similar to what makes it into the actual commercial hardware, there's every chance some?if not all of it?changes, if only slightly.

That being the case, here's what we know is inside PS4 development kits?model # DVKT-KS000K?as of January 2013. As you'll see, some things have changed since earlier kits became available in March 2012.

System Memory: 8GB

Video Memory: 2.2 GB

CPU: 4x Dual-Core AMD64 "Bulldozer" (so, 8x cores)

GPU: AMD R10xx

Ports: 4x USB 3.0, 2x Ethernet

Drive: Blu-Ray

HDD: 160GB

Audio Output: HDMI & Optical, 2.0, 5.1 & 7.1 channels

A lot more info here (including new user accounts): http://kotaku.com/59...ve-specs-so-far

edit: VGleaks who put out the next Xbox specs had this to say...

VG Leaks teasing us

http://www.vgleaks.com/what-do-you-e...e-poll-inside/

Comments section:

Anonymous 23 January, 2013 at 22:48 Reply

What are your thoughts on the Kotaku ?leak?/clickbait?

And the answer:

Mr.X 24 January, 2013 at 0:23 Reply

Hi!

I think it?s better to wait until tomorrow to answer your question. Please, we will publish a new Orbis article soon, a different one?

Thanks for your enquiry.

Link to comment
Share on other sites

I read this yesterday, thought the actual documents had leaked on the net rather than just this website saying they have those documents. Would like to read the actual documents.

Link to comment
Share on other sites

im not so sure how the rear touch screen would work - but then again im still thinking of the dualshock controller style. perhaps it'll simply be shaped more like the Vita. still, the back touch is difficult to use on the vita, so...

Link to comment
Share on other sites

Must be 2013 with a proper unveiling of each console in the months ahead :)

Will grant the controller certainly seems interesting on the ps4.

Link to comment
Share on other sites

Dev kits have x2 the RAM of the actual console right?

I wouldn't say all dev kits have twice the RAM but as a rule of thumb yeah.

im not so sure how the rear touch screen would work - but then again im still thinking of the dualshock controller style. perhaps it'll simply be shaped more like the Vita. still, the back touch is difficult to use on the vita, so...

They said they are redesigning the PS4 pad so I doubt it will be the same as Dual Shock 3.

Link to comment
Share on other sites

Don't they mean dual 4 core CPUs, not 4 dual core CPUs ....

Knowing Sony they might be going for the more complex four socket option. ??

Link to comment
Share on other sites

Does anyone know if the "Bulldozer" is a laptop/mobile CPU or is it similar to a desktop CPU? (or something else) Although the EuroGamer leak says "Jaguar". Keeping in mind this is apparently devkits.

Hopefully VGleaks keeps to their word and releases their leak today.

Link to comment
Share on other sites

Does anyone know if the "Bulldozer" is a laptop/mobile CPU or is it similar to a desktop CPU? (or something else)

It's a desktop class cpu not mobile.

Link to comment
Share on other sites

Does anyone know if the "Bulldozer" is a laptop/mobile CPU or is it similar to a desktop CPU? (or something else) Although the EuroGamer leak says "Jaguar". Keeping in mind this is apparently devkits.

Hopefully VGleaks keeps to their word and releases their leak today.

Jaguar is the current rumoured CPU for both consoles, 2nd generation Bobcat processor aimed at low power mobile/tablet devices.

Link to comment
Share on other sites

Unless the guy is going to fake a 90 page PDF file that goes into detail about the devkit then yes it is actually true. Specs aren't much different than the rumoured final spec for the console itself.

Link to comment
Share on other sites

If it's anything like the PS3 dev kit it just looks like a VCR. :)

Man I wish consumer consoles would look like that's, andthen would all stack nicely with proper sized AV components.

Link to comment
Share on other sites

The devkit Kotaku posted is apparently an older one:

Currently, there are 3 types of devkits:

1) R10 boards with special BIOS, running in generic PC?s

2) ?Initial 1? ? Early devkit

model number: DVKT-KS000K

SCE-provided PC equipped with R10XX board

Runs Orbis OS

Available July 2012

3) SoC Based Devkit: early version of the ORBIS hardware

Available January 2013

R10 Board (with special BIOS) assemble in a Generic PC

Requires Windows 7 64 bit edition

Recommend

Sandy Bridge (Intel) or Bulldozer (AMD)

Minimum 8 GB RAM (system memory)

650 Watt PSU

VS2010 SP1

DWM (Desktop Windows Manager) must be turned off

Application will use Windows services for everything except GPU interface

SCE will provide ?Gnm?, a custom GPU interface

Do you remember the first Durango?s pictures? This is a very early devkit based on Windows.

DVKT-KS000K (?Initial 1?)

Runs Orbis OS

CPU: Bulldozer 8-core, 1.6 Ghz

Graphics Card: R10 with special BIOS

RAM: 8 GB (system memory)

BD Drive

HDD: 2.5 ? 160 GB

Network Controller

Custom South Bridge allows access to controller prototypes

dvkt-600x374.jpg

SoC Based Devkit

Available January 2013

CPU: 8-core Jaguar

GPU: Liverpool GPU

RAM: unified 8 GB for devkit (4 GB for the retail console)

Subsystem: HDD, Network Controller, BD Drive, Bluetooth Controller, WLAN and HDMI (up to 1980?1080@3D)

Analog Outputs: Audio, Composite Video

Connection to Host: USB 3.0 (targeting over 200 MB/s),

ORBIS Dualshock

Dual Camera

The last devkit is the closer one to the retail console. Expect a machine with these specs or similar to these ones. Obviously, Sony could introduce changes in this features, but don?t expect deep mods.

Source: http://www.vgleaks.c...s-roadmaptypes/

Last one is the one we're interested in. Orbis Dualshock mentioned as well.

Link to comment
Share on other sites

Interesting article from Timothy Lottes (creater of FXAA)

Working assuming the Eurogamer Article is mostly correct with the exception of maybe exact clocks, amount of memory, and number of enabled cores (all of which could easily change to adapt to yields).

While the last console generation is around 16x behind in performance from the current high-end single chip GPUs, this was a result of much easier process scaling and this was before reaching the power wall. Things might be much different this round, a fast console might be able to keep up much longer as scaling slows down. If Sony decided to bump up the PS4 GPU, that was a great move, and will help the platform live for a long time. If PS4 is around 2 Tflop/s, this is roughly half what a single GPU high-end PC has right now, which is probably a lot better than what most PC users have. If desktop goes to 4K displays this requires 4x the perf over 1080p, so if console maintains a 1080p target, perf/pixel might still remain good for consoles even as PC continues to scale.

The real reason to get excited about a PS4 is what Sony as a company does with the OS and system libraries as a platform, and what this enables 1st party studios to do, when they make PS4-only games. If PS4 has a real-time OS, with a libGCM style low level access to the GPU, then the PS4 1st party games will be years ahead of the PC simply because it opens up what is possible on the GPU. Note this won't happen right away on launch, but once developers tool up for the platform, this will be the case. As a PC guy who knows hardware to the metal, I spend most of my days in frustration knowing damn well what I could do with the hardware, but what I cannot do because Microsoft and IHVs wont provide low-level GPU access in PC APIs. One simple example, drawcalls on PC have easily 10x to 100x the overhead of a console with a libGCM style API.

Assuming a 7970M in the PS4, AMD has already released the hardware ISA docs to the public, so it is relatively easy to know what developers might have access to do on a PS4. Lets start with the basics known from PC. AMD's existing profiling tools support true async timer queries (where the timer results are written to a buffer on the GPU, then async read on the CPU). This enables the consistent profiling game developers require when optimizing code. AMD also provides tools for developers to view the output GPU assembly for compiled shaders, another must for console development. Now lets dive into what isn't provided on PC but what can be found in AMD's GCN ISA docs,

Dual Asynchronous Compute Engines (ACE) :: Specifically "parallel operation with graphics and fast switching between task submissions" and "support of OCL 1.2 device partitioning". Sounds like at a minimum a developer can statically partition the device such that graphics can compute can run in parallel. For a PC, static partition would be horrible because of the different GPU configurations to support, but for a dedicated console, this is all you need. This opens up a much easier way to hide small compute jobs in a sea of GPU filling graphics work like post processing or shading. The way I do this on PC now is to abuse vertex shaders for full screen passes (the first triangle is full screen, and the rest are degenerates, use an uber-shader for the vertex shading looking at gl_VertexID and branching into "compute" work, being careful to space out the jobs by the SIMD width to avoid stalling the first triangle, or loading up one SIMD unit on the machine, ... like I said, complicated). In any case, this Dual ACE system likely makes it practical to port over a large amount of the Killzone SPU jobs to the GPU even if they don't completely fill the GPU (which would be a problem without complex uber-kernels on something like CUDA on the PC).

Dual High Performance DMA Engines :: Developers would get access to do async CPU->GPU or GPU->CPU memory transfers without stalling the graphics pipeline, and specifically ability to control semaphores in the push buffer(s) to insure no stalls and low latency scheduling. This is something the PC APIs get horribly wrong, as all memory copies are implicit without really giving control to the developer. This translates to much better resource streaming on a console.

Support for upto 6 Audio Streams :: HDMI supports audio, so the GPU actually outputs audio, but no PC driver gives you access. The GPU shader is in fact the ideal tool for audio processing, but on the PC you need to deal with the GPU->CPU latency wall (which can be worked around with pinned memory), but to add insult to injury the PC driver simply just copies that data back to the GPU for output adding more latency. In theory on something like a PS4 one could just mix audio on the GPU directly into the buffer being sent out on HDMI.

Global Data Store :: AMD has no way of exposing this in DX, and in OpenGL they only expose this in the ultra-limited form of counters which can only increment or decrement by one. The chip has 64KB of this memory, effectively with the same access as shared memory (atomics and everything) and lower latency than global atomics. This GDS unit can be used for all sorts of things, like workgroup to workgroup communication, global locks, or like doing an append or consume to an array of arrays where each thread can choose a different array, etc. To the metal access to GDS removes the overhead associated with managing huge data sets on the GPU. It is much easier to build GPU based hierarchical occlusion culling and scene management with access to these kind of low level features.

Re-used GPU State :: On a console with low level hardware access (like the PS3) one can pre-build and re-use command buffer chunks. On a modern GPU, one could even write or modify pre-built command buffer chunks from a shader. This removes the cost associated with drawing, pushing up the number of unique objects which can be drawn with different materials.

FP_DENORM Control Bit :: On the console one can turn off both DX's and GL's forced flush-to-denorm mode for 32-bit floating point in graphics. This enables easier ways to optimize shaders because integer limited shaders can use floating point pipes using denormals.

128-bit to 256-bit Resource Descriptors :: With GCN all that is needed to define a buffer's GPU state is to set 4 scalar registers to a resource descriptor, similar with texture (up to 8 scalar registers, plus another 4 for sampler). The scalar ALU on GCN supports block fetch of up to 16 scalars with a single instruction from either memory or from a buffer. It looks to be trivially easy on GCN to do bind-less buffers or textures for shader load/stores. Note this scalar unit has it's own data cache also. Changing textures or surfaces from inside the pixel shader looks to be easily possible. Note shaders still index resources using an instruction immediate, but the descriptor referenced by this immediate can be changed. This could help remove the traditional draw call based material limit.

S_SLEEP, S_SETPRIO, and GDS :: These provide all the tools necessary to do lock and lock-free retry loops on the GPU efficiently. DX11 specifically does not allow locks due to fear that some developer might TDR the system. With low level access, the S_SLEEP enables placing wavefront to sleep without busy spinning on the ALUs, and the S_SETPRIO enables reducing priority when checking for unlock between S_SLEEPs.

S_SENDMSG :: This enables a shader to force a CPU interrupt. In theory this can be used to signal to a real-time OS completion of some GPU operation to start up some CPU based tasks without needed the CPU to poll for completion. The other option would be maybe a interrupt signaled from a push buffer, but this wouldn't be able to signal from some intermediate point during a shader's execution. This on PS4 might enable tighter GPU and CPU task dependencies in a frame (or maybe even in a shader), compared to the latency wall which exists on non-real-time OS like Windows which usually forces CPU and GPU task dependencies to be a few frames apart.

Full Cache Flush Control :: DX has only implicit driver controlled cache flushes, it needs to be conservative, track all dependencies (high overhead), then assume conflict and always flush caches. On a console, the developer can easily skip cache flushes when they are not needed, leading to more parallel jobs and higher performance (overlap execution of things which on DX would be separated by a wait for machine to go idle).

GPU Assembly :: Maybe? I don't know if GCN has some hidden very complex rules for code generation and compiler scheduling. The ISA docs seem trivial to manage (manual insertion of barriers for texture fetch, etc). If Sony opens up GPU assembly, unlike the PS3, developers might easily crank out 30% extra from hand tuning shaders. The alternative is iterating on Cg, which is possible with real-time profiling tools. My experience on PC is micro-optimization of shaders yields some massive wins. For those like myself who love assembly of any arch, a fixed hardware spec is a dream.

...

I could continue here, but I'm not, by now you get the picture, launch titles will likely be DX11 ports, so perhaps not much better than what could be done on PC. However if Sony provides the real-time OS with libGCM v2 for GCN, one or two years out, 1st party devs and Sony's internal teams like the ICE team, will have had long enough to build up tech to really leverage the platform.

I'm excited for what this platform will provide for PS4-only 1st party titles and developers who still have the balls to do a non-portable game this next round.

Source: http://timothylottes.blogspot.fr/2013/01/orbis-and-durango.html

Link to comment
Share on other sites

I hope they bump it up to 6 GB for the retail console.

Interesting article indeed, Audioboxer. Going by the PS3, it's logical for the PS4 to use LibGCM. I wonder if it'll also use PSGL. I think the next-gen Xbox will have an early advantage because of D3D11.1. A lot of developers are used to D3D and if history repeats itself, the good titles (with Uncharted or Killzone visuals) will come out later on the PS4.

Link to comment
Share on other sites

I hope they bump it up to 6 GB for the retail console.

Why? Just to match what the recent reports say about the xbox 3 having more ram? There's no reason to do that if those same reports are true and the xbox 3's gpu is as far behind in raw horsepower as they claim.

Link to comment
Share on other sites

They won't. GDDR5 is very expensive. 4Gb is plenty.

That's most likely true but there's always hope. The more RAM, the better.

Why? Just to match what the recent reports say about the xbox 3 having more ram? There's no reason to do that if those same reports are true and the xbox 3's gpu is as far behind in raw horsepower as they claim.

Developers want more RAM, regardless of the speed. Microsoft doubled the amount of RAM in the Xbox 360 prior to its release after Epic Games showed them what Gears of War looked like (with more memory). It cost a lot of money but they did it. Can you imagine how what the Xbox 360 would be like if it had 256 MB of RAM instead of 512 MB?

Link to comment
Share on other sites

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.