Xbox One Silicon Talk


Recommended Posts

You could just look at the shots of the back of the XB1 and see what the Kinect v2 port looks like to compare it to the back of the 360s port.

It isn't the same, I already looked.  A Bing image search will show it.  (Probably Google too, if that's yer poison.)

Link to comment
Share on other sites

Just watched the unboxing again. It looks to be a custom port. I also noticed that the power/data cable is much thicker on Kinect v2 than Kinect v1. I briefly owned a Kinect v1, and don't remember the cord being that thick. Even with power and data...

Link to comment
Share on other sites

Just watched the unboxing again. It looks to be a custom port. I also noticed that the power/data cable is much thicker on Kinect v2 than Kinect v1. I briefly owned a Kinect v1, and don't remember the cord being that thick. Even with power and data...

Link to comment
Share on other sites

One thing of note, just because the diagram shows USB could mean that the link being used is USB, instead of something like PCI-X or w/e.   The port itself, for all we know, could be doing a bit more and not just be your run of the mill USB port.  Back at the unboxing, did anyone see if the Kinect v2 has it's own power cord or does it get fed power through the connection to the XB1?

Link to comment
Share on other sites

The port is different probably because of power requirements (it is definitely a proprietary port). The Kinect for the Xbox 360 had that same problem as earlier Xbox 360 models required it to be plugged in with an additional adapter.

While USB 3 supports up to 900mA (correct me if I'm wrong) and is quite a lot more than the USB 2 standard (500mA), it still could be the case that the USB 3 ports on the Xbox One don't provide enough for the Kinect to function, even if 900mA is quite a lot of power. The Kinect for the Xbox One has a lot of components (its own processor amongst others) and it even has a small fan at the back to cool them, which could indicate that it requires some juice.

Link to comment
Share on other sites

The port is different probably because of power requirements (it is definitely a proprietary port). The Kinect for the Xbox 360 had that same problem as earlier Xbox 360 models required it to be plugged in with an additional adapter.

While USB 3 supports up to 900mA (correct me if I'm wrong) and is quite a lot more than the USB 2 standard (500mA), it still could be the case that the USB 3 ports on the Xbox One don't provide enough for the Kinect to function, even if 900mA is quite a lot of power. The Kinect for the Xbox One has a lot of components (its own processor amongst others) and it even has a small fan at the back to cool them, which could indicate that it requires some juice.

 

This is what I was thinking, I'm sure they're feeding a 2nd power line back through it to the Kinect to power the whole thing.  To that extent it's probably two different lines combined into one cord, not standard USB3.   If this is indeed how it is then it makes sense for MS to have a USB3/2 specific version for Windows later in 2014 since they'd have to go in and rework some of the connections.  It's not a matter of just swapping the ports/plugs and calling it a day I bet.

 

Besides, with all the processing going on I'm sure the Kinect 2 for Windows will require a USB3 connection and not USB2.

Link to comment
Share on other sites

There are ways to push more power via USB 3.0 than the basic spec of 900mA I believe.

 

There are several pc motherboard makers that have been doing this for a few years now.  I'm not sure the method, but they are able to provide a single standard usb 3.0 port that is specially designed to handle much more power through it.  It wouldn't be shocking if MS was employing a similar technique.

Link to comment
Share on other sites

There's no special method to "push" more power to the USB port. They simply allow it to draw more power, the equipment you plug in won't draw more power than it requires anyway. so even if the USB port is capable of delivering 2.1A a device that only uses .5A will only draw 0.5.  some motherboards require a special booster software to activate the extra power though as it's usually only used for charging. 

Link to comment
Share on other sites

There's no special method to "push" more power to the USB port. They simply allow it to draw more power, the equipment you plug in won't draw more power than it requires anyway. so even if the USB port is capable of delivering 2.1A a device that only uses .5A will only draw 0.5.  some motherboards require a special booster software to activate the extra power though as it's usually only used for charging. 

 

So they allow it to draw more power, sounds like a special method to me.  Your talking more semantics than anything else.  Like I said, I didn't know what the method was. 

 

Your right that I worded it wrong though.  I meant that devices that needed more power then the basic spec could draw that power via one of the special ports.

Link to comment
Share on other sites

I mean, DirectX is objectively the best Graphics API available.

Why is Microsoft allowing Sony to make use this? Couldn't they just say "DirectX can be used to develop apps and games for Microsoft platforms only".

 

That'd give them a huge advantage over Sony as OpenGL is nowhere near the level of power of DirectX.

Link to comment
Share on other sites

I mean, DirectX is objectively the best Graphics API available.

Why is Microsoft allowing Sony to make use this? Couldn't they just say "DirectX can be used to develop apps and games for Microsoft platforms only".

 

That'd give them a huge advantage over Sony as OpenGL is nowhere near the level of power of DirectX.

 

They're not, Sony isn't using DirectX. they're using DirectX terminology as a reference. 

Link to comment
Share on other sites

They're not, Sony isn't using DirectX. they're using DirectX terminology as a reference. 

 

I had a discussion with another friend about this, and he swears up and down Sony is able to use directx (I refute it because it's a proprietary microsoft technology), but I saw this, and another thread on neogaf i think, which I can't find right now:

 

http://www.geek.com/games/sony-iimprove-directx-11-for-the-ps4-blu-ray-1544364/

 

How is Sony able to improve DirectX if they can't even use it?  I saw something else about them being able to use DX11.2 and extend it.  Again, I have no earthly idea how they're able to do it when it's proprietary to MS platforms.  Any ideas on this?

Link to comment
Share on other sites

No, sony us using a custom OpenGL base graphics API. 

 

But the graphics card they use is DirectX compatible, and directX terminology is what's known today, that's why they use DirectX terminology in their marketing speak. but there's not a shred of DirectX on the PS4. 

 

And they're not saying they're improving DirectX, they're claiming they've improved on DirectX functions in the graphics card. 

Link to comment
Share on other sites

No, sony us using a custom OpenGL base graphics API. 

 

But the graphics card they use is DirectX compatible, and directX terminology is what's known today, that's why they use DirectX terminology in their marketing speak. but there's not a shred of DirectX on the PS4. 

 

And they're not saying they're improving DirectX, they're claiming they've improved on DirectX functions in the graphics card. 

DirectX is an specification, just as OpenGL is. the APIs are coded based on those specifications, which say what instructions they must included and what they should do. They come often from the GPU manufacturer, not from MS or the Kronos Group since after all the API links the hardware directly with the instructions, that's why there is so much difference between some drivers of different versions.

Link to comment
Share on other sites

I had a discussion with another friend about this, and he swears up and down Sony is able to use directx (I refute it because it's a proprietary microsoft technology), but I saw this, and another thread on neogaf i think, which I can't find right now:

 

http://www.geek.com/games/sony-iimprove-directx-11-for-the-ps4-blu-ray-1544364/

 

How is Sony able to improve DirectX if they can't even use it?  I saw something else about them being able to use DX11.2 and extend it.  Again, I have no earthly idea how they're able to do it when it's proprietary to MS platforms.  Any ideas on this?

Sony isn't using DirectX, not one bit, let me just make that clear.

 

The only thing they've said about it, is that they've took into consideration how DirectX works with the CPU/GPU (its "instruction") and implemented it in a variety in their custom APU. They HAD to do this on this generation due to X86 or the PS4 would be an absolute dog to code for and would be miles behind the X1. This isn't due to raw performance but simply how all modern engines work these days. Which is simple, they all work and are designed specifically around DirectX. This doesn't mean they've got an implementation of DirectX, simply copying it to a degree at the hardware low-level. There is no DirectX software stack.

 

A prime example of this is Frostbite engine which is specifically designed and architectured around DirectX. Hence why currently in the developer interviews, people are claiming the X1 looks better in games using that engine (Battlefield, NFS).

Link to comment
Share on other sites

This topic is now closed to further replies.