Jump to content

20 posts in this topic

Posted

I removed the link due to its content. We don't condone piracy around here. (Y)

Share this post


Link to post
Share on other sites

Posted

I removed the link due to its content. We don't condone piracy around here. (Y)

Updated link with references to piracy removed...

 

http://pastebin.com/YDKnHcTY

 

Just so people can see the BS.

 

Is that ok, Nick?

Share this post


Link to post
Share on other sites

Posted

Is that ok, Nick?

As far as I know Jailbreaking isn't illegal, so it's all good in my mind. I will point out that other staff members may overrule me on this though. Thanks for clearing the pastebin file up. (Y)

Share this post


Link to post
Share on other sites

Posted

Hello,
 
This was bound to happen sooner or later on a x86 platform.

If its a fake, its just delaying the inevitable...

Share this post


Link to post
Share on other sites

Posted

lol.. Assigned/UnAssigned code?  I assume he means Signed/UNSigned... and of course you could run SIGNED code.. all code that runs is signed.

1 person likes this

Share this post


Link to post
Share on other sites

Posted

Hello,
 
This was bound to happen sooner or later on a x86 platform.

If its a fake, its just delaying the inevitable...

The CPU type isn't indictive if it's likely to be hacked or not.

Share this post


Link to post
Share on other sites

Posted

Hello,

The CPU type isn't indictive if it's likely to be hacked or not.

I highly disagree.

Programmers have been programming (worded wrong but its understood) for the x86 architectures since about the 80s. Any softwared related (or processor) exploit depending on this arch, would probably make it easier.

Granted more times its a software bug that is exploited but this doesnt rule out thats it could be the architecture's fault as well.

(I might have worded my previous comment incorrectly; This "hack" isnt related to the arch, its related to the OS so my apoligies)

Share this post


Link to post
Share on other sites

Posted

Hello,
I highly disagree.

Programmers have been programming (worded wrong but its understood) for the x86 architectures since about the 80s. Any softwared related (or processor) exploit depending on this arch, would probably make it easier.

Granted more times its a software bug that is exploited but this doesnt rule out thats it could be the architecture's fault as well.

(I might have worded my previous comment incorrectly; This "hack" isnt related to the arch, its related to the OS so my apoligies)

The processors has nothing to do with it.  Unless there is a CPU level exploit that works for all x86 instructioned CPUs.   Any exploit will be through the OS which is CPU Independant, or they will be direct hardware hacks (intercept lines, memory dumps, etc).

Share this post


Link to post
Share on other sites

Posted

Good(that it ended up being a hoax).. I use to be all for consoles getting hacked.. but now.. it really hurts the scene.. I hope it never gets hacked to be honest.. If you want "homebrew" buy a RaspberryPi or something.. If you want to play copies of games for free. get a PC.. lol  

Share this post


Link to post
Share on other sites

Posted

Hello,
I highly disagree.

Programmers have been programming (worded wrong but its understood) for the x86 architectures since about the 80s. Any softwared related (or processor) exploit depending on this arch, would probably make it easier.

Granted more times its a software bug that is exploited but this doesnt rule out thats it could be the architecture's fault as well.

(I might have worded my previous comment incorrectly; This "hack" isnt related to the arch, its related to the OS so my apoligies)

Erm yeah... Things like intercepting memory buses, halting the processor, changing the memory (which is most likely encrypted and the decryption is most likely done INSIDE the processor to avoid the key being on a seperate chip which you could just read it out from)... You can do that on any CPU, powerPC, arm, x86, sparc, PIC, AVR, etc.

OS exploits are again, completely independant from the CPU type, if you make a piece of C code that has a potential buffer-overflow bug, no matter what arch you compile it for, it'll still be present in the assembly instructions for that CPU.

 

Unless you're talking about something like there being a flaw or bug with how a CPU does something, like (this is made up) if you hold an address like wrong it allows unsigned code execution, in which case there's going to be even LESS problems with this because x86 is the tried-and-tested mainstream CPU and has been for over 30 years now. If there was a hardware exploit suddenly discovered in the x86 architecture, it would effect pretty much almost all computer systems on the whole planet, hence why they do very stringent testing.

Share this post


Link to post
Share on other sites

Posted

Title changed.

Share this post


Link to post
Share on other sites

Posted

Hello,

The processors has nothing to do with it.  Unless there is a CPU level exploit that works for all x86 instructioned CPUs.   Any exploit will be through the OS which is CPU Independant, or they will be direct hardware hacks (intercept lines, memory dumps, etc).

Erm yeah... Things like intercepting memory buses, halting the processor, changing the memory (which is most likely encrypted and the decryption is most likely done INSIDE the processor to avoid the key being on a seperate chip which you could just read it out from)... You can do that on any CPU, powerPC, arm, x86, sparc, PIC, AVR, etc.
OS exploits are again, completely independant from the CPU type, if you make a piece of C code that has a potential buffer-overflow bug, no matter what arch you compile it for, it'll still be present in the assembly instructions for that CPU.
 
Unless you're talking about something like there being a flaw or bug with how a CPU does something, like (this is made up) if you hold an address like wrong it allows unsigned code execution, in which case there's going to be even LESS problems with this because x86 is the tried-and-tested mainstream CPU and has been for over 30 years now. If there was a hardware exploit suddenly discovered in the x86 architecture, it would effect pretty much almost all computer systems on the whole planet, hence why they do very stringent testing.

Both of you misinterpreted me. Never mind.

Share this post


Link to post
Share on other sites

Posted

Good(that it ended up being a hoax).. I use to be all for consoles getting hacked.. but now.. it really hurts the scene.. I hope it never gets hacked to be honest.. If you want "homebrew" buy a RaspberryPi or something.. If you want to play copies of games for free. get a PC.. lol  

 

I imagine at some point there will be an 'emulator' similar to WINE for the PS4/XBone given that the PC and the systems all share the same ISA. It is a different ball game from prior game system architectures. Reverse engineer the loading process, implement stub OS services, implement wrappers for the OGL/DX ABI calls, and you are probably well on your way there.

Share this post


Link to post
Share on other sites

Posted

I imagine at some point there will be an 'emulator' similar to WINE for the PS4/XBone given that the PC and the systems all share the same ISA. It is a different ball game from prior game system architectures. Reverse engineer the loading process, implement stub OS services, implement wrappers for the OGL/DX ABI calls, and you are probably well on your way there.

It took sony/ms years to make the implementations from scratch and optimise them, having to implement an entire system using a guessed spec would take an incredibly long time and be incredibly inefficient. It'll take a long longer than 10 years before something even starts, 12 years on and there's still no xbox emulator for PC, and that's a puny P3 733MHz system.

Share this post


Link to post
Share on other sites

Posted

It took sony/ms years to make the implementations from scratch and optimise them, having to implement an entire system using a guessed spec would take an incredibly long time and be incredibly inefficient. It'll take a long longer than 10 years before something even starts, 12 years on and there's still no xbox emulator for PC, and that's a puny P3 733MHz system.

 

I'm not sure what exactly you are referring to when you say "it took Sony/MS years to make implementations from scratch ..." or where you got time estimates from or where you get that the emulation would be incredibly inefficient. What exactly are you referring to?

 

Emulating the Pentium 3 was never the issue with the XBone Original. That was a done deal given that you don't need to emulate a processor with the same-feature set and ISA as the processor in your PC. Moreover, there are emulators for the original XBox. But, you don't hear much about them because they can't really play many games. The question is why though? The answer is pretty simple: lack of GPU emulation support. This was, in part, difficult because the XBox GPU had a distinct feature-set from the Geforce 3 cards of the era (and future cards for that matter). One emulator exists that has done everything except that part of the equation (interestingly enough, it was actually done by a student for a class project). Also, evidently development efforts were also hindered by a lack of documentation for how the XBox worked.

 

The questions here are really how distinct was the GPU feature-set and how much interest has there been in XBox emulators over the years? For the former, I'm not sure on the specific details, but it is evidently the case that the GPU feature-set is distinct from the NV20. For the latter, I imagine there hasn't been much interest considering that there were not many exclusives to the system (even including bad exclusives). Without a community or interest, you will never going to get finished emulator even if it is a relatively simple thing to do.

 

For the new systems, we are talking about SoCs which are not really all that custom. They have largely taken existing IP cores and integrated those into a die (cheap and smart -- this is the way of the future). It isn't like the old days where feature-sets between components were custom developed and in turn drove future GPU development for PCs.

 

The question is how homogeneous the feature-set is and how good the documentation is for OS calls/GPU calls/etc for these new systems. We already know the ISA is the same so x86 code can run natively if the executables are loaded properly and ABI stubs are implemented for library calls. I'm assuming the developer documentation for those interfaces is not poor. It also seems to me that the feature-set for these on-die GPUs is fairly homogeneous to already existing AMD/ATI GPUs in the important parts you would need to emulate (e.g. custom shaders/ops). The less custom it is, the less you need the emulate. Remember, we are talking about technologies that are no longer in their infancy and that have matured - designs and interfaces are becoming more and more standardized in terms of GPU functionality, language specifications, and APIs for game programming. It is also worth noting that, I have read that the early development kits were just high-end PCs, which suggests even more so the homogeneity between these systems. At a high level, for me, that really leaves one piece of the equation left: how to handle the ESRAM. I don't know for sure but, I would imagine it is accessed via MMIO on the system and that simplifies how to go about the issue.

 

At the end of the day, I'm not saying you'll ever see an emulator, but it seems far more feasible for one to exist than compared with prior architectures given the homogeneity of these systems and x86_64 PCs. If you want to debate on the distinctions, can you give me something concrete to philosophize on? I really would be interested.

Share this post


Link to post
Share on other sites

Posted

I'm not sure what exactly you are referring to when you say "it took Sony/MS years to make implementations from scratch ..." or where you got time estimates from or where you get that the emulation would be incredibly inefficient. What exactly are you referring to?

 

Emulating the Pentium 3 was never the issue with the XBone Original. That was a done deal given that you don't need to emulate a processor with the same-feature set and ISA as the processor in your PC. Moreover, there are emulators for the original XBox. But, you don't hear much about them because they can't really play many games. The question is why though? The answer is pretty simple: lack of GPU emulation support. This was, in part, difficult because the XBox GPU had a distinct feature-set from the Geforce 3 cards of the era (and future cards for that matter). One emulator exists that has done everything except that part of the equation (interestingly enough, it was actually done by a student for a class project). Also, evidently development efforts were also hindered by a lack of documentation for how the XBox worked.

 

The questions here are really how distinct was the GPU feature-set and how much interest has there been in XBox emulators over the years? For the former, I'm not sure on the specific details, but it is evidently the case that the GPU feature-set is distinct from the NV20. For the latter, I imagine there hasn't been much interest considering that there were not many exclusives to the system (even including bad exclusives). Without a community or interest, you will never going to get finished emulator even if it is a relatively simple thing to do.

 

For the new systems, we are talking about SoCs which are not really all that custom. They have largely taken existing IP cores and integrated those into a die (cheap and smart -- this is the way of the future). It isn't like the old days where feature-sets between components were custom developed and in turn drove future GPU development for PCs.

 

The question is how homogeneous the feature-set is and how good the documentation is for OS calls/GPU calls/etc for these new systems. We already know the ISA is the same so x86 code can run natively if the executables are loaded properly and ABI stubs are implemented for library calls. I'm assuming the developer documentation for those interfaces is not poor. It also seems to me that the feature-set for these on-die GPUs is fairly homogeneous to already existing AMD/ATI GPUs in the important parts you would need to emulate (e.g. custom shaders/ops). The less custom it is, the less you need the emulate. Remember, we are talking about technologies that are no longer in their infancy and that have matured - designs and interfaces are becoming more and more standardized in terms of GPU functionality, language specifications, and APIs for game programming. It is also worth noting that, I have read that the early development kits were just high-end PCs, which suggests even more so the homogeneity between these systems. At a high level, for me, that really leaves one piece of the equation left: how to handle the ESRAM. I don't know for sure but, I would imagine it is accessed via MMIO on the system and that simplifies how to go about the issue.

 

At the end of the day, I'm not saying you'll ever see an emulator, but it seems far more feasible for one to exist than compared with prior architectures given the homogeneity of these systems and x86_64 PCs. If you want to debate on the distinctions, can you give me something concrete to philosophize on? I really would be interested.

 

Would it still be an emulator if the instruction-set is so vague and common, and will be executed on the same platform it was designed on?? I could see emulator be used for PPC or RISC to X86, anything where the hardware itself needs to be written in code... This begs the question, think we could 'hackintosh' this generation? IE - get the software to run on non-MS/Sony hardware?

1 person likes this

Share this post


Link to post
Share on other sites

Posted

Would it still be an emulator if the instruction-set is so vague and common, and will be executed on the same platform it was designed on?? I could see emulator be used for PPC or RISC to X86, anything where the hardware itself needs to be written in code... This begs the question, think we could 'hackintosh' this generation? IE - get the software to run on non-MS/Sony hardware?

 

Depends on the exact usage of the term 'emulator'. It originally meant hardware assisted simulation iirc. That went out the window at some point (probably the 90s with system emulators that have no hardware support). I'm pretty lax in usage because I've seen the term tossed around to mean various things in HPC. I think a lot of times it really just means 'simulation of some type of hardware system'; where as, you could use the term simulation in a more general context. People like to debate about the term w.r.t. to the WINE project ('WINE Is Not Emulator' vs 'WIndows EMulator'). That is just 'emulating' windows which was specifically designed to run on exactly the same hardware as Linux (more so than in the case of XBone or PS4 for PC) :-D

 

As for hackintoshing it, that seems like a holy grail. I don't know the details but I imagine it is a firmware/OS combo bundled in one. If the high level features are separated and not too hardware specific something like this could be possible. Any code that does low level hardware access/driver-equivalent parts would need to be replaced though (which I suppose is akin to loading kext like OSx, but I doubt these systems feature loadable kernel modules). I imagine it would be more of a frankenbuild with key things replaced (and many many binary patches) instead. My gut feeling is that this is harder than just doing it the WINEish way.

Share this post


Link to post
Share on other sites

Posted

I'm not sure what exactly you are referring to when you say "it took Sony/MS years to make implementations from scratch ..." or where you got time estimates from or where you get that the emulation would be incredibly inefficient. What exactly are you referring to?

 

Emulating the Pentium 3 was never the issue with the XBone Original. That was a done deal given that you don't need to emulate a processor with the same-feature set and ISA as the processor in your PC. Moreover, there are emulators for the original XBox. But, you don't hear much about them because they can't really play many games. The question is why though? The answer is pretty simple: lack of GPU emulation support. This was, in part, difficult because the XBox GPU had a distinct feature-set from the Geforce 3 cards of the era (and future cards for that matter). One emulator exists that has done everything except that part of the equation (interestingly enough, it was actually done by a student for a class project). Also, evidently development efforts were also hindered by a lack of documentation for how the XBox worked.

 

The questions here are really how distinct was the GPU feature-set and how much interest has there been in XBox emulators over the years? For the former, I'm not sure on the specific details, but it is evidently the case that the GPU feature-set is distinct from the NV20. For the latter, I imagine there hasn't been much interest considering that there were not many exclusives to the system (even including bad exclusives). Without a community or interest, you will never going to get finished emulator even if it is a relatively simple thing to do.

 

For the new systems, we are talking about SoCs which are not really all that custom. They have largely taken existing IP cores and integrated those into a die (cheap and smart -- this is the way of the future). It isn't like the old days where feature-sets between components were custom developed and in turn drove future GPU development for PCs.

 

The question is how homogeneous the feature-set is and how good the documentation is for OS calls/GPU calls/etc for these new systems. We already know the ISA is the same so x86 code can run natively if the executables are loaded properly and ABI stubs are implemented for library calls. I'm assuming the developer documentation for those interfaces is not poor. It also seems to me that the feature-set for these on-die GPUs is fairly homogeneous to already existing AMD/ATI GPUs in the important parts you would need to emulate (e.g. custom shaders/ops). The less custom it is, the less you need the emulate. Remember, we are talking about technologies that are no longer in their infancy and that have matured - designs and interfaces are becoming more and more standardized in terms of GPU functionality, language specifications, and APIs for game programming. It is also worth noting that, I have read that the early development kits were just high-end PCs, which suggests even more so the homogeneity between these systems. At a high level, for me, that really leaves one piece of the equation left: how to handle the ESRAM. I don't know for sure but, I would imagine it is accessed via MMIO on the system and that simplifies how to go about the issue.

 

At the end of the day, I'm not saying you'll ever see an emulator, but it seems far more feasible for one to exist than compared with prior architectures given the homogeneity of these systems and x86_64 PCs. If you want to debate on the distinctions, can you give me something concrete to philosophize on? I really would be interested.

The GPU of the original xbox is almost a standard GeForce card, in fact it's based off the pre-production model, and you can get various official NVIDIA cards and flash the pre-production firmware to them and they'll work fine as part of an xbox alpha dev kit if you put them inside, and they work fine with the official NVIDIA display drivers of the time, which is why the alpha xbox allowed you to boot XboxOS or windows 2000.

 

And like wine with windows, you'd need to find out ALL function calls, all possible arguments and write your own code to process the functions as the abstraction layer that MS/Sony provides does, that's what I'm referring to, hence why wine is still vastly incomplete, because it's a bloody huge project to undertake, and windows API's are documented for developers, xbox one/ps4's documentation is only available to registered developers, so people would know even less on how to implement it. Plus there's copy protection, and I don't mean copy protection as in something on the disc/system to stop you playing pirated games, I'm talking about specific reliances and checks for hardware that if missing makes it either refuse to operate or unable to operate.

It's 2013, SNES game haven't been commercially produced by nintendo since 1996, and only NOW 17 years later is there someone who's managed to create an actual proper accurate SNES emulator (I'm not referring to SNES9X etc. that isn't an exact implementation, I'm referring to bsnes), and he's only just managed to get the few DSPs for SNES cartridges onto an FPGA, 17 years. The xbox is vastly more complex as are all more recent systems.

Share this post


Link to post
Share on other sites

Posted

The GPU of the original xbox is almost a standard GeForce card, in fact it's based off the pre-production model, and you can get various official NVIDIA cards and flash the pre-production firmware to them and they'll work fine as part of an xbox alpha dev kit if you put them inside, and they work fine with the official NVIDIA display drivers of the time, which is why the alpha xbox allowed you to boot XboxOS or windows 2000.

 

<snip>

 

So your argument is that the alpha XBox GeForce cards which were just GeForce 3 cards work with alpha XBox firmware which was just Ge-force 3 firmware? Yeah, this is true. The Alpha kit was a PC (and the bios firmware could be flashed so you could just run whatever you wanted). It is also a terrible argument for arguing the complexity of emulation had it stayed true for the final product because it would mean that simulating the XBox GPU would be incredibly easy feature-set wise since they'd share their entire feature-set and you wouldn't have to emulate anything. Forunately, MS and nVidia made a custom GPU in the end with custom features and vertex/pixel shaders; and diverged the DX implementation so all of this ended up being more difficult (this is based off of what emulator devs say and wikipedia).

 

Interestingly enough, I just read that hackers got Windows 2000 running on the final version of the XBox so clearly the CPU, memory architecture, and basic GPU functionalities weren't much different from a PC... I guess that really did just leave the more advanced GPU feature-set to emulate...

 

<snip>

 

And like wine with windows, you'd need to find out ALL function calls, all possible arguments and write your own code to process the functions as the abstraction layer that MS/Sony provides does, that's what I'm referring to, hence why wine is still vastly incomplete, because it's a bloody huge project to undertake, and windows API's are documented for developers, xbox one/ps4's documentation is only available to registered developers, so people would know even less on how to implement it. Plus there's copy protection, and I don't mean copy protection as in something on the disc/system to stop you playing pirated games, I'm talking about specific reliances and checks for hardware that if missing makes it either refuse to operate or unable to operate.

It's 2013, SNES game haven't been commercially produced by nintendo since 1996, and only NOW 17 years later is there someone who's managed to create an actual proper accurate SNES emulator (I'm not referring to SNES9X etc. that isn't an exact implementation, I'm referring to bsnes), and he's only just managed to get the few DSPs for SNES cartridges onto an FPGA, 17 years. The xbox is vastly more complex as are all more recent systems.

 

The API is going to be documented (as is the ABI unless it strays from x86_64) for the systems; otherwise, developers couldn't use it. Sure, it is only available to registered developers, but it is disingenuous to say that won't ever surface or that hackers wouldn't be able to figure out the interfaces. Moreover, many of the GPU interfaces may differ in ABI only (if they differ at all...) from the PC. If worse comes to worse, these are going to be in the form of some kind of exported library call with signatures so it probably isn't going to be too difficult to figure that part out. Honestly the real issue is implementing the functionality of the APIs _not_ the API interfaces.

 

There are a number of reasons why WINE is "incomplete" as you put it:

 

(1) Because it was implemented using black-box reverse engineering,

(2) Because it has to support more and more things with newer and newer releases of windows and core library functionality.

(3) It has two decades of cruft and poorly documented APIs to implement (MS did an amazingly ****ty job of this in the 90s from what I hear),

(4) it is an runtime layer that normally expects to function as Ring 0 and thus has to work around the issues that come with that.

 

Honestly, it isn't even "vastly" incomplete. It works for A LOT of things (including actually playing Windows AAA games using a DX wrapper). It just sucks performance wise for many Windows applications because of API issues with GDI (no direct access to buffers) and doesn't have proper support for multi-threading.

 

Copy-protection is a known issue and it existed even in early DOS games. Is it an issue? Yes. Is it fool-proof? No. It isn't something new or that hasn't been employed in PS3/360/PC. What exactly is different here? You know how most hackers bypass copy-protection (which necessarily includes any sort of software check for missing hardware)? They either modify the binaries to remove checks or modify the checks to be successful. That isn't the sort of thing that is going to deter hackers. But, if you really want to argue this, I'll give you a better talking point: payload encryption. Encrypt the binaries or even just certain segments of the code and suddenly it isn't so easy to reverse engineer. Then reverse engineering requires either running a dissembler on the fly or taking a dump of the in-memory image of the binary. These things are not really all that fun; but they are sort clever and deter dissemblers from making automated easy work of a binary.

 

As for your SNES analogy: sure, BSnes (or Higan as it is called now) has come around as the most accurate SNES emulator, but you are being rather disingenuous here:

 

(1) The entire point of Higan is to run SNES games with 100% accuracy down to the quirks of the original hardware. I.e. it is attempting to be a cycle accurate simulator (i.e. something that is INSANELY SLOW and really really really hard to do). Seriously, it requires you to understand and simulate the exact functionality and timing of each co-processor/DSP/enhancement chip (including any quirks in the interaction of these things). While admirable, it is a sisyphean task. It is not something that even occurs if you attempt to emulate a platform using FPGAs and have the original specification from the people who designed the hardware. Systems can be so complex that not even the designers are aware of the exact function or timing of every thing in combination with everything else (which for example is why you have an errata list for every processor that has ever been released). 

 

(2) It did not take 17 years to get a 'proper' or 'accurate' SNES emulator. 99% of SNES games worked fine a decade ago (or more) on Zsnes and Snes9x (for example, I was playing emulated games for the system in 1999; possibly earlier). There is something to be said about being smart when emulating systems instead of being pure. The focus of other SNES emulators (and most emulators in general) is on getting games working and playable NOT on making 100% faithful and accurate emulation of the original hardware. One of these is not like the other and infinitely harder than the other. Which brings me to 3.:

 

(3) You don't need to emulate the the instruction set architecture, or the hardware, or attempt at an 100% accurate emulation for these modern systems. The former (ISA emulation) was _required_ for the emulation of the SNES. These things would be completely inane focuses and a genuine waste of time. Emulating the ISA would be just plain stupid to be honest.

 

(4) Your analogy to the SNES is a false analogy. Just because they are video game systems doesn't mean that you would be required to 'emulate' them in the same manner. These modern systems would be 'emulated' in a completely different manner to that of the SNES; ergo, development time for a SNES emulator (be it Higan at 17 years or Snes9x at 3 years) is not particularly relevant.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0

  • Recently Browsing   0 members

    No registered users viewing this page.