High-level emulation

High-level emulation (HLE) is an approach for construction of emulators, especially for video game consoles, which attempts to simulate the response of the system rather than accurately recreating its internal design.

Instead of trying to accurately create or recreate the hardware gate by gate, in HLE a software platform is created on which the emulated code can be run in a host computer having different hardware and a different instruction set. The effort focuses on recreating the appropriate functionality provided by the system emulated. Thus, the emphasis is shifted from the most efficient method of processing data to getting the same (or comparable) results as if the native platform was used. By contrast, the traditional way of emulating is termed Low-level emulation, or LLE which is used to develop new computer hardware and execute legacy binary code.

The term HLE originates from UltraHLE, the first emulator for the Nintendo 64 console that ran commercial games. Initial discussion about HLE occurred to give context for the reasons behind some video games not functioning properly with the emulator. The earliest (1962) high-level emulator was called a functional simulator for executing military flight programs written in symbolic assembly language code, without generating binary code.

Criteria for High-Level Emulation

In order for HLE approach to work, the platform in question must meet certain criteria. Namely, there must exist a higher abstraction level than the raw binary machine code to be executed, realised directly in the hardware, or by the operating system. In any case, it must exist outside of the software intended to run on the emulated platform, and have certain amount of standardisation and semantics for HLE to succeed.

Comparison to traditional models

Compared to LLE, HLE has a very different set of design decisions and trade-offs. As the complexity of modern (fifth generation and above) video consoles rapidly increases, so does their computational power; more importantly, the difference in computational power to consumer PCs, which are the most common host systems for the emulators, has shrunk over time. Thus, the requirements on the quality of the emulated services increases, together with the difficulty of doing so. Hardware chips in consoles are usually extremely specialised towards specific functionality needed by games written for them, often in directions which are completely different from those taken by the hardware in an average PC machine. For example, 3D graphics might be realised by an extremely fast integer processor, coupled with the assumption of main system memory being the same as graphics memory, taking away the separate step of loading textures.

Emulating such an architecture programmatically on a PC, characterised by the emphasis put on floating-point operations, and specialised graphics hardware with memory separate from the system memory would be extremely difficult, especially taking into account the scarcity of documentation typical for specialised, proprietary hardware. Even if such an emulator could be created, it may be too slow for use. An HLE emulator would take the data to be processed, along with the operations list, and implement it using the means available on the host systems. Floating-point math and GPU operations could be performed natively. The result is not only a much better match with the host platform, but often significantly better results, as floating-point computation yields higher quality graphics suitable for high resolution displays available for PCs. It is important to note, however, that the difference in resolution, shading, or processing of graphics memory, sound, and others will change the output from the native machine environment that the emulator is trying to replicate. Other than being less authentic, in some cases, this could be undesirable, for instance rendering portions of the game that were not meant to be seen, making seams in textures more evident because of higher resolutions, bi-linear filtering pixel layers, and at worst will cause software to crash or not execute certain instructions due to interrupts not correctly handled because of HLE simulation.

Advantages and disadvantages of HLE

Among the advantages of HLE technique, chiefly are the ability to utilise the existing host facilities much better and more easily, the ability to optimise the results as the code and hardware improves, and much less or no work at all needed to achieve the desired end result, if an appropriate function is already provided by the host, as would be common in 3D graphics functionality. The progress of implementations is also much more independent of the detailed hardware documentation, instead relying only on the listing of possible functions available to the programmer, which is already provided by a software development kit available for each platform.

The disadvantages include much higher reliance on standardisation among target applications, and the presence of sufficiently high-level mechanisms in the emulated platforms. If there is no such mechanism, or applications fail to utilise it in one of the already supported ways, they will not work correctly, even if other, superficially similar applications function with no problems. Thus a significant amount of tweaks might be required to get all of the desired titles to run satisfactorily.

As a side-effect, HLE removes the common source of legality issues, by not requiring the users to provide it with the bootstrap software used by the original platform to create an environment for applications to run in. Because the emulator itself provides such environment, it no longer needs system ROM images, bootstrap cartridge images or other software obtained from a physical copy of the emulated system, a process which usually resulted in an unclear status in the light of copyright law.

HLE is easier to start and when optimized, can achieve great speed even on weaker hardware. But it does so by sacrificing authenticity. Also, the accuracy of HLE approach cannot be matched to proper LLE software. The speed of HLE is the greatest advantage, however it is achieved by the simulations of the desired output, rather than a mathematically correct output timed properly. In many cases, a specific software can run 90% as close when compared to the emulated machine, and another case 50% or even 0% (may fail to boot or start) in the same emulator, because of software that depends on very precise timings or functions that do not output properly. In LLE, since the software is trying to replicate the original hardware chips down to the bugs and waits, most software should work bug-free and not break one another because of the extensive game-specific hacks and individual, sometimes per game tweaks that become necessary once an error is spotted in HLE. Thus, maintaining compatibility and accuracy on an HLE software that targets a machine that had many games released in its time, will prove much more work and testing of hundreds, sometimes thousands of individual software.

Future outlook

As the console systems progress into more and more complexity, the importance of HLE approach increases. Modern (6th and 7th generation) video consoles are already far too complex and powerful to facilitate their emulation using the traditional approach. Additionally, some systems (notably Xbox) are themselves little more than a standardised PC machine, making it wasteful to try to recreate the hardware using PC as the host machine. Thus, HLE increasingly becomes the only sensible approach.

The state of consumer level PCs have also changed, newer computers are much faster than 20 years ago, and LLE is becoming possible at last for some of the very first consoles and CPUs that had to be emulated via HLE in the 90s. As a result, many emulators can opt for accuracy and cycle-accurate replication of the microchips which result in very precise software environments that can finally replace old consoles and computers. However, HLE has found a new purpose in smartphones, handheld devices, and other electronic gadgets that have much lower specs than the average computer, and for these devices the speed and simulated functionality translates to higher frame-rates.

External links

This article is issued from Wikipedia - version of the Friday, September 18, 2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.