NAMCO MAGIC EDGE HORNET SIMULATOR HARDWARE
[ BACK TO MAIN PAGE ] - [ NAMCO HARDWARE PAGE ]
(Very) Basic Overview - Full Details Below
Hardware System : Silicon Graphics Onyx Reality Engine II.
Sound : 3D surround sound and radio intercom. 4-channel system plus sub channel.
Memory : 64 MB ECC Memory.
Storage : 2 GB SCSI-II system disk.
Video Output : Video Projector - 30x40 Inch screens.
Controls : Realistic aircraft controls including flight yoke and throttle.
Networking : Two CPUs for UNIX networking and overhead, which pass control information between the 6 units and the central Onyx system over 10Meg Ethernet.
Players / Units : 6
Individual Unit Movement : Fully Hydraulic, 2 DOF in angle, 1 DOF in vertical motion.
(32 inches vertical motion, Roll: +/- 60 degrees, Nose up pitch: 45 degrees, Nose down pitch: 25 degrees)
Individual Unit Dimensions : 2.3 meters Wide, 4.7metres Long x 2.2metres High.
Individual Unit Weight : 4,000 lbs.
Price : $750,000 for the 6 Units and Onyx.
Software : There were 3 Mission packs available, Hornet 1, F18 and X21 Hornet.
|Magic Edge F18 / Magic Edge Hornet 1 / Magic Edge X21
|Notes : This has to be said, look at the date of the game, 1993, this is when Sega and Namco had just started making their 3D games, with flat untextured polygons, bear that in mind when looking at this game that current PC hardware (as of 2001) has only just reached this level, 8 years later...
This was a joint venture between Namco and Magic Edge to produce a commercially viable fun flight simulator for Namco's theme parks, it failed spectacularly due to general and running costs, I suspect they're the only arcade machines in the world which had a $100K/year support agreement!
The gameplay was actually disappointing. The graphics were mostly ok, but nothing much by today's standards. Being thrown around in the pods was cool, but I felt that the pods were lagging a bit behind the action, which was nauseating (and I don't usually get motion sick). Also, the cost of each game was ridiculous.
Full System and general overview
|Main Hardware :
Silicon Graphics Onyx System with Reality Engine 2 (REII) - The processors were R4400SC's running at 150MHz. You could fit four of them on a single CPU card.
This is an eight-processor configuration - with two CPUs per pipe, using three MCOs [multi-channel options] on the Onyx system, which have four Raster Managers per pipe, The three pipelines each drive six displays - providing 18 displays off a single Onyx system.
Graphics Pipes :
Each "pipe" of the reality engine was actually three huge boards.
First board: aka. the GE10 (Geometry Engine version 10) had 12 Intel i860 CPU's doing what is now known as T&L, This fed into the secondary board(s).
Second Board: Rasterisation and Texturing (the "Raster Manager" or RM boards), which fed onto the final board.
Third Board : Video output.
The MCO was a separate board again and deserves some explanation.
Basically, it takes a single rendered frame and splits it up into multiple video outputs. So a single pipe can feed multiple displays. These were often used in military flight Sims, where the lower-resolution side and back displays were fed from the same pipe. I'm unaware of any other product which does anything like the MCO - it's quite an amazing piece of technology.
Operating System :
Finally, the whole system ran Irix (SGI Unix).
As I recall, they'd hacked it up so that their application booted straight out of the /etc/rc* scripts, but it was Irix underneath. I don't remember if they used processor affinity, which could have bound CPU's to tasks as your description implies. I also don't recall if they used Performer for their app, although I'd almost bet they did. Performer is a programming layer over IrisGL (later OpenGL), which facilitates large-dataset 3D simulations. It was specifically designed for flight simulators and similar programs.
Even today, the Reality Engine is still faster than most PC graphics cards for big datasets. For smaller datasets a modern nVidia card will win, but if you've got a few gigabytes of data to render, the nVidia will choke.
Software and SGI Sillyness :
When SGI released a new graphics product, they also used to release a demo disk which often contained really cool games itself. Paradigm, which is a company who writes tools for professional flight Sims, used to have a pretty good flight sim game in there. Apparently it was much better than the Magic Edge software. Even the Magic Edge guy confided in me that he thought their software was the problem.
SGI also used to hold contests with Indy's as prises, where they'd challenge people to write games for Irix. This is how Doom and Abuse got ported to Irix, plus a whole slew of really cool and unusual software. One of them even used the Indycam to monitor your body position, and you played by squirming around to move your player on the screen. Watching people just play it was hilarious.
Of course, this was the old, fun, successful Silicon Graphics. The new SGI is a nasty, mean, unsuccessful organisation which is unlikely to still be around in 12 months time.
Other Games :
There were a number of games which used Onyx's and RE's at the time, and this one wasn't the weirdest of them. One I was also involved with didn't have the individual pods like this one, but actually had a centrifuge about 7m across. There used to be one installed at Intencity at Hurstville in Sydney and another was at Intencity at Parramatta.
The one I played was a car racing simulator (and it was lame). Another was a fight through underground tunnels. Each pod held two players, one of whom steered, and one who shot.
The Onyx sat in the middle of the hub, going around and around with the game! These were systems designed for data centres and labs, so spinning around a couple of times a second was scary. Then again, we used to have Onyx's at coal mines bumping up and down every time they blasted, so perhaps it wasn't so bad.
Product Scope & Full Spec
|Product Scope : A flexible image generation system for low cost flight, ground, space, maritime and virtual reality applications.
Full Spec for Silicon Graphics Onyx System with Reality Engine 2 :
1 to 6 hi-resolution (1280 x 1024) outputs/system
Up to 18 independent channels per system with individual eyepoints
Up to 3 graphics pipelines per system each with 4 Raster Managers
Up to 320 million anti-aliased, trilinear mip-map textured, pixels per second fill rate per graphics pipeline
Up to 7.8 million displayable pixels per system
Redefinable display line rates
Separate NTSC or PAL composite video output standard
Multi-channel capability in deskside and rack chassis
Up to 32-bit Z-buffer - fully integrated with anti-aliasing
Coplanar surface support
Texture decal support
Over 9000 polygons/pipeline @ 30 Hz
Polygons textured, anti-aliased and Z-buffered
Up to 27000 polygons/system @ 30 Hz
4MB texture capacity standard
Programmable texture map sizes from 2 x 2 to 1024 x 1024 texels
Trilinear mip-map capability standard
Up to 380 128 x 128 mip-mapped textures
From 4 to 48-bits per texel
Lighting and smooth shading blended with texture
Dynamic texture projection
Environment mapped texture
Up to 16 sub-sample anti-aliasing
8 or 12-bits per color component
Lighting, shading, reflection, Z-buffering, anti-aliasing and translucency all combinable on same surface
Database traversal, culling and rendering
Fade level of detail
Frame rate control
Height above terrain
Line of sight
Steerable light lobes
Articulation and geometry animation
Custom programmable special effects
Directional shading and lighting
8 Configurable light sources, with material properties control
Full color interpolation on surfaces
Lighting functions definable: ambience, diffusion, specularity, shininess, emissivity, position and color
Over 100 simultaneous moving models (6 DOF)
All models Z-buffered and anti-aliased at sub-pixel level
Fog, haze, clouds
Tunable fog functions, spline fog
Time of day
In synchronous mode:
Min 33ms @ 60 Hz frame rate and 60 Hz refresh
Min 50ms @ 30 Hz frame rate and 60 Hz refresh
Standard 21" multisync monitor
Real time analog and digital video I/O capability
Selectable display resolutions from VGA to HDTV, including field sequential output
Genlock synchronization standard
Separate composite video output standard
Custom line rates
Video image transfer from disk to texture or frame buffer at real-time video frame rates
Up to 4096 levels of translucency
Alpha to coverage function allows translucency, Z-buffering and anti-aliasing simultaneously
IRIS Performer: Simulation development library enables flexible and rapid application development
Debug and performance tuning tools
CASEVision software development environment
Raster-based round light point support
Brightness and size vary with distance
Independent lightpoint fog control
3:1 trade off with polygons
Strobes, beacons, flashing and rotating lights
Up to 24 multi-processing RISC CPUs
85 - 2880 MIPS and 16 - 528 Mflops (estimated with 150MHz R4400)
REACT real-time system kernel
1.2GB/sec internal system bus
Shared memory multiprocessor architecture
3 to 23 VME slots
High speed disk and peripheral I/O support
Up to 50MB/sec VME rate per VME bus
Up to 5 VME buses supported
Fast and wide SCSI-2 support at 20MB/sec per SCSI bus
Up to 32 SCSI buses per system
Up to 16GB of main system memory
ADA language support
Secure O/S - Trusted IRIX (Security level B1) available
Multiple color maps enable support for different sensor types
Custom outputs configurable to suit display device
Hardware support for image processing functions
Optional digital audio board
Programmable audio library
|Even if the software does become available, emulation is very unlikely. The RE subsystem is probably one of the most complex pieces of hardware ever developed, and SGI never documented hardware or microcode interfaces. It had at least ten extremely complex ASIC's on it, some of which incorporated custom-designed CPU's themselves.
Of course, it *might* be possible to do an Ultra-HLE like trap at the OpenGL API, and then use existing OpenGL hardware and not bother with the RE emulation.