|<<>>|38 of 40 Show listMobile Mode

Matrox Parhelia

Published by marco on

 Don’t think the choice of an NVidia card is cut-and-dried yet. Sure, the latest Radeon offerings from ATI are slightly slower than the Nvidia cards, but what about the visual quality? Speaking of visual quality, an old hand at making slow, nice-looking graphics has a new technology, called Parhelia. The Matrox site has pages of information, with screenshots and vidoes galore. They hope to lose the “slow” reputation with this one.

Ars Technica has an article, Aiming High at Matrox with some details about the new card:

  1. 80 million transistor (0.15 micron process) 512-bit GPU
  2. 256-bit DDR memory interface with up to 256MB DDR unified frame buffer
  3. AGP 8x with Fast Writes
  4. OpenGL 1.3 and DX 8.1 compliant
  5. Fourth Generation DualHead Technology (400MHz 10-bit RAMDACs up to 2048x1536 @32bpp)
  6. Support for 3rd RGB output (3 display desktop up to 3840x1024 @32bpp)
  7. Quad Vertex Shader Array, Quad texturing per pixel
  8. 36-Stage Shader Array
  9. 64 Super Sample Texture Filtering
  10. Hardware Displacement Mapping
  11. 16x Fragment Antialiasing
  12. 10-bit DVD playback, filtering, scaling and output

Hardware Zone has even more in a very thorough run-down in Matrox Parhelia-512 : The Technology. The article mentions that they have demonstration versions, but availability is still a ways off, with prices expected to be in the sub $400 range for a 128MB DDR card.

That’s a lot of stuff for one card, with 3-monitor support at insane resolutions, DVD hardware playback, 4 vertex shaders per pixel, each with 36 stages, and 16x Antialiasing. Whew! If Matrox can deliver, then this could give NVidia a run for their money, since NVidia always wins the speed crown, but can be a bit slower in making an all-in-one card that is capable of rendering as nicely as ATI or Matrox.

When you’re playing the latest and greatest game at 1280 × 1024 at 75FPS, who cares if the NVidia card will do 90FPS? Still, the jury is out until Mr. Carmack weighs in with his opinion of the new technology. Here’s hoping they’ve made their shader language compatible with OpenGL and/or DirectX.