Breaking News

The AMD Spider platform: Series-7 chipset, Phenom CPU, HD38x0 GPU

After a long wait AMD has finally unveiled their new Spider platform. For the first time since their merger with ATI they now can offer a complete package with a new chipset, a new CPU-family and new GPU’s. We were invited to the big launch-event in Warsaw, Poland, and have both photos and some benchmarks from the new products.

INTRODUCTION

The computer industry is a funny place. One company can be on top for several years, just to be toppled and relegated to the back of the pack within months. In some ways that is what has happened to AMD and ATI. Once these two separate companies were the top dogs; the kings in their respective fields. While NVIDIA struggled with the GeForce 5 ATI shined with the Radeon 9×00 chipsets. And while AMD won the heart of most enthusiast users with the Athlon 64, Intel struggled to offer the same performance/price ration with their P4.

Today things look a lot different. NVIDIA and Intel have products that simply offer a lot more value that AMD, especially in the CPU-segment.

Between the 14th and the 16th of November I flew over to Warsaw, Poland, to get introduced to AMD’s new set of components codename Spider. The Spider platform consists of new GPU’s, new CPU’s and a new chipset and is AMD’s answer to both Intel and NVIDIA’s latest offerings. Is it enough? Read on and find out.

The Spider

THE SPIDER IS HERE

The Spider is the codename for the whole CPU/Chipset/GPU platform from AMD. It is however not something that AMD will market out towards the consumers. They did hint that some computer builders might use the Spider name in their marketing but it is nothing that AMD will push. I personally think it is a pity as the spider image they are using is quite cool.

The idea with the Spider platform is to be able to offer a complete set of products that are guaranteed to work the best together. This however does not mean that each separate component only works with other AMD products. In fact, AMD also talk a lot about being open and that we will see motherboards using competitor chipsets that will support CrossfireX, just like we have Intel-based motherboards today that support Crossfire.

The Spider platform consists of 3 different components:

Chipset
The new Series-7 chipset lay the foundation for the whole platform. It will have 42 PCI-Express lanes, Multi-Monitor CrossfireX support and also support the new AMD Overdrive software.

CPU
We’ve waited a long time for these but finally it is time for AMD to release their new Phenom Quad-core CPU’s. These CPU’s are “true” Quad-Core designs, have shared L3-cache as well as HyperTransport 3.0.

GPU
Last but not least AMD is releasing two new GPU’s; the HD3850 and the HD3870. These GPU’s are ready for Quad-CrossfireX, have support for DirectX10.1 and have on-chip UVD.

On paper the Spider does seem to have some serious bite to it but what about the reality? Well, it was kind of a mixed bag to be honest.
 

OUR FIRST STOP – THE SERIES-7 CHIPSET

 Chipsets are not by a far shot as sexy to talk about as new GPU’s or CPU’s. And yet they are the foundation for everything. A bad chipset can affect the performance of a whole system. We all remember the bad USB-performance of the earlier ATI-chipsets, don’t we?

The new Series-7 chipset offers some cool new features that I think will be appreciated by many.

CrossfireX
We all know Crossfire – the ability to take two cards and run them together for (almost) double the performance. With the Series-7 chipset and the new GPU’s we now will get CrossfireX. No longer will you have to settle running two GPU’s together, instead you will be able to hook up 1-4 HD3xxx GPU’s in a system. This of course will depend on which motherboard you get. While the top-end motherboards using the 790FX chipset will have 4 16xPCI-E slots, cheaper versions will have less slots (790X will have 2 and 770 will just have one 16xPCI-E slot). As the Series-7 chipset also has 42 PCI-E lanes there will be no shortage in lanes for the slots. Motherboard builders will be able to choose any configuration they want.

In addition to allowing up to 4 cards in CrossfireX configuration, AMD now makes it possible to run multi-monitor with Crossfire. Up until now you have not been able to use more than one monitor when using SLI/Crossfire but this is now no longer an issue. With the 790FX chipset you can run up to 8 monitors. I would expect this is not something that is useful for a majority of gamers but imagine running 2 monitors with two 3970-cards in a game like Supreme Commander. Yummy.

PCI-Express Gen 2 and HyperTransport 3.0
With new Quad-core CPU’s and Quad-GPU CrossfireX there is a need for more bandwidth for the graphics. This is handled by PCI-Express 2.0. This increased the bandwidth from around 8 GB/s (bi-directional) for PCI-E 1.1 to 16 GB/s for PCI-E 2.0. Add the new HyperTransport 3.0 and you can gain even more performance.

AMD Overdrive
Finally AMD is offering a way to tweak the chipset/CPU on the AMD platform. NVIDIA has offered nTune for quite some time now and I’ve been surprised that AMD has not managed to do something similar until now.

In short the AMD overdrive application allows you to change all sorts of settings in the bios, from Windows. It has a “Novice Mode” where all you have to do is drag a slider to increase the performance and an “Advance Mode” where you can change tons of settings (including clocking each core in a Phenom separately).

All Tier-1 board partners will have support for this utility right away and it looks like other board makers will also support it. If there were something that you could complain about the nTune-application it was that few motherboards supported it fully.

We had the opportunity to test the Overdrive software during our benchmark sessions and while it definitely looks nice and the advanced mode seems to be useful for tweakers I’m a bit sceptic about the novice mode. If a regular users sees a slider where the best settings says “High Performance”, who will not slide it all the way to the top? AMD assured us that these settings cannot damage the computer but I’m more worried that it can make the system unstable; something a regular or novice user might not understand is due to using to aggressive settings in Overdrive.

Overdrive also includes Auto-tuning for the CPU and HyperTransport bus. It finds the maximum speeds for both and just like nTune it can take hours and lots of restarts to finally arrive at a overclocked speed. AS soon as we have our own Phenom we will of course try it out.

We should see motherboards using the various Series-7 chipset from all the major board makers soon. During the event we saw motherboards from Asus, MSI, Gigabyte, Foxconn, Sapphire and Asrock. In addition to these we also will get motherboards from ECS, Biostar, Jetway, Abit and J&W.

Most motherboard makers seem to use the SB600 SouthBridge but at least Foxconn told me that they will wait and release their motherboard with the SB700 SouthBridge in January.

Also rumoured to come out in early 2008 is the RS780 chipset which will have integrated graphics. The rumoured specifications are DirextX10.1 support, performance as the 2400/8500GT and integrated UVD. Talk about a kick-ass motherboard for a HTPC.

Photos


A lot of motherboards were on parade.

 
Left: Gigabytes 790FX motherboard should already be out and uses the SB600 Southbridge.
Right: As always ASUS come up with new cool features for their motherboards. How about a heatsink for the memory?

THE PHENOM CPU – 4 CORES IS BETTER THAN TWO

When AMD moved from the Socket939 to Socket AM2 a lot of users were upset. There seemed not to be much of a difference between the Socket 939 and the AM2 CPU’s to warrant a change in socket. It was apparently a difficult decision for AMD to make this move but it was done with the future in mind. This is obvious when looking at the new AM2+ socket that is introduced with the Series-7 chipset. Older AM2 CPU’s will work fine in the AM2+ socket while new Phenom’s will work fine in the older AM2-sockets. All that happens is that the new features will not work. The next update to the socket will also be backward compatible with the current sockets making it possible to upgrade to a new motherboard without having to change CPU.

So what are the stand-out features of the Phenom?

First “True” Quad-Core Desktop processor
AMD really likes to tell everyone that unlike the current Quad-Core from Intel, which is 2 Dual-Core CPU’s put together, the Phenom is a true Quad-Core CPU built up from the ground. All cores, the memory controller and separate I/O interface communicate through a high performance crossbar switch.

Integrated memory controller
This is not a new feature but it is worth mentioning. The integrated memory controller can optimize memory performance both with matched and mismatched DIMMs. The physical address space has been increased to support up to 256 TB of memory.

Shared Third Level Cache
The shared L3 cache provides additional cache capacity; it provides a shorter average latency for cache and allows cores to rapidly share information without DRAM access. This reduces access latency.

HyperTransport 3.0
HT 3.0 has the potential to increase the I/O bandwidth to 5.2 GT/s. The raw bandwidth is up to 20.8 GB/s. The Phenom is of course backwards compatible with HT 1.0 and HT 2.0 although you get less bandwidth with them (up to 6.4 GB/s raw bandwidth for HT1.0 and 8.0 GB/s for HT2.0).

Cool’n Quiet 2.0
In these environmental conscious times it is nice to be able to have your CPU take it easy when the full power is not needed. In the Phenom AMD has updated Cool’n Quiet adding more features to lower idle power consumption and reduce fan noise. The processor has a C1E power state which activates when all the cores are inactive. It disconnects the HyperTransport link, places the memory in low-power mode and lowers the internal clocks.

A new feature called PSI (Processor Power Saving Indicator) helps reduce the amount of current fed to the processor. A new processor output can notify the processor voltage regulator that not as much current is needed. The voltage regulator can then reduce the number output phases to improve efficiency when the cores are idle.

These new features bring the Phenom in compliance with the new Energy Star 4.0 requirements.

Overall it sounds pretty cool, doesn’t it? Well – there is a bit of a problem though. First of all AMD is only bringing out a 2.2GHz (AMD Phenom 9500) and a 2.3 GHz (AMD Phenom 9600) part. You might wonder where the 2.4 Ghz Phenom (AMD Phenom 9700) is? Well, it turns out there is a problem with the first versions of the Phenoms. A bug has been found in the processor (actually in the L3 cache) which affects the stability of the CPU’s at some extreme cases. This bug is actually in all the current Phenom CPU’s but AMD thinks it only affects the 2.4+ GHz Phenoms. When the processor is running under a heavy work-load and some applications are run the processor crashes. AMD already has a bios-fix but that on the other hand can induce up to 10% speed-penalty. In fact, currently in the AMD Overdrive software you can opt to turn the bios-fix on/off depending on if you want to get the full speed but risk the bug or if you want to play it safe. AMD will of course fix this in the next revision of the chips but it still is a blow to them not even is able to get a 2.4 GHz part out.

During the event we had access to Spider-systems using the 2.4 GHz CPU and we could overclock it quite easily with the AMD Overdrive software to around 3.0 GHz so things are not as grim as they seem. Beginning next year we should hopefully see new faster Phenoms including FX-Phenoms that are clocked at 3.0 GHz. 

Performance

At the event we had access to some Phenom-systems and a limited number of benchmarks. The only benchmark that I could run on both the Phenom-system and my system at home was PCMark Vantage. Below is some of the sub-scores that might be of interest. As soon as I got my own Phenom at home more relevant benchmarks will be run.


 

Intel System: Intel [email protected], 2 GB DDR3@1066 MHz, HD2900XT and HD3850 512 MB, 36 GB WD Raptor, Catalyst 7.10, Vista

Phenom System: Phenom [email protected], 2 GB DDR2@1066 MHz, single and dual HD3850 512 MB, 150 GB WD Raptor, unknown driver version, Vista

THE FINAL PIECE OF THE SPIDER PUZZLE: THE NEW HD 38×0 GPU’S

As this is AMD’s first launch where they are launching a complete package there is just one thing missing: the new GPU’s. If you are expecting something that will rival NVIDIA’s GeForce 8800GTX Ultra then you will be disappointed. AMD has instead decided to target the enthusiast/mainstream market and, in my opinion, released two GPU’s that no doubt will be very popular. Instead of going up against the GTX and GTX Ultra, these two new GPU’s are going up against the GeForce 8600GT and the GeForce 8800GT.

The main new features of the HD3800 series of GPU’s are:

DirextX10.1
This is actually a minor upgrade to DirectX10 which will be coming to Windows Vista with SP1. It helps improving the HDR lighting and makes real-time global illumination possible for more realistic lighting.

Global illumination is not a new thing but has previously been done by doing as much as possible of the global illumination calculation during pre-process time and then store and use as a runtime. This produces nice effects but does not work for dynamic scenes with dynamic lighting. In DirextX10.1 global illumination is done by rendering hundred of cube maps to capture the lighting in the environment. DX10.1 adds cube map arrays for this.

55nm manufacturing process
By moving to a 55nm manufacturing process AMD is able to decrease both the power usage but also the price of the GPU. The number of transistors is slightly higher in the HD3870 than in the HD2900XT and slightly lower in the HD3950 (actually 666 million transistors which would make this the GPU of the Beast?).

UVD
We had UVD on the HD2400 and HD2600 last generation while the HD2900XT did everything in the shaders but now it will be present in both the HD3850 and the HD3870. The reason they skipped it in the HD2900XT was that they did not see it as a HTPC-card and thus you could use the power of the shaders etc. even though it meant a bit more CPU-usage. With the HD39x0 AMD has two cards that both look like good candidates for Media-PC’s and thus the inclusion of UVD becomes more important.

While talking about DirectX10.1, AMD’s Richard Huddy got quite agitated when getting asked about the comments from Crytech and Microsoft that DirectX10.1 was not an important update. He went as far as saying the Crytech was just holding their GPU-partner behind the back and that the internal reaction at Microsoft was quite upset. He went as far as saying the Crytech had gotten money from NVIDIA for the Crysis-development, saying that AMD knew this as they had also been in the bidding-war.

Talking to Richard Hyddy afterwards I asked him regarding the lack of a high-end GPU-part. This would be the second launch where AMD did not have anything to counter the 880GTX and 8800GTX Ultra with. His answer intrigued me. He said that AMD sees us at a cross-road. Making single chip video cards faster and faster is getting harder and harder. Instead AMD sees Multi-GPU as the road to take for the future. His answer was that AMD’s response to the 8800GTX was two HD3870 cards. And with the ability to support up to four cards in CrossfireX, he felt that AMD would have products both for the mainstream, enthusiast and hard-core gamer market.

I think it is an intriguing idea that instead of buying one card you buy two or more cards if you want the performance. I can see a point here. It makes it possible to easily add more performance 6 months down the road instead of having to buy a complete new card. A good example is going from the HD2600XT to a HD2900XT or from a GeForce 8800GTS to a GeForce 8800GTX. They still are upgrades inside a generation but they require quite a lot of investment. If you buy a HD3850 it would be cheaper to add one or two more HD3850 (this of course requires the correct motherboard) and get a performance boost.


MSI showed a motherboard with 4 HD3850 in it.

The problem I see with this philosophy is that so far the state of the Crossfire and SLI performance has not exactly been impressive, especially on Vista. If Crossfire and CrossfireX are to become an important upgrade and performance route for AMD they need to get those drivers rock solid so that a normal user can just pop in another card and right away get the performance benefit from them. I am also a bit hesitant regarding the power usage with this model. These cards do not use much power but combine three or four of them and the power usage starts going through the roof.

At the end of the event we also got the chance to see a HD3870 X2 card running in a case. This is a card that is similar to the 2600XT Gemini that was very rare in the wild. It has two HD3870 GPU’s directly on the card and thus basically acts as a Crossfire HD3870 setup. With two of these you will have Quad-Crossfire, but with two cards.


Two HD3870-X2 running in Crossfire

I have no clue how well it will work and the cards will not be out until beginning next year but I would see X2 cards as a more viable competition to NVIDIA’s high-end cards than two cards in Crossfire mode.

AMD is aggressively pricing these two GPU’s. The HD3850 comes at a recommended price at around 179€ while the HD3870 has a recommended price at around 229€. A quick look at a few local Swedish web stores (Komplett, Webhallen) shows the cards already costing less than this and a HD3850 can already be had at around the same price as a GeForce 8600GTS while the HD3870 costs ~50$ less than the new GeForce 8800GT.

Performance

We had the opportunity to run some benchmarks at the event and I also got a HD3850 with me home. These are just some quick benchmark runs and I will soon have the opportunity to run some more tests on both a HD3870 and a HD3850 and compare it to both other AMD- and NVIDIA-cards.

System: Intel [email protected], 2 GB DDR3@1066 MHz, HD2900XT and HD3850 512 MB, 36 GB WD Raptor, Catalyst 7.10, Vista

The HD3850 is the slower card of the new pair and thus costs aroundn half of what the HD2900XT does and it still is almost as fast. Considering it draws much less power, is much less noisy, runs cooler and has UVD on-board it is a clear winner in my book.

System: Phenom [email protected], 2 GB DDR2@1066 MHz, single and dual HD3850 512 MB, 150 GB WD Raptor, unknown driver version, Vista

This test was done at the event. As you can see the HD3850 scales nicely as we move up in the resolution. I cannot wait to test this in some real games.

System: Intel [email protected], 2 GB DDR3@1066 MHz, HD2900XT and HD3850 512 MB, 36 GB WD Raptor, Catalyst 7.10, Vista

I ran a few quick benchmarks with the full version of Crysis. I used the Crysis benchmark utility which can be found on the net and ran the gpubench.bat (included with Crysis) at low, medium and high quality settings. DX9 was forced. DX10 scores will follow later.

We see the same tendency as with 3DMark06. Even though the HD3850 costs half of the HD2900XT it still overall performs very well.

Photos

 
Left: ASUS HD3870. Right: Gigabyte HD3870

 
Left: A HD38x0 card from Sapphire with a interesting cooling solution. 
Right: Two HD38x0 cards from GeCube, aslo with a slightly different cooling solution. 
These cards are clocked higher than the standard HD38x0 cards.


This quad-crossfire HD3870 system where actually running 3Dmark06 in a demo-loop. 
No-one was allowed to benchmark it though even though we tried.

FINAL WORD

These events always are a bit fun. In a few days you get pumped with information about how great the products are, hopefully with some good technical information hidden somewhere behind the PR-fluff, while you just cannot wait to get home to test the new toys in a “real” environment. While the actual briefings can be interesting, the most important part is instead the ability to meet other journalist and discuss everything from the products we just saw to how to review properly. All this is of course done over lots of bears. The overall feeling I got from talking to my fellow journalists was that while the GPU’s and the chipsets sounds really good there was a real disappointment of the slow core-speed of the first Phenoms. One of my colleagues even used the title: Phenom-fiasco for his magazine.-article.

I can agree to a certain extent. It must be frustrating to prepare for a big launch like this and then at the finish line have to pull back some of the higher clocked Phenoms because of a bug. It must also be frustrating to the chipset guys and the GPU guys that the whole Spider platform is getting crippled by the CPU’s. To AMD’s credit they have instead chosen to compete with price and there is no doubt that the new Spider platform offers a tremendous value for money for the enthusiast and mainstream customers. It might not be enough to lure people back from the Intel fold but it should be a great upgrade path for people with older Athlon 64 X2 CPU’s that have been waiting for a AMD solution.

AMD might not have completely succeeded but they have shown that there still is life left in the company and that Intel and NVIDIA still will have to be on their toes in the future.
 

Check Also

EZCast Pro HDMI Mirror2TV Stick – First look

Once upon a time I used to be very interested in finding the perfect media …

AMD presents 2012 A-Series APU (Trinity)

AMD today unveiled the next generation APUs for notebooks: Trinity. With more performance, Eyefinity-support and …

Leave a Reply

instagram default popup image round
Follow Me
502k 100k 3 month ago
Share