AMD ATI FirePro V8800 sets SolidWorks 2010 on fire

Published 08 April 2010

Posted by Greg Corke

Article tagged with: solidworks, amd, graphics, ati firepro

For the last 24 hours I’ve been sat in a dark room playing with AMD’s brand new professional graphics card, the ATI FirePro V8800. Oh what an exciting life I lead!

The fact is it’s an exceptional 3D card and one that literally ripped through our SolidWorks 2010 graphics benchmark – making incredibly light work of manipulating our RealView model. It’s also got some interesting new technology called Eyefinity that supports multiple monitors and if, like me, you’re frustrated by ALT/TABBING your way through apps, could certainly do wonders for your productivity.

Unfortunately we didn’t get to test this feature out but hope to soon – contrary to popular belief we don’t have dozens of spare monitors lying around the DEVELOP3D office.

Over the next few months AMD is set to introduce other FirePro cards based on the same core technology and we’ll test these out as soon as we can get our hands on them. In addition, if the rumours are correct, some new Quadro FX cards from Nvidia are also due out soon, so we’ll hopefully add these into the mix for some sort of graphics card showdown.

If your workstation currently clunks away when rotating 3D models, and if you’ve got some spare cash, the next few months would be an excellent time to invest in some new 3D technology. And with the recent introduction of Intel’s new Xeon X5600 series six core processors, there’s a phenomenal amount of processing power to back it up if you can stretch to an all in one workstation.

For a full review of the ATI FirePro V8800, click here.

Comments:

I would be curious to see you do some benchmark comparisions using Windows 7 between the FireGL cards compared to the AMD high end Radeon gaming cards. Same for the Nvidia Quadro and Nvidia GeForce. Are the Quadro or FireGL cards really worth the $1000 premium? I am in doubt about this using DirectX drivers.

Posted by Kevin E on 09 April 2010 at 03:36 AM

Despite ATI's superior benchmark stats, I feel most Solidworks users would still prefer the Quadro line of cards for Nvidia's driver stability and seemingly less graphic glitches. It'd be interesting to see a poll from SW users and readers of Develop3D if this indeed is true. I'm running a FX1800 myself, which so far has been great.

Posted by Daniel on 12 April 2010 at 02:26 PM

Kevin, While we appreciate many 3D professionals use ‘gaming’ cards – and we know there are many happy doing this - we would not generally recommend using GeForce or Radeon for CAD use, particularly in applications that use OpenGL (although, as you allude this is less of an issue for DirectX) Most 3D CAD applications, including SolidWorks, use OpenGL, but not all features of OpenGL are supported in gaming cards. I’m not 100% sure, but I seem to remember that VBOs (Vertex Buffer Objects) is one of them. You can read about this in this DEVELOP3D article on SolidWorks graphics In addition, gaming cards aren’t certified by CAD vendors so if you do run into any problems with gaming cards you might not be supported properly. Also in recent years, both AMD and Nvidia have also dropped the price of their entry-level professional cards, so you don’t have to spend a lot of cash to get a certified card.

Posted by Greg Corke on 13 April 2010 at 08:55 AM

Daniel, You raise a very interesting point and one that’s very hard for use to quantify as a magazine – it’s not all about frame rates. We’ve used both makes of cards under SolidWorks and have never experienced any problems. However, we don’t use SolidWorks day in day out like you guys. Currently, there are far more Nvidia Quadro FX cards out there than AMD ATI FirePro cards, so I think it would be hard to get a balanced view, but I’d certainly be very interested to hear from SolidWorks users' experiences good or bad. I’d also like to hear from anyone using gaming cards. As far as your choice goes, I agree the Quadro FX1800 is a good all round card and performed well when we tested it under SolidWorks 2009. The new cards from AMD certainly look great and it’ll be interesting to see how the lower end cards stack up in terms of price / performance when they are released.

Posted by Greg Corke on 13 April 2010 at 09:07 AM

I think that Autodesk made a great move making Inventor use DirectX drivers instead of using OpenGL. From what I understand, there is not a lot of difference between a GeForce card and a Quadro card. In fact, people used to regularly "soft quadro" their GeForce cards to turn them into a Quadro card using software. I know you can buy cheaper Quadro cards, but to get some of the same specs as say the GeForce GTX 470 or GTX 480 you are going to have to buy the high end Quadros and there will be a significant price increase. Moving from OpenGL to DirectX hasn't seemed to cause any issues for Inventor. I would say that it has brought some improvements along with the ability to save a bundle of cash on graphics cards.

Posted by Kevin E. on 15 April 2010 at 05:29 PM

Greg the one thing that has not been mentioned is that a lot of the new rendering tools around intend to use or do use Nvidia CUDA cores that only come with Nvidia cards. So if you use Hypershot or new versions of in built Mental Ray based renderers or similar Nvidia cards are likely to remain the best option.

Posted by Kevin Quigley on 16 April 2010 at 08:58 PM

Kevin E, It makes sense for Autodesk since they only support Windows anyway and DirectX is Windows only. For everyone else that support multiple o/s's....not so much.

Posted by Andy B on 19 April 2010 at 05:31 PM

Andy B, If I'm gonna use Autodesk Inventor, is an HD 5870 or an HD 5970 okay for it? Or should I use Nvidia cards?

Posted by D on 21 April 2010 at 11:43 AM

I'm not at all sure that Inventor being on DirectX means there aren't any problems. It's even stated in the release notes of Inventor 2011 that 'not all WHQL certified Direct3D graphics drivers are completely trouble free.' What's also interesting is that most CAD/CAM applications have continued with OpenGL, even the ones that only operate on Windows. I can only think of three that use DirectX - Inventor, MicroStation and Solid Edge, although i am sure there are others. What could be interesting with Autodesk is that it is one of the key CAD software developers that is actively looking at the Mac platform. If AutoCAD (more likely) or Inventor (less likely) was ported to OSX they would have to develop a new graphics engine, or borrow one from one of its Mac-based programs. Kevin, What you write about new rendering tools using iray is very true. I'm not really a fan of bespoke technologies, but nVidia certainly looks to have some momentum with the mental ray based renderer. AMD also has a bespoke GPU rendering technology through studioGPU.

Posted by Greg Corke on 21 April 2010 at 03:02 PM

Andy B, Couldn't really help you on that, as never tested Radeon cards, but you may be able to find some information on here http://www.inventor-certified.com/graphics/index.php

Posted by Greg Corke on 21 April 2010 at 03:09 PM

Also worth note about Inventor is that you can throw masses of graphics power at it and your frame rates won't necessarily increase. This is because the CPU is usually the bottleneck before you get anywhere near hitting the limits of the GPU (the same is also true with some models under SolidWorks.) Some models we have tested with Inventor show no performance difference between a £100 graphics card and a £2,000 one. This may have changed for Inventor 2011, but we haven't tested this yet.

Posted by Greg Corke on 21 April 2010 at 03:13 PM

Due to all the confusion with OpenGL and DirectX gpu choice with Solidworks, Inventor and others, can Develop3d do a definitive guide as to what is needed for each api (and thus the possibly choices for differing cad-applications)? I ask this because we've recently looked into changing from XP32 to Win7 64bit for Inventor 2010 (now 2011) and during this change we decided to investigate new workstations (and thus new graphics cards). We spoke to a number of sellers whom had systems reviewed by your good selves, and all were pushing the OpenGL cards stating they were more suited for Inventor (with gaming-cards not even on the list of available options). Knowing Autodesk had changed to DirectX a couple releases ago, we asked about the performance-to-price differences between "cad" and "gaming" cards with benchmarks for each. The reply seemed pre-rehearsed, stating cad cards are more accurate, have better drivers and are more suited for the job (quoting Specviewperf benchmarks as it's "cad-related"). Looking into this benchmark I noticed it was OpenGL and thus moot for our Inventor needs - why are we being quoted OpenGL benchmarks when we specified DirectX hardware?!? Wouldn’t a 3D Mark or Passmark benchmark be more suited? How are these cards more accurate?!? no answers given... I felt I had to do more digging around into the differences between cad and gaming cards and am honestly at a loss as to why Inventor users are still being pushed the more expensive cards. The only conclusion I can draw are resellers are either up-selling (by stating potential untruths) to gain a better profit margin or possibly just ignorance as to the different needs Inventor/DirectX now has over SolidWorks/OpenGL. I can understand advantages for cad-cards in the OpenGL-world, but Inventor’s moved on (thank god) and as such surely so should the workstation-vendor’s options and advice? I know this has been discussed many a time, but generally only within the OpenGL environment with respect to cad-products. As such, I am asking for a definitive guide for what is needed today, for each environment to clear up any confusion end-users, vendors and myself might have. For ref, my recent findings: I have tested Inventor 2010 (on Core 2 Due 32bit XP) with a Quadro FX 1500, GeForce GTS 250 and ATI HD 4770 - no advantage for the Quadro. Strangely ATI was noticeable slower creating idw 2d base-views (for a specific assembly it was 6.5mins for both Nvidia cards but 8.5mins for the ATI). As for Inventor 2011 (on i7 64bit Win7) with Quadro FX 580 against the previous GeForce 250 and ATI 4770 – the Quadro struggled with frame-rate and usability compared to the 2 other cards, rotation and panning a chore with the model lagging behind the mouse. At least this time the ATI didn’t suffer the slower 2d-plotting speeds. I have discussed my findings further with others on Sean Dotson’s website: http://www.mcadforums.com/forums/viewtopic.php?f=5&t=11067 Collection of quotes from an Autodesk Inventor programmer explaining an insider’s view into the matter: http://www.docstoc.com/docs/15647572/Autodesk-Inventor-OpenGL-to-DirectX-Evolution/ My impression is that for DirectX the most powerful card for your cash, whether cad or gaming, is the best choice. Plus some details into how, with Dx10, concerns over display-accuracy have been solved (well, I think that's what is being explained at the end...). Look forward to any thoughts from Develop3d, Sam

Posted by Sam M on 06 May 2010 at 02:57 PM

Sam - thanks for all your comment and links. Makes for an interesting read and there are lots of valid points. I can quite understand why you are having a problem getting a definitive answer to your question of whether to use a professional or consumer card for Inventor. The thing is everyone has different opinions because they have different experiences. As a magazine we don’t test CAD software nearly as much as someone who uses it day in day out – otherwise we wouldn’t have any time to produce magazines ☺. And someone who uses the software in a particular way may experience an issue that no one else does. For me, while frame rates are important, this whole consumer versus professional debate comes down to support. While I’d love to believe that all software is perfect, we’ve all experienced glitches / faults in all sorts of code. When there’s a problem with the CAD software, the glitches are often first identified by users, who then report the problem to the support team and the software developer fixes the problem and issues a Service Pack. Both professional and consumer graphics cards can also have glitches in their drivers. BUT if there’s a problem with a professional card’s driver when running CAD, I’d be pretty confident it would get fixed in a new driver release. Conversely if there was a problem with a consumer card’s driver when running CAD I wouldn’t even expect it to register on their list of priorities. Why would it? Most of their customers play 3D games and if I was in charge I’d certainly pool all my development resources into making sure Call of Duty or whatever the latest game the kids (and adults) are staying up all hours playing, runs better, smoother, etc. Yes, there is an argument that consumer cards are WHQL certified so all DX applications should run fine. And while a certain level of testing is required for the WHQL stamp of approval I don’t believe for a minute that Microsoft tests the graphics card and drivers when running Inventor or any other DX CAD application and even if it did it would do it for any length of time. For me, knowing that I was fully supported in my choice of hardware would make me want to buy a professional graphics card every time – even for DX applications. But I can totally understand that some choose to go the consumer route and would rather save money. I guess in some ways it’s a bit like buying insurance. Incidentally, AMD’s professional graphics team recently told me that one of its UK FirePro customers had been experiencing a display problem in Inventor 2011 and it has already fixed the problem in a new driver release.

Posted by Greg Corke (DEVELOP3D) on 07 May 2010 at 02:53 PM

Greg, thanks for the reply. I understand the support benefits of the workstation cards over consumer, but honestly still think potentially false information is being fed to the end-user. We’ve been told so many reasons over the years for choosing the cad-cards when using OpenGL, so fear the same are still being spouted for, almost automatically, irrespective of whether the customer is using DirectX or OpenGL (and personally could question if the hardware vendors even know the difference from my recent experiences). I would honestly love someone to digest all the info and present it in a way that clears up all confusion and sets aside OpenGL desires from DirectX. Know it would benefit not only myself but the majority of your readers, which is why I hope this discussion could be expanded into your magazine. Yeah, I know, I can hear the groans and imagine eye-rolling at any suggestion of another graphics-card and OpenGL v DirectX rehash, but I think we're in the murky position we are simply because old arguments (possibly false ones) are still being applies from the OpenGL world in the DirectX (and vice-versa). Especially in light of those comments from the Inventor programmer which almost contradict any hardware-vendor. I think this is important, not only to Inventor users already on DirectX, but from what I understand the new OpenGL version possibly screws over the cad-market and could potentially either force a change to DirectX or even more confusion as to which graphics card and their suitable drivers. Know it's a year or so old, but Tom's Hardware's comparison between OpenGL 3 and Dx11 suggest considerable problems with the new OpenGL and cad (either in its short-term stability, or possibly a long-term concern as to the lack of OpenGL’s respect/consideration for the existing cad market): http://www.tomshardware.co.uk/opengl-directx,review-31330-4.html "According to Carmack, OpenGL 3’s falling short of what it was supposed to be is mainly the fault of certain CAD software developers who weren’t really favorable to Longs Peak. They were afraid of problems with compatibility and their applications due to the disappearance of certain older functions. That version was tacitly confirmed by Lichtenbelt: “ During the Longs Peak design phase, we ran into disagreement over what features to remove from the API...The disagreements happened because of different market needs...We discovered we couldn’t do one API to serve all..”" As for your comments regarding a premium support on the cad cards despite WHQL certification, this is where I get lost. As far as I understand the situation, Nvidia (or whoever) create a card and DX driver that goes to M$ for submission – if it meets the DX criteria it is certified. On the other side of the coin Autodesk writes their code to take advantage of the specified DX standard. If there is a bug it’s either in the way the cad software uses the DX api (and thus a patch) or the graphics card hasn’t truly met the criteria and thus it shouldn’t have been certified. Know this is possibly overly simplistic, and we all know bugs/mistakes get through any system, but still think it’s a better system and more controlled than OpenGL. Hell, looking at the box for the GeForce 250 next to me it says it supports OpenGL2.1 and thus should be on par with a cad card within that environment, but doubt it. Even the old OpenGL arguments confuse me when I sit and think about them. Been told previously that cad cards are more “accurate” – how? I’m aware they have a more mature OpenGL library which possibly explains this statement to a sub-pixel level, but does that mean the gaming card will produce an incorrect part??? The 3d coordinates for that part are presumably held and manipulated by the cpu, adding a fillet and the tangential calculations are cpu-based; in a similar respect and FEA calculations are performed on by the cpu from the 3d matrices held in the ram. Sure, as soon as it comes to display the model on the screen the 3d matrix is textured and flattened to 2d by the gpu, but how is the result any more “accurate?” It’s not like a 1.6±0.1mm dimension on a print will suddenly come out as 1.5 when pdf’d or plotted. Yes, I'm possibly overly cynical but is it just in the interest of the Gpu manufacturers to keep this information vague so we keep buying their premium range? Take for example the Quadro 580 (UK retails for about £130), as far as I understand it's a glorified GeForce 9500 with tweaked drivers (UK retails for about £40-50). If they're able to make a profit from selling this hardware for £40, but now they have a market persuaded to pay three times that - of course they're financially motivated to keep pushing these cards... If, hypothetically, Dassault or Autodesk have a part in this OpenGL driver support and thus a cut of the profits, then a true conspiracy-theorist would say that it's equally in their benefits to keep end-users using the premium cards too. Hell, certainly not going to make any friends with gpu manufacturers if they admit the gaming cards are suitable, so am surprised by the honesty of those comments by the Inventor programmer (and thus not surprised it’s not an official statement) – yet another reason why I feel an independent publication into all of this is needed. When I started to write this I had a few concerns and questions which I felt were worth asking, but as I skim through it, it seems more of a rant and thus apologise. Are there potential problems with OpenGL in the future? Will there be more interest in DirectX by cad developers? Have I got the wrong end of the stick and am wrong myself? Any thoughts?

Posted by Sam M on 11 May 2010 at 04:09 PM

Sam, I'd love to be able to do an in-depth article on this as it's certainly an important topic for our readers - and one that comes up a lot. Unfortunately we just don't have the resources to do this - or the in-house technical code-level knowledge of the APIs, the way that CAD vendors have implemented them, or the graphics drivers. I hear a lot of things that go on behind the scenes that I wish I could print. In terms of accuracy, I think maybe it might not be the best word for the hardware vendors to use. I have seen examples in the past with consumer cards where the shaded face of a solid is not in the same place as the edges of the same model, or lines have simply not been drawn. I have also seen Autodesk's ViewCube disappear in AutoCAD - in DX mode - but was then visible when Nvidia's dedicated OpenGL driver, Powerdraft was used (which does not run on consumer cards). I am meeting with Nvidia next week and will pose some of the issues you raise. I will also seek to get more information from both AMD and Nvidia (and possibly Dell) with a view to writing a more in-depth article in the future about exactly what goes into the certification process for OpenGL and DirectX applications. I'm sure you are aware that both Nvidia and AMD have higher profit margins on professional cards than they do on consumer cards, but I know they spend a significant amount of time and money on testing, optimisation and certification. I personally don't think there's a conspiracy out there. I do think there's a problem with education and hope to find out more information about this that will help CAD users make better informed decisions about their hardware purchases. Cheers, Greg P.S. We like rants on DEVELOP3D.com. That's what it's there for smile

Posted by Greg Corke (DEVELOP3D) on 12 May 2010 at 02:28 PM

Leave a comment

Enter the word you see below: