Personal View site logo
Graphics card for video editing
  • 129 Replies sorted by
  • Speedgrade is another beast. I can load 4K footage, grade it and play back real time butter smooth on an older AMD Phenom II x6, 16GB, GTX460 system.

    From another forum/GPU thread, I read:

    SpeedGrade is in a league of its own in terms of performance. Instead of using GPU compute, i.e. use the GPU to mimic CPU functions, it directly leverages the GPU's shaders through OpenGL. The result is performance an order of magnitude better than the likes of Premiere Pro or Resolve. Interestingly, the SpeedGrade team recommends Quadro GPUs

  • I have several of these cards in different computers, and looking at the "load" using GPUZ, even the GT 240 is never running at peak. That is, the program just doesn't ever use all of the GPU power even on the 240. I have a passively cooled GT 240 for my Video/Audio daw that is silent, and everything runs fine. Ray tracer in After Effects works if you hack the raytracer text file, exactly the way it is done in Premiere. If anything, playback is even smoother in CS6. I don't see a big difference using DDR5 as opposed to DDR3, but the testing shows a slight increase in throughput. An i7 with hyperthreading of course makes a big difference, and you need 12 gig ram. I plunked for the 16 gig because it is ridiculously cheap. I do believe that scaling, for all kinds of pan and scan, etc, is better using the CUDA. My GT 240s will also power the Catleap 27" monitor no problem. Lastly, native GPU acceleration in Speedgrade requires at this time a real Quadro card, if there is a hack for this I have not found it. Waiting for the budget Tesla clone :)

  • "It is always good to see some real benchmarks or measurements."

    For CS5.5 http://ppbm5.com/ For DaVinci i havent seen any... but i also would like to see some.

    "Are you sure that you need 4Gb?"

    I don't know exactly how many memory Premier needs for "GPU" 4K editing, but according to people that is testing it in the Adobe Hardware forums it is needed more than 1GB (sorry dont remember exactly the amount but according to them 1,5GB is enough ) and having a lot of memory helps to improve performance.

    http://forums.adobe.com/message/4290928

    On the other hand (although I think it is not very important for this conversation) having a lot of memory is important for GPU rendering as, the whole scene must be stored in the card memory. (I guess this can vary depending of the render engine)

    http://jeffpatton.net/2010/11/gtxquadrotesla-my-opinion-on-todays-gpu-selections-for-rendering/

    I just wanted to express that having that big amount of memory can be useful for a lot GPU computing applications, even if now you don't need it.

  • OK.

    Yep, I agree if you need very good performance GTX 670 seems sweet spot.

    but most probably in real things GTX 640 won't be much behind.

    Looking at http://www.studio1productions.com/Articles/PremiereCS5.htm it seems that Adobe products scaling is not the best.

    It requires someone to make proper test for DaVinci.

  • @Vitaliy, No, I definitely don't need 4GB at the moment. But in the near future, who knows!? I do know that one of the guys that set up the PPBM5 tests has reported that the GTX680 slightly out performs the GTX580 and I'm pretty sure that was in CS6 (but don't quote me on it!) Bill Gehrke is his name.

    I'm intending on gaming as well as editing/compositing when I get my new system next week, and I have heard that there are some games that sit close to eating up 2GB vram. I guess I opted for the 4GB card for a bit of future proofing, but to be honest it probably wasn't worth the extra money.

  • I still think that GTX670 is a great choice right now.

    It is always good to see some real benchmarks or measurements.

    and 4 gig of memory let you do 4k editing, complex GPU 3d rendering

    Are you sure that you need 4Gb?

  • @ignatius

    I still think that GTX670 is a great choice right now. At the moment it is quite good for CS6 and CUDA apps, gaming.... and 4 gig of memory let you do 4k editing, complex GPU 3d rendering...

    Also Quadro cards now are: very limited and cheap or very expensive.

  • It is very interesting to see real benchmarks on real systems concerning major editors, and Davinci 9.

  • @Pedro_ES, Yep, that sounds like a plausible prediction. They're moving to make a clear delineation between gamer cards and professional cards, and of course anything that has 'professional' in the title means premium price. The next generation Quadros will leave the GTX series behind, but you'll have to pay through the nose to get at them!

    I've just dropped about £2550 on a new system and opted for the 4GB GTX670. Won't have that sort of money ever again (now that I'm married with talk of kids!)

    All the best

  • Yes!... all I have said before is refered to CUDA but AMD is doing some beast in general computing, but not well supported yet by software vendors

    See my two points why :-)

    I'll try to talk with BM, but I am almost 100% sure that Adobe is getting money for OpenCL delays.

  • @Vitaliy_Kiselev

    Thats why it is an small move :-)

    "Look on chart on top. AMD cards are much better for GPU applications. Especially for double precision."

    Yes!... all I have said before is refered to CUDA but AMD is doing some beast in general computing, but not well supported yet by software vendors

  • Also for some CS5 benchmarks... (i think @ignatius was refering to this)

    http://ppbm5.com/

  • Also Adobe made an small move supporting OpenCL in CS6 for some AMD cards (AMD Radeon HD 6750M and AMD Radeon HD 6770M)

    This support is very limited. And works for Mac version only.

    Nvidia will make easier for us to justify spending a lot of money on this cards

    Look on chart on top. AMD cards are much better for GPU applications. Especially for double precision.

    Yes... also they hit first this market and the positioned CUDA very well

    It is very hard to tell who had been first. But it is true that ATI did not care for niche markets involving GPU applications. I think AMD managers made error hoping for OpenCL standard. Nvidia could not survive if it happens.

  • @Vitaliy_Kiselev

    Yes... also they hit first this market and the positioned CUDA very well

  • @ignatius,

    Yes, they are still a good choice for CS6, but they are worse for most applications that use double precision operations... and I think that if this doest change, next generation wont be the best choice.

    Also Adobe made an small but interesting move supporting OpenCL in CS6 for some AMD cards (AMD Radeon HD 6750M and AMD Radeon HD 6770M)

    It is just a prediction, but I guess that, unfortunately, next Quadro generation will be a better general computing card that the GTX line (for CUDA)... so, unfortunately again, Nvidia will make easier for us to justify spending a lot of money on this cards...

  • I hope software developers start to use OpenCl more and more for a better competition between Nvidia and AMD where we can only win.

    I think two points prevent this:

    • NVidia pays good money to delay OpenCL to retain sales to niche markets.
    • Developers of this firms are also happy, as they are not very good at OpenCL.
  • @Pedro_ES,

    Yeah, does look like Nvidia want to make that distinction, but at the moment the kepler cards have been performing just as well (if not slightly better) than the previous generation (570,580). The fact that they run cooler and use less power (plus I don't think there's a lot in price difference) could be the reasons to get them at the moment. The general consensus is that the current Quadro cards (4000 for example) are not worth the money because they are outperformed by the GTX cards, certainly according to the Premiere Pro CS5 benchmark results. The only reason to get a Quadro at this juncture is if you have a 10bit monitor and/or do a lot of 3D work.

  • If you head over to the adobe forums and check out the premiere pro 5 & 6 benchmarks, most systems that score high have a GTX 570 or 580. The Nvidia Kepler series are also reported to be working with the hack in Premiere Pro, but only the GTX 680 supports ray tracing in after effects after a CS6 update.

    The 4GB version of the GTX670 is considered to be best bang for the buck at the moment because it performs similarly to the GTX580, but runs cooler and uses less power. Also I believe the kepler cards support up to 4 monitors and 4GB of vram could come in handy for 4k red and multiple large images in photoshop etc. Most will probably think 4GB vram is overkill though.

  • It is also important to notice the the Kepler GTX line has worse computing (double - precision operations) capabilities than previous generations.

    It seems than Nvidia want to make a clearer distinction between the consumer line (GTX), professional line (Quadro) and computing line (Tesla) by crippling them in some aspects... the problem here is the professional and computing computing line have "professional prices"

    http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17

    http://www.chw.net/2012/03/kepler-a-fondo-explorando-sus-capacidades-en-computo-gpgpu/

    http://www.chw.net/2012/04/gpus-nvidia-y-amd-probados-en-aplicaciones-gpgpu-cudadirectcomputeopencl/?cp=all

    I hope software developers start to use OpenCl more and more for a better competition between Nvidia and AMD where we can only win.

  • As far as I understand, Adobe uses fast scaler without GPU. If you want to have the same scaler, as with CUDA, you have to switch it on in Sequence settings. It's called "Maximum Render Quality" in sequence settings.

    It then uses the high quality Lanczos scaling algorithm, which requires heavy GPU. Without good CUDA card this will slow down rendering almost 10 times but it's vital if you have text or grapics in your edit.

    You can get a used GTX 580 for around 300€. If you're on a laptop, the brand new GTX 680M has 3-4 times the speed of the GT 650M (which is in the new Macbook Pro's) and is similar to GTX 580 performance which is incredible on a laptop.

    Be aware that there is a text file in the Adobe Premiere Pro, After Effects ect folder that's named 'cuda supported cards' or something. Not every CUDA capable card is supported yet, there's a workaround to just type in your Nvidia CUDA cards name, but it didn't work for my current one (GTX460), others have reported it worked for their non supported cards.

    EDIT: Vitaliy, yes it is sad, ATI/AMD cards usually always performed much better than their equivalent Nvidia card for the past few years. Unfortunately video software industry settled on Nvidia.

  • As far as I understand, Adobe uses fast scaler without GPU. If you want to have the same scaler, as with CUDA, you have to switch it on in Sequence settings. It's called "Maximum Render Quality" in sequence settings.

  • @nomad

    Yep, it is documented in second link.

    Adobe use crappy scaler, so if it start using hardware it becomes much better.

  • Btw, situation is quite humorous today:

    image

    image

    image

    Yet many software packages have bad OpenCL support or do not have any.

  • FYI: I noticed that scaling in final output looks much better when using CUDA.