AMD Radeon Pro W5700 7nm GPU Takes On NVIDIA In Workstations
AMD is anything but timid when it comes to its ambitions to conquer the computing market. Not satisfied with challenging CPU giant Intel, AMD acquired ATI to take on the GPU Goliath that is NVIDIA. After shaking up the CPU market with its Zen architecture, it is doing the same with its RDNA architecture and Navi GPUs with moderate success. It is now bringing that success from the consumer graphics card market to the pro industry with the new Radeon Pro W7500, daring to encroach on a market long dominated by NVIDIA.
Those following the PC and GPU market closely might see the familiar "5700" number also found on AMD's Radeon RX 5700. That's no coincidence as the two share the same Radeon DNA, literally since AMD markets it as "RDNA"' and the 7nm FinFET process used by its Navi GPUs. In other words, it is AMD's enterprise version of the RX 5700 but it's more than just a coating of paint on top.
In addition to the improvements in performance-per-watt and support for GDDR6 memory, the Radeon Pro W5700 tacks on features that are geared more towards the use cases and workflows of professional users. That includes PCIe 4.0 support, which AMD boasts is an industry-first, multiple Thunderbolt and high-speed USB-C ports, and support for wireless VR headsets. In terms of technical spec sheets, the graphics card boasts of 36 compute units and 8 GB of GDDR6 memory.
While those might make PC gamers drool, AMD is more interested in getting the Radeon Pro W5700 on workstations. Specifically, it is aiming its guns at 3D designers, architects, and engineers whose work now demands real-time visual feedback instead of waiting for renders to finish for minutes or even hours. And it has no qualms about comparing the graphics card to NVIDIA's RTX 4000, its closest rival.
As a pro piece of equipment, the AMD Radeon Pro W5700 unsurprisingly also has a pro-level price tag. That starts at $799, making sure that it's something only enterprise customers can grab. Whether they will is another question and will depend on the real-world performance of the graphics card against NVIDIA's equivalent.