AMD Radeon Pro WX 7100M vs NVIDIA GeForce RTX 2070 Max-Q
            AMD Radeon Pro WX 7100M
            
                
            
        
    
            NVIDIA GeForce RTX 2070 Max-Q
            
                
            
        
    | 2304 Shaders 8GB GDDR5 1243MHz | 2304 Shaders 8GB GDDR6 1185MHz | 
| Peak AI Performance 5.73 TFLOPS FP16 | Peak AI Performance 174.74 TOPS INT4 Tensor | 
| FP32 5.73 TFLOPS | FP32 5.46 TFLOPS | 
| FP16 5.73 TFLOPS | FP16 10.92 TFLOPS | 
| Form Factor Soldered - | Form Factor Soldered - | 
| TDP 130W | TDP 80W | 
| - - - - - | - - - - - | 
| GB6 OpenCL N/A 0% | |
| GB6 Metal N/A 0% | GB6 Metal N/A 0% | 
| GB6 Vulkan N/A 0% | GB6 Vulkan N/A 0% | 
| GB5 OpenCL N/A 0% | GB5 OpenCL 73,970 24% | 
| GB5 CUDA N/A 0% | GB5 CUDA 85,865 24% | 
| GB5 Metal N/A 0% | GB5 Metal N/A 0% | 
| GB5 Vulkan N/A 0% | GB5 Vulkan 63,555 31% | 
| OCT 2020.1 N/A 0% | OCT 2020.1 185 24% | 
| OCT Metal N/A 0% | OCT Metal N/A 0% | 
| Peak AI
                                            Performance 
                                            5.73 TFLOPS
                                         
                                            FP16
                                         | Peak AI
                                            Performance 
                                            174.74 TOPS
                                         
                                            INT4 Tensor
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP16
                                         
                                            5.73 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP16
                                         
                                            10.92 TFLOPS
                                            
                                                
                                            
                                         
                                            43.68 TFLOPS
                                            
                                                 Tensor (FP16 Accumulate)
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            21.84 TFLOPS
                                            
                                                 Tensor (FP32 Accumulate)
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP32
                                         
                                            5.73 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP32
                                         
                                            5.46 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP64
                                         
                                            360 GFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP64
                                         
                                            170 GFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            BF16
                                         
                                            -
                                            
                                                
                                            
                                         
                                            21.84 TFLOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            TF32
                                         
                                            21.84 TFLOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            INT4
                                         
                                            174.74 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            INT8
                                         
                                            -
                                            
                                                
                                            
                                         
                                            87.37 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                         | 
                                            Ray Tracing
                                         
                                            16.5 TOPS
                                         | 
| Pixel
                                            Fillrate 
                                            39.776 GPixel/s
                                         | Pixel
                                            Fillrate 
                                            75.84 GPixel/s
                                         | 
| 
                                            -
                                         
                                            -
                                         | 
                                            -
                                         
                                            -
                                         | 
| Texture
                                            Fillrate 
                                            178.992 GTexel/s
                                         | Texture
                                            Fillrate 
                                            170.64 GTexel/s
                                         | 
| ManufacturerAMD | ManufacturerNVIDIA | 
| Chip DesignerAMD | Chip DesignerNVIDIA | 
| ArchitectureGCN 4 | ArchitectureTuring | 
| FamilyRadeon Pro WX | FamilyGeForce 20 | 
| Codename Ellesmere Polaris 10 Variant Polaris 10 XTM GL | Codename NV166 TU106 Variant N18E-G2 | 
| Market Segment Laptop | Market Segment Laptop | 
| Release Date 3/1/2017 | Release Date 1/29/2019 | 
| Foundry GlobalFoundries - | Foundry TSMC - | 
| Fabrication Node 14LPP - | Fabrication Node 12FFN - | 
| Die Size 232 mm² - | Die Size 445 mm² - | 
| Transistor Count 5.7 Billion - | Transistor Count 10.8 Billion - | 
| Transistor Density 24.57M/mm² - | Transistor Density 24.27M/mm² - | 
| Form Soldered | Form Soldered | 
| Shading Units 2304 Shaders - | Shading Units 2304 Shaders - | 
| Texture Mapping Units 144 TMUs | Texture Mapping Units 144 TMUs | 
| Render Output Units 32 ROPs | Render Output Units 64 ROPs | 
| - - | Tensor Cores 288 T-Cores | 
| - - | Ray-Tracing Cores 36 RT-Cores | 
| - - | Streaming Multiprocessors 36 SMs | 
| Compute Units 36 CUs | - - | 
| - - | - - | 
| - - | Graphics Processing Clusters 3 GPCs | 
| - - 1188MHz Base 1243MHz | - - 885MHz Base 1185MHz | 
| - - | - - | 
| L1 - - 16KB/CU - | L1 32KB/SM Tex 64KB/SM - - | 
| L2 2MB Shared | L2 4MB Shared | 
| - - - | - - - | 
| 8GB GDDR5 - | 8GB GDDR6 - | 
| Bus Width 256Bit | Bus Width 256Bit | 
| Clock 1250MHz Transfer Rate 5GT/s Bandwidth 160GB/s | Clock 1500MHz Transfer Rate 12GT/s Bandwidth 384GB/s | 
| - - - - - - - - - | - - - - - - - - - | 
| TDP 130W | TDP 80W | 
| - - | - - | 
| - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - No Ports | - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - No Ports | 
| Max Resolution 5120x2880 | Max Resolution 7680x4320 | 
| Max Resolution Refresh Rate 60Hz | Max Resolution Refresh Rate 60Hz | 
| Variable Refresh Rate - FreeSync - | Variable Refresh Rate G-Sync FreeSync - | 
| Display Stream Compression (DSC) Not Supported | Display Stream Compression (DSC) Supported | 
| Multi Monitor Support 3 | Multi Monitor Support 3 | 
| Content Protection HDCP 2.2 | Content Protection HDCP 2.2 | 
| Model VCE 3.0 | Model NVENC 7 | 
| Codec - - - - - - - - AVC (H.264) HEVC (H.265) - - - - | Codec - - - - - - - - AVC (H.264) HEVC (H.265) - - - - | 
| Model UVD 6.3 | Model NVDEC 4 | 
| Codec MPEG-1 MPEG-2 MPEG-4 JPEG VC-1 - - - AVC (H.264) HEVC (H.265) - - - - | Codec MPEG-1 MPEG-2 MPEG-4 - VC-1 VP8 VP9 - AVC (H.264) HEVC (H.265) - - - - | 
| Direct X 12 Direct 3D 12_0 | Direct X 12 Direct 3D 12_2 | 
| OpenGL 4.6 OpenCL 2.1 Vulkan 1.3 | OpenGL 4.6 OpenCL 3.0 Vulkan 1.2 | 
| Shader Model 6.7 - - GFX 8 - - - - | Shader Model 6.6 CUDA 7.5 - - PureVideo HD VP10 VDPAU Feature Set J | 
| Not a Card - - - | Not a Card - - - | 
| - - - - - - - - | - - - - - - - - | 
| - - PCIe Version 3.0 PCIe Lanes 16 | - - PCIe Version 3.0 PCIe Lanes 16 | 
| - - - - | - - - - | 
| - - - - - - | - - - - - - | 
 
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                            Copy Link
    
     
                    
                