NVIDIA GeForce RTX 2060 Max-Q vs Radeon Pro 5500M 8GB
            NVIDIA GeForce RTX 2060 Max-Q
            
                
            
        
    
            Radeon Pro 5500M 8GB
            
                
            
        
    | 1920 Shaders 6GB GDDR6 1175MHz | 1536 Shaders 8GB GDDR6 1300MHz | 
| Peak AI Performance 144.38 TOPS INT4 Tensor | Peak AI Performance 7.99 TFLOPS FP16 | 
| FP32 4.51 TFLOPS | FP32 3.99 TFLOPS | 
| FP16 9.02 TFLOPS | FP16 7.99 TFLOPS | 
| Form Factor Soldered - | Form Factor Soldered - | 
| TDP 65W | TDP 50W | 
| - - - - - | - - - - - | 
| GB6 OpenCL N/A 0% | |
| GB6 Metal N/A 0% | GB6 Metal N/A 0% | 
| GB6 Vulkan N/A 0% | GB6 Vulkan N/A 0% | 
| GB5 OpenCL 61,230 20% | GB5 OpenCL N/A 0% | 
| GB5 CUDA 66,700 19% | GB5 CUDA N/A 0% | 
| GB5 Metal N/A 0% | GB5 Metal N/A 0% | 
| GB5 Vulkan 54,345 26% | GB5 Vulkan N/A 0% | 
| OCT 2020.1 165 21% | OCT 2020.1 N/A 0% | 
| OCT Metal N/A 0% | OCT Metal N/A 0% | 
| Peak AI
                                            Performance 
                                            144.38 TOPS
                                         
                                            INT4 Tensor
                                         | Peak AI
                                            Performance 
                                            7.99 TFLOPS
                                         
                                            FP16
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP16
                                         
                                            9.02 TFLOPS
                                            
                                                
                                            
                                         
                                            36.1 TFLOPS
                                            
                                                 Tensor (FP16 Accumulate)
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            18.05 TFLOPS
                                            
                                                 Tensor (FP32 Accumulate)
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP16
                                         
                                            7.99 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP32
                                         
                                            4.51 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP32
                                         
                                            3.99 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP64
                                         
                                            140 GFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP64
                                         
                                            250 GFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            BF16
                                         
                                            -
                                            
                                                
                                            
                                         
                                            18.05 TFLOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            TF32
                                         
                                            18.05 TFLOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            INT4
                                         
                                            144.38 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            INT8
                                         
                                            -
                                            
                                                
                                            
                                         
                                            72.19 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            Ray Tracing
                                         
                                            13.6 TOPS
                                         | 
                                            -
                                         
                                            -
                                         | 
| Pixel
                                            Fillrate 
                                            56.4 GPixel/s
                                         | Pixel
                                            Fillrate 
                                            41.6 GPixel/s
                                         | 
| 
                                            -
                                         
                                            -
                                         | 
                                            -
                                         
                                            -
                                         | 
| Texture
                                            Fillrate 
                                            141 GTexel/s
                                         | Texture
                                            Fillrate 
                                            124.8 GTexel/s
                                         | 
| ManufacturerNVIDIA | ManufacturerApple | 
| Chip DesignerNVIDIA | Chip DesignerAMD | 
| ArchitectureTuring | ArchitectureRDNA 1 | 
| FamilyGeForce 20 | FamilyRadeon Pro | 
| Codename NV166 TU106 Variant N18E-G1 | Codename Fighter Navi 14 Variant Navi 14 XTA | 
| Market Segment Laptop | Market Segment Laptop | 
| Release Date 1/29/2019 | Release Date 11/13/2019 | 
| Foundry TSMC - | Foundry TSMC - | 
| Fabrication Node 12FFN - | Fabrication Node N7 - | 
| Die Size 445 mm² - | Die Size 158 mm² - | 
| Transistor Count 10.8 Billion - | Transistor Count 6.4 Billion - | 
| Transistor Density 24.27M/mm² - | Transistor Density 40.51M/mm² - | 
| Form Soldered | Form Soldered | 
| Shading Units 1920 Shaders - | Shading Units 1536 Shaders - | 
| Texture Mapping Units 120 TMUs | Texture Mapping Units 96 TMUs | 
| Render Output Units 48 ROPs | Render Output Units 32 ROPs | 
| Tensor Cores 240 T-Cores | - - | 
| Ray-Tracing Cores 30 RT-Cores | - - | 
| Streaming Multiprocessors 30 SMs | - - | 
| - - | Compute Units 24 CUs | 
| - - | - - | 
| Graphics Processing Clusters 3 GPCs | - - | 
| - - 975MHz Base 1175MHz | - - 1000MHz Base 1300MHz | 
| - - | L0 32KB/WGP | 
| L1 32KB/SM Tex 64KB/SM - - | L1 - - - 128KB/Array | 
| L2 3MB Shared | L2 2MB Shared | 
| - - - | - - - | 
| 6GB GDDR6 - | 8GB GDDR6 - | 
| Bus Width 192Bit | Bus Width 128Bit | 
| Clock 1375MHz Transfer Rate 11GT/s Bandwidth 264GB/s | Clock 1500MHz Transfer Rate 12GT/s Bandwidth 192GB/s | 
| - - - - - - - - - | - - - - - - - - - | 
| TDP 65W | TDP 50W | 
| - - | - - | 
| - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - No Ports | - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - No Ports | 
| Max Resolution 7680x4320 | Max Resolution 7680x4320 | 
| Max Resolution Refresh Rate 60Hz | Max Resolution Refresh Rate 120Hz | 
| Variable Refresh Rate G-Sync FreeSync - | Variable Refresh Rate - FreeSync - | 
| Display Stream Compression (DSC) Supported | Display Stream Compression (DSC) Supported | 
| Multi Monitor Support 3 | Multi Monitor Support 4 | 
| Content Protection HDCP 2.2 | Content Protection HDCP 2.3 | 
| Model NVENC 7 | Model VCN 2.0 | 
| Codec - - - - - - - - AVC (H.264) HEVC (H.265) - - - - | Codec - - - - - - - - AVC (H.264) HEVC (H.265) - - - - | 
| Model NVDEC 4 | Model VCN 2.0 | 
| Codec MPEG-1 MPEG-2 MPEG-4 - VC-1 VP8 VP9 - AVC (H.264) HEVC (H.265) - - - - | Codec MPEG-1 MPEG-2 MPEG-4 JPEG VC-1 - VP9 - AVC (H.264) HEVC (H.265) - - - - | 
| Direct X 12 Direct 3D 12_2 | Direct X 12 Direct 3D 12_1 | 
| OpenGL 4.6 OpenCL 3.0 Vulkan 1.2 | OpenGL 4.6 OpenCL 2.1 Vulkan 1.2 | 
| Shader Model 6.6 CUDA 7.5 - - PureVideo HD VP10 VDPAU Feature Set J | Shader Model 6.5 - - GFX 10.1 - - - - | 
| Not a Card - - - | Not a Card - - - | 
| - - - - - - - - | - - - - - - - - | 
| - - PCIe Version 3.0 PCIe Lanes 16 | - - PCIe Version 4.0 PCIe Lanes 8 | 
| - - - - | - - - - | 
| - - - - - - | - - - - - - | 
 
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                            Copy Link
    
     
                    
                