NVIDIA GeForce RTX 2080 Max-Q vs NVIDIA GeForce GTX 1070 Max-Q
            NVIDIA GeForce RTX 2080 Max-Q
            
                
            
        
    
            NVIDIA GeForce GTX 1070 Max-Q
            
                
            
        
    | 2944 Shaders 8GB GDDR6 1095MHz | 2048 Shaders 8GB GDDR5 1379MHz | 
| Peak AI Performance 206.32 TOPS INT4 Tensor | Peak AI Performance 22.59 TOPS INT8 | 
| FP32 6.45 TFLOPS | FP32 5.65 TFLOPS | 
| FP16 12.9 TFLOPS | FP16 90 GFLOPS | 
| Form Factor Soldered - | Form Factor Soldered - | 
| TDP 80W | TDP Unknown | 
| - - - - - | - - - - - | 
| GB6 Metal N/A 0% | GB6 Metal N/A 0% | 
| GB5 OpenCL 87,315 29% | GB5 OpenCL N/A 0% | 
| GB5 CUDA 105,835 30% | GB5 CUDA N/A 0% | 
| GB5 Metal N/A 0% | GB5 Metal N/A 0% | 
| GB5 Vulkan 65,140 32% | GB5 Vulkan N/A 0% | 
| OCT 2020.1 200 26% | OCT 2020.1 N/A 0% | 
| OCT Metal N/A 0% | OCT Metal N/A 0% | 
| Peak AI
                                            Performance 
                                            206.32 TOPS
                                         
                                            INT4 Tensor
                                         | Peak AI
                                            Performance 
                                            22.59 TOPS
                                         
                                            INT8
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP16
                                         
                                            12.9 TFLOPS
                                            
                                                
                                            
                                         
                                            51.58 TFLOPS
                                            
                                                 Tensor (FP16 Accumulate)
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            25.79 TFLOPS
                                            
                                                 Tensor (FP32 Accumulate)
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP16
                                         
                                            90 GFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP32
                                         
                                            6.45 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP32
                                         
                                            5.65 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP64
                                         
                                            200 GFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP64
                                         
                                            180 GFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            BF16
                                         
                                            -
                                            
                                                
                                            
                                         
                                            25.79 TFLOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            TF32
                                         
                                            25.79 TFLOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            INT4
                                         
                                            206.32 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            INT8
                                         
                                            -
                                            
                                                
                                            
                                         
                                            103.16 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            INT8
                                         
                                            22.59 TOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            Ray Tracing
                                         
                                            19.4 TOPS
                                         | 
                                            -
                                         
                                            -
                                         | 
| Pixel
                                            Fillrate 
                                            70.08 GPixel/s
                                         | Pixel
                                            Fillrate 
                                            88.256 GPixel/s
                                         | 
| 
                                            -
                                         
                                            -
                                         | 
                                            -
                                         
                                            -
                                         | 
| Texture
                                            Fillrate 
                                            201.48 GTexel/s
                                         | Texture
                                            Fillrate 
                                            176.512 GTexel/s
                                         | 
| ManufacturerNVIDIA | ManufacturerNVIDIA | 
| Chip DesignerNVIDIA | Chip DesignerNVIDIA | 
| ArchitectureTuring | ArchitecturePascal | 
| FamilyGeForce 20 | FamilyGeForce 10 | 
| Codename NV164 TU104 Variant N18E-G3-A1 | Codename NV134 GP104 - - | 
| Market Segment Laptop | Market Segment Laptop | 
| Release Date 1/29/2019 | Release Date 5/1/2017 | 
| Foundry TSMC - | Foundry TSMC - | 
| Fabrication Node 12FFN - | Fabrication Node 16FF - | 
| Die Size 545 mm² - | Die Size 314 mm² - | 
| Transistor Count 13.6 Billion - | Transistor Count 7.2 Billion - | 
| Transistor Density 24.95M/mm² - | Transistor Density 22.93M/mm² - | 
| Form Soldered | Form Soldered | 
| Shading Units 2944 Shaders - | Shading Units 2048 Shaders - | 
| Texture Mapping Units 184 TMUs | Texture Mapping Units 128 TMUs | 
| Render Output Units 64 ROPs | Render Output Units 64 ROPs | 
| Tensor Cores 368 T-Cores | - - | 
| Ray-Tracing Cores 46 RT-Cores | - - | 
| Streaming Multiprocessors 46 SMs | Streaming Multiprocessors 16 SMs | 
| - - | - - | 
| - - | - - | 
| Graphics Processing Clusters 6 GPCs | - - | 
| - - 735MHz Base 1095MHz | - - 1101MHz Base 1379MHz | 
| - - | - - | 
| L1 32KB/SM Tex 64KB/SM - - | L1 - 48KB/SM - - | 
| L2 4MB Shared | L2 2MB Shared | 
| - - - | - - - | 
| 8GB GDDR6 - | 8GB GDDR5 - | 
| Bus Width 256Bit | Bus Width 256Bit | 
| Clock 1500MHz Transfer Rate 12GT/s Bandwidth 384GB/s | Clock 2000MHz Transfer Rate 8GT/s Bandwidth 256GB/s | 
| - - - - - - - - - | - - - - - - - - - | 
| TDP 80W | TDP Unknown | 
| - - | - - | 
| - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - No Ports | - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - No Ports | 
| Max Resolution 7680x4320 | Max Resolution 7680x4320 | 
| Max Resolution Refresh Rate 60Hz | Max Resolution Refresh Rate 30Hz | 
| Variable Refresh Rate G-Sync FreeSync - | Variable Refresh Rate G-Sync FreeSync - | 
| Display Stream Compression (DSC) Supported | Display Stream Compression (DSC) Not Supported | 
| Multi Monitor Support 3 | Multi Monitor Support 3 | 
| Content Protection HDCP 2.2 | - - | 
| Model NVENC 7 | Model 2x NVENC 4 | 
| Codec - - - - - - - - AVC (H.264) HEVC (H.265) - - - - | Codec - - - - - - - - AVC (H.264) HEVC (H.265) - - - - | 
| Model NVDEC 4 | Model NVDEC 3 | 
| Codec MPEG-1 MPEG-2 MPEG-4 - VC-1 VP8 VP9 - AVC (H.264) HEVC (H.265) - - - - | Codec MPEG-1 MPEG-2 MPEG-4 - VC-1 - VP9 - AVC (H.264) HEVC (H.265) - - - - | 
| Direct X 12 Direct 3D 12_2 | Direct X 12 Direct 3D 12_1 | 
| OpenGL 4.6 OpenCL 3.0 Vulkan 1.2 | OpenGL 4.6 OpenCL 3.0 Vulkan 1.3 | 
| Shader Model 6.6 CUDA 7.5 - - PureVideo HD VP10 VDPAU Feature Set J | Shader Model 6.7 CUDA 6.1 - - PureVideo HD VP8 VDPAU Feature Set H | 
| Not a Card - - - | Not a Card - - - | 
| - - - - - - - - | - - - - - - - - | 
| - - PCIe Version 3.0 PCIe Lanes 16 | - - PCIe Version 3.0 PCIe Lanes 16 | 
| - - - - | - - - - | 
| - - - - - - | - - - - - - | 
 
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                            Copy Link
    
     
                    
                