AMD Radeon RX 7900 GRE vs AMD Radeon RX 7700 XT
            AMD Radeon RX 7900 GRE
            
                $549
            
        
    
            AMD Radeon RX 7700 XT
            
                $449
            
        
    | 5120 Shaders 16GB GDDR6 2245MHz | 3456 Shaders 12GB GDDR6 2544MHz | 
| Peak AI Performance 183.91 TOPS INT4 Tensor | Peak AI Performance 140.67 TOPS INT4 Tensor | 
| FP32 45.98 TFLOPS | FP32 35.17 TFLOPS | 
| FP16 91.96 TFLOPS | FP16 70.34 TFLOPS | 
| Form Factor PCIe Card 2.5-Slots | Form Factor PCIe Card 2.5-Slots | 
| TDP 260W | TDP 245W | 
| Power Connectors - 2x 8-Pin - - | Power Connectors - 2x 8-Pin - - | 
| GB6 OpenCL N/A 0% | |
| GB6 Metal N/A 0% | GB6 Metal N/A 0% | 
| GB6 Vulkan N/A 0% | 
| GB5 OpenCL N/A 0% | GB5 OpenCL N/A 0% | 
| GB5 CUDA N/A 0% | GB5 CUDA N/A 0% | 
| GB5 Metal N/A 0% | GB5 Metal N/A 0% | 
| GB5 Vulkan N/A 0% | GB5 Vulkan N/A 0% | 
| OCT 2020.1 N/A 0% | OCT 2020.1 N/A 0% | 
| OCT Metal N/A 0% | OCT Metal N/A 0% | 
| Peak AI
                                            Performance 
                                            183.91 TOPS
                                         
                                            INT4 Tensor
                                         | Peak AI
                                            Performance 
                                            140.67 TOPS
                                         
                                            INT4 Tensor
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP16
                                         
                                            91.96 TFLOPS
                                            
                                                
                                            
                                         
                                            45.98 TFLOPS
                                            
                                                 Tensor (FP16 Accumulate)
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            45.98 TFLOPS
                                            
                                                 Tensor (FP32 Accumulate)
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP16
                                         
                                            70.34 TFLOPS
                                            
                                                
                                            
                                         
                                            35.17 TFLOPS
                                            
                                                 Tensor (FP16 Accumulate)
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            35.17 TFLOPS
                                            
                                                 Tensor (FP32 Accumulate)
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP32
                                         
                                            45.98 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP32
                                         
                                            35.17 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP64
                                         
                                            1.44 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP64
                                         
                                            1.1 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            BF16
                                         
                                            91.96 TFLOPS
                                            
                                                
                                            
                                         
                                            45.98 TFLOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            BF16
                                         
                                            70.34 TFLOPS
                                            
                                                
                                            
                                         
                                            35.17 TFLOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            INT4
                                         
                                            183.91 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            INT4
                                         
                                            140.67 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            INT8
                                         
                                            -
                                            
                                                
                                            
                                         
                                            45.98 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            INT8
                                         
                                            -
                                            
                                                
                                            
                                         
                                            35.17 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            INT32
                                         
                                            22.99 TOPS
                                            
                                                
                                            
                                         | 
                                            INT32
                                         
                                            17.58 TOPS
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                         | 
                                            -
                                         
                                            -
                                         | 
| Pixel
                                            Fillrate 
                                            359.2 GPixel/s
                                         | Pixel
                                            Fillrate 
                                            244.224 GPixel/s
                                         | 
| 
                                            -
                                         
                                            -
                                         | 
                                            -
                                         
                                            -
                                         | 
| Texture
                                            Fillrate 
                                            718.4 GTexel/s
                                         | Texture
                                            Fillrate 
                                            549.504 GTexel/s
                                         | 
| ManufacturerAMD | ManufacturerAMD | 
| Chip DesignerAMD | Chip DesignerAMD | 
| ArchitectureRDNA 3 | ArchitectureRDNA 3 | 
| FamilyRadeon RX 7000 | FamilyRadeon RX 7000 | 
| Codename Plum Bonito Navi 31 Variant Navi 31 XL | Codename Wheat Nas Navi 32 Variant Navi 32 | 
| Market Segment Desktop | Market Segment Desktop | 
| Release Date 7/27/2023 | Release Date 8/25/2023 | 
| Foundry TSMC TSMC Memory Cache Die | Foundry TSMC TSMC Memory Cache Die | 
| Fabrication Node N5 N6  Memory Cache Die | Fabrication Node N5 N6  Memory Cache Die | 
| Die Size 304 mm² 4x 38 mm² Memory Cache Die | Die Size 200 mm² 3x 38 mm² Memory Cache Die | 
| Transistor Count 45.4 Billion 4x 2.1 Billion Memory Cache Die | Transistor Count 28.1 Billion 3x 2.1 Billion Memory Cache Die | 
| Transistor Density 149.17M/mm² 54.64M/mm² Memory Cache Die | Transistor Density 140.50M/mm² 54.64M/mm² Memory Cache Die | 
| Form PCIe Card | Form PCIe Card | 
| Shading Units 5120 Shaders - | Shading Units 3456 Shaders - | 
| Texture Mapping Units 320 TMUs | Texture Mapping Units 216 TMUs | 
| Render Output Units 160 ROPs | Render Output Units 96 ROPs | 
| Tensor Cores 160 T-Cores | Tensor Cores 108 T-Cores | 
| Ray-Tracing Cores 80 RT-Cores | Ray-Tracing Cores 54 RT-Cores | 
| - - | - - | 
| Compute Units 80 CUs | Compute Units 54 CUs | 
| - - | - - | 
| - - | - - | 
| - - 1880MHz Base 2245MHz | - - 1900MHz Base 2544MHz | 
| L0 64KB/WGP | L0 64KB/WGP | 
| L1 - - - 256KB/Array | L1 - - - 256KB/Array | 
| L2 6MB Shared | L2 4MB Shared | 
| L3 64MB Shared 2.25TB/s | L3 48MB Shared 1.69TB/s | 
| 16GB GDDR6 - | 12GB GDDR6 - | 
| Bus Width 256Bit | Bus Width 192Bit | 
| Clock 2250MHz Transfer Rate 18GT/s Bandwidth 576GB/s | Clock 2250MHz Transfer Rate 18GT/s Bandwidth 432GB/s | 
| - - - - - - - - - | - - - - - - - - - | 
| TDP 260W | TDP 245W | 
| - - | - - | 
| - - - - - - - - - - - - - - - - - - - - - - - - - - 2x DisplayPort 2.1 - - - - - - - - - 1x HDMI 2.1 1x USB-C + DP - | - - - - - - - - - - - - - - - - - - - - - - - - - - 3x DisplayPort 2.1 - - - - - - - - - 1x HDMI 2.1 - - | 
| Max Resolution 15360x8640 | Max Resolution 15360x8640 | 
| Max Resolution Refresh Rate 165Hz | Max Resolution Refresh Rate 165Hz | 
| Variable Refresh Rate - FreeSync - | Variable Refresh Rate - FreeSync - | 
| Display Stream Compression (DSC) Supported | Display Stream Compression (DSC) Supported | 
| Multi Monitor Support 3 | Multi Monitor Support 3 | 
| Content Protection HDCP 2.3 | Content Protection HDCP 2.3 | 
| Model VCN 4.0 | Model VCN 4.0 | 
| Codec - - - - - - VP9 - AVC (H.264) HEVC (H.265) - AV1 - - | Codec - - - - - - VP9 - AVC (H.264) HEVC (H.265) - AV1 - - | 
| Model VCN 4.0 | Model VCN 4.0 | 
| Codec MPEG-1 MPEG-2 MPEG-4 JPEG VC-1 - VP9 - AVC (H.264) HEVC (H.265) - AV1 - - | Codec MPEG-1 MPEG-2 MPEG-4 JPEG VC-1 - VP9 - AVC (H.264) HEVC (H.265) - AV1 - - | 
| Direct X 12 Direct 3D 12_2 | Direct X 12 Direct 3D 12_2 | 
| OpenGL 4.6 OpenCL 2.2 Vulkan 1.3 | OpenGL 4.6 OpenCL 2.2 Vulkan 1.3 | 
| Shader Model 6.7 - - GFX 11 - - - - | Shader Model 6.7 - - GFX 11 - - - - | 
| - - - 3x Fans | - - - 2x Fans | 
| Power Connectors - - - 2x 8-Pin - - - | Power Connectors - - - 2x 8-Pin - - - | 
| Slots Required 2.5 PCIe Version 4.0 PCIe Lanes 16 | Slots Required 2.5 PCIe Version 4.0 PCIe Lanes 16 | 
| Multi GPU Support Supported Type CrossFire XDMA | Multi GPU Support Supported Type CrossFire XDMA | 
| Height 110 mm (4.33 in) Width 276 mm (10.87 in) Depth 51 mm (2.01 in) | Height 111 mm (4.37 in) Width 267 mm (10.51 in) Depth 50 mm (1.97 in) | 
 
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                            Copy Link
    
     
                    
                