Intel Arc A380 vs GIGABYTE Radeon RX 6500 XT Gaming OC
            Intel Arc A380
            
                $149
            
        
    
            GIGABYTE Radeon RX 6500 XT Gaming OC
            
                $170
            
        
    | 1024 Shaders 6GB GDDR6 2450MHz | 1024 Shaders 4GB GDDR6 2825MHz | 
| Peak AI Performance 160.56 TOPS INT4 Tensor | Peak AI Performance 11.57 TFLOPS FP16 | 
| FP32 5.02 TFLOPS | FP32 5.79 TFLOPS | 
| FP16 10.04 TFLOPS | FP16 11.57 TFLOPS | 
| Form Factor PCIe Card 2.1-Slots | Form Factor PCIe Card 2.0-Slots | 
| TDP 75W | TDP 107W | 
| - - - - - | Power Connectors 1x 6-Pin - - - | 
| GB6 OpenCL N/A 0% | |
| GB6 Metal N/A 0% | GB6 Metal N/A 0% | 
| GB6 Vulkan N/A 0% | 
| GB5 OpenCL 21,840 7% | GB5 OpenCL 53,340 17% | 
| GB5 CUDA N/A 0% | GB5 CUDA N/A 0% | 
| GB5 Metal N/A 0% | GB5 Metal N/A 0% | 
| GB5 Vulkan N/A 0% | GB5 Vulkan 40,335 20% | 
| OCT 2020.1 N/A 0% | OCT 2020.1 N/A 0% | 
| OCT Metal 75 12% | OCT Metal 75 13% | 
| Peak AI
                                            Performance 
                                            160.56 TOPS
                                         
                                            INT4 Tensor
                                         | Peak AI
                                            Performance 
                                            11.57 TFLOPS
                                         
                                            FP16
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP16
                                         
                                            10.04 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            40.14 TFLOPS
                                            
                                                 Tensor (FP32 Accumulate)
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP16
                                         
                                            11.57 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP32
                                         
                                            5.02 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP32
                                         
                                            5.79 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP64
                                         
                                            1.25 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP64
                                         
                                            360 GFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            BF16
                                         
                                            -
                                            
                                                
                                            
                                         
                                            40.14 TFLOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            INT4
                                         
                                            160.56 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            INT8
                                         
                                            -
                                            
                                                
                                            
                                         
                                            80.28 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                         | 
                                            -
                                         
                                            -
                                         | 
| Pixel
                                            Fillrate 
                                            78.4 GPixel/s
                                         | Pixel
                                            Fillrate 
                                            90.4 GPixel/s
                                         | 
| 
                                            -
                                         
                                            -
                                         | 
                                            -
                                         
                                            -
                                         | 
| Texture
                                            Fillrate 
                                            156.8 GTexel/s
                                         | Texture
                                            Fillrate 
                                            180.8 GTexel/s
                                         | 
| ManufacturerIntel | ManufacturerGIGABYTE | 
| Chip DesignerIntel | Chip DesignerAMD | 
| ArchitectureAlchemist | ArchitectureRDNA 2 | 
| FamilyArc A | FamilyRadeon RX 6000 | 
| Codename Xe HPG ACM-G11 Variant DG2-128 | Codename Beige Goby Navi 24 Variant Navi 24 XT | 
| Market Segment Desktop | Market Segment Desktop | 
| Release Date 6/28/2022 | Release Date 1/19/2022 | 
| Foundry TSMC - | Foundry TSMC - | 
| Fabrication Node N6 - | Fabrication Node N6 - | 
| Die Size 157 mm² - | Die Size 107 mm² - | 
| Transistor Count 7.2 Billion - | Transistor Count 5.4 Billion - | 
| Transistor Density 45.86M/mm² - | Transistor Density 50.47M/mm² - | 
| Form PCIe Card | Form PCIe Card | 
| Shading Units 1024 Shaders - | Shading Units 1024 Shaders - | 
| Texture Mapping Units 64 TMUs | Texture Mapping Units 64 TMUs | 
| Render Output Units 32 ROPs | Render Output Units 32 ROPs | 
| Tensor Cores 128 T-Cores | - - | 
| Ray-Tracing Cores 8 RT-Cores | Ray-Tracing Cores 16 RT-Cores | 
| - - | - - | 
| - - | Compute Units 16 CUs | 
| Execution Units 128 EUs | - - | 
| Graphics Processing Clusters 8 GPCs | Graphics Processing Clusters 1 GPC | 
| - - 2000MHz Base 2450MHz | - - 2610MHz Base 2825MHz | 
| - - | L0 32KB/WGP | 
| L1 - - - Unknown | L1 - - - 128KB/Array | 
| L2 4MB Shared | L2 1MB Shared | 
| - - - | L3 16MB Shared 0.42TB/s | 
| 6GB GDDR6 - | 4GB GDDR6 - | 
| Bus Width 96Bit | Bus Width 64Bit | 
| Clock 1938MHz Transfer Rate 15.5GT/s Bandwidth 186GB/s | Clock 2250MHz Transfer Rate 18GT/s Bandwidth 144GB/s | 
| - - - - - - - - - | - - - - - - - - - | 
| TDP 75W | TDP 107W | 
| - - | Temp 110°C Max | 
| - - - - - - - - - - - - - - - - - - - - - - - - - 3x DisplayPort 2.0 - - - - - - - - - - 1x HDMI 2.1 - - | - - - - - - - - - - - - - - - - - - - - - - - - 1x DisplayPort 1.4 - - - - - - - - - - - 1x HDMI 2.1 - - | 
| Max Resolution 7680x4320 | Max Resolution 7680x4320 | 
| Max Resolution Refresh Rate 60Hz | Max Resolution Refresh Rate 120Hz | 
| Variable Refresh Rate - FreeSync - | Variable Refresh Rate - FreeSync - | 
| Display Stream Compression (DSC) Not Supported | Display Stream Compression (DSC) Supported | 
| Multi Monitor Support 4 | Multi Monitor Support 2 | 
| - - | Content Protection HDCP 2.3 | 
| Model Arc | No Encoders - | 
| Codec - - - - - - - - AVC (H.264) HEVC (H.265) - AV1 - - | - - - - - - - - - - - - - - - | 
| Model Arc | Model VCN 3.0 | 
| Codec - MPEG-2 - JPEG - - VP9 - AVC (H.264) HEVC (H.265) - AV1 - - | Codec MPEG-1 MPEG-2 MPEG-4 JPEG VC-1 - VP9 - AVC (H.264) HEVC (H.265) - AV1 - - | 
| Direct X 12 Direct 3D 12_2 | Direct X 12 Direct 3D 12_2 | 
| OpenGL 4.6 OpenCL 3.0 Vulkan 1.3 | OpenGL 4.6 OpenCL 2.2 Vulkan 1.3 | 
| Shader Model 6.6 - - - - - - - - | Shader Model 6.6 - - GFX 10.3 - - - - | 
| - - - 1x Fan | - - - 3x Fans | 
| - - - - - - - - | Power Connectors - - 1x 6-Pin - - - - | 
| Slots Required 2.1 PCIe Version 4.0 PCIe Lanes 8 | Slots Required 2.0 PCIe Version 4.0 PCIe Lanes 4 | 
| - - - - | Multi GPU Support Supported Type Bridgeless | 
| Height 114 mm (4.49 in) Width 222 mm (8.74 in) Depth 42 mm (1.65 in) | Height 114 mm (4.49 in) Width 282 mm (11.1 in) Depth 40 mm (1.57 in) | 
 
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                            Copy Link
    
     
                    
                