Intel Arc B580 vs Intel Arc A580
            Intel Arc B580
            
                $249
            
        
    
            Intel Arc A580
            
                $179
            
        
    | 2560 Shaders 12GB GDDR6 2670MHz | 3072 Shaders 6GB GDDR6 2000MHz | 
| Peak AI Performance 437.45 TOPS INT4 Tensor | Peak AI Performance 393.22 TOPS INT4 Tensor | 
| FP32 13.67 TFLOPS | FP32 12.29 TFLOPS | 
| FP16 27.34 TFLOPS | FP16 24.58 TFLOPS | 
| Form Factor PCIe Card 2.0-Slots | Form Factor PCIe Card 2.0-Slots | 
| TDP 190W | TDP 185W | 
| Power Connectors - 1x 8-Pin - - | - - - - - | 
| GB6 OpenCL N/A 0% | |
| GB6 Metal N/A 0% | GB6 Metal N/A 0% | 
| GB6 Vulkan N/A 0% | 
| GB5 OpenCL N/A 0% | GB5 OpenCL N/A 0% | 
| GB5 CUDA N/A 0% | GB5 CUDA N/A 0% | 
| GB5 Metal N/A 0% | GB5 Metal N/A 0% | 
| GB5 Vulkan N/A 0% | GB5 Vulkan N/A 0% | 
| OCT 2020.1 N/A 0% | OCT 2020.1 N/A 0% | 
| OCT Metal N/A 0% | OCT Metal N/A 0% | 
| Peak AI
                                            Performance 
                                            437.45 TOPS
                                         
                                            INT4 Tensor
                                         | Peak AI
                                            Performance 
                                            393.22 TOPS
                                         
                                            INT4 Tensor
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP16
                                         
                                            27.34 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            109.36 TFLOPS
                                            
                                                 Tensor (FP32 Accumulate)
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP16
                                         
                                            24.58 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            98.3 TFLOPS
                                            
                                                 Tensor (FP32 Accumulate)
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP32
                                         
                                            13.67 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP32
                                         
                                            12.29 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            FP64
                                         
                                            3.42 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            FP64
                                         
                                            3.07 TFLOPS
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            BF16
                                         
                                            -
                                            
                                                
                                            
                                         
                                            109.36 TFLOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            BF16
                                         
                                            -
                                            
                                                
                                            
                                         
                                            98.3 TFLOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            INT4
                                         
                                            437.45 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            INT4
                                         
                                            393.22 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            INT8
                                         
                                            -
                                            
                                                
                                            
                                         
                                            218.73 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            INT8
                                         
                                            -
                                            
                                                
                                            
                                         
                                            196.61 TOPS
                                            
                                                 Tensor
                                            
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         | 
                                            -
                                         
                                            -
                                            
                                                
                                            
                                         | 
| 
                                            -
                                         
                                            -
                                         | 
                                            -
                                         
                                            -
                                         | 
| Pixel
                                            Fillrate 
                                            213.6 GPixel/s
                                         | Pixel
                                            Fillrate 
                                            192 GPixel/s
                                         | 
| 
                                            -
                                         
                                            -
                                         | 
                                            -
                                         
                                            -
                                         | 
| Texture
                                            Fillrate 
                                            427.2 GTexel/s
                                         | Texture
                                            Fillrate 
                                            384 GTexel/s
                                         | 
| ManufacturerIntel | ManufacturerIntel | 
| Chip DesignerIntel | Chip DesignerIntel | 
| ArchitectureBattlemage | ArchitectureAlchemist | 
| FamilyArc B | FamilyArc A | 
| Codename Xe2 HPG BMG-G21 - - | Codename Xe HPG ACM-G10 Variant DG2-512 | 
| Market Segment Desktop | Market Segment Desktop | 
| Release Date 12/3/2024 | Release Date 10/10/2023 | 
| Foundry TSMC - | Foundry TSMC - | 
| Fabrication Node N5 - | Fabrication Node N6 - | 
| Die Size 272 mm² - | Die Size 406 mm² - | 
| Transistor Count 19.6 Billion - | Transistor Count 21.7 Billion - | 
| Transistor Density 72.06M/mm² - | Transistor Density 53.45M/mm² - | 
| Form PCIe Card | Form PCIe Card | 
| Shading Units 2560 Shaders - | Shading Units 3072 Shaders - | 
| Texture Mapping Units 160 TMUs | Texture Mapping Units 192 TMUs | 
| Render Output Units 80 ROPs | Render Output Units 96 ROPs | 
| Tensor Cores 160 T-Cores | Tensor Cores 384 T-Cores | 
| Ray-Tracing Cores 20 RT-Cores | Ray-Tracing Cores 24 RT-Cores | 
| - - | - - | 
| - - | - - | 
| Execution Units 160 EUs | Execution Units 384 EUs | 
| Graphics Processing Clusters 20 GPCs | Graphics Processing Clusters 24 GPCs | 
| - - - 2670MHz | - - 1700MHz Base 2000MHz | 
| - - | - - | 
| L1 - - - Unknown | L1 - - - Unknown | 
| L2 18MB Shared | L2 8MB Shared | 
| - - - | - - - | 
| 12GB GDDR6 - | 6GB GDDR6 - | 
| Bus Width 192Bit | Bus Width 96Bit | 
| Clock 2375MHz Transfer Rate 19GT/s Bandwidth 456GB/s | Clock 1938MHz Transfer Rate 15.5GT/s Bandwidth 186GB/s | 
| - - - - - - - - - | - - - - - - - - - | 
| TDP 190W | TDP 185W | 
| - - | - - | 
| - - - - - - - - - - - - - - - - - - - - - - - - - - 3x DisplayPort 2.1 - - - - - - - - - 1x HDMI 2.1 - - | - - - - - - - - - - - - - - - - - - - - - - - - - 3x DisplayPort 2.0 - - - - - - - - - - 1x HDMI 2.1 - - | 
| Max Resolution 7680x4320 | Max Resolution 7680x4320 | 
| Max Resolution Refresh Rate 60Hz | Max Resolution Refresh Rate 60Hz | 
| Variable Refresh Rate - FreeSync - | Variable Refresh Rate - FreeSync - | 
| Display Stream Compression (DSC) Not Supported | Display Stream Compression (DSC) Not Supported | 
| Multi Monitor Support 4 | Multi Monitor Support 4 | 
| - - | - - | 
| Model Arc | Model Arc | 
| Codec - - - - - - - - AVC (H.264) HEVC (H.265) - AV1 - - | Codec - - - - - - - - AVC (H.264) HEVC (H.265) - AV1 - - | 
| Model Arc | Model Arc | 
| Codec - MPEG-2 - JPEG - - VP9 - AVC (H.264) HEVC (H.265) - AV1 - - | Codec - MPEG-2 - JPEG - - VP9 - AVC (H.264) HEVC (H.265) - AV1 - - | 
| Direct X 12 Direct 3D 12_2 | Direct X 12 Direct 3D 12_2 | 
| OpenGL 4.6 OpenCL 3.0 Vulkan 1.3 | OpenGL 4.6 OpenCL 3.0 Vulkan 1.3 | 
| Shader Model 6.6 - - - - - - - - | Shader Model 6.6 - - - - - - - - | 
| - - - 2x Fans | - - - 1x Fan | 
| Power Connectors - - - 1x 8-Pin - - - | - - - - - - - - | 
| Slots Required 2.0 PCIe Version 4.0 PCIe Lanes 8 | Slots Required 2.0 PCIe Version 4.0 PCIe Lanes 16 | 
| - - - - | - - - - | 
| Height 115 mm (4.53 in) Width 272 mm (10.71 in) Depth 40 mm (1.57 in) | - - - - - - | 
 
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                             
                                 
                            Copy Link
    
     
                    
                