Intel graphics processing units

http://en.wikipedia.org/wiki/Comparison_of_Intel_graphics_processing_unitsphp

Comparison of Intel graphics processing units

From Wikipedia, the free encyclopedia
 
 

This page contains information about Intel's GPUs and motherboard graphics chipsets in table form. In 1982 Intel licensed the NEC µPD7220 and announced it as the Intel 82720 Graphics Display Controller.[1]node

First generation[edit]

Graphics Launch Market Chipset Code name Device id. Core render
clock(MHz)
Pixel pipelines Shader model
(vertex/pixel)
API support Memory bandwidth(GB/s) DVMT(MB) Hardware acceleration
DirectX OpenGL OpenCL MPEG-2 VC-1 AVC
Intel740 1998 Desktop stand-alone Auburn 7800 220 2 N/A 5.0 1.1 1.3
0.533(AGP 2×)
2–8 via VMI No No
752 1999 Portola 1240 250 6.0 (hardware)
9.0 (software)
1.2 0.8(PC100)–1.067(PC133)
1.067(AGP 4× forAIMM)
8–16 MC
3D graphics with Direct AGP 810
810-DC100
810E
810E2
Whitney 7121
7123
7125
230 32
2000 815
815E
815G
815EG
Solano 1132

Second generation[edit]

Intel marketed its second generation using the brand Extreme Graphics.linux

Graphics Launch Market Chipset Code name Device id. Core render
clock(MHz)
Pixel pipelines Shader model
(vertex/pixel)
API support Memory bandwidth(GB/s) DVMT(MB) Hardware acceleration
DirectX OpenGL OpenCL MPEG-2 VC-1 AVC
Extreme Graphics 2002 Desktop 845G
845GL
845GV
Brookdale 2562 200 2 N/A 7.0 (hardware)
9.0 (software)
1.3 2.1 64 MC No No
2001 Mobile 830M
830MG
Almador 3577 166[2] 1
Extreme Graphics 2 2003 Desktop 865G
865GV
Springdale 2572 266 6.4
Mobile 852GM
852GME
852GMV
854
855GM
855GME
Montara 3582
358E
133–266 2.1–2.7

Third generation[edit]

Graphics Launch Market Chipset Code name Device id. Core render
clock(MHz)
Pixel pipelines Shader model
(vertex/pixel)
API support Memory bandwidth(GB/s) DVMT(MB) Hardware acceleration
DirectX OpenGL OpenCL MPEG-2 VC-1 AVC
GMA 900 2004 Desktop 910GL Grantsdale 2582
2782
333 4 2.0 (SW) / 2.0 9.0 1.4 on Windows
2.1 on Linux[3]
3.2 128 MC[4][5] No No
915GL
915GV 8.5
915G
2005 Mobile Mobile 915
Family
Alviso 2592
2792
133–333
GMA 950 Desktop 945GZ Lakeport 2772
2776
400 3.0 (SW) / 2.0 9.0c 256[5][6]
945GC 10.7
945G
2006 Mobile Mobile 945
Family
Calistoga 27A2
27A6
27AE
166–400[7]
GMA 3100 2007 Desktop Q33 Bearlake 29D2
29D3
400[8][9] 1.5 on Windows
2.1 on Linux[3]
12.8
Q35 29B2
29B3
G31 29C2
29C3
G33 12.8 (DDR2)
17 (DDR3)
Full[5]
GMA 3150 2010 Nettop Atom D4xx
Atom D5xx
Pineview A001
A002
2 6.4 384
Netbook Atom N4xx
Atom N5xx
A011
A012
200 5.3

Fourth generation[edit]

Graphics Launch Market Chipset Code name Device id. Core render
clock(MHz)
Execution units Shader model
(vertex/pixel)
API support Memory bandwidth(GB/s) DVMT(MB) Hardware acceleration
DirectX OpenGL OpenCL MPEG-2 VC-1 AVC
GMA 3000 2006 Desktop 946GZ Lakeport 2972
2973
667[10][11] 8[10][12] 3.0 (SW) / 2.0 9.0c /9_3 2.0 on Windows
2.1 on Linux
10.6 256 MC[5] No No
Q963 Broadwater 2992
2993
Q965 12.8
GMA X3000 G965 29A2
29A3
3.0 384 Full[5] MC + (LF −
WMV9only)
GMA X3500 2007 G35 Bearlake 2982
2983
4.0 10.0 MC + LF
GMA X3100 Mobile GL960 Crestline 2A02
2A03
2A12
2A13
400[13] 8.5 MC + (LF −
WMV9only)
GLE960
GM965 500[13] 10.7
GME965
GMA 4500 2008 Desktop B43 Eaglelake 2E42
2E43
2E92
2E93
533 10 2.1 12.8 (DDR2)
17 (DDR3)[14]
1700 MC + LF MC + LF
Q43 2E12
2E13
Q45
GMA X4500 G41 2E32
2E33
800
G43 2E22
2E23
GMA X4500HD G45 Full Full
GMA 4500MHD Mobile GL40 Cantiga 2A42
2A43
400[15] 12.8
GS40
GM45 533[15] 12.8 (DDR2)
17 (DDR3)
GS45

Fifth generation[edit]

Specifications of Intel HD Graphics series [16] [17] [18] [19] [20] [21] [22]
Graphics Launch Market CPU Code name Device ID Core clock(MHz) Execution units Shader model API support Memory bandwidth
(GB/s)
DVMT(MB) Hardware acceleration
DirectX OpenGL OpenCL CVT HD QSV
HD Graphics 2010 Desktop Celeron G1101
Pentium G69xx
Ironlake (Clarkdale) 0042 533 12 4.0 10.0 2.1 No 17 1720 No No
Core i3-5x0
Core i5-6x0
Core i5-655K
733 21.3 Yes
Core i5-661 900
Mobile Celeron U3xxx
Pentium U5xxx
Ironlake (Arrandale) 0046 166–500 12.8 No
Core i3-3x0UM
Core i5-5x0UM
Core i7-6x0UE
Core i7-6x0UM
Yes
Core i7-620LE
Core i7-6x0LM
266–566 17.1
Celeron P4xxx
Pentium P6xxx
500–667 No
Core i3-330E
Core i3-3x0M
Yes
Core i5-4x0M
Core i5-520E
Core i5-5x0M
Core i7-610E
Core i7-6x0M
500–766

Sixth generation[edit]

Specifications of Intel HD Graphics series [16] [17] [18] [19] [20] [21] [22]
Graphics Launch Market CPU Code name Device ID Core clock(MHz) Execution units Shader model API support Memory bandwidth
(GB/s)
DVMT(MB) Hardware acceleration
DirectX OpenGL OpenCL CVT HD QSV
HD Graphics 2011 Desktop Celeron G4x0
Celeron G5x0
Celeron G530T
Pentium G6xx
Pentium G6x0T
Pentium G8x0
Sandy Bridge 0102
0106
0112
0116
0122
0126
010A
650–1100 6 4.1 10.1 3.1 on Windows[17]
3.3 on MacOS X[18]
3.1 on Linux[23]
No 21.3 1720 No No
Mobile Celeron B7x0
Celeron 8x7
Celeron B8xx
Pentium B9x0
Pentium 9x7
350–1150
HD Graphics 2000 Desktop Core i3-2102
Core i3-21x0
Core i3-21x0T
Core i5-2xx0
Core i5-2x00S
Core i5-25x0T
Core i7-2600
Core i7-2600S
650–1250 Yes Yes
Workstation Xeon E3-1260L
HD Graphics 3000 Desktop Core i3-21x5
Core i5-2405S
Core i5-2500K
Core i7-2x00K
850–1350 12
Mobile Core i3-23x0E
Core i3-23xxM
Core i5-251xE
Core i5-2xxxM
Core i7-2xxxQM
Core i7-271xQE
Core i7-29x0XM
650–1300
HD Graphics P3000 Workstation Xeon E3-12x5 850–1350

Seventh generation[edit]

Specifications of Intel HD Graphics series [16] [17] [18] [19] [20] [21] [22]
Graphics Launch Market CPU Code name Device ID Core clock(MHz) Execution units Shader model API support Memory bandwidth
(GB/s)
DVMT(MB) Hardware acceleration
DirectX OpenGL OpenCL CVT HD QSV
HD Graphics 2012 Desktop Celeron 16x0
Celeron 1610T
Pentium 2xx0
Pentium G2xx0T
Ivy Bridge 0162
0166
016A
0152
0156
015A
650–1050 6 5.0 11.0 4.0 on Windows[24]
4.1 on MacOS X
3.3 on Linux[25]
1.2[20] 25.6 1720 No No
Mobile Celeron 10x0M
Celeron 10x7U
Pentium 2117U
Pentium 20x0M
350–1100
HD Graphics 2500[19] Desktop Core i3-32x0
Core i3-32x0T
Core i5-3xx0
Core i5-3xx0S
Core i5-3xx0T
650–1150 Yes Yes
HD Graphics 4000[19] Core i3-3225
Core i3-3245
Core i5-3475S
Core i5-3570K
Core i7-3770
Core i7-3770x
16
Mobile Core i3-3110M 650–1000
Core i3-3120M
Core i5-3210M
650–1100
Core i3-3120ME 650–900
Core i3-3217U
Core i5-3317U
350–1050
Core i3-3217UE 350–900
Core i3-3229Y
Core i5-3339Y
Core i5-3439Y
350–850
Core i5-3320M
Core i5-3360M
650–1200
Core i5-3337U 350–1100
Core i5-3427U 350–1150
Core i5-3610ME 650–950
Core i7-3520M
Core i7-3xx7U
Core i7-3xxxQM
Core i7-3920XM
350–1300
HD Graphics P4000[26] Workstation Xeon E3-12x5V2 650–1250
HD Graphics 2013 ? ? Haswell 0402
0406
040A
0412
0416
041A

ULT:
0A04
0A16
0A22
0A26
0A2A
? 10(GT1) 5.0 11.1[21] 4.0 on Windows[22]
4.1 on MacOS X
3.3 on Linux
1.2 25.6 ? ? ?
HD Graphics 4200 ULT
Mobile
20(GT2)
HD Graphics 4400
Desktop Core i3-4130 350-1150 Yes
core i3-4130T 200-1150
HD Graphics 4600 Mobile ? ? ?
Desktop Core i3-4330  ?-1150
Core i3-4330T 200-1150 Yes
Core i3-4340 350-1150
Core i5-4430
Core i5-4430S
Core i5-4440
Core i5-4440S
350–1100 Yes
Core i5-4570
Core i5-4570S
350–1150
Core i5-4570T 200–1150
Core i5-4670
Core i5-4670K
Core i5-4670S
Core i5-4670T
350–1200
Core i7-4765T
Core i7-4770
Core i7-4770S
Core i7-4770T
Core i7-4771
Core i7-4770K 350–1250
HD Graphics 5000 ULT
Mobile
? ? 40(GT3) ? ?
Iris Graphics 5100
Iris Pro Graphics 5200 Mobile 40+eDRAM
(GT3e)
Desktop Core i5-4570R 200-1150 Yes Yes
Core i5-4670R
Core i7-4770R
200-1300
HD Graphics 2013 ? ? ValleyView (Bay Trail) 0F30
0F31
0F32
0F33
[27]
? ? 5.0 11.1 4.0 on Windows
3.3 on Linux
1.2 ? ? ? ?

PowerVR based[edit]

Specifications of Intel PowerVR-based series [4] [5] [6] [10] [11] [13] [14] [15] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53]
Graphics Launch PowerVR core Market Chipset Code name Device id. Core render
clock(MHz)
Pixel pipelines Shader model
(vertex/pixel)
API support Memory bandwidth(GB/s) DVMT(MB) Hardware acceleration
DirectX OpenGL OpenCL MPEG-2 VC-1 AVC
GMA 500 2008 SGX535 CE CE3100 Canmore 2E5B ? 2[54] 3.0 9.0c 2.0 4.2 256 Full Full Full
2009 Atom CE4100 Sodaville
Atom CE4110 200
Atom CE4130
Atom CE4150 400
Atom CE4170
2010 Atom CE4200 Groveland ?
2008 MID UL11L Poulsbo (Menlow) 8108
8109
100
US15L 200
US15W
GMA 600 2010 MID Atom Z6xx Lincroft(Moorestown) 4102 400[55] 2 3.0 9.0c 2.0 6.4 759 Full Full Full
GMA 2012 SGX540 Smartphone Atom Z2460 Penwell (Medfield) ? 400[56] 4 ES 2.0 6.4 1024 Full Full Full
GMA 3600 2011 SGX545 Nettop
Netbook
Atom D2500
Atom N2600
Cedarview(Cedar Trail) 0BE0
0BE1
0BE2
0BE3
400[57] 4 3.0 9.0c 3.0 8.5
6.4
1024[58] Full Full Full
GMA 3650 Atom D2700
Atom N2800
640[57] 8.5
GMA 2012 Tablet Atom Z2760 Cloverview(Clover Trail) 08C7
08C8
08C9
08CA
08CB
08CC
08CD
08CE
08CF
533 4 3.0 9.0c 2.0 6.4 2048 Full Full Full

Notes[edit]

Acronyms
The following acronyms are used throughout the article.
Full hardware acceleration techniques
Intel graphic processing units employ the following techniques in  hardware acceleration of  digital video playback.
Codec used to encode the video Technique employed
MPEG-2 Part 2 VLD, iDCT, and MC
VC-1 VLD, iMDCT, MC, and LF
H.264/MPEG-4 AVC VLD, iMDCT, MC, and LF
Calculation
The raw performance of integrated GPU, in single-precision  FLOPS, can be calculated as follows:
  • Fourth generation (GMA 3, 4) - EU * 2 * 2 [multiply + accumulate] * clock speed
  • Fifth generation (HD Graphics) - EU * 2 * 2 [multiply + accumulate] * clock speed
  • Sixth generation (HD Graphics 2000, 3000) - EU * 4 [dual-issue x 2 SP] * 2 [multiply + accumulate] * clock speed
  • Seventh generation (HD Graphics 4000, 5000) - EU * 8 * 2 [multiply + accumulate] * clock speed
For example, the HD Graphics 3000 is rated at 125 GFLOPS, [59] which is consistent with the formula (12 * 4 * 2 * 1,300 MHz).

 

 

 

http://en.wikipedia.org/wiki/Intel_GMAandroid

Intel GMA

From Wikipedia, the free encyclopedia
 
 
Intel GMA
Direct3D support Direct3D 10.0[1]
Shader Model 4.0[1]
OpenCL support na[1]
OpenGL support OpenGL 2.1[1]
Predecessor Intel Extreme Graphics
Successor Intel HD Graphics
GMA X3000 on Intel DG965WHMKR motherboard (only heat sink visible)

The Intel Graphics Media Accelerator, or GMA, is a series of integrated graphics processors introduced in 2004 by Intel, replacing the earlier Intel Extreme Graphics series.git

This series targets the market of low-cost graphics solutions. The products in this series are integrated onto the motherboard, have limited graphics processing power, and use the computer's main memory for storage instead of a dedicated video memory. They are commonly found on netbooks, low-priced laptops anddesktop computers, as well as business computers which do not need high levels of graphics capability. In early 2007, 90% of all PC motherboards sold had an integrated GPU.[2]github

 

 

History[edit]

The GMA line of GPUs replaces the earlier Intel Extreme Graphics, and the Intel740 line, the latter of which was a discrete unit in the form of AGP and PCI cards with technology that evolved from companiesReal3D and Lockheed Martin. Later, Intel integrated the i740 core into the Intel 810 chipset.web

The original architecture of GMA systems supported only a few functions in hardware, and relied on the host CPU to handle at least some of the graphics pipeline, further decreasing performance. However, with the introduction of Intel's 4th generation of GMA architecture (GMA X3000) in 2006, many of the functions are now built into the hardware, providing an increase in performance. The 4th generation of GMA combines fixed function capabilities with a threaded array of programmable executions units, providing advantages to both graphics and video performance. Many of the advantages of the new GMA architecture come from the ability to flexibly switch as needed between executing graphics-related tasks or video-related tasks. While GMA performance has been widely criticized in the past as being too slow for computer games, the latest GMA generation should ease many of those concerns for the casual gamer.redis

Despite similarities, Intel's main series of GMA Integrated Graphics Processors (IGPs) is not based on the PowerVR technology Intel licensed from Imagination Technologies. Intel used the low-powerPowerVR MBX designs in chipsets supporting their XScale platform, and since the sale of XScale in 2006 has licensed the PowerVR SGX and used it in the GMA 500 IGP for use with their Atom platform.chrome

With the introduction of the Platform Controller Hub, the Graphics Media Accelerator series ceased, and the CPU-based Intel HD Graphics series was created.express

Graphics cores[edit]

Intel GPU based[edit]

Generation three based[edit]

GMA 900[edit]

The GMA 900 was the first graphics core produced under Intel's Graphics Media Accelerator product name, and was incorporated in the Intel 910G, 915G, and 915Gx chipsets.

The 3D architecture of the GMA 900 was a significant upgrade from the previous Extreme 3D graphics processors. It is a 4 pixel per clock cycle design supporting DirectX 9 pixel shader model 2.0. It operates at a clock rate ranging from 160 to 333 MHz, depending on the particular chipset. At 333 MHz, it has a peak pixel fill-rate of 1332 megapixels per second. However, the architecture still lacks support forhardware transform and lighting and the similar vertex shader technologies.

Like previous Intel integrated graphics parts, the GMA 900 has hardware support for MPEG-2 motion compensationcolor-space conversion and DirectDraw overlay.

The processor uses different separate clock generators for display and render cores. The display unit includes a 400 MHz RAMDAC, 2 25–200 Mpixel/s serial DVO ports, and 2 display controllers. In mobile chipsets, up to 2 18-bit 25–112 MHz LVDS transmitters are included.

GMA 950[edit]

The GMA 950 was the second graphics core produced under Intel's Graphics Media Accelerator product name, and was incorporated in the Intel 945G chipsets.

The processor includes a 400 MHz 256-bit core, supporting up to 10.6GB/s memory bandwidth with DDR2-667 system RAM, up to 224MB max. video memory through DVMT scheme, 1.6GPixels/s and 1.6GTexels/s fill rate, a max. resolution of 2048x1536 for both analog and digital displays, 2 SDVO ports for flat-panels and/or TV-Out via ADD2 cards or media expansion cards.

3D-wise, GMA 950 supports up to 4pixels per clock rendering, Microsoft DirectX 9.0 hardware acceleration & Vertex shader 3.0 and OpenGL 1.4 with ARB extensions.

GMA 3000[edit]

The 946GZ, Q965, and Q963 chipsets use the GMA 3000 chip.[3][4] The GMA 3000 3D core is very different from the X3000, despite their similar names. It is based more directly on the previous generation GMA 900 and GMA 950 graphics, and belonging to the same "i915" family with them. It has pixel and vertex shaders which only support Shader Model 2.0b features[citation needed], and the vertex shaders are still only software-emulated. In addition, hardware video acceleration such as hardware-based iDCT computation, ProcAmp (video stream independent color correction), and VC-1 decoding are not implemented in hardware. Of the GMA 3000-equipped chipsets, only the Q965 retains dual independent display support. The core speed is rated at 400 MHz with 1.6 Gpixel/s fill rate in datasheets, but was listed as 667 MHz core in the white paper.[5]

The memory controller can now address a maximum of 256 MB of system memory, and the integrated serial DVO ports have increased top speed to 270Mpixel/s.

GMA 3100[edit]

Integrated graphics found on G31 and G33 chipset motherboards. Shader Model 3.0, support Pixel Shader 2.0 and doesn't support Vertex Shader with OpenGL 1.4 support.

GMA 3150[edit]

Found in Intel Atom N4xx, N5xx (codenamed Pineview) processors. Like GMA 3100 and GMA 3000, this is a very close relative of the GMA900/950, completely different to the GMA X3000 series.

Generation four based[edit]

GMA X3000[edit]

The GMA X3000 for desktop was "substantially redesigned" when compared to previous GMA iterations[6] and it is used in the Intel G965 north bridge controller.[7] The GMA X3000 was launched in July 2006.[8] X3000's underlying 3D rendering hardware is organized as a unified shader processor consisting of 8 scalar execution units. Each pipeline can process video, vertex, or texture operations. A central scheduler dynamically dispatches threads to pipeline resources, to maximize rendering throughput (and decrease the impact of individual pipeline stalls.) However, due to the scalar nature of the execution units, they can only process data on a single pixel component at a time.[9] The GMA X3000 supports DirectX 9.0 with vertex and pixel Shader Model 3.0 features.

The processor consists of different clock domains, meaning that the entire chip does not operate the same clock speed. This causes some difficulty when measuring peak throughput of its various functions. Further adding to the confusion, it is listed as 667 MHz in Intel G965 white paper, but listed as 400 MHz in Intel G965 datasheet. There are various rules that define the IGP's processing capabilities.[9]

Memory controller can now address maximum 384 MB memory according to white paper, but only 256 MB in datasheet.

GMA X3100[edit]

The GMA X3100 is the mobile version of the GMA X3000 used in the Intel GL960/GM965 chipsets and also in the GS965 chipset. The X3100 supports hardware transform and lighting, up to 8 programmable shader units, and up to 384 MB memory. Its display cores can run up to 333 MHz on GM965 and 320 MHz on GL960. Its render cores can run up to 500 MHz on GM965 and 400 MHz on GL960. The X3100 display unit includes a 300 MHz RAMDAC, two 25–112 MHz LVDS transmitters, 2 DVO encoders, and a TV encoder. In addition, the hardware supports DirectX 10.0,[10][11] Shader Model 4.0 and OpenGL 2.0.[11][12]

GMA X3500[edit]

GMA X3500 is an upgrade of the GMA X3000 and used in the desktop G35. The shaders support shader model 4.0 features. Architecturally, the GMA X3500 is very similar to the GMA X3000,[13] with both GMAs running at 667 MHz. The major difference between them is that the GMA X3500 supports Shader Model 4.0 and DirectX 10, whereas the earlier X3000 supports Shader Model 3.0 and DirectX 9.[13] The X3500 also adds hardware-assistance for playback of VC-1 video.

GMA X4500[edit]

The GMA X4500 and the GMA X4500HD for desktop[14] were launched in June 2008.[15] The GMA X4500 is used in the G43 chipset[16] and the GMA X4500HD is used in the G45 chipset.[14] The GMA X4500 is also used in the G41 chipset,[17] which was released in September 2008.[18]

The GMA 4500MHD for laptops was launched on July 16, 2008. Featurewise, the 4500MHD is identical to its desktop cousin, the X4500HD.[citation needed] It had been previously rumored that a cost-reduced version, the GMA 4500, was to be launched in late 2008 or early 2009[19] and was to be used in the upcoming Q43 and Q45 chipsets.[17] But in practice the Q43 and Q45 Chipsets also use the GMA X4500.[20]

The difference between the GMA X4500 and the GMA X4500HD is that the GMA X4500HD is capable of "full 1080p high-definition video playback, including Blu-ray disc movies",[14][21]

Like the X3500, X4500 supports DirectX 10 and Shader Model 4.0 features. Intel designed the GMA X4500 to be 200% faster than the GMA 3100 (G33 chipset) in 3DMark06 performance[22] and 70% faster than the GMA X3500 (G35 chipset).[23]

PowerVR GPU based[edit]

Intel developed a new set of low power graphics architecture based on PowerVR.

The available Linux drivers do not support much of this.[24]

PowerVR SGX 535 based[edit]

GMA 500[edit]

The Intel SCH (System Controller Hub; codenamed Poulsbo) for the Atom processor Z5xx series features a GMA 500 graphic system. Rather than being developed in-house, this core is a PowerVR SGX 535 core licensed from Imagination Technologies.[25]

Intel describes this as "a flexible, programmable architecture that supports shader-based technology, 2D, 3D and advanced 3D graphics, high-definition video decode, and image processing. Features include screen tiling, internal true color processing, zero overhead anti-aliasing, programmable shader 3D accelerator, and 32-bit floating-point operations."[26]

GMA 600[edit]

A revised version of the previous Intel SCH (System Controller Hub) for the Atom Z6xx series CPU codenamed Lincroft. Essentially, this is the same graphic system as the GMA 500, but clocked at double the speed. (From 200 MHz to 400 MHz).[27]

PowerVR SGX 545 based[edit]

GMA 3600[edit]

This integrated graphics system was released in Intel Atom (Cedar Trail, 32 nm) and based on PowerVR SGX545. Unlike the original PowerVR solution, this model is clocked at 400 MHz instead of 200 MHz.[28] It is specifically found in the Intel Atom N2600 processor[29] and Atom D2500[30] models. DirectX version 9.0c.

GMA 3650[edit]

Same as previous, but this version is clocked at 640 MHz.[28] It is found in Atom N2800, Atom D2550, Atom D2500, Atom D2600 and Atom D2700 models. DirectX version 9.0c.

Specifications[edit]

Protected Audio Video Path[edit]

Protected Audio Video Path (PAVP) protects the data path within a computer during video playback (e.g., Blu-ray discs). It is supported by newer chipsets (e.g. Intel G45) and operating systems (sinceWindows Vista).[31]

PAVP can be configured in the BIOS. Different modes are supported:

  1. Disabled.
  2. PAVP Lite: Reserves buffer memory for encryption of compressed video data.
  3. Paranoid PAVP: Reserve memory during boot which isn't seen by the Operating System. This disables Windows Aero in Windows Vista.

The default setting in most BIOS is PAVP Lite.

Within Intel HD Graphics, the successor of Intel GMA, a similar technology called Intel Insider exists.

Software support[edit]

Mac OS X[edit]

Mac OS X 10.4 supports the GMA 950, since it was used in previous revisions of the MacBook, MacMini, and 17-inch iMacs.[32] It has been used in all Intel-based Mac minis (until the Mac Mini released on March 3, 2009).[33] Mac OS X 10.5 Leopard contains drivers for the GMA X3100, which were used in a recent revision of the MacBook range.[34]

Late-release versions of Mac OS X 10.4 also support the GMA 900 due to its use in the Apple Developer Transition Kit, which was used in the PowerPC-to-Intel transition. However, special modifications to thekext file must be made to enable Core Image and Quartz Extreme.

Although the new MacBook line no longer uses the X3100, Mac OS X 10.5 (Leopard) ships with drivers supporting it that require no modifications to the kext file. Mac OS X 10.6 (Snow Leopard), which includes a new 64-bit kernel in addition to the 32-bit one, does not include 64-bit X3100 drivers.[citation needed] This means that although the MacBooks with the X3100 have 64-bit capable processors and EFI, Mac OS X must load the 32-bit kernel to support the 32-bit X3100 drivers.[citation needed] November 9's 10.6.2 update ships with 64-bit X3100 drivers.[citation needed]

Apple removed the 64-bit GMA X3100 drivers later, and thus affected Macs were forced back to the 32-bit kernel despite being 64-bit clean in terms of hardware and firmware. No 64-bit drivers were offered in OS X Lion. Subsequently OS X Mountain Lion dropped 32-bit kernel booting. The combination of these two changes in graphics driver code resulted in many Mac revisions being unable to upgrade to Mountain Lion, as their GPUs cannot be replaced.

For a while MacBook and MacBook Pro notebooks instead shipped with a far more powerful[35] NVIDIA GeForce 9400M,[36] and the 15" and 17" MacBook Pro notebooks shipped with an additional GeForce 9600GT supporting hybrid power to switch between GPUs. The NVIDIA GeForce 9400M chipset implemented in Apple MacBooks did not support composite or S-video output.[37]

FreeBSD[edit]

FreeBSD 8.0 supports the following Intel graphic chipsets: i810, i810-DC100, i810e, i815, i830M, 845G, 852GM, 855GM, 865G, 915G, 915GM, 945G, 945GM, 965G, 965Q, 946GZ, 965GM,945GME, G33, Q33, Q35, G35, GM45, G45, Q45, G43 and G41 chipsets. In practice, chipsets through 4500MHD are supported with DRM and 3D using FreeBSD 9. Work to integrate GEM and KMS is currently adding support for i-series integrated graphics and improving support for earlier chipsets.

Linux[edit]

In August 2006, Intel added support to the open-source X.Org/XFree86 drivers for the latest 965 series that include the GMA (X)3000 core.[38] These drivers were developed for Intel by Tungsten Graphics.[39]

In May 2007, version 2.0 of the driver (xorg-video-intel) was released, which added support for the 965GM chipset. In addition, the 2.0 driver added native video mode programming support for all chipsets from i830 forward. This version added support for automatic video mode detection and selection, monitor hot plug, dynamic extended and merged desktops and per-monitor screen rotation. These features are built into the X.Org 7.3 X server release and will eventually be supported across most of the open source X.Org video drivers.[40] Version 2.1, released in July 2007, added support for the G33, Q33 and Q35 chipsets.[41] G35 is also supported by the Linux driver.[42]

As is common for X.Org drivers on Linux, the license is a combination of GPL (for the Linux kernel parts) and MIT (for all other parts).[43]

The drivers were mainly developed by Intel and Tungsten Graphics (under contract) since the chipsets' documentation were not publicly available for a long time. In January 2008, Intel released the complete developer documentation for their, at the time, latest chipsets (965 and G35 chipset), allowing for further external developers' involvement.[44][45] In April 2009, Intel released documentation for their newer G45 graphics (including X4500) chipsets.[46] In May 2009, Intel employee Eric Anholt stated Intel was "still working on getting docs for [8xx] chipsets out."[47]

H.264 acceleration via VA-API[edit]

Linux support for gpu accelerated H.264 playback is available and working for X4500HD and X4500MHD using VAAPI and the g45-h264 branch.[48][49]

PowerVR based chips on Linux[edit]

GMA 500, GMA 600, GMA 3600, GMA 3650 are PowerVR based chips incompatible with Intel GenX GPU architecture family. There are no Intel supported FOSS drivers. The current available FOSS drivers (included in Linux 3.3 onwards) only support 2D acceleration (not 3D acceleration).[24]

Ubuntu supports GMA500 (Poulsbo) through the ubuntu-mobile and gma500 repositories on Launchpad. Support is present in an experimental way for 11.10 and 12.04, but the installation procedure is not as simple as other drivers and can lead to many bugs. Ubuntu 12.10 has 2D support included.[50]

Joli OS, a Linux based OS optimized for netbooks, has a driver for the GMA500 built in.

PixieLive, a GNU/Linux live distribution optimized for GMA500 netbooks, it can boot from USB Pendrive, SD Card or HardDisk.

Intel releases official Linux drivers through the IEGD (Intel Embedded Graphic Driver) supporting some Linux distributions dedicated to the embedded market.[51]

In November 2009, the Linux Foundation released the details of a new, rewritten Linux driver that would support this chipset and Intel's other upcoming chipsets. The Direct Rendering Manager and X.org parts would be free software, but the 3D component (using Gallium3D) will still be proprietary.[52]

Solaris[edit]

Oracle Solaris 11 provides 64-bit video driver support for the following Intel graphic chipsets: i810, i810-dc100, i810e, i815, i830M, 845G, 852GM/855GM, 865G, 915G, E7221 (i915), 915GM, 945G, 945GM, 945GME, Pineview GM, Pineview G, 965G, G35, 965Q, 946GZ, 965GM, 965GME/GLE, G33, Q35, Q33, GM45, 4 Series, G45/G43, Q45/Q43, G41, B43, Clarkdale, Arrandale, Sandybridge Desktop (GT1), Sandybridge Desktop (GT2), Sandybridge Desktop (GT2+), Sandybridge Mobile (GT1), Sandybridge Mobile (GT2), Sandybridge Mobile (GT2+), Ivybridge Mobile (GT1), Ivybridge Mobile (GT2), Ivybridge Desktop (GT1), Ivybridge Desktop (GT2), Ivybridge Server (GT1), and Ivybridge Server (GT2).

The Solaris open-source community developers provide additional driver support for Intel HD Graphics 4000/2500 graphic-based chipsets (aka Ivy Bridge), OpenGL 3.0/GLSL 1.30, and the new libva/va-apilibrary enabling hardware accelerated video decode for the prevailing coding standards today (MPEG-2, MPEG-4 ASP/H.263, MPEG-4 AVC/H.264, and VC-1/WMV3).

Microsoft Windows[edit]

GMA 900 on Windows[edit]

The GMA 900 is theoretically capable of running Windows Vista's Aero interface and is certified as DirectX 9 compliant. However, no WHQL certified WDDM driver has been made available. Presumably this is due to the lack of a "hardware scheduler" in the GPU.[53]

Many owners of GMA900 hardware believed they would be able to run Aero on their systems as early release candidates of Vista permitted XDDM drivers to run Aero. Intel, however, contends that Microsoft's final specs for Aero/WDDM certification did not permit releasing a WDDM driver for GMA900 (due to issues with the hardware scheduler, as mentioned above), so when the final version of Vista was released, no WDDM driver was released.[54] The last minute pulling of OpenGL capabilities from the GMA drivers for Windows Vista left a large number of GMA based workstations unable to perform basic 3D hardware acceleration with OpenGL and unable to run many Vista Premium applications such as Windows DVD Maker.

In Windows 8, Aero effects are enabled with VGA compatibility driver via software rendering. There are no native GMA900 drivers available for Windows 8 since XDDM support is removed from this operating system. On GMA900 based laptops with Windows 7, users may experience a serious bug related to the chipset's native backlight control method failing to change brightness, resulting in the brightness becoming stuck on a particular value after driver installation. The bug did not occur when Windows 7 was initially released to the public and is commonly observed after running Windows Update.

GMA 950 on Windows[edit]

This IGP is capable of displaying the Aero interface for Windows Vista. Drivers are shipped with Windows Vista since beta versions became available in mid-2006. It can also run Windows 7's Aero interface since Intel released drivers for Windows 7 in mid-June 2009.

The GMA 950 is integrated into many netbooks built on Intel 945GSE Express chipset, and is able to display a resolution up to 2048×1536 at 75 Hz utilizing up to 224 MB of shared memory.[55]

Most of the reviews about this IGP were negative, since many games (such as Splinter Cell: Chaos Theory or Oblivion) need Pixel Shader 2.0 or higher, that is supported in hardware, and Vertex Shader 2.0, that is software-emulated. Other games such as Crysis will start, but with frame rates below acceptable.

GMA X3000/X3100 on Windows[edit]

T&L and Vertex Shaders 3.0 are supported by Intel's newest 15.6 drivers for Windows Vista as of September 2, 2007. XP support for VS3 and T&L was introduced on August 10, 2007. Intel announced in March 2007 that beta drivers would be available in June 2007.[56][57] On June 1, 2007 "pre-beta" (or Early Beta) drivers were released for Windows XP (but not for Vista).[58] Beta drivers for Vista and XP were released on June 19.[59] Since hardware T&L and vertex shading has been enabled in drivers individual applications can be forced to fall back to software rendering,[60] which raises performance and compatibility in certain cases. Selection is based on testing by Intel and preselected in the driver .inf file.

Intel has released production version drivers for 32-bit and 64-bit Windows Vista that enable the Aero graphics. Intel introduced Direct X 10 for the X3100 and X3500 GPUs in the Vista 15.9 drivers, though any release of DX10 drivers for the X3000 is uncertain. WDDM 1.1 is supported by X3100 but DXVA-HD is not.

OpenGL 2.0 support is available since Vista 15.11 drivers[61] and XP 14.36 drivers.[62]

Windows 8 ships with a driver for the X3100.[63]

GMA 500 on Windows[edit]

As of September 2010, the latest available driver revisions from the Intel website for Windows XP, Vista and 7 are:[64][65]

  • IEGD Version 5.1 for Windows NT,2000 and XP (OpenGL only)
  • Version 3.3.0 for Windows XP. (D3D only)
  • Version 4.0.2 for Windows Vista.
  • Version 5.0.0.2030 for Windows 7.

Modern Gaming[edit]

The performance and functionality of GMA processors are limited, attaining the performance of only low-cost discrete GPUs at best.[66] Some features of games and other 3D applications may be unsupported by GMAs, particularly older ones. The GMA X3x00's unified shader design allows for more complete hardware functionality, but the line still has issues with some games and has significantly limited performance.[67]

Intel has put up a page with 'Known Issues & Solutions' for each version.[68] For Intel Graphics Media Accelerator Software Development concerns, there is the Integrated Graphics Software Development Forum.[69]

Microsoft Windows performance reviews[edit]

GMA X3000 review[edit]

A review conducted in April 2007 by The Tech Report determined that the GMA X3000 had performance comparable to the Nvidia GeForce 6150.[70] During that review the GMA X3000 was unable to run thePC games Battlefield 2 and Oblivion.[71] However, the ExtremeTech review found that games which aren't as graphically demanding, such as Sims 2 and Civilization 4, "look good" when the GMA X3000 is used to run them.[72]

Reviews performed by The Tech Report, by ExtremeTech and by Anandtech all concluded that the AMD's Radeon X1250 integrated graphics solutions based on the AMD 690G chipset was a better choice than the GMA X3000 based on the G965 chipset, especially when considering 3D gaming performance and price.[70][73][74]

GMA X3500 review[edit]

In a review performed by Register Hardware in December 2007,[13] author Leo Waldock argued that because the GMA X3500 is not capable of running any PC game that requires DirectX 10, the addition ofDirectX 10 support to the GMA X3500 was "irrelevant".[75] During that same review, the GMA X3500 was used to run the PC games Crysis and FEAR Extraction Point, where it was able to render only 4 and 14 frames per second respectively for each game.[76] In the end the review concluded that overall the X3500 made "minimal advances" over the GMA X3000.[75]

GMA X4500 review[edit]

In a review published in May 2008, the GMA X4500 showed a superior game performance to the lowest-end 1-year-older GeForce 8400M graphics card in some CPU-bound tests, while losing to the still low-end GeForce 8400M GS with a slower CPU.[77]

See also[edit]

 
 
 
 

 http://en.wikipedia.org/wiki/Intel_HD_Graphics

Intel HD Graphics

From Wikipedia, the free encyclopedia
 
 
Intel GMA
Direct3D support Direct3D 11.1[1]
Shader Model 5.0[1]
OpenCL support OpenCL 1.2[1]
OpenGL support OpenGL 4.0 on Windows[1]
OpenGL 4.1 on Mac OS[2]
OpenGL 3.3 on Linux[3]
Predecessor Intel GMA
Successor na
Core i5 processor with integrated HD Graphics 2000

Intel HD Graphics is a series of Intel integrated graphics processors introduced in 2010 that are manufactured on the same die as the processor, together forming an accelerated processing unit.

 

 

History[edit]

Before the introduction of Intel HD Graphics, Intel integrated graphics were built into the motherboard's northbridge, as part of the Intel's Hub Architecture. This included Intel Extreme Graphics and the Intel Graphics Media Accelerator. As part of the Platform Controller Hub (PCH) design, the northbridge was eliminated and graphics processing was moved to the central processing unit (CPU).

The previous Intel graphics solutions, such as the Intel GMA, had a reputation of lacking in performance and features, and therefore they were considered not to be a good choice for more demanding graphics applications, such as 3D gaming. The performance increases brought by Intel HD Graphics made the products competitive with integrated graphics adapters made by other rivals (Nvidia and ATI/AMD). Intel HD Graphics, featuring minimal power consumption that is important in laptops, was capable enough that PC manufacturers often stopped offering discrete graphics options in their low-end and mid-range laptop lines.

In January 2010, the Clarkdale and Arrandale processors were released with Ironlake HD Graphics, and branded as CeleronPentium, or Core.

In January 2011, the Sandy Bridge processors were released, introducing the "second generation" HD Graphics:

  • HD Graphics1 (6 execution units)
  • HD Graphics 2000 (6 execution units and additional features3)
  • HD Graphics 3000 (12 execution units and additional features3)

On April 24, 2012, Ivy Bridge was released, introducing the "third generation" of Intel's HD graphics:[4]

  • HD Graphics2 (6 execution units)
  • HD Graphics 2500 (6 execution units and additional features3)
  • HD Graphics 4000 (16 execution units and additional features3)

1 Celeron and Pentium have Intel HD, while Core i3 and above have either HD 2000 or HD 3000.

2 Celeron and Pentium have Intel HD, while Core i3 and above have either HD 2500 or HD 4000.

3 Those include hardware video encoding and HD postprocessing effects.

HD Graphics on some low power mobile CPUs have limited video decoding, while all desktop CPUs do not have these limitations.

On September 12, 2012, Haswell CPUs were announced, with four models of integrated GPUs:

  • HD Graphics (GT1, 10 execution units)
  • HD Graphics 4200, 4400, 4600, P4600, P4700 (GT2, 20 execution units)
  • HD Graphics 5000 (GT3, 40 execution units, twice the power-performance of HD4xxx for compute-limited workloads, 15 W TDP SKUs)
  • Iris Graphics 5100 (the same as HD Graphics 5000, 28 W TDP SKUs)
  • Iris Pro Graphics 5200 (GT3e, the same as GT3 but with addition of a large 128 MB embedded DRAM (eDRAM) cache to improve performance of bandwidth-limited workloads)

The 128 MB of eDRAM is on the same package as the CPU, but in a separate die manufactured in a different process. Intel refers to this as a Level 4 cache that is available to both CPU and GPU, naming itCrystal Well. Linux support for this eDRAM is expected in kernel version 3.12, by making the drm/i915 driver aware and capable of using it.[5][6][7]

Specifications[edit]

Intel Insider[edit]

Beginning with Sandy Bridge, the graphics processors include a media and content protection service called Intel Insider, which provides protection for the media within the processor.[8] Previously there was a similar technology called Protected Audio Video Path (PAVP).

Three active displays[edit]

Ivy Bridge[edit]

HD2500 and HD4000 GPUs in Ivy Bridge CPUs are advertised as supporting three active monitors, but many users have found that this does not work for them due to the chipsets only supporting two active monitors in many common configurations. The reason for this is that the chipsets only include two phase-locked loops (PLLs); a PLL generates a pixel clock at a certain frequency which is used to sync the timings of data being transferred between the GPU and displays.[9]

Therefore, three simultaneously active monitors can only be achieved by a hardware configuration that requires only two unique pixel clocks, such as:

  • Using 2 or 3 active DisplayPort connections.[10] DisplayPort requires only a single pixel clock for all active connections, regardless of how many there are (this is not the case for non-active connections, which would require an extra pixel clock for each connection).
  • By using two non-DisplayPort connections of the same connection type (i.e. two HDMI connections) and the same clock frequency (i.e. connected to two identical monitors at the same resolution), so that a single unique pixel clock can be shared between both connections.[11]
  • Using the Embedded DisplayPort on a mobile CPU along with any two other outputs.[10]

Haswell[edit]

ASRock Z87 and H87 motherboards support three simultaneous displays.[12] Asus H87 is also advertised as supporting three independent monitors at once.[13]

See also[edit]

 

 

http://en.wikipedia.org/wiki/Graphics_hardware_and_FOSS

Free and open-source graphics device driver

From Wikipedia, the free encyclopedia
  (Redirected from  Graphics hardware and FOSS)
 

free and open-source graphics device driver is software that controls computer graphics hardware and supports graphics rendering APIs and is distributed at no cost with openly shared source code. Graphics device drivers are written for specific hardware to work within the context of a specific operating system kernel and to support a range of APIs used by applications to access the graphics hardware. They may also control output to the display, if the display driver is part of the graphics hardware. Most free and open source graphics device drivers are developed via the Mesa project.

All hardware developers provide device drivers for their products over a range of operating systems. But some developers of graphics hardware provide no free and open-source drivers for their hardware and they provide little or no technical documentation to support independent development of free and open-source device drivers for their products. The free and open source device drivers available for hardware with support for independent driver development are generally of much higher quality in terms of completeness, stability, security and performance than drivers for hardware that lack such support.

Drivers without freely (and thus legally) available source code are commonly referred to as binary drivers. Binary drivers used in the context of operating systems that are prone to ongoing development and change, such as Linux, create problems to both end-users and package maintainers. These problems affect system stability, overall system security, and performance and are the main reason for the independent development of free and open-source drivers. When no technical documentation is available, an understanding of the underlying hardware is often gained by "clean room reverse engineering." Based on this understanding, device drivers may be written and legally published under any chosen software license.

There are rare and special cases, where manufacturers' driver source code is openly available in the Internet, but not under a free license. This means that the code can be studied and altered for personal usage, but the altered (and usually even the original) source code cannot be freely distributed, so problem solutions cannot be shared, significantly reducing the utility of such drivers in comparison to completely free and open-source drivers.

 

 

Problems with binary drivers[edit]

Illustration of the Linux graphics stack

Viewed from the free and open-source software developer's perspective[edit]

There are a number of objections to binary-only drivers. There are philosophical and ethical objections, with some feeling that drivers distributed without source code are against the beliefs of the free software movement. There are very pragmatic objections regarding copyright, security, reliability and development concerns. As part of a wider campaign against binary blobs, OpenBSD lead developer Theo de Raadt has pointed out that with a binary driver there is "no way to fix it when it breaks (and it will break)" and that once a product which relies on binary drivers is declared to be end-of-life by the manufacturer, it is effectively "broken forever."[1]The project has also asserted that binary drivers "hide bugs and workarounds for bugs,"[2] a comment that has been somewhat vindicated by flaws found in binary drivers, including in October 2006 an exploitable bug in Nvidia's 3D drivers discovered by Rapid7. It is speculated that this bug has existed since 2004, although Nvidia have denied this, asserting that the issue was only communicated to them in July 2006 and that the 2004 bug was a bug in X.Org, not in Nvidia's driver.[3]Another problem with binary drivers is that they often do not work with current versions of open source software, and almost never support development snapshots of open source software - e.g. it is usually not directly possible for a developer to use Nvidia's or ATI's proprietary drivers with a development snapshot of an X server or a development snapshot of the Linux kernel. Furthermore features like kernel mode-setting cannot be added to binary drivers by anyone but the vendors, which prevents their inclusion if the vendor lacks capacity or interest.

In the Linux kernel development community, Linus Torvalds has made strong statements on the issue of binary-only modules, asserting: "I refuse to even consider tying my hands over some binary-only module", and continuing: "I want people to know that when they use binary-only modules, it's THEIR problem".[4] Another kernel developer, Greg Kroah-Hartman, has commented that a binary-only kernel module does not comply with the kernel's license—the GNU General Public License—it "just violates the GPL due to fun things like derivative works and linking and other stuff."[5]

Writer and computer scientist Peter Gutmann has expressed concerns that the digital rights management scheme in Microsoft's Windows Vista operating system may limit the availability of the documentation required to write open drivers as it "requires that the operational details of the device be kept confidential."[6]

In the case of binary drivers there are also objections due to free software philosophy, software quality and security concerns.[7] There are also concerns that the redistribution of closed source Linux kernel modules may be illegal.[8]

By choice, the Linux kernel has no Binary Kernel Interface.[9]

Viewed from the hardware developer's perspective[edit]

Application-specific integrated circuitry / Integrated circuitrymicroprocessors designed and suitable to efficiently, when looking at power consumption and silicon, accelerate the calculations required by therasterisation of 3D wire-frame models, is actually pretty simple and straight forward.

When applications, such as a 3D game engine or a 3D computer graphics software shunt calculations from the CPU to the GPU, they usually use a special purpose API, like e.g. OpenGL or Direct3D, and do not address the hardware directly (see also Cell (microprocessor)). All the translation from API calls to actual GPU opcodes, is done by the device driver, hence the device driver contains a considerable amount of know-how, and is the constant object of optimization. This takes time and involves considerable financial investments. Thus alone the leakage of source code, not even published under some free license, would give the competition an advantage, by having the own know-how plus the know-how gained by looking at the accumulation of the know-how leaked. Especially potential newcomers to the business of graphic acceleration ASICs would gain a considerable amount of know-how, without bearing the very high costs involved in the development of the know-how.

On the Computer Desktop, the hardware scene was long time dominated by the x86/x86-64 instruction set and the PC and by GPUs available for the PC. And for a long time, there were only three major competitors: Nvidia, AMD and Intel. The main competing factor was raw performance in 3D computer games (which is not only achieved by much silicon by also by the efficient translation of API calls into GPU opcodes!), price for office hardware. The display driver is an inherent part of the graphics card, and so is the video decoder, that is silicon designed to assist calculations necessary for the decoding of Video streams, like e.g. the Daala (video codec). The market for PC has also been dwindling for the last couple of years. It seems highly unlikely new competitors will enter this market, and it is hard to say, how much more know-how one company would gain, by having a glance at the source code of the drivers of the other company. Intel maintain only free and open-source driver for Linux.

The mobile sector however, presents itself as an antonym:

  • The functional blocks (i.e. the display driver ASIC, the 3D acceleration ASIC, the 2D acceleration ASIC, Video decoing and Video encoding ASIC) are separate SIP (SIP) blocks, since the hardware devices are very different from each other. Some portable media player does require a display driver and gains from Video decoding acceleration, but does not require any sort of 3D acceleration, etc.
  • the development goal is not only raw 3D performance, but also system integration, power consumption and 2D capabilities. There is also an approach to abandon the only Vsync way to update the display and make better use of the possibilities that came with the sample and hold-technology to significantly lower power consumption.
  • There are more competitors on the market and there are newcomers (e.g. while Imagination has been around since before 2000, Vivante entered this market in 2005. Ingenic experimented with some own SIP solutions for graphics.)
  • Usually there is no distinct graphics memory, thus the CPU and the GPU share the main memory, which is considerably slower the distinct graphics memory in graphics cards for Desktop PCs. Yet, this can greatly benefit power consumption and performance when supported by the software architecture, because data does not have to be copied from the one to the other memory.

The growing mobile market, the unsatisfied requirements of mobile devices and the advantages that can be gained by the development of new techniques leaves much more room for the existing competition and for new competitors entering the market. Thus the SIP and the software supporting it (the device drivers and maybe even the software infrastructure for these device drivers, i.e. the entire graphics stack) can be considered more prone to discretion and a quick launch to market.

When looking at the fact that during the second quarter of 2013, 79.3% of smartphones sold worldwide were running some version of Android,[10] it is clear that the Linux kernel is dominant on smartphones. Thus hardware developers have a huge incentive to deliver excellent Linux drivers for their hardware, but due to the competition, no incentive what-so-ever to make these driver free and open-source.

Projects like libhybris try to combine the power of the existent Linux android drivers with different platforms than Android. Additionally there are ongoing efforts to write free and open-source drivers by independent developers.

Performance Comparison[edit]

GLX gears is  not a benchmark

A widely known source for performance information is the free3d.org site,[11] which collects 3D performance information—specifically glxgears frame rates—submitted by users. On the basis of what it concedes is an inadequate benchmark,[12] the site currently lists ATI's Radeon HD 4670 as recommended for "best 3D performance." Additionally, Phoronixroutinely runs benchmarks comparing free driver performance.

A comparison from April 29, 2013 between the FOSS and the proprietary drivers on both AMD and Nvidia is found here: Phoronix

Software architecture[edit]

Mesa / DRI and  Gallium3D have different driver models. Both share a lot of  free and open-source code
An example matrix for implementing the Gallium3D driver model. Through the introduction of the Gallium3D Tracker Interface and the Gallium3D WinSys Interface, only 18 instead of 36 modules are required. Each WinSys module can work with each Gallium3D device driver module and with each State Tracker module.

The framework on Unix-like operating systems for graphics device drivers has been the constant objective of developments and thus change.

Video Electronics Standards Association[edit]

In general, the vesa driver supports most graphics cards without acceleration and with display resolutions limited to a set of resolutions programmed in the video BIOS by the manufacturer.

Also see: Kernel Documentation/fb or the Phoronix Test Suite

Direct Rendering Infrastructure[edit]

Gallium3D[edit]

Gallium3D splits the user space driver into three components by introducing two additional interfaces: the Gallium3D Tracker interface and the Gallium3D WinSys Interface. This has certain advantages and also disadvantages. Drivers shares a lot of code with DRI-drivers, and for example Intel, still only writes DRI-drivers. A lot of source code is of course shared between the drivers implementing DRI or Gallium3D respectively.

Free and open-source API implementations[edit]

Mesa 3D is the only available free and open-source implementation of OpenGLOpenGL ESOpenVGGLXEGL and OpenCL.

Wine contains an implementation of the Direct3D version 9. Another Wine component used to translate Direct3D calls into OpenGL calls and it worked on top of the OpenGL implementation in Mesa 3D. Since the Gallium3D State Tracker for D3D9 is available, Wine was adapted, so that this translation is no longer taking place. The Gallium3D State Tracker for D3D9, device drivers have a direct interface to the Direct3D 9 API, increasing the performance by a factor of two and more.

Annotation: Gallium3D State Trackers are also not an implementation of a 3D API like OpenGL or Direct3D, but are a component of the device driver, and are meant as an interface to an API (an API to an API (sic!)). As of the time of this writing, there is was a fully functional State Tracker for Direct3D version 9 written in C++ and an un-maintained one for Direct3D versions 10 and 11 written in C.

History[edit]

The Linux graphics stack have been prone to a long evolution over the years, with some unnecessary detours due to the X Window System core protocol. Understanding this history should help comprehend the current design and its late arrival.

Free and open-source drivers[edit]

In general, the vesa driver supports most graphics cards without acceleration and with display resolutions limited to a set of resolutions programmed in the video BIOS by the manufacturer.

Also see: Kernel Documentation/fb or the Phoronix Test Suite

ATI/AMD[edit]

The proprietary driver by AMD is called fglrx and is distributed as part of AMD Catalyst Linux. A current version can be downloaded from the Internet and some Linux distributions contain it in their repositories.
The FOSS drivers for AMD/ATI GPUs are called radeon (xf86-video-ati or xserver-xorg-video-radeon). The radeon drivers provide 2D and Xv acceleration, with 3D support available for almost all supported cards. However, the drivers still have to load proprietary microcode into the GPU to enable hardware acceleration.[13] Full Modesetting support is also available. There is also the RadeonProgram, which aims to have a community maintained application compatibility database similar to the Wine project's AppDB.

radeon 3D code is split into five parts: the radeon and r200 classic Mesa drivers and r300gr600gradeonsi Gallium3D drivers:

On October 1, 2013, AMD released the missing documentation for their newest products.[15]

An up-to-date feature matrix is available at Freedesktop.orgradeon feature matrix

ATI released programming specifications for a number of their chipsets in 2007, 2008 and 2009.[16][17][18][19][20] AMD also does some active development and support for the radeon driver.[21] This is in direct contrast to AMD's main competitor in the graphics field Nvidia, which does offer its own proprietary driver similar to AMD Catalyst, but does not provide any support or assistance to any free graphics initiatives.[22] AMD had just two engineers working full-time on the free drivers, namely Alex Deucher and Richard Li, although they decided to work to expand their free graphics team.[23] They recently hired three more developers, with one of the developers going to be working on the desktop graphics stack and the other two on embedded open-source priorities.[24] The developers in question have now been confirmed to be veteran graphics coders Michel Dänzer (taking over from Richard) and Christian König,[25] as well as Tom Stellard.

Recently, work on performance optimizations has greatly enhanced the 3D performance of the radeon driver, especially for users of R300g.[26] In some select cases, the R300g driver can even outperform a legacy Catalyst driver in terms of 3D performance.[27] Various options and tweaks can also be enabled to optimize the free driver's speed and offer more competitive performance.[28]

Nvidia[edit]

A screenshot of REnouveau, which is a program that collects data for most of  nouveau's reverse engineering work

The proprietary driver by Nvidia is called geforce. A current version can be downloaded from the Internet and some Linux distributions, e.g. Ubuntu, contain it in their repositories.
One FOSS driver for Nvidia GPUs is called nv.[29] It is 2D-only, and some people, e.g. Dirk Hohndel,[30][31] claim it is obfuscated[32][33]. It was maintained by Nvidia themselves. It features neither 3D acceleration nor motion compensation.[33][34] On March 26, 2010, Nvidia announced that it would stop supporting new technologies and GPUs in nv, stating that

  • Nvidia will continue to support the existing functionality and existing level of acceleration in the nv driver for existing GPUs, on existing, and (within reason) future, X server versions.
  • Nvidia will not support the xf86-video-nv driver on Fermi or later GPUs.
  • Nvidia will not support DisplayPort, on any GPU, in the xf86-video-nv driver.
—Andy Ritger,  Message to the x.org mailing list[35][36]

NVidia in the past, provided documentation for the older RIVA TNT series chipsets.

While the proprietary Nvidia drivers support the GLX interface, e.g. libgl-nvidia-glx[37] Nvidia did not announce plans to support the EGL interface, upon which e.g. Wayland relies on. However, the proprietary Nvidia driver 331.13 BETA from 4 October 2013 supports the EGL API.[38]

Another FOSS driver for Nvidia GPUs is called nouveau. This project aims to reverse engineer Nvidia's GPUs to produce 3D acceleration for X.Org/Wayland utilizing Gallium3D. Already at an early stage of development, several Linux distributions, starting with Fedora,[39] have chosen to use nouveau as the default FOSS driver for Nvidia cards.[40][41] The nouveau driver can already manage to offer competitive 3D performance on lower end Nvidia graphics hardware, when compared to the official binary driver.[42][43] On March 26, 2012, the Nouveau driver was marked as stable and promoted from the staging area of the Linux kernel.[44]

On 23 September 2013[45] Nvidia publicly announced, that they would release some documentation about their GPUs.

An up-to-date feature matrix is available at Freedesktop.org: Nouveau Driver Feature Matrix

tegra-re github:tegra-re is the name of the project, that is working in reverse-engineering nVidia's Tegra series of GPUs for embedded devices.

Nvidia distributes proprietary device drivers for Tegra through OEMs and as part of its "Linux for Tegra" (formerly "L4T") development kit. As of April 2012, due to different "business needs" from that of their GeForce line of graphics cards, Nvidia and one of their Embedded Partners, Avionic Design GmbH from Germany, are also working on submitting free and open-source drivers for Tegra upstream to the mainline Linux kernel.[46][47] Nvidia co-founder & CEO laid out the Tegra processor roadmap using Ubuntu Unity in GPU Technology Conference 2013.[48]

Intel[edit]

Intel has a long history of producing or commissioning open source drivers for its graphics chips, with the exception of their Intel GMA#PowerVR based chips that severely lack documentation.[21] The driver development is outsourced to Tungsten Graphics (acquired by VMware 2008-11-26).

There are currently two versions of the Intel X.Org driver, the xorg-video-i810 driver which supports the i810 and a number of more recent chipsets[49] and the updated xorg-video-intel driver that supports the same hardware, with the recent versions 2.0 and 2.1 adding support for later chips, including the G33Q33 and Q35.[50]

In addition, the xorg-video-intel driver (sometimes known as the "modesetting driver") does not use the video BIOS for switching video modes; as some BIOSes include only a limited range of modes, this provides more reliable access to those supported by Intel video cards.

Unlike the radeon and nouveau drivers, Intel does not intend to utilize the Gallium3D framework for its graphics drivers.[51] Intel has also recently put work into optimizing their free Linux drivers to have closer performance to their Windows counterparts, especially on Sandy Bridge hardware where various performance optimizations have now allowed the free Intel driver to be able to outperform their proprietary Windows drivers at certain tasks.[52][53][54] Some of these performance enhancements can also benefit users of older hardware.[55]

Intel's LLC (Last Level Cache, L4-Cache, Crystalwell, Iris Pro) got supported in Linux kernel 3.12[56][57]

Intel has more 20 to 30 full-time Linux graphics developers.[58]

Matrox[edit]

Matrox develops and manufactures these products:

Matrox provides free and open source drivers for all their chipsets older than the G550; chipsets newer than the G550 are only supported by a closed source driver.

S3 Graphics[edit]

S3 Graphics develops these products:

ARM[edit]

ARM Holdings is a fabless semiconductor company which licenses SIP cores. They are known for the licensing the ARM instruction set and CPUs based upon it but they also develop and license the Mali series of GPUs. On January 21, 2012, Phoronix reported that Luc Verhaegen was driving a reverse-engineering attempt aimed at the ARM Holdings Mali series of GPUs, specifically the Mali-200 and Mali-400 versions. The reverse-engineering project was presented at FOSDEM, February 4, 2012.[59][60] On February 2, 2013, Verhaegen demonstrated Quake III Arena in timedemo mode, running on top of the Limadriver.[61]

Imagination Technologies[edit]

Imagination Technologies is a fabless semiconductor company which licenses SIP cores. They are known for the licensing the MIPS instruction set and CPUs based upon it but probably even more for developing and licensing the PowerVR series of GPUs. There are a couple of PowerVR based-Intel GPUs but the PowerVR is widely used in mobile SoCs. Imagination Technologies do not provide a FOSS driver nor any documentation for their products. Due to its vast appearance in embedded devices, the Free Software Foundation has put reverse-engineering of the PowerVR driver on a high-priority project list.[62]

GNU: PowerVR

Vivante[edit]

Vivante is a fabless semiconductor company which licenses SIP cores. They develop the GCxxxx series of GPUs. There is a proprietary and closed source Linux driver from Vivante consisting of a kernel space and a user space part. The kernel component is available as open source (GPL) however the user space components which consist of the GLES(2) implementations and a HAL library are not. These, however, contain the bulk of the driver logic.

Wladimir J. van der Laan found out and documented the state bits, command stream, and shader ISA by studying how these blobs work. He studied these by examining and manipulating command stream dumps. Based upon this documentation the Gallium3D driver etna_viv is being written. Wladimir's work was inspired by libv. Due to the simplicity of the vivante hardware, etna_viv has surpassed libv has achieved more quickly than similar projects. The project has produced a functional-but-unoptimized Gallium3D LLVM driver. etna_viv has surpassed vivante's own proprietary code in some benchmarks. It supports Vivante's product line of GC400 Series, GC800 Series, GC1000 Series, GC2000 Series, GC4000 Series, GC5000 Series and GC6000 Series.[63]

Qualcomm[edit]

Qualcomm develops and manufactures the Adreno (former ATI Imageon) GPU series, mostly as part of their Snapdragon (system on chip). In 2012, Phoronix and Slashdot reported that Rob Clark was working on reverse-engineering drivers for the Adreno GPU series.[64][65] Clark wrote in a referenced blog post that he is doing the project in his spare time, and that the Qualcomm platform was his only viable target for working on open 3D graphics since his employers (Texas Instruments and Linaro) were affiliated with the Imagination PowerVR and ARM Mali cores which would otherwise have been his primary targets, and that he already had working command streams for 2D support, and that 3D commands seemed to have the same characteristics.[66] The driver code was published on Gitorious under the namefreedreno.[67] It has since been moved to Mesa.[68][69] In 2012, a working shader assembler was completed[70] and demonstration versions were developed for texture mapping[71] and phong shading[72]using the reverse-engineered shader compiler. At FOSDEM, February 2, 2013, Clark demonstrated Freedreno with support for Gallium3D running desktop compositing, the XBMC media player and Quake III Arena.[73] As of September 2013, freedreno has been adopted into mainline Linux, and will be part of Linux 3.12.[74]

Broadcom[edit]

Broadcom develops and designs the VideoCore GPU series as part of their SoCs. Due to the fact that it is used by the Raspberry Pi there has been strong interest in a FOSS driver for VideoCore.[75] and on October 24, 2012 the Raspberry Pi Foundation, in co-operation with Broadcom, announced they open sourced, "all the ARM (CPU) code that drives the GPU". Unfortunately, the code released reflects the very minimum needed to make an open-source announcement. The open-source components allow message-passing between the ARM CPU and VideoCore, but offer little insight into Videocore and little further programmability. This is because the Videocore GPU runs a RTOS that handles the real processing; all of the actual video acceleration is done using this RTOS firmware coded for its proprietary GPU, and this firmware wasn't open sourced.[76] Further, as there is neither a toolchain available that could target this proprietary GPU, nor its Instruction set documented, no possible advantage could be taken in case if the firmware source code is made available. The project videocoreiv[77] tries to document the VideoCore GPUs. Based upon this documentation a driver could be written.

Other vendors[edit]

SiS and VIA have both shown limited interest and communication regarding open source drivers; however, both have released source code which has later been integrated into X.Org by FOSS developers.[33]In July 2008, VIA opened up documentation for their products to improve its image within the Linux and open-source communities.[78] Although, so far VIA has failed to work with the open source community to provide documentation and a working DRM driver, leaving expectations for supporting the Linux operating system unfulfilled.[79] On January 6, 2011 it was announced that VIA was no longer interested in supporting free graphics initiatives.[80]

DisplayLink have announced an open source project called libdlo with the goal of bringing support for their USB graphics technology to Linux and other platforms. The code is available under the LGPLlicense.[81] It has not yet been integrated into an X.Org driver. DisplayLink graphics support is available through the kernel udlfb driver (with fbdev) in mainline, and udl/drm driver, which as of March 2012 is only available in the drm-next tree.

Often other non-hardware related vendors may also lend a hand to free graphics initiatives. Red Hat has some employees working full-time on several free and open source software projects, such as two full-time employees working on the free software radeon (David Airlie and Jérôme Glisse[82]) and one full-time employee working on the free software nouveau graphic drivers.[citation needed] In addition, the Fedora Project runs a "Fedora Graphics Test Week" event before the launch of their new Linux distribution versions in order to provide rigorous testing to free graphics drivers.[83] Other companies that have provided development or support include Novell and VMware.

Open hardware projects[edit]

Project VGA assembled PCB
Open Graphics Project assembled PCB

Project VGA aims to create a low budget, open source, VGA compatible video card. All information to create one is available, but at present there seems to be no development. Some data:[84]

  • PCI bus interface (32 bit, 33/66 MHz, 3,3/5V compatible)
  • Xilinx Spartan-3 S400 FPGA (aiming for ~100 MHz)
  • 16 MB SDRAM (aiming for ~166 MHz)
  • Onboard programmer with USB interface
  • Analog (HD15) VGA output connector.

The Open Graphics Project is another aim at creating an open hardware GPU. The Open Graphics Device v1 features dual DVI-I outputs and 100-pin IDC connector. On September 2010, the first 25 OGD1 boards were made available for grant application and purchase ($750).[85]

The Milkymist system-on-chip, targeted at embedded graphics instead of desktop computers, supports a VGA output, a limited vertex shader and a 2D texturing unit.[86]

See also[edit]