fbpx
Wikipedia

Shader

In computer graphics, a shader is a computer program that calculates the appropriate levels of light, darkness, and color during the rendering of a 3D scene—a process known as shading. Shaders have evolved to perform a variety of specialized functions in computer graphics special effects and video post-processing, as well as general-purpose computing on graphics processing units.

Shaders are most commonly used to produce lit and shadowed areas in the rendering of 3D models. Phong shading (right) is an improvement on Gouraud shading, and was one of the first computer shading models ever developed after the basic flat shader (left), greatly enhancing the appearance of curved surfaces in renders.
Another use of shaders is for special effects, even on 2D images, (e.g., a photo from a webcam). The unaltered, unshaded image is on the left, and the same image has a shader applied on the right. This shader works by replacing all light areas of the image with white, and all dark areas with a brightly colored texture.

Traditional shaders calculate rendering effects on graphics hardware with a high degree of flexibility. Most shaders are coded for (and run on) a graphics processing unit (GPU),[1] though this is not a strict requirement. Shading languages are used to program the GPU's rendering pipeline, which has mostly superseded the fixed-function pipeline of the past that only allowed for common geometry transforming and pixel-shading functions; with shaders, customized effects can be used. The position and color (hue, saturation, brightness, and contrast) of all pixels, vertices, and/or textures used to construct a final rendered image can be altered using algorithms defined in a shader, and can be modified by external variables or textures introduced by the computer program calling the shader.[citation needed]

Shaders are used widely in cinema post-processing, computer-generated imagery, and video games to produce a range of effects. Beyond simple lighting models, more complex uses of shaders include: altering the hue, saturation, brightness (HSL/HSV) or contrast of an image; producing blur, light bloom, volumetric lighting, normal mapping (for depth effects), bokeh, cel shading, posterization, bump mapping, distortion, chroma keying (for so-called "bluescreen/greenscreen" effects), edge and motion detection, as well as psychedelic effects such as those seen in the demoscene.[clarification needed]

History edit

This use of the term "shader" was introduced to the public by Pixar with version 3.0 of their RenderMan Interface Specification, originally published in May 1988.[2]

As graphics processing units evolved, major graphics software libraries such as OpenGL and Direct3D began to support shaders. The first shader-capable GPUs only supported pixel shading, but vertex shaders were quickly introduced once developers realized the power of shaders. The first video card with a programmable pixel shader was the Nvidia GeForce 3 (NV20), released in 2001.[3] Geometry shaders were introduced with Direct3D 10 and OpenGL 3.2. Eventually, graphics hardware evolved toward a unified shader model.

Design edit

Shaders are simple programs that describe the traits of either a vertex or a pixel. Vertex shaders describe the attributes (position, texture coordinates, colors, etc.) of a vertex, while pixel shaders describe the traits (color, z-depth and alpha value) of a pixel. A vertex shader is called for each vertex in a primitive (possibly after tessellation); thus one vertex in, one (updated) vertex out. Each vertex is then rendered as a series of pixels onto a surface (block of memory) that will eventually be sent to the screen.

Shaders replace a section of the graphics hardware typically called the Fixed Function Pipeline (FFP), so-called because it performs lighting and texture mapping in a hard-coded manner. Shaders provide a programmable alternative to this hard-coded approach.[4]

The basic graphics pipeline is as follows:

  • The CPU sends instructions (compiled shading language programs) and geometry data to the graphics processing unit, located on the graphics card.
  • Within the vertex shader, the geometry is transformed.
  • If a geometry shader is in the graphic processing unit and active, some changes of the geometries in the scene are performed.
  • If a tessellation shader is in the graphic processing unit and active, the geometries in the scene can be subdivided.
  • The calculated geometry is triangulated (subdivided into triangles).
  • Triangles are broken down into fragment quads (one fragment quad is a 2 × 2 fragment primitive).
  • Fragment quads are modified according to the fragment shader.
  • The depth test is performed; fragments that pass will get written to the screen and might get blended into the frame buffer.

The graphic pipeline uses these steps in order to transform three-dimensional (or two-dimensional) data into useful two-dimensional data for displaying. In general, this is a large pixel matrix or "frame buffer".

Types edit

There are three types of shaders in common use (pixel, vertex, and geometry shaders), with several more recently added. While older graphics cards utilize separate processing units for each shader type, newer cards feature unified shaders which are capable of executing any type of shader. This allows graphics cards to make more efficient use of processing power.

2D shaders edit

2D shaders act on digital images, also called textures in the field of computer graphics. They modify attributes of pixels. 2D shaders may take part in rendering 3D geometry. Currently the only type of 2D shader is a pixel shader.

Pixel shaders edit

Pixel shaders, also known as fragment shaders, compute color and other attributes of each "fragment": a unit of rendering work affecting at most a single output pixel. The simplest kinds of pixel shaders output one screen pixel as a color value; more complex shaders with multiple inputs/outputs are also possible.[5] Pixel shaders range from simply always outputting the same color, to applying a lighting value, to doing bump mapping, shadows, specular highlights, translucency and other phenomena. They can alter the depth of the fragment (for Z-buffering), or output more than one color if multiple render targets are active. In 3D graphics, a pixel shader alone cannot produce some kinds of complex effects because it operates only on a single fragment, without knowledge of a scene's geometry (i.e. vertex data). However, pixel shaders do have knowledge of the screen coordinate being drawn, and can sample the screen and nearby pixels if the contents of the entire screen are passed as a texture to the shader. This technique can enable a wide variety of two-dimensional postprocessing effects such as blur, or edge detection/enhancement for cartoon/cel shaders. Pixel shaders may also be applied in intermediate stages to any two-dimensional images—sprites or textures—in the pipeline, whereas vertex shaders always require a 3D scene. For instance, a pixel shader is the only kind of shader that can act as a postprocessor or filter for a video stream after it has been rasterized.

3D shaders edit

3D shaders act on 3D models or other geometry but may also access the colors and textures used to draw the model or mesh. Vertex shaders are the oldest type of 3D shader, generally making modifications on a per-vertex basis. Newer geometry shaders can generate new vertices from within the shader. Tessellation shaders are the newest 3D shaders; they act on batches of vertices all at once to add detail—such as subdividing a model into smaller groups of triangles or other primitives at runtime, to improve things like curves and bumps, or change other attributes.

Vertex shaders edit

Vertex shaders are the most established and common kind of 3D shader and are run once for each vertex given to the graphics processor. The purpose is to transform each vertex's 3D position in virtual space to the 2D coordinate at which it appears on the screen (as well as a depth value for the Z-buffer).[6] Vertex shaders can manipulate properties such as position, color and texture coordinates, but cannot create new vertices. The output of the vertex shader goes to the next stage in the pipeline, which is either a geometry shader if present, or the rasterizer. Vertex shaders can enable powerful control over the details of position, movement, lighting, and color in any scene involving 3D models.

Geometry shaders edit

Geometry shaders were introduced in Direct3D 10 and OpenGL 3.2; formerly available in OpenGL 2.0+ with the use of extensions.[7] This type of shader can generate new graphics primitives, such as points, lines, and triangles, from those primitives that were sent to the beginning of the graphics pipeline.[8]

Geometry shader programs are executed after vertex shaders. They take as input a whole primitive, possibly with adjacency information. For example, when operating on triangles, the three vertices are the geometry shader's input. The shader can then emit zero or more primitives, which are rasterized and their fragments ultimately passed to a pixel shader.

Typical uses of a geometry shader include point sprite generation, geometry tessellation, shadow volume extrusion, and single pass rendering to a cube map. A typical real-world example of the benefits of geometry shaders would be automatic mesh complexity modification. A series of line strips representing control points for a curve are passed to the geometry shader and depending on the complexity required the shader can automatically generate extra lines each of which provides a better approximation of a curve.

Tessellation shaders edit

As of OpenGL 4.0 and Direct3D 11, a new shader class called a tessellation shader has been added. It adds two new shader stages to the traditional model: tessellation control shaders (also known as hull shaders) and tessellation evaluation shaders (also known as Domain Shaders), which together allow for simpler meshes to be subdivided into finer meshes at run-time according to a mathematical function. The function can be related to a variety of variables, most notably the distance from the viewing camera to allow active level-of-detail scaling. This allows objects close to the camera to have fine detail, while further away ones can have more coarse meshes, yet seem comparable in quality. It also can drastically reduce required mesh bandwidth by allowing meshes to be refined once inside the shader units instead of downsampling very complex ones from memory. Some algorithms can upsample any arbitrary mesh, while others allow for "hinting" in meshes to dictate the most characteristic vertices and edges.

Primitive and Mesh shaders edit

Circa 2017, the AMD Vega microarchitecture added support for a new shader stage—primitive shaders—somewhat akin to compute shaders with access to the data necessary to process geometry.[9][10] Similarly, Nvidia introduced mesh and task shaders with its Turing microarchitecture in 2018 which provide similar functionality and like AMD's primitive shaders are also modelled after compute shaders.[11][12]

In 2020, AMD and Nvidia released RDNA 2 and Ampere microarchitectures which both support mesh shading through DirectX 12 Ultimate.[13] These mesh shaders allow the GPU to handle more complex algorithms, offloading more work from the CPU to the GPU, and in algorithm intense rendering, increasing the frame rate of or number of triangles in a scene by an order of magnitude.[14] Intel announced that Intel Arc Alchemist GPUs shipping in Q1 2022 will support mesh shaders.[15]

Ray tracing shaders edit

Ray tracing shaders are supported by Microsoft via DirectX Raytracing, by Khronos Group via Vulkan, GLSL, and SPIR-V,[16] by Apple via Metal.

Compute shaders edit

Compute shaders are not limited to graphics applications, but use the same execution resources for GPGPU. They may be used in graphics pipelines e.g. for additional stages in animation or lighting algorithms (e.g. tiled forward rendering). Some rendering APIs allow compute shaders to easily share data resources with the graphics pipeline.

Parallel processing edit

Shaders are written to apply transformations to a large set of elements at a time, for example, to each pixel in an area of the screen, or for every vertex of a model. This is well suited to parallel processing, and most modern GPUs have multiple shader pipelines to facilitate this, vastly improving computation throughput.

A programming model with shaders is similar to a higher order function for rendering, taking the shaders as arguments, and providing a specific dataflow between intermediate results, enabling both data parallelism (across pixels, vertices etc.) and pipeline parallelism (between stages). (see also map reduce).

Programming edit

The language in which shaders are programmed depends on the target environment. The official OpenGL and OpenGL ES shading language is OpenGL Shading Language, also known as GLSL, and the official Direct3D shading language is High Level Shader Language, also known as HLSL. Cg, a third-party shading language which outputs both OpenGL and Direct3D shaders, was developed by Nvidia; however since 2012 it has been deprecated. Apple released its own shading language called Metal Shading Language as part of the Metal framework.

GUI shader editors edit

Modern video game development platforms such as Unity, Unreal Engine and Godot increasingly include node-based editors that can create shaders without the need for actual code; the user is instead presented with a directed graph of connected nodes that allow users to direct various textures, maps, and mathematical functions into output values like the diffuse color, the specular color and intensity, roughness/metalness, height, normal, and so on. Automatic compilation then turns the graph into an actual, compiled shader.

See also edit

References edit

  1. ^ "LearnOpenGL - Shaders". learnopengl.com. Retrieved November 12, 2019.
  2. ^ "The RenderMan Interface Specification".
  3. ^ Lillypublished, Paul (May 19, 2009). "From Voodoo to GeForce: The Awesome History of 3D Graphics". PC Gamer – via www.pcgamer.com.
  4. ^ "ShaderWorks' update - DirectX Blog". August 13, 2003.
  5. ^ "GLSL Tutorial – Fragment Shader". June 9, 2011.
  6. ^ "GLSL Tutorial – Vertex Shader". June 9, 2011.
  7. ^ Geometry Shader - OpenGL. Retrieved on December 21, 2011.
  8. ^ "Pipeline Stages (Direct3D 10) (Windows)". msdn.microsoft.com.
  9. ^ "Radeon RX Vega Revealed: AMD promises 4K gaming performance for $499 - Trusted Reviews". July 31, 2017.
  10. ^ "The curtain comes up on AMD's Vega architecture". January 5, 2017.
  11. ^ "NVIDIA Turing Architecture In-Depth". September 14, 2018.
  12. ^ "Introduction to Turing Mesh Shaders". September 17, 2018.
  13. ^ "Announcing DirectX 12 Ultimate". DirectX Developer Blog. March 19, 2020. Retrieved May 25, 2021.
  14. ^ "Realistic Lighting in Justice with Mesh Shading". NVIDIA Developer Blog. May 21, 2021. Retrieved May 25, 2021.
  15. ^ Smith, Ryan. "Intel Architecture Day 2021: A Sneak Peek At The Xe-HPG GPU Architecture". www.anandtech.com.
  16. ^ "Vulkan Ray Tracing Final Specification Release". Blog. Khronos Group. November 23, 2020. Retrieved 2021-02-22.

Further reading edit

External links edit

  • OpenGL geometry shader extension
  • Riemer's DirectX & HLSL Tutorial: HLSL Tutorial using DirectX with much sample code
  • Pipeline Stages (Direct3D 10)

shader, this, article, about, kind, computer, program, other, uses, album, tattoo, machine, this, article, multiple, issues, please, help, improve, discuss, these, issues, talk, page, learn, when, remove, these, template, messages, this, article, factual, accu. This article is about the kind of computer program For other uses see Shader album and Tattoo machine This article has multiple issues Please help improve it or discuss these issues on the talk page Learn how and when to remove these template messages This article s factual accuracy may be compromised due to out of date information Please help update this article to reflect recent events or newly available information April 2017 This article includes a list of general references but it lacks sufficient corresponding inline citations Please help to improve this article by introducing more precise citations April 2014 Learn how and when to remove this template message Learn how and when to remove this template message In computer graphics a shader is a computer program that calculates the appropriate levels of light darkness and color during the rendering of a 3D scene a process known as shading Shaders have evolved to perform a variety of specialized functions in computer graphics special effects and video post processing as well as general purpose computing on graphics processing units Shaders are most commonly used to produce lit and shadowed areas in the rendering of 3D models Phong shading right is an improvement on Gouraud shading and was one of the first computer shading models ever developed after the basic flat shader left greatly enhancing the appearance of curved surfaces in renders Another use of shaders is for special effects even on 2D images e g a photo from a webcam The unaltered unshaded image is on the left and the same image has a shader applied on the right This shader works by replacing all light areas of the image with white and all dark areas with a brightly colored texture Traditional shaders calculate rendering effects on graphics hardware with a high degree of flexibility Most shaders are coded for and run on a graphics processing unit GPU 1 though this is not a strict requirement Shading languages are used to program the GPU s rendering pipeline which has mostly superseded the fixed function pipeline of the past that only allowed for common geometry transforming and pixel shading functions with shaders customized effects can be used The position and color hue saturation brightness and contrast of all pixels vertices and or textures used to construct a final rendered image can be altered using algorithms defined in a shader and can be modified by external variables or textures introduced by the computer program calling the shader citation needed Shaders are used widely in cinema post processing computer generated imagery and video games to produce a range of effects Beyond simple lighting models more complex uses of shaders include altering the hue saturation brightness HSL HSV or contrast of an image producing blur light bloom volumetric lighting normal mapping for depth effects bokeh cel shading posterization bump mapping distortion chroma keying for so called bluescreen greenscreen effects edge and motion detection as well as psychedelic effects such as those seen in the demoscene clarification needed Contents 1 History 2 Design 3 Types 3 1 2D shaders 3 1 1 Pixel shaders 3 2 3D shaders 3 2 1 Vertex shaders 3 2 2 Geometry shaders 3 2 3 Tessellation shaders 3 2 4 Primitive and Mesh shaders 3 2 5 Ray tracing shaders 3 3 Compute shaders 4 Parallel processing 5 Programming 5 1 GUI shader editors 6 See also 7 References 8 Further reading 9 External linksHistory editThis use of the term shader was introduced to the public by Pixar with version 3 0 of their RenderMan Interface Specification originally published in May 1988 2 As graphics processing units evolved major graphics software libraries such as OpenGL and Direct3D began to support shaders The first shader capable GPUs only supported pixel shading but vertex shaders were quickly introduced once developers realized the power of shaders The first video card with a programmable pixel shader was the Nvidia GeForce 3 NV20 released in 2001 3 Geometry shaders were introduced with Direct3D 10 and OpenGL 3 2 Eventually graphics hardware evolved toward a unified shader model Design editShaders are simple programs that describe the traits of either a vertex or a pixel Vertex shaders describe the attributes position texture coordinates colors etc of a vertex while pixel shaders describe the traits color z depth and alpha value of a pixel A vertex shader is called for each vertex in a primitive possibly after tessellation thus one vertex in one updated vertex out Each vertex is then rendered as a series of pixels onto a surface block of memory that will eventually be sent to the screen Shaders replace a section of the graphics hardware typically called the Fixed Function Pipeline FFP so called because it performs lighting and texture mapping in a hard coded manner Shaders provide a programmable alternative to this hard coded approach 4 The basic graphics pipeline is as follows The CPU sends instructions compiled shading language programs and geometry data to the graphics processing unit located on the graphics card Within the vertex shader the geometry is transformed If a geometry shader is in the graphic processing unit and active some changes of the geometries in the scene are performed If a tessellation shader is in the graphic processing unit and active the geometries in the scene can be subdivided The calculated geometry is triangulated subdivided into triangles Triangles are broken down into fragment quads one fragment quad is a 2 2 fragment primitive Fragment quads are modified according to the fragment shader The depth test is performed fragments that pass will get written to the screen and might get blended into the frame buffer The graphic pipeline uses these steps in order to transform three dimensional or two dimensional data into useful two dimensional data for displaying In general this is a large pixel matrix or frame buffer Types editThere are three types of shaders in common use pixel vertex and geometry shaders with several more recently added While older graphics cards utilize separate processing units for each shader type newer cards feature unified shaders which are capable of executing any type of shader This allows graphics cards to make more efficient use of processing power 2D shaders edit 2D shaders act on digital images also called textures in the field of computer graphics They modify attributes of pixels 2D shaders may take part in rendering 3D geometry Currently the only type of 2D shader is a pixel shader Pixel shaders edit Pixel shaders also known as fragment shaders compute color and other attributes of each fragment a unit of rendering work affecting at most a single output pixel The simplest kinds of pixel shaders output one screen pixel as a color value more complex shaders with multiple inputs outputs are also possible 5 Pixel shaders range from simply always outputting the same color to applying a lighting value to doing bump mapping shadows specular highlights translucency and other phenomena They can alter the depth of the fragment for Z buffering or output more than one color if multiple render targets are active In 3D graphics a pixel shader alone cannot produce some kinds of complex effects because it operates only on a single fragment without knowledge of a scene s geometry i e vertex data However pixel shaders do have knowledge of the screen coordinate being drawn and can sample the screen and nearby pixels if the contents of the entire screen are passed as a texture to the shader This technique can enable a wide variety of two dimensional postprocessing effects such as blur or edge detection enhancement for cartoon cel shaders Pixel shaders may also be applied in intermediate stages to any two dimensional images sprites or textures in the pipeline whereas vertex shaders always require a 3D scene For instance a pixel shader is the only kind of shader that can act as a postprocessor or filter for a video stream after it has been rasterized 3D shaders edit 3D shaders act on 3D models or other geometry but may also access the colors and textures used to draw the model or mesh Vertex shaders are the oldest type of 3D shader generally making modifications on a per vertex basis Newer geometry shaders can generate new vertices from within the shader Tessellation shaders are the newest 3D shaders they act on batches of vertices all at once to add detail such as subdividing a model into smaller groups of triangles or other primitives at runtime to improve things like curves and bumps or change other attributes Vertex shaders edit Vertex shaders are the most established and common kind of 3D shader and are run once for each vertex given to the graphics processor The purpose is to transform each vertex s 3D position in virtual space to the 2D coordinate at which it appears on the screen as well as a depth value for the Z buffer 6 Vertex shaders can manipulate properties such as position color and texture coordinates but cannot create new vertices The output of the vertex shader goes to the next stage in the pipeline which is either a geometry shader if present or the rasterizer Vertex shaders can enable powerful control over the details of position movement lighting and color in any scene involving 3D models Geometry shaders edit Geometry shaders were introduced in Direct3D 10 and OpenGL 3 2 formerly available in OpenGL 2 0 with the use of extensions 7 This type of shader can generate new graphics primitives such as points lines and triangles from those primitives that were sent to the beginning of the graphics pipeline 8 Geometry shader programs are executed after vertex shaders They take as input a whole primitive possibly with adjacency information For example when operating on triangles the three vertices are the geometry shader s input The shader can then emit zero or more primitives which are rasterized and their fragments ultimately passed to a pixel shader Typical uses of a geometry shader include point sprite generation geometry tessellation shadow volume extrusion and single pass rendering to a cube map A typical real world example of the benefits of geometry shaders would be automatic mesh complexity modification A series of line strips representing control points for a curve are passed to the geometry shader and depending on the complexity required the shader can automatically generate extra lines each of which provides a better approximation of a curve Tessellation shaders edit As of OpenGL 4 0 and Direct3D 11 a new shader class called a tessellation shader has been added It adds two new shader stages to the traditional model tessellation control shaders also known as hull shaders and tessellation evaluation shaders also known as Domain Shaders which together allow for simpler meshes to be subdivided into finer meshes at run time according to a mathematical function The function can be related to a variety of variables most notably the distance from the viewing camera to allow active level of detail scaling This allows objects close to the camera to have fine detail while further away ones can have more coarse meshes yet seem comparable in quality It also can drastically reduce required mesh bandwidth by allowing meshes to be refined once inside the shader units instead of downsampling very complex ones from memory Some algorithms can upsample any arbitrary mesh while others allow for hinting in meshes to dictate the most characteristic vertices and edges Primitive and Mesh shaders edit Circa 2017 the AMD Vega microarchitecture added support for a new shader stage primitive shaders somewhat akin to compute shaders with access to the data necessary to process geometry 9 10 Similarly Nvidia introduced mesh and task shaders with its Turing microarchitecture in 2018 which provide similar functionality and like AMD s primitive shaders are also modelled after compute shaders 11 12 In 2020 AMD and Nvidia released RDNA 2 and Ampere microarchitectures which both support mesh shading through DirectX 12 Ultimate 13 These mesh shaders allow the GPU to handle more complex algorithms offloading more work from the CPU to the GPU and in algorithm intense rendering increasing the frame rate of or number of triangles in a scene by an order of magnitude 14 Intel announced that Intel Arc Alchemist GPUs shipping in Q1 2022 will support mesh shaders 15 Ray tracing shaders edit Ray tracing shaders are supported by Microsoft via DirectX Raytracing by Khronos Group via Vulkan GLSL and SPIR V 16 by Apple via Metal Compute shaders edit Compute shaders are not limited to graphics applications but use the same execution resources for GPGPU They may be used in graphics pipelines e g for additional stages in animation or lighting algorithms e g tiled forward rendering Some rendering APIs allow compute shaders to easily share data resources with the graphics pipeline Parallel processing editShaders are written to apply transformations to a large set of elements at a time for example to each pixel in an area of the screen or for every vertex of a model This is well suited to parallel processing and most modern GPUs have multiple shader pipelines to facilitate this vastly improving computation throughput A programming model with shaders is similar to a higher order function for rendering taking the shaders as arguments and providing a specific dataflow between intermediate results enabling both data parallelism across pixels vertices etc and pipeline parallelism between stages see also map reduce Programming editThe language in which shaders are programmed depends on the target environment The official OpenGL and OpenGL ES shading language is OpenGL Shading Language also known as GLSL and the official Direct3D shading language is High Level Shader Language also known as HLSL Cg a third party shading language which outputs both OpenGL and Direct3D shaders was developed by Nvidia however since 2012 it has been deprecated Apple released its own shading language called Metal Shading Language as part of the Metal framework GUI shader editors edit Modern video game development platforms such as Unity Unreal Engine and Godot increasingly include node based editors that can create shaders without the need for actual code the user is instead presented with a directed graph of connected nodes that allow users to direct various textures maps and mathematical functions into output values like the diffuse color the specular color and intensity roughness metalness height normal and so on Automatic compilation then turns the graph into an actual compiled shader See also editGLSL SPIR V HLSL Compute kernel Shading language GPGPU List of common shading algorithms Vector processorReferences edit LearnOpenGL Shaders learnopengl com Retrieved November 12 2019 The RenderMan Interface Specification Lillypublished Paul May 19 2009 From Voodoo to GeForce The Awesome History of 3D Graphics PC Gamer via www pcgamer com ShaderWorks update DirectX Blog August 13 2003 GLSL Tutorial Fragment Shader June 9 2011 GLSL Tutorial Vertex Shader June 9 2011 Geometry Shader OpenGL Retrieved on December 21 2011 Pipeline Stages Direct3D 10 Windows msdn microsoft com Radeon RX Vega Revealed AMD promises 4K gaming performance for 499 Trusted Reviews July 31 2017 The curtain comes up on AMD s Vega architecture January 5 2017 NVIDIA Turing Architecture In Depth September 14 2018 Introduction to Turing Mesh Shaders September 17 2018 Announcing DirectX 12 Ultimate DirectX Developer Blog March 19 2020 Retrieved May 25 2021 Realistic Lighting in Justice with Mesh Shading NVIDIA Developer Blog May 21 2021 Retrieved May 25 2021 Smith Ryan Intel Architecture Day 2021 A Sneak Peek At The Xe HPG GPU Architecture www anandtech com Vulkan Ray Tracing Final Specification Release Blog Khronos Group November 23 2020 Retrieved 2021 02 22 Further reading editUpstill Steve 1990 The RenderMan Companion A Programmer s Guide to Realistic Computer Graphics Addison Wesley ISBN 0 201 50868 0 Ebert David S Musgrave F Kenton Peachey Darwyn Perlin Ken Worley Steven 1994 Texturing and modeling a procedural approach AP Professional ISBN 0 12 228730 4 Fernando Randima Kilgard Mark 2003 The Cg Tutorial The Definitive Guide to Programmable Real Time Graphics Addison Wesley Professional ISBN 0 321 19496 9 Rost Randi J 2004 OpenGL Shading Language Addison Wesley Professional ISBN 0 321 19789 5 External links editOpenGL geometry shader extension Riemer s DirectX amp HLSL Tutorial HLSL Tutorial using DirectX with much sample code Pipeline Stages Direct3D 10 Retrieved from https en wikipedia org w index php title Shader amp oldid 1179137229, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.