Hvad en vertex/fragment shader og mapping af periferi syn til skærm?

Tags:    vr unity3d 3d

Hej :)

Jeg er i gang med at læse om grafik (3D/2D) og om spiludvikling. Jeg er stødt på nogle ting, jeg ikke lige forstår, og håber i kan hjælpe med

Hvad er en vertex shader og en fragment shader?

Mit bud: En vertex shader "leger" med positionen af et spilobjekt og dets output er et fragments shaders input. Et fragment shader "leger" med hver enkel pixel's farve.

Mapping af periferi syn til skærm?

Har ikke et bud på hvad det er :)




1 svar postet i denne tråd vises herunder
0 indlæg har modtaget i alt 0 karma
Sorter efter stemmer Sorter efter dato
https://en.wikipedia.org/wiki/Shader#Vertex_shaders siger:

Vertex shaders are the most established and common kind of 3d shader and are run once for each vertex given to the graphics processor. The purpose is to transform each vertex's 3D position in virtual space to the 2D coordinate at which it appears on the screen (as well as a depth value for the Z-buffer). Vertex shaders can manipulate properties such as position, color and texture coordinate, but cannot create new vertices. The output of the vertex shader goes to the next stage in the pipeline, which is either a geometry shader if present, or the rasterizer. Vertex shaders can enable powerful control over the details of position, movement, lighting, and color in any scene involving 3D models.


Samme side siger om Fragment shaders:

Pixel shaders[edit]
Pixel shaders, also known as fragment shaders, compute color and other attributes of each "fragment" - a technical term usually meaning a single pixel. The simplest kinds of pixel shaders output one screen pixel as a color value; more complex shaders with multiple inputs/outputs are also possible. Pixel shaders range from always outputting the same color, to applying a lighting value, to doing bump mapping, shadows, specular highlights, translucency and other phenomena. They can alter the depth of the fragment (for Z-buffering), or output more than one color if multiple render targets are active. In 3D graphics, a pixel shader alone cannot produce very complex effects, because it operates only on a single fragment, without knowledge of a scene's geometry. However, pixel shaders do have knowledge of the screen coordinate being drawn, and can sample the screen and nearby pixels if the contents of the entire screen are passed as a texture to the shader. This technique can enable a wide variety of two-dimensional postprocessing effects, such as blur, or edge detection/enhancement for cartoon/cel shaders. Pixel shaders may also be applied in intermediate stages to any two-dimensional images—sprites or textures—in the pipeline, whereas vertex shaders always require a 3D scene. For instance, a pixel shader is the only kind of shader that can act as a postprocessor or filter for a video stream after it has been rasterized.



Angående den sidste fandt jeg forskellige ting:
http://www.businessdictionary.com/definition/screen-mapping.html siger:

A process used in integrating enterprise resource planning (ERP) systems with multiple applications for greater efficiency. It is often used for transferring data from one platform to another (such as from a Windows program into an ERP) or for combining data from multiple sources. Screen mapping solutions may be used in warehousing environments for data entry processes that involve the use of wireless applications.


https://en.wikipedia.org/wiki/Peripheral, som også gav noget forklaring, fandt jeg et fancy billede:

https://en.wikipedia.org/wiki/Peripheral#/media/File:Linux_kernel_and_gaming_input-output_latency.svg

Mit bud er, at det er noget med, hvor på skærmen dit grafiske output skal "lande", og eventuelt hvordan (fx. ændringer i størrelse fra originalen, projektion, rotation, etc.)


Jeg håber ikke, jeg er helt galt på den :)



t