blender 3d drawing normal map
Bake Normal Maps from meshes
Blenders default return engine, "Blender Render" or "Blender Internal" as it's oftentimes called, includes a defended sub-system that tin exist used to generate diverse types of paradigm map, typically of an object in of itself (direct or self-referencing), or past translating the details of one to some other (indirect or inferred referencing). This latter approach is almost often used to create Normal maps, a process whereby the item of a high-resolution model is rendered down, or "Baked", to an image UV mapped to a low resolution version of the same mesh.
Content
Resources
The post-obit tutorial discusses this procedure, of using "Blender Render" and the "Texture Bake" sub-arrangement to generate a Normal map from a high-resolution mesh. A basic understanding of Blender is useful just non specifically necessary to get the most from the below.
Pattern note: the "Texture Broil" procedure is much the aforementioned in principle as for Blender ii.49 or using Cycles Render although it differs somewhat in process. i.e. which buttons are pressed, when and where.
Depression-poly mesh Preparation ^
Texture baking a mesh to produce a Normal map is usually a 'like-for-like' process in that structural data from a high-resolution mesh is rendered down to an image mapped to a low-resolution facsimile. Specifically for Normal maps the process essentially means translating surface data into a series of "R", "K" and "B" ('cerise', 'green' and 'blue') colour values, each representing the orientation of an individual face normal.
Pattern note: the "normal" component in "Normal map" refers to "X", "Y" and "Z" coordinate values indicating the orientation of a given 'face'. In Blender this can be displayed or visualised by activating "Normals" in "Mesh Brandish" ("View Properties", "N") and selecting whatever one or combination of "Vertex", "Edge" or "Face up" Normals (activating normals on a high-resolution meshes may cause significant operation problems).
For texture blistering to piece of work, and RGB Normalised values be properly calculated, it's of import to ensure both high and low resolution meshes are correctly prepared beforehand. In essence both meshes need to exist; 1) co-located, i.e. both in the same place; ii) have their respective Origin points similarly co-located; 3) their respective "Scale:" and "Dimensions:" information gear up; and 4) be (approximately) the aforementioned size.
Blueprint note: unfixed or otherwise disparate structures tin cause issues for texture baking and other 'interpretive' processes where 'sameness' is the basis upon which operations are performed. As such both low and high resolution meshes should be 'set' or 'fixed' using "Apply" - "Ctrl+A » Rotation and Scale". It'due south also preferable to take all the higher up training done before baking to avoid issues or the need to perform additional work after the fact.
Before baking brand certain to accept both a Low and High resolution mesh available, both of which should occupy the same location, be the same size, have the same Origin points and exist 'fixed' (shown side-by-side in the above for clarity - at render they volition be co-located and sit down atop each other. "Depict all Edges" is besides active, found in "Object" Backdrop, to highlight structural or density differences between the ii versions)
In improver, the low resolution mesh MUST be fully "UV Mapped" with an "Prototype" assigned using either 'generated' or 'bitmap' based data - the map itself acts every bit a guide inside which image data is baked, the image equally the 'substrate' or 'canvas' to which that is done.
Pattern note: the above assumes the bake procedure is just for the product of a Normal map independent of any Fabric(s) information technology might eventually be associated with: it can be generated without the need to assign a total Cloth to the mesh. If a Material is needed (because the resulting Normal map is to form part of a 'alive' datablock immediately afterward baking) - with the low resolution mesh selected click the "Material" button in Properties. The console that appears should be bare. If it is click "+ New" to generate a populated Material slot. Leave this as is then click the "Texture" push button. Again if an entry does not exist click "+ New" to create i. Here change "Blazon:" to "Epitome or Movie", and so in the "Image" sub-section click the "Scan Image to be linked" button (button with a landscape picture show icon - don't click "+ New" or "Open up"). From the drop-down carte du jour list that appears, the image previously generated and assigned to the mesh should exist available, select it to assign to the Cloth making information technology ready for baking. Further information on setting up Materials can exist found here.
To create the 'prototype', in the "UV/Image Editor" click the "Image" menu choice in the Header and then "New Image" to access the epitome properties popup (alternatively click the "+ New" button also visible in the Header). Set a "Name" (or leave the auto-generated value in place, usually "Untitled [.n]"), change the "Width:" and "Peak:" as needed based on the shape and size of the mesh, the expected layout of the UV map (east.g. "1024" by "1024"), or the size of the image required. Set the 'style' of image to exist used selecting ane of either "Color Grid", "UV Grid" or "Bare" from the "Generated Blazon" selector, and so click "OK" to confirm and generate. The new image volition immediately appear in the UV/Image Editor occupying what was previously a bare "Texture Grid".
Design note: the only significant difference between "Color Grid", "UV Grid" and "Bare" is whether a blueprint or a uniform (single) colour is displayed; the latter tends to make it easier to run across the UV when unwrapping and editing, whereas the former make information technology easier to run across distribution and relative image density across the mesh - ideally their respective patterns should be uniformly distributed, particularly in areas immediately visible to the viewer/player. In addition, to assist the UV editing process the different image 'types' can exist swapped back-and-forth to make the UV editing process easier - for example using "Blank" to come across the initial UV map, and then "UV Grid" to cheque distribution. To do this, in the UV/Prototype Editor click "View » Properties" to access the Image Properties. panel (or printing "North" with the cursor over the active view), then click the "Color Filigree", "UV Filigree" or "Blank" buttons as needed.
Generating a new Image that'southward assigned directly to the UV without the use of Materials; the low-resolution mesh is shown in Edit mode to display the UV Map before texture application - the map is slightly distorted due to the default "Texture Infinite" (grid) being square (high resolution shown with "Draw all Edges" active for clarity and to highlight structural differences)
If the mesh has non yet been unwrapped, whilst selected press "Tab" to enter Edit Fashion, "A" to select everything (or from the 3D View Header click "Select » (De)select All") before finally pressing "U" and selecting "Unwrap" from the "UV Map" pop-up card. This generates an initial map to which the previously created image can exist assigned before and then editing and mark the mesh with Seams for better Unwrap.
Design note: UV unwrapping is selection based process, but surfaces highlighted at unwrap generate a map to which an paradigm can be assigned, else surfaces appear 'white' (no Fabric/UV Map) or 'pinkish' (missing Image). In one case the initial map is generated it can be edited as normal using Seams to split the UV and lay information technology flat.
Shown in "UV Editing" layout (selected from "Choose Screen Layout" in the upper Header/Info bar), the objects initial UV is mapped to a square grid, the UV/Prototype Editors default "Texture Space" - this automatically readjusts itself i an image is associated with the mapped UV's. Note also that every bit an paradigm has yet to be assigned to the object information technology appears white in the 3D View
For that, whilst the mesh is notwithstanding selected, in the UV/Paradigm Editor click the "Scan Prototype to linked" push in the Header (icon displaying a 'photograph') and select the previously created prototype from the 'scan' list that emerges, assigning information technology to the UV map and the mesh (which will appear in the 3D View subject to being in "Texture" way, "Alt+Z"). Once practical go on to edit the UV's and then they map to the image as needed - information technology's important the low-poly object be fully unwrapped before standing because the map itself, its shape and coverage, is used to guide the process, information technology acts as a template of sorts into which the RGB information from the high-resolution mesh is broiled.
Design note: the image tin can exist assigned earlier or after the UV map has been generated and edited, doing ane or the other is largely a matter of preference. If done after all the same, some adjustment to the UV map may be necessary if the image used is not foursquare (the texture grid is square so the UV map will expand when assigned to a wider paradigm). To aid the general process editing the UV map, enabling "Snap to Pixels" helps by making sure mesh vertices align to texture pixels - with the unabridged mesh selected in Edit Mode to expose the UV's, click "UV's" then "Snap to Pixels" in the UV/Image Editor, then (optionally) with the entire UV selected make a minor "Calibration" aligning, "Due south", to nudge individual vertices to the pixel.
Selecting the previously generated epitome in the UV/Prototype Editor to assign it to the initial UV map (shown independent within the default square "Texture Space" area - click the "Browse Epitome to exist linked" button selecting the entry from the list (in this instance "Untitled")
Make sure the low-poly mesh is fully UV unwrapped and has an Paradigm assigned (a "Generated" paradigm in the instance shown above) before continuing (mesh shown in Edit Fashion and selected to display the UV map and image in the UV/Image Editor using the "UV Editing" screen layout)
High-poly mesh preparation ^
The loftier-resolution mesh on the other manus needs little additional preparation other than ensuring it's size/dimension, position and Origin lucifer the low-resolution facsimile - it does not require materials, images or to exist UV unwrapped. High-resolution meshes can be used with or without modifiers so it'south not strictly necessary the "Multires" or "Subdivision Surface" modifiers exist applied to the mesh beforehand (click "Employ" within each Modifier panel). Mesh data can be presented in its original 'quadratic' class or optionally tessellated (triangulated), "Ctrl+T", before baking. This is not absolutely necessary for the procedure to work however.
Blueprint note: for super-loftier resolution meshes tessellation may not offer any pregnant advantage because at that place would exist far more surface volume on the mesh than is available on the assigned prototype for a i:1 surface/pixel correlation (the accuracy of the resulting Normal map relative to the data baked is defined by the size of the image mapped to the model - smaller images mean less pixel space available for baking which reduces the accuracy represented in detail and structure). Where information technology may be prudent to triangulate is in instances where the reference mesh is not suitably dense, which tin can increase the run a risk of bake errors where the render process has difficulty determining whether an united nations-split face is concave or convex, leading to incorrectly baked RGB values.
The loftier-resolution mesh (selected and shown with "Draw all lines" active) tin can exist 'sub-divided' manually (i.e. subdivided through use of the "Subdivide" button in "Tools") or assigning the "Subdivision Surface" or "Multires" modifier, either 'stock-still' or 'unfixed' (the modifier properties being applied and 'made existent') earlier being baked
Texture Bake ^
Once the mesh has been UV mapped and an image assigned click the "Return" Properties push ('camera' icon). Coil downwards to the "Broil" subsection at the lesser of the panel and click the black triangle to the headings left to access its respective properties and options. Here change "Bake Type:" to "Normal" ("Broil Type: Normal", displays "Full Render" past default) and activate "Selected to Active" past clicking the checkbox then Blender knows the Normal map is to be produced using the low/high resolution mesh pair.
Design note: when checking "Bake" settings it'southward besides prudent to have "Articulate" set so the procedure substantially wipes the image and re-bakes the data 'as new' when re-baking later on adjustments are fabricated where needed. Further "Margin:" can be fix to a higher value, i.due east. "16 px", to compensate for UV positions relative to each other where the resulting image is to be used with mipmaps; as mipmaps decrease in size, the altitude between UV islands diminishes so a larger initial value is used to compensate for whatsoever subsequent loss of pixels due to epitome size reduction - this value does depend on the textures actual size, a 16px margin uses a significant amount of space on a 128 ten 128 pixel image just very little on a 2048 x 2048 epitome. Adjust the "Margin:" value every bit appropriate.
Switch to "Render" Properties and scroll downwards to the very bottom to admission the "Broil" systems options and settings (defaults shown start paradigm above). Change "Bake Type:" to "Normal" and activate "Selected to Agile" so Blender knows how the Normal map is to be generated. Annotation meshes are shown positioned every bit they should be for rendering, moved together so they occupy the aforementioned position on the grid, important for 'like-for-like' texture baking
Finally in the 3D View make sure to double-check the loftier-resolution mesh is selected First and the depression-resolution mesh LAST - the order is important - and so click the button marked "Bake" to generate the map. A progress bar will announced in the "Info" header atop the application, disappearing upon procedure completion.
Pattern notation: the order objects are selected ensures return takes place correctly; the low-resolution mesh, the item assigned the image, should always exist the Concluding particular (multi) selected ("Shift+RMB") else the process will error out.
Brand certain the low-resolution mesh is selected Final (should be active object) so click the "Bake" push to generate the Normal map - a progress bar appears in the "Info" header showing render condition
The resulting Normal map baked to the prototype previously mapped to the depression resolution mesh, which can now be saved and used as needed - mapped to an object for game use - note the image, once saved, will need to be re-normalised to remove the grayness background and ensure it only contains normalised RGB colour values, else information technology may cause issues when used in-game ('grayness' is non a normalised colour)
Save the Baked Texture ^
In one case bake has finished the resulting Normal map will accept appeared in the UV/Image Editor where it tin can then exist saved. To do this from the UV/Image Editor Header click "Image » Save equally Epitome" or "Paradigm » Salvage a Copy", either choice opens the "File Browser". Select a suitable image format from the "Save as Image" properties section bottom-left, preferably 'loss-less' such every bit "BMP" or "Targa RAW". Scan to the location the file is to be saved, and and so click the "Save as Image" button top-correct. Blender will interruption as the temporary bake data is written to file then render to the previous view once done.
Pattern notation: the deviation between "Save as Prototype" and "Save a Re-create" has the quondam save the broiled data to a suitable format using the new file to over-ride whatever currently resides in the Material and mapped to the mesh. Whereas the latter will save a 'copy' of the same bake data leaving in place whatever is active until the main *.blend file is saved or reloaded - bake data is temporary in nature, as such volition be lost when doing this (dumped into a temporary data/epitome buffer).
From the UV/Prototype Editor click "Paradigm*" so "Relieve as Image" or "Salvage a Copy" to save the data in a usable format (the "*" appended to the "Image" bill of fare pick in the Header indicates generated data has not yet been saved, removed once that has been done)
In the "File Browser" that opens select a loss-less format to salve the epitome to from the options dialogue lower-left, "BMP" or "Targa RAW" for example, browse to a location to relieve the file, and then click "Salve every bit Prototype" tiptop-right. Blender volition then return to the previous screen once washed
Broken renders ^
During the bake process it's not uncommon for renders to exhibit artefacts as a event of disparities between the high and low resolution meshes and the origin signal from which rays used to analyse mesh are cast; if the assart is insufficient the cease result is usually some form of image abuse.
The divergence between the Low and High resolution meshes needs to be consider for baking else the render process results in images being corrupt or exhibiting other forms of visual aberration, essentially making the Normal map unusable
In other words when making low/high resolution mesh pairings for Normal map baking, it's expected at that place will be some degree of co-planar surfacing, overlaps or protrusions - rivets, screws and other features tin sit exactly at, above, or below, the depression-poly mesh such that when the bake process comes across these types of structures they may be inadvertently clipped or improperly rendered considering the point from which bake is initiated is likewise close to a surface. The result is typically a broken or incomplete rendering of construction.
Using the default values to bake Normal maps may crusade image artefacts (shown in the above equally design interference, especially noticeable in the apartment windowed area to the right) due to the origin point of the 'ray' used to make up one's mind structure being as well shut to surfaces; if the tolerance is too low features above or below might not then bake correctly considering they're substantially clipped from consideration
To compensate for this disparity the distance between surfaces and the origin signal of the ray-cast can exist increased or decreased by adjusting "Altitude:" and "Bias:"; although each performs a particular role, in practice higher values mean more of the mesh can be captured for rendering. To brand adjustments; in "Bake" Properties click either "Altitude:", "Bias:", or both in turn, and type a suitable value depending on the significance of detailing that needs to be considered, then re-bake to rebuild the Normal map.
Design note: equally a general rule-of-pollex the distance used tends to be reflective of the departure betwixt the everyman point of the low-poly structure, meaning both "Altitude" and "Bias" equal "0", and the peak/depth of features needing to be captured on the high-resolution mesh, meaning "Altitude" and "Bias" equal a value that allows the unabridged construction of both meshes to be fully evaluated. Although this can be determined by measuring the features in question, both 'Distance' and 'Bias' values are represented in Blenders Units, some experimentation may exist necessary to find the happy median - it should not be so low as to fail at preventing artefacts, just also non then loftier as to cause similar bug in contrary. Distance and Bias are too relative to mesh size and scale so college values may be necessary for big objects.
To mitigate issues of bad or poor rendering, "Bias" and "Altitude" values can be increased to include more geometry in the broil process, substantially expanding the area captured, compensating the clipping that might inadvertently occur due to the proximity of both meshes to each other and the 'ray' origin used to calculate and render down those aforementioned structures
With "Distance:" and/or "Bias:" set to appropriate (college) values a cleaner broil is possible considering all the necessary mesh structure is beingness fully captured for processing
Baking and Anti-Aliasing ^
Every bit with most forms of baking, "Aliasing" presents a detail problem considering its reliant upon the power of the rendering system to interpolate and 'anti' aliase pixel information, and the resolution of the image being baked to. If one or both are insufficient the resulting broil exhibits 'aliasing' or significant pixellation that ofttimes manifests every bit visible 'stepping' or 'jagged' edges around non-axial or not-perpendicular details and structures (e.g. rivets, screws and other curved details). In other words the bake process is not inferring or averaging pixel information with respect to generating make clean edges around structural features that don't run along the horizontal or vertical axis.
Design note: aliasing issues tend to occur when image data is non correctly, or fully, interpolated when baked. In other words the render process is analysing structural information at at face up value, i.e. every bit it appears to the renderer absent, or with minimal, interpreted averaging between pixels to fill in the gaps, the 'anti' in "anti-aliasing".
Shown in "UV Editing" layout for clarity, Normal maps baked without "Anti-Aliasing" oft showroom 'jaggies', the 'stepped' or pixilated advent of edges around non-perpendicular structures (that don't run forth the horizontal or vertical axes)
In practice, absent proper "Anti-Aliasing" or "Sampling" options, the solution is to edit the Normal map exterior Blender, manually painting in what's missing using a third-political party image editing awarding like Photoshop, GIMP, PhotoPaint etc. (or whatever epitome/photograph editing software is available). Alternatively the map can exist baked larger than needed then down-sized, once more using a photo-editor, taking advantage of the indirect anti-aliasing that occurs when images are re-sized. Withal further the map can exist left in its raw state relying instead on anti-aliasing being performed at run-fourth dimension when image data is re-sampled and re-fatigued to differing sizes on screen.
Design annotation: when manually editing Normal maps through painting it's often better to split and edit the channels individually, correcting each of the 'R' (crimson), 'G' (green) and 'B' (bluish) colour channels separately rather than as a combined RGB image - channels but comprise saturation data pertaining to the colour they stand for, i.eastward. "0%" or "0" saturation, often displayed equally black, represents no channel colour, whereas "100%" or "255" saturation represents full channel color, a total range of a single colour from 'black' to 'total' colour - 0% to 100%, or 0 to 255 saturation. Overall this makes it less likely colour aberrations will be introduced to the image, although care still needs to be taken to match edited tones exactly else the end result is a malformed Normalised surface in-game.
Shown in "UV Editing" layout for clarity, the objects Normal map is baked to a 256x128, 2048x1024 and 4096x2048 texture respectively, each of which has a differing affect on rivet detail clarity - in essence larger images mean fine details can be better defined as more than pixel information is available for apply, distributing aliasing across more data and in effect pseudo 'anti-aliasing' the image
Mesh Smoothing & Normal Maps ^
When baking Normal maps using the "Selected to Active" approach, "Mesh Smoothing" is determined not by edge or face splits on the low-poly mesh but instead on the high-poly using "Control Loops" or other types of "Control Structures". Typically extra geometry placed strategically to modify the density of a feature or features, information technology'due south this data which when baked as biased RGB Normalised colours, informs the behaviour of smoothing on the low-poly mesh with respect to the high-poly geometry information technology direct represents. In other words meshes are typically smoothed based on the continuity and colour of Normalised pixels across a UV mapped surface, not the surface structure itself - modification to the latter tends to have a detrimental effect because it interferes with the erstwhile.
Image-top shows how mesh density changes the fashion structure tin can be seen to curve and respond to lighting/shading. In this way hard (1) or soft edges and curves (two) can exist controlled past irresolute the mesh which in turn, Image-bottom, is then represented in RGB as similarly hard (1) or soft edges or curves (ii), or a combination (iii), thus controlling the way smoothing behaves
What this means in practice is that the weather condition for blistering textures to a depression-poly object, and using such objects in-game differ; during bake, the low-poly mesh will typically accept minimal smoothing in place, if at all, because edges, corners, crevices, bevels and then are defined by the high-resolution mesh; whereas in-game some smoothing may be necessary to augment a construction, i.e. splitting unseen faces to aid the appearance of others that are.
Shown in "UV Editing" layout for clarity, the low-poly mesh is shown with its left side prepare to use uniform smoothing, and edges marked every bit "Precipitous" on the right. The baked Normal map, shown in the UV/Image Editor (left side of Blender), displays the result and what bear on smoothing, or not, has on Normal maps - the difference can exist significant
Displayed in "Texture" manner (over again shown using "UV Editing" layout) the Normal map affects the meshes appearance quite significantly - in-game this has a like affect frequently causing the completed model to exhibit surface issues
Expanded view (shown total screen in the UV/Paradigm Editor for clarity) of the same broiled Normal map with uniformly smoothed surfaces on the left of each section and edge splits on correct; the difference tin be quite significant depending on the type of object baked so special care should be given to Blistering where 'Smoothing' needs to be used
With this in listen special attention should be given to the use of Smoothing and Polish Group like enhancements so they don't pause Normals, which for all intents and purposes means "Mesh Smoothing", "Smooth Groups" or "Hard Edges" are to be avoided, certainly for baking purposes. In other words the low resolution mesh should be assigned a unmarried 'smoothing' value or group, or where they are needed information technology should exist done in a way that doesn't backbite from the use of Normal maps, i.east. edges or faces marked in a way that's not immediately obvious to the viewer.
Smoothing tin be used simply care should to be taken as to the effect doing and then has on the resulting Normal map (note likewise consideration should exist given to aliasing issues where hard edges appear - some other incidental reason to avoid using "Mesh Smoothing" if at all possible)
Renormalising Normal maps (renormalisation) ^
When Broil completes Blender displays non-normalised areas of the map in the UV/Image Editor as a apartment uniform grey colour. When the map is later saved information technology may included these depending on the abyss of what was rendered. If this is the case the entire epitome will need to be "re-normalised" to convert non-normalised colours to valid normalised RGB values. This is important because Normal maps should simply contain RGB normalised values (each "R", "G" and "B" using but "0 » 255" saturation), any that are non will issue in an unusable Normal map (subject to engine tolerance).
Baked Normal maps may include invalid colours which need to exist removed earlier the map can be used in-game, else they can cause the appearance of aberrations and other types of malformed surface, if not make the map unusable (shown in Corel PhotoPaint)
The simplest fashion to re-normalise an epitome is to filter information technology using a tool or plugin designed specifically for that purpose (bailiwick to availability for the awarding used). If one is not available some transmission image editing is necessary to right the problem. To exercise this open the epitome and, if not already masked as a result of the bake process in Blender, select all the gray areas, creating a mask or layer from the selection. Once done but overflowing-filling the areas, masks or layers with the appropriate normalised colour, ordinarily this will be a 'flat' or 'mid-tone' value of RGB "127, 127, 255" or "128, 128, 255" depending on the engine used.
Earlier use Normal maps need to exist "Re-normalised" wherever they comprise none invalid RGB colours. How this is done varies on tool or filter availability but can be washed manually by masking of aberrant areas and inundation-filling them with an advisable 'mid-tone' normalised colour ("127, 127, 255" or "128, 128, 255" depending on the game engine used)
Summary ^
The following is a basic summary of the process as a quick bank check-listing for use.
-
Make sure both meshes are the same size and at the same location.
-
UV map low-poly mesh and assign an Paradigm (with or without Material).
-
High resolution mesh can have Subdivision or Multires modifiers active.
-
Fix "Bake" options "Normal" and "Active to Selected".
-
Click the "Broil" button to generate Normal map.
-
Save the result to a loss-less format e.1000. *.tga.
Conclusion ^
Blenders internal return engine ("Blender Internal") has long been used to bake various types of image map for meshes using the "Texture Bake" organisation. Although there are one or 2 disadvantages to Blender Render (some of which are mentioned above) its withal a viable option providing consideration be given beforehand to problems that might occur. One important point to note however is that Blender Internal is no longer being actively adult or support beyond critical bug fixes as Blender Foundation focus is on Cycles Render.
Video ^
The video beneath shows the basic procedure discussed above.
ledfordallontention.blogspot.com
Source: https://www.katsbits.com/tutorials/blender/bake-normal-maps.php
Post a Comment for "blender 3d drawing normal map"