Is it normal that the rendering quality with `r.HairStrands.RasterizationScale 0.3` is lower than with 1? Please confirm.
If a reproduction case is needed, after opening the project I provided, simply load the preset file and render. You will get three MP4s with different qualities, among which the 0.3 setting produces the worst result.
Is it normal that the rendering quality with `r.HairStrands.RasterizationScale 0.3` is lower than with 1?
Yes, this is expected. The reason you get more aliasing and noise with a lower r.HairStrands.RasterizationScale value is because it controls the hair radius relative to the pixel size so smaller strands will have more temporal instability and aliasing (see how ComputeMinStrandRadiusAtDepth1() uses the RasterizationScale as a multiplier in HairStrandsUtils.cpp).
A value of 0.3 is very thin relative to pixels and is likely to have more sample misses resulting in aliasing, flickering and temporal instability. The default value of 0.5 provides what we think is good coverage and fidelity. Values of 1.0 and higher will cover at least one pixel and will be more stable but will thicken the hair.
Please let me know if you have additional questions about r.HairStrands.RasterizationScale.