Open side-bar Menu
 Taming The Dimensions
Marie Brown
Marie Brown
Marie Brown is an experienced writer and a less experienced 3D artist. Her interests include creating 3D content, experimenting with different software and render engines, and learning how to make 3D imagery so fabulous that no one will question whether it's art or not.

Instancing in DAZ Studio

October 27th, 2014 by Marie Brown

Have you ever wanted to do a crowded scene in DAZ Studio, but run into problems with limited memory? Well, there’s an easy way around that: instancing. There are some pretty severe limitations to this method, but it’s really light and easy on the memory.

First, how to do it. I’m in a fantasy sort of mood right now, so I’m going to create a fantasy warrior. Okay, here he is:


There you have it, a rather annoyed-looking Gianni dressed as a barbarian in a desert wasteland. Now for the next step, and an illustration of a hidden catch of instancing.

I’m going to select Gianni, then go up to the “Create” menu and select “New Node Instances”. It asks how many, I leave it at the default value of ten. Then voila, I have my fantasy raiding party! Or do I…?


Oops! There’s something missing. If you’re going to instance a figure, the clothing absolutely has to be parented to it, or you get a whole bunch of naked people.

So now I’ll parent everything properly, create my instances, and drag each one into position. Instances can be selected individually or as a group, so you can drag the whole group into a rough approximation of where you want them, then select and move each one.

Okay, here’s my group of barbarian warriors, illustrating the one major drawback of instancing: they all look the same.


The similarity of the figures isn’t the end of the world, though. There’s a way around it. If you create, say, five different warriors, all in different poses and maybe with variations of armor, and of course different hair and weapons, then you can create instances of each warrior. Vary the number, too, so there’s maybe five black-haired guys, and eight with their swords overhead, or whatever. Then intermix all the instances and their original figures, and you’ve got an active battle scene, without completely destroying your computer.

If you have Photoshop, or any other imaging program, you can add some final touches such as dust or lighting effects, and you’re done. Put your finished image online for the world to see, and enjoy the comments about your amazingly powerful computer and the questions of “How’d you do it?”

An example of instancing in DAZ Studio, postworked in Photoshop to tweak colors and add dust.

An example of instancing in DAZ Studio, postworked in Photoshop to tweak colors and add dust.

HDRI Lighting In LuxRender

September 28th, 2014 by Marie Brown

Just a quick, simple trick today. I’ve been obsessing over lighting for a while now, and I ran across a chance comment online that made me finally take a look at using HDRI lighting in Lux. I’ve made the attempt before, but was operating on less-than-accurate information, and never got it to work. So, just in case there’s anyone else out there that heard you need to use the goniometric light in Lux as an HDRI light source, well, you don’t.

Someone online mentioned using the HDRI images from the Bryce freebie Golden Lighting in Lux. Now, I admit, I never once thought of doing that, because in my compartmentalized little mind, Bryce HDRIs tend to light Bryce scenes, and Bryce doesn’t play nice with Lux. So, once I got over feeling a bit dumb, I decided to take a shot at lighting an image with one of the Golden Lighting HDRIs. Real clever, eh?

I’ve already discovered using the goniometric light for IBL (image-based lighting) is something of a pain. Not to say it can’t be done, it’s just not something I’m happy doing yet. So I dug around in the LuxRender docs and found a much simpler alternative to the goniometric route.

Ready for this? It’s real simple…

First, set up your scene in DAZ Studio.

Second, create a plane primitive (Create>New Primitive>Plane). Make it a big one, I used 1000m. Rotate it 90 degrees on the X axis and position it behind your scene, as a backdrop. You may have to adjust it more after the first test-render.

Third, turn the plane into a LuxRender light source (select plane, right-click Parameters tab, choose “Luxus – LuxRender Light”, then select “infinite” as the type).

And fourth, look under the parameters for “LuxRender Light”. Where it says “LuxRender Infinite Light – Enviroment Map”, load your HDRI image of choice.

Then set up your render settings the way you like them, and hit render. You’ll have nice HDRI-generated lighting bathing your scene.

Now wasn’t that easy?

Just for comparison, so you can see what a nice HDRI can do in Lux, I’ve rendered the exact same image with 3Delight, and then again with Lux. Here’s the 3Delight version:


And here’s what Lux has to offer:


There you go. Simple HDRI lighting in LuxRender.

Composition? Or Worldbuilding…

August 11th, 2014 by Marie Brown

Hello again! First off, I need to apologize for falling off the face of the web. I didn’t produce even a single post for the entire month of July, because I got thoroughly sidetracked by Camp NaNoWriMo, one of those crazy wild writing adventures I’ve grown to love. But I’m back now, and I have a few subjects waiting patiently in line for me to write them out. So here goes.

Composition is a huge word in the photography world, and in the 3D art world, as well. I don’t much like thinking about it, which tells me it’s something I probably should look into. I avoid stuff I don’t completely understand. To me, the word and the concept behind it just smacks of careful thought, and planning, and mapping out concepts, and all sorts of other stuff, when all I want to do is make pictures I think are cool looking. But to get those cool looking pictures, I really should know the rules of the trade. Right?

A very basic definition of composition, as applied to both photography and 3D art, is “the arrangement of visual elements within the visual frame.” Nice, bland, boring definition. It doesn’t take into account the vast amount of discourse on composition available online, in books, in videos and classes and pamphlets… You get the picture.

Now, I’m not about to take on the entire subject of composition in this blog post. That’d take half a lifetime. And I’m going to assume that if you’re reading this, you’ve probably run across the term before, and maybe even concepts like the “rule of thirds” and the “golden spiral.” What I will do, however, is share some thoughts I’ve had recently while I was supposed to be working, about how traditional composition fits together with the worldbuilding a writer does, and applies to 3D art.

A 3D artist, like an author or a movie director, is in complete control of the set. Of course, in this case, the set is a program window in DAZ Studio, empty and waiting to be filled with magic.
Read the rest of this entry »

Biased vs. Unbiased: Oh No, Not Again…

June 29th, 2014 by Marie Brown

Biased or unbiased? It’s one of the great debates of the 3D world, right up there with “Is computer art real art?” and “Are DAZ Studio users real 3D artists?” There are many people who swear by biased render engines, and just as many who swear by unbiased render engines. Now, I don’t want to get into that debate for real in this post. (Of course, if you want to do so in the comments, fine. I’m always up for a good debate.) What I’ll do is attempt to explain what each type of render engine is, what it does, and show some strengths and weaknesses of each in a direct comparison of a basic scene.

Bias, when applied to 3D rendering, is defined as systematic error. This refers to the way light is handled. A biased renderer will introduce a certain amount of error, a bit of blur, into the way it calculates the light values. This results in dramatically faster render times than most unbiased render engines offer, and gives a somewhat less than perfectly real look most of the time. An unbiased renderer calculates the light without introducing any systematic error, resulting in a very realistic handling of light. Unbiased renders can take days to complete, or “clear”. But the resulting image can often be confused for a real-world photograph.

Okay, there’s your simple explanation of the terms. Maybe even a hair on the over-simplified side, but overly technical terminology makes my eyes cross. DAZ Studio uses the 3Delight render engine, which is biased. It is also capable of using unbiased render engines such as LuxRender and Octane via plug-ins, which makes it the perfect place to compare biased vs. unbiased renders. I use LuxRender via the Luxus plug-in.

So, here we go. I’ve set up a basic scene, just a room with a few objects in it. Now let’s render out the baseline.

biased_defaultThis render was produced with 3Delight, with default materials on all objects, using one Age of Armour Advanced Ambient light and one DAZ linear point light in the candle flame. My overall response to this render? Yuck. Biased or unbiased doesn’t matter one little bit when the end result just plain stinks. Oh well, that’s precisely why 3D content providers make shader sets, so you don’t have to stick with the default texture an item comes with if it’s less than satisfactory. A bit of experimenting revealed that the intense hot spot came from the ambient light, so it will just have to stay there and get taken out with Photoshop, because if I take away the light… well, you know what happens when you render a scene without light. Nothing. Nada. Blackness.

Okay, on to the next render, this time using Luxus. Here it comes…





In the interests of increasing render speed, I set the Sampler to “sppm.” But look what happened! The dreaded “fireflies,” those annoying bright spots, appeared almost immediately. Argh. So I stopped the render before it really cleared and switched to “metropolis,” which is my preferred Sampler anyway. The problem with metropolis, though, is it can take a long time to clear.

While Lux does its thing in the background, I’ll take a moment to look at the most obvious difference between the two renders: the ambient light. 3Delight likes ambient lights, which distribute light evenly over the entire scene, regardless of walls and doors and such. In the case of the AoA Advanced lights, you can adjust the falloff so the light blends into shadow around the edges, which is a really cool and useful feature. I didn’t do that here, though, because I just wanted even lighting overall.

Now, LuxRender does not like ambient lights, because they’re just plain not physically correct. The closest it comes to an ambient light is a “sun” light. This acts just like a real sun in that it won’t pass through walls. So in this scene, the sun light is falling at an angle through the window, producing real-life light and shadow effects. The candle flame is also a light source. I converted it into a meshlight. You can’t really tell here, though, because I only let the render run until the fireflies popped up.

The next glaringly obvious difference is in the wineglass and its contents. This particular product, Wine Me, was designed with both Studio and Luxus materials. The only difference between the two renders is that on the Lux one I switched the Luxus parameters to “on” for both glass and wine, which made a huge difference and shows a hint of why some people go nuts over unbiased renderers.

Okay, on to the better, firefly-free render.

unbiased_default_metroNow I’m going to throw some numbers at you. Render time: 47 min, 20 threads, 88.08 S/p, 628% Efficiency. Yes, that’s right, this not-quite-cleared test render ran for 47 minutes. I set it to use 20 processor threads (Lux can handle up to 32) so I could keep messing around with my computer with no problems while the render ran. Lux took 88.08 samples per pixel, which is nowhere near what most final renders need, but serves for a test. Most final renders clear around 500-1000 S/p. Sometimes you get lucky and it’ll clear at 200, sometimes it’s worth running to 2000 (that’s the kind of render that can run for a week or so…). The 628% Efficiency means the renderer had plenty of light to “see” the scene and render it out. A low Efficiency gives terrible results, most of the time.

All right, so on to the overall effect of this render: still not all that great. I like the more natural lighting, and the wineglass is awesome, especially the way it catches the candlelight and has the beginnings of a caustic effect showing on the table, but some of the materials just plain stink. If you look at the candle holder, you’ll see that Lux didn’t quite know what to do with it, and it doesn’t cast a shadow. And that crystal ball is just plain dull. Some of that is due to the low S/p, but more of it is because the Luxus autoconverter didn’t do the greatest job switching out DAZ materials for Lux-friendly materials.

Moving on now. This time I’m going to tweak the materials and the lighting and do my best to make each render shine, so you can see a more true comparison between the two.



Oh, that dratted hotspot! It just won’t go away. But in this render, I’ve changed the materials of all the glass items, and also the Index of Refraction for the wine, and gave the little knobs on the mirror a more metallic texture. This is a pretty decent example of what a biased renderer can do with ambient lighting. I’d like to see some shadows coming off the objects, but that would require additional lighting, which would throw off the comparison with the unbiased image.

Next, the unbiased version.




Here you go. Other than the candle holder, which still has Lux scratching its head and wondering what to do with it despite its new parameters, this shows some of the power of unbiased rendering. I allowed it to run for an hour and twenty-three minutes, at which point a few fireflies popped up and I shut it down. It reached 166.71 S/p at 494% Efficiency. If you look at the crystal ball, you can see the reflections of all the windows in the room and the blue sky outside. There are caustic effects from the wineglass and the crystal ball starting to show up on the table. And the wine shows some reflections and variations in color, a result of the light bouncing around inside of the liquid.

So there’s my illustration of the basic difference between biased and unbiased render engines, the way they handle light. When I first got into 3D art, I was all about unbiased rendering, because you’ve got to admit, it does an amazing job with light. The way Lux handles glass and metals in particular just makes me drool. But now, there are a boatload of new toys specifically made for rendering in 3Delight, such as the AoA Advanced lights and subsurface shaders. Using the new lighting and materials means that, in my opinion, the biased renderer can hold its own against unbiased rendering any day. At least, it can in certain situations.

End result? I’ve come to rely heavily on both render engines. Sometimes I’ll even composite images from both of them at once. Biased is better for some situations, unbiased wins hands down when it comes to lighting effects.

That’s it for now. In the future, I intend to get into more of the fun stuff, such as subsurface scattering and volumetrics, and show how the biased vs. unbiased debate continues.


DAZ Studio



Advanced Ambient Light


Deco Mirror

table from The Living Room Collection

Wine Me

The Candle Collection

crystal ball from Opus Magnum


The Ultimate Shader Pack


Playing With Render Settings: Pixel Filters

May 26th, 2014 by Marie Brown

When I first started using DAZ Studio, I freely confess that render settings meant nothing to me. I’d used Bryce casually for years, but generally left it on its default render settings, because I was kind of afraid to change anything lest I break the program. But using DAZ Studio forced me to get over that silly fear in a hurry, because the default render settings in Studio are pretty lousy. Here’s an example render using the default settings:

olympia_daz_default DAZ Default Settings

Max Ray Trace Depth: 1

Pixel Samples (X,Y): 4

Shadow Samples: 10

Gain: 1

Gamma Correction: Off

Gamma: 1

Shading Rate: 2

Pixel Filter: Sinc

Pixel Filter Width (X,Y): 6

Not the greatest image ever, I think you’ll agree. The default settings produce very blah and disappointing renders, especially when you’ve been looking through galleries and seeing all the amazing things people can do with this free program (and all the paid content they lure you into buying). After doing a bit of research into why my renders looked so bad compared to everyone else’s, I found out all sorts of interesting things about ray tracing, pixel and shadow samples, and shading rates. Those are the basic settings that need to change to produce a decent render. So here’s an example of a render done with my preferred final settings:

olympia_my_defaultMy  Default Settings

Max Ray Trace Depth: 4

Pixel Samples (X,Y): 12

Shadow Samples: 32

Gain: 1

Gamma Correction: On

Gamma: 1

Shading Rate: 0.1

Pixel Filter: Sinc

Pixel Filter Width (X,Y): 6

Olympia now looks sharper, more in focus, not blown out. Her hair looks better, and the details on her dress are more visible. Still not absolutely fabulous, but definitely getting better.

All right, there you have a baseline for this experiment. That last pair of settings, Pixel Filter and Pixel Filter Width, are what snagged my attention this time. I’ve messed around with pixel filters in LuxRender enough to know that they change the quality of the image, but not how or why. I’ve also ran across comments online that indicate using a pixel filter width higher than six is crazy, but again, I don’t know why. So here it goes.

Box Pixel Filter

olympia_boxYuck! She’s gone all blurry and out of focus. Okay, try again.

Triangle Pixel Filter

olympia_triangleOkay, not as bad, but still not nice and crisp like I want. Next one I have high hopes for, because I’ve used it in Bryce and been happy with it.

Catmull-Rom Pixel Filter


Whew, she’s looking like herself again. Okay, one more try, and I know this one will be blurry, simply because of its name.

Gaussian Pixel Filter


Yep, more blur, although still not as bad as that box filter.

So, why do the box and triangle filters look so horrible? This entailed a bit of research. I went online and read up on pixel filters. A pixel filter turns out to be the calculation used to determine what each rendered pixel filter will look like. If you want the technical details, sorry, you’re on your own. Math makes my head hurt. But take the box filter for an example. It looks at a single pixel, then it takes all the pixels around that one in a box shape and calculates an average of all the values. Voila! Now I understand why the image is so blurry! Because the filter is looking at its surroundings and averaging them out, resulting in mush.

Now how to fix the mush problem? There has to be a way, otherwise nobody would ever use the box filter for anything. Hmm… calculates surrounding pixels… What about pixel filter width? Aha! That tells the render engine how many pixels to look at. So, if I’m right, modifying the pixel filter width will transform the box filter into something usable for purposes other than producing a nice blurry background to composite with another image and simulate depth of field, which is about the only use I can think of for something that blurry. So. Here we go.

Box Filter, Width 3


That’s it, then. There’s a visible improvement in the image, just by changing the pixel filter width.






Box Filter, Width 1


There you have it, folks. Olympia is back to looking like herself again.  Just to make certain, I tried the triangle filter out with different filter widths, and the same result happened.

Hopefully you learned something from my little experiments. I know I sure did. Happy rendering!

Gamma Correction Is Your Friend

April 28th, 2014 by Marie Brown

I’ve been thinking about this subject for a while now, because it can have a rather startlingly huge impact on a 3D render. But you see, I just can’t come up with a way to explain it better than this article, Gamma Correction For A Linear Workflow. It covers all the whats, whys, and hows of gamma correction, and does it much better than I ever could. So what I’ll do is post a nice, clear illustration of what gamma correction will do for a DAZ Studio render. And I will also pass along the warning that you should always do test renders with gamma correction on, if you intend to use it, because otherwise your lighting will be all messed up. Read the rest of this entry »

Diving Underwater

April 7th, 2014 by Marie Brown

I’ve seen many underwater images online, and never really felt the desire to make any of my own. But I started experimenting with Bryce and making HDRI images. For some reason or other, the camera wound up underwater in the Bryce scene, and I liked the look so much I rendered it out as an HDRI. This is the image that came of using it.


First attempt at an underwater image in DAZ.

In Bryce, an HDRI image is really simple to make. First, you set up your scene the way you want it, then render it, then save as HDRI. Simple.

But what, you ask, is an HDRI, and why do I want one?

HDRI stands for High Dynamic Range Image. Both Studio and Bryce, and of course other 3D programs, can use this image to generate some really cool indirect lighting. Have a look at the fish-guy holding the spear. See the highlights, shadows, and the way his texture looks? That’s all from the HDRI. The base model is completely untextured, just the default grey.

Without the HDRI, the image is pretty bland and boring, without and real indication that it’s set underwater.

underwater with HDRI

Same image, with HDRI.

Raw render, no HDRI, no postwork.

With it, the underwater feel has started to happen, but it’s still not what I was looking for. However, you can see how much light is added from the image, ans you can begin to see the underwater look.

Enter Photoshop. I’ve used the program for a while, ever since PaintShop Pro got taken over by Corel. But mainly I used it for very basic stuff, adjusting light levels on photos and such. Now, I’m learning how to use the program for that mysterious thing called postwork.

In this case, I couldn’t see any real underwater look. Yes, the color suggests underwater, and yes, the dude with the spear looks like he belongs underwater. But where’s the depth? The ripples, the way everything looks submerged when you’re swimming around underwater? The weeds and fishes that come with the bottom of lakes and oceans alike?

To get these effects, I sent the original render in BMP format to Photoshop. I went online and located some free, unrestricted use fish and seaweed brushes, and used them to add in a bit of life. Then I started experimenting with layers of colors, blend modes, and opacity to bring more depth to the colors. If you’re curious, there’s an orange layer set to Vivid Light, a dark purple Hue layer, and a salmon-pink Overlay. The ripple effect came from a simple trick. I made a black layer, then applied “difference clouds” (Filter>Render>Difference Clouds) to it, and used the blend mode Vivid Light at 20% opacity. Voila! A more realistic underwater look.

If I’d been using Bryce, I admit, it would have looked way better. Bryce has some fabulous caustic effects for underwater, which is the primary reason I’d never really thought of doing an underwater scene in Studio. Why bother, when it’s so easy and awesome in Bryce? But sticking with what you know never expands anyone’s horizons, and I learned a lot working on this image.


© 2017 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us
ShareCG™ is a trademark of Internet Business Systems, Inc.

Report a Bug      Report Abuse      Make a Suggestion      About      Privacy Policy      Contact Us      User Agreement      Advertise