Linux and Radiance
Been running radiance, the light simulation renderer, with Blender and so far so good.
I use Franscesco Anselmo's Brad python script to translate Blender scene geometry into Radiance then render straight from blender.
My hardware is an old pentium 2 running Ubuntu 5.04!
Just to let anyone know it can be done. Radiance actualy beats the pants off many renderers by being physically accurate and rendering so efficiently; the software runs on old hardware like mine.
Now I plan on upgrading to AMD X2 or Interl Core Duo and was wondering if anyone knows how to get radiance to parallel render on multi core processors?? I know it's been done with sparc stations.
Wow, this is a blast from the past, for me! I had a summer internship at LBL where I made a simple NeWS interface for Radiance so it could be used on Silicon Graphics machines. I also made a couple Radiance models for Greg Ward--a chair and a tennis ball. At the time, I don't think Radiance supported textures and in any case Greg wanted a "real" fuzzy model with all of the individual loose threads individually modeled.
Well Im just getting my teeth into Radiance and I have to admit it is quite complicated! Luckily materials and textures now are abundant by trawling the net.
Are you familiar with MGF ? (material geometry format) it seems to have promise but the mgf2rad syntax has me stumped. From I understand of that protocol, material textures 'scanned' with a spectrometer and the nano scale readings can be converted into a format Radiance understands. That's new stuff buddy holla if it rings a bell!
I'm sorry--while it was an honor to work for Greg Ward, it was a long time ago and I can hardly remember any specifics. When I made those models, I did the chair by manually plotting out cross sections on graph paper and typing in coordinates. In the curved areas, I wrote a small C program to calcuate coordinates (approximating curves with numerous flat rectangles). That gave me enough practice to write a C program to create the fuzzy tennis ball model.
This was all tedious and done on a very low level. What I did wasn't really a practical thing to do, other than as a small research curiousity.
Doing texture procedurals from scratch sounds tricky. That tennis trick was it an invoked 'cal' function? Im no programmer, but it is easy see how a texture like could be layered with another to make interesting combinations..
Back when I was working on it, I don't think Radiance had any support for textures. Each "hair" of the tennis ball was individually modeled as three thin cylinders forming a shallow arc. I came up with a handful of interesting ideas on how to chose the starting point and hair direction randomly (with an even distribution), as well as a particular 48-fold symmetry which meshed well with the way Radiance worked. An interesting thing was that despite the fact that the model was symmetrical, it looked completely random because of the way light hits each symmetrical patch of "fuzz" differently.
Bear in mind that this was back when a 25mhz 68030 was blazing fast...
Oh yeah those 030s were hot stuff then. I remember waitings days for renders on my amiga.
Originally Posted by IsaacKuo
Now with faster and faster processors, procedural modeling of textures with micro scales meshes like you did might catch on. It'd be interesting to see a blender python script of code for such 'fuzzy' textures.
shag carpet (back in style now)
cactus porcupine spines
real mesh tile arrays in any size with grout (instead of mere grids)
mesh orange peel textures with random size spheres embedded on surface
Back in the late '80s and early '90s, I think most of us expected real time raytracing would become commonplace once processors got fast enough. But in fact, the closest we ever came to that was the "2d" version of ray tracing called "raycasting"--used in games like Wolfenstein 3d and Doom.
After that, 3d accelerated graphics hardware took over. It seems that high resolution textures give more "bang for the buck" than high resolution modeling, even now. We're still at the stage where 3d models are almost universally hollow shells made of flat triangles and fake bumps (bump textures). This is a suprise to me.
I had an Amiga back when I did that internship at LBL, and I wanted to port Radiance to it. However, at the time the Amiga OS had only named pipes, with a syntax completely different from Unix style <command>|<filter>|<filter>... syntax. The Amiga shell scripts were different enough from Unix style scripts to make a port difficult to accomplish and impossible to maintain.
Radiance has alot going for it: it can render credible results very fast on limited hardware. Unlike decorative renderers that strive to produce realistic looking results, radiance relies on physics to produce a result that is a close representation of the physical behavior of light.
Yet this rendering on an old Pentium 2 @ 266Mhz, took only 15minutes!
This one took just over 6hrs on the same hardware to provide enhanced image quality. http://dalani.com/StudioM_files/glass-dropplets2.png
Radiance like all renderers can easily use up CPU cycles when more resolution and less interpolation (more accuracy) is required. On my old amiga the same would have taken weeks to calculate!
it's all relative: yesterday's high tech is today's anachronism...