Wednesday, September 19, 2012

Demo Reel v0.22

Updated the demo reel.

Too much homework recently. Do not have enough time for my personal project.

I hope that I can get more time for updating the fluid sim. I've done lots of improvement during the summer but haven't integrated into the reel yet.


Friday, September 14, 2012

New demo for the GPU tracer

the image rendered with tone mapping satisfied me a lot. So I made a new demo for the tracer. I turned off the antweak bar and the fps viewer for less distraction.

Here's the new demo:

Monday, September 10, 2012

Tone mapping

I made a slight change to my GPU path tracer in color transferring.

Previous I was using gamma correction, with gamma equals 2.2, now I'm using the tone mapping operator proposed by Paul Debevec. 

The difference is shown as following:

In the 1st comparison group I turned off the depth of field, just in order to focus on the color difference. And the render time is only 150s, no sufficient for full convergence but good enough for color comparison.
 Image rendered with tone mapping operator
Image rendered with Gamma correction

In the 2nd comparison group I kept all the features on and take 600s for the image to be fully converged.

Image rendered with tone mapping operator

Image rendered with Gamma correction

For the gamma correction group I'm using radiance 16 for the light, but for the tone mapping group I'm using 75 for light.

Personally speaking I prefer tone mapping. It's not that shiny and looks way much better!

New tracer

Based on the fact that I got stuck in implementing the GPU KD-tree paper, I decided to turn to something else.

I've been reading things related to HDR and found it's pretty interesting, and I'm not satisfied with my previous two tracers. So I want to re-write my tracer.

The plan could be like:

Step 1: set up the all the openGL stuff. Rendering everything on to a frame buffer and display the frame buffer with time. Setting up the basic scene is necessary for this step, including camera and the most simple ray trace function(could be just returning a color based on the pixel coordinates).

Step 2: enrich the scene. Setting up objects class and material class. Implement basic ray trace algorithm, which is easy. The material class should be compatible with the HDR, cause that's what I'm trying to focus on for this project. The result of this step is ray-traced image of a simple scene(could be a cornell box with a single sphere).

Step 3: build a CPU kd-tree for complicated objects like stanford bunny or armadillo. Change ray trace with monte carlo path trace. My implementation of material class have to be changed to be better modulized to used BRDF. The result of this one should be a nice rendered image which could function as reference image. Yet the process must be really slow.

Step 4: try implementing the GPU kd-tree paper again and ship everything onto GPU, or try implement photon mapping in the tracer. Haven't thought about that far yet.

Hopefully this project could keep me busy for a while. I'll keep the progress updated.

Friday, August 3, 2012

Cube sample

The most important thing today is I finished the cube sampling part.

Usually a cube is consist of 6 discontinuous surfaces, which is hard for constructing level set field. Actually I've done several experiments to create level set from discrete meshes like a triangle mesh, but the I failed all the time because I don't know how to deal with the discontinuity.

This time I'm using a trick to avoid the discontinuity. By using spherical surface on the corner and cylindrical surface on the edge, I could create correct level set data for a "soft" cube.

This is important because in most cases, the scene is just a box, and I need this cube sample to do construct all the solid particles.

Now everything new has pretty much be finished. The only thing left is to assemble all the parts together. Yet SigGraph is approaching, so I might not have enough time to finish this project. Anyway I'll try my best.


Here's a image of cube sample. The test scene for me would be a large spherical water drop falling in a cube. Or maybe a water cube falling in a sphere. I'm so excited it's almost done!

Solid particle sample finished

A good news is I totally understood the process of sampling today. 

I've been reading this paper for around 10 times to finally figure out the main process of sampling. I really wish the authors could explain more on the basic process instead of focusing on their contribution.

Solid particles are sampled only once at the beginning of the simulation. And later the solid particles would not move, or move only with the solid object. The paper mentioned there is a velocity for solid particle. That is its ghost velocity and would only be used in calculating the viscosity force for fluid particles that is near the solid surface. In integration, there is no real velocity for solid particles(as long as solid object does not move).

Because the relaxation process is really really slow for fluid volume samples, and based on the fact that all the fluid sample and solid sample has only to be done once during the simulation, I decided to write a file exporter/importer. I can run the simulator with a relatively higher relaxation iteration number and thus getting a better distribution of all the particles, and export particles to a certain file. This has to be done only once, and later when I will be testing other parts, I can just import initial state of particle from the generated file instead of sampling again.

Now the biggest problem to be is related to projection. In the sample process, a random sample is supposed to be projected to the surface which is described by a level set. Yet when I tried to use level set grid to do the projection work, I ended up with a crappy result. I think that is my fault in implementation. Hopefully I will fix that tomorrow, cause I don't want to use implicit surface only.

I'll post a picture of all kinds of particles when I fixed the existing problems.

Thursday, August 2, 2012

Ghost SPH Sample Finished

Finished the sampling part of ghost SPH.

The sampling process take a few steps as follows. The input is a level set field which describe the shape of the fluid.

1. Sample the surface, namely those voxels where level set change sign.

2. Relaxing surface particles. This would yield better distribution of surface particles(blue noise).

3. Sample the interior part of fluid. Using the surface particles as initial seed.

4. Apply volume relaxation. This is a extremely slow process, multiple iterations, and within each iteration, neighbor search has to be applied to each particle.

5. Sample air particles. Similar as volume sample. Using surface particles as seeds, sample air particles outside the surface within a single-smooth-kernel layer with the help of level set.

The result so far looks like this:

The green particles are ghost particles, and blue ones as fluid particles.

The next step would be transplanting all the calculation parts in my previous SPH simulator into this new one.

My understanding for the paper might not be 100 percent correct, now I still have a few confusion related to this paper. I'm contacting the author recently and hopefully they could unravel my questions.

I'll keep updated.