Matt Faw
September 21st, 2012, 10:15 PM
Dear all,
I wanted to share a few recent experiments using particles in negative z-space. I was blown away by the falling ash scene in Avatar (and the rain in Meet the Andersons), and have been looking for a chance to play in that territory, myself.
Since particles can easily exist in that space without the risk of edge violations, they seem to be a useful way of keeping the negative z-space active. Of course, a deep particle system may also have extreme differences in convergence, but the small size and some movement seem to keep the brain filling in the gaps, and pulling off feats of convergence, almost by assumption. What I mean by that is: still frames of some of these extreme negative depth particles aren't easy to converge, but when they're moving, my brain is so busy making sense of it all, that it fills in any gaps, and everything makes relative sense. This is probably especially true of the last shot in the following sequence:
3D Z-Space Particles tests - YouTube
Most of the above was generated in 3DS Max, using David Shelton's 3D Hippie Stereocam rig, and composited in After Effects. Of course, I recommend seeing this in HD, full-screen.
I'm curious about y'all's experience in this territory, what experiments or jobs you've had to deal with this, where the particles just didn't work, or ways of approaching them that is novel and exciting.
I wanted to share a few recent experiments using particles in negative z-space. I was blown away by the falling ash scene in Avatar (and the rain in Meet the Andersons), and have been looking for a chance to play in that territory, myself.
Since particles can easily exist in that space without the risk of edge violations, they seem to be a useful way of keeping the negative z-space active. Of course, a deep particle system may also have extreme differences in convergence, but the small size and some movement seem to keep the brain filling in the gaps, and pulling off feats of convergence, almost by assumption. What I mean by that is: still frames of some of these extreme negative depth particles aren't easy to converge, but when they're moving, my brain is so busy making sense of it all, that it fills in any gaps, and everything makes relative sense. This is probably especially true of the last shot in the following sequence:
3D Z-Space Particles tests - YouTube
Most of the above was generated in 3DS Max, using David Shelton's 3D Hippie Stereocam rig, and composited in After Effects. Of course, I recommend seeing this in HD, full-screen.
I'm curious about y'all's experience in this territory, what experiments or jobs you've had to deal with this, where the particles just didn't work, or ways of approaching them that is novel and exciting.