Part 1
Most of the 3D and compositing packages
offer some sort of a particle systems toolset. They usually come with
a nice set of examples and demos showing all the stunning things you
can do within the product. However, the way to really judge its
capabilities is often not by the things the software can do, but
rather by the things it can not. And since the practice shows it
might be not so easy to think of all the things one might be missing
in a real production at a time, I have put together this checklist.
Flexible enough software allows for
quite complex effects
like this spiral galaxy, created with particles
alone.
|
Even if some possibilities are not
obvious out of the box, you can usually make particles do pretty much
everything with the aid of scripting or utilizing some
application-specific techniques. It often requires adjusting the way
of thinking to a particular tool's paradigms, and I personally find
acquaintance with different existing products of a big help here.
Thus in case you have already made up your mind on a choice of
specific program, you might still find the following list useful as a
set of exercises – figuring out how to achieve the described
functionality within your app.
Overall
concerns
The first question would be if it is a
two- or three-dimensional system or does it allow for both modes? A
2D-only solution is going to be limited by definition, however it can
possess some unique speed optimizations and convenient features like
per-particle blurring control, extended support for blending modes
and utilizing data directly from images to control the particles. The
ability to dial in the existing 3D camera motion is quite important
in the real production environment.
In general, it is all about control.
The more control over every thinkable aspect of a particle's life you
have – the better. And it is never enough, since the tasks at hand
are typically custom by their very nature. One distinctive aspect of
this control is the data flow. How much data can be transferred into
the particle system from the outside, passed along inside and output
back in the very end? Which particles' properties can it affect? We
want it all.
The quest for control also means that
if the system doesn't have some kind of a modular arrangement (like
nodes for instance), it is likely to be limited in functionality.
Examples of particle nodes in
Houdini (above) and Fusion (below)
|
Emission
features
Following the good old-fashioned
tradition of starting at the beginning, let's start with the particle
emission.
What are the predefined emitter types
and shapes and, most importantly, does it allow for user-defined
emitter shapes? You can only get that far without the custom sources
– input geometry or image data increase the system's flexibility
enormously. Emitting from custom volumes allows for great
possibilities as well. What about emission from animated objects? Can
emitter's velocity and deformations data be transferred to the
particles being born? For the geometry input, particle birth should
be possible from both the surface and the enclosed volume of the
input mesh, and then we'd often want some way of restricting it to
the certain areas only. Therefore to achieve the real control,
texture information needs to be taken into account as well.
Geometry normally allows for cruder
control compared to image data, so we want all kinds of particles'
properties (amount, size, initial velocity, age, mass, etc.) to be
controllable through texture data, using as much of the texture
image's channels as possible. For instance, you might want to set the
initial particles' direction with vectors stored in RGB channels of
an image, and use Alpha or any other custom channel to control its
size, as well as use emitter's texture to assign groups to particles
for further manipulation. The same applies to driving particles'
properties with different volumetric channels (density, velocity,
heat) or dialing an image directly into the 2D particle system as a
source.
Does your system allow to create
custom particles' attributes and assign their values from a source's
texture?
The
look options
Now consider the options available for
the look of each individual particle. Both instancing custom geometry
and sprites are a must for a full-featured 3D particle system*. And
there is no need to say that animation should be supported for both.
Are there any additional special types of particles available which
allow for speed benefits or additional techniques? One example of
such a technique would be the single-pixel particles which can be
highly optimized and thus available in huge amounts (like in Eyeon
Fusion for instance), allowing for the whole set of quite unique
looks.
*Rendering a static
particle system as strands for representing hair or grass is yet
another possible technique which some software might offer.
An effect created with millions of
single-pixel particles
|
Another good example are metaballs –
while each one merely being a blob on its own, when instanced over a
particle system (especially if the particles can control their
individual sizes and weights) metaballs can become a powerful effects
and modeling tool.
A particle system driving the
metaballs
|
Whether using sprites or geometry,
getting the real power requires versatile control over the timing and
randomization of these elements. Can element's animation be offset
for every individual particle to start when it gets born? Can a
random frame of the input sequence be picked for each particle? Can
this frame be chosen based on the particle's attributes? Can input
sprite's or geometry animation be stretched to the particle's
lifetime? (So that if you have a clip with a balloon growing out of
nowhere and eventually blowing up, you could match it to every
particle in a way, that no matter how long does a particle live, the
balloon's appearing would coincide with its birth, and the blowing up
would exactly match its death.)
With a good level of randomization
and timing management, animated sprites/meshes are quite powerful in
creating many effects like fire and crowds.
Rotation
and spin
And the last set of controls which
we're going to touch upon in this first part are rotation and spin
options. Although “always face camera” mode is very useful and
important, it is also important to have an option to exchange it for
a full set of 3D rotations, even for the flat image instances like
sprites (think small tree leaves, snowflakes or playing cards). A
frequently required task is to have the elements oriented along their
motion paths (in shooting arrows for example). And of course having
an easy way to add randomness and spin, or to drive those through
textures/other particles' attributes is of high value.
Next time we'll look further at the
toolset required to efficiently drive particles later in their lives.
Part 2
Now we're taking a look into the further life of a
particle. The key concept and requirement
stays the same: maximum control over all thinkable parameters and
undisrupted data flow through a particle's life and between the
different components of the system.
The first question I would ask after
the emission is how many particles' properties can be controlled
along and with their age. Color and size are a must, but it is also
important for the age to be able to influence on any other arbitrary
parameter, and in a non-linear fashion (like plotting an
age-to-parameter dependency graph with a custom curve). For example,
when doing a dust-cloud with sprites you might want to be increasing
their size while decreasing opacity towards the very end of a
lifetime.
Can custom events be triggered at the
certain points of a particle's lifetime? Can the age data be
transferred further to those events?
Spawning
Spawning (emitting new particles from
the existing ones) is the key functionality for a particle system.
Its numerous applications include changing the look of a particle
based on events like collisions or parameters like age, creating
different kinds of bursts and explosions and creating all sorts of
trails. Classical fireworks effect is a good example where spawning
functionality is used in at least two ways: it creates the trails by
generating new elements behind the leading ones, plus produces the
explosion by generating new leads from the old ones at a desired
moment.
In a fireworks effect spawning is
used to create both the trails and the explosion
|
Just like with the initial emission
discussed the last time, it is paramount to be able to transfer as
much data as possible from the old particles to the new ones.
Velocity, shape, color, age and all the custom properties should be
both easily inheritable if required; or possible to set from scratch
as an option.
The last but not least spawning option
to name is the recursion. A good software solution provides user with
the ability to choose whether to use newly spawned elements as a base
for spawning in each next time-step (to spawn recursively) or not.
Although a powerful technique, recursive spawning can quickly get out
of hand as the number of elements keep growing exponentially.
Behavior
control
Forces are a standard way of driving
the motion in a particle system. The default set of directional,
point, rotational, turbulent and drag forces aside*, it is important
to have an easily controllable custom force functionality with a
visual control over its shape. Ability to use arbitrary 3D-objects or
images as forces comes very handy here.
*An often
overlooked drag (sometimes called friction) force plays a very
important role as it counteracts the other forces, not letting them
get out of control through overgrowing.
Forces raise the next question – how
much can the system be driven by physical simulations? Does it
support collisions? What are the supported types of collision objects
then, the options for post-collision behavior and how much data can a
particle exchange in a collision with the rest of the scene?
Can further physical phenomena like
smoke/combustion or fluid behavior be simulated within the particle
system? Can this kind of simulation data be dialed into the system
from the outside? One efficient technique, for instance, is to drive
the particles with the results of a low-resolution volumetric
simulation, using them to increase its detalization.
The particle effect above uses the
low-resolution
smoke simulation shown below as the custom force
|
The next things commonly required for
directing particles are the follow path and find target
functionality. Support for the animated paths/targets is of value
here, just as the ability to compel reaching the goal within the
certain timeframe.
Many interesting effects can be
achieved if the particles have some awareness of each other (like
knowing who is the nearest neighbor). Forces like flocking can be
used to simulate the collective behavior then.
Limiting
the effect
For any force or event (including
spawning) which may be added to the flow, let's now consider the
limiting options. What are the ways to restrict the effect of each
particular operator? Can it be restricted to a certain area only? A
certain lifespan? Can it be limited with a custom criteria like
mathematical expressions, arbitrary particle properties or a certain
probability factor? How much is the user control over the active area
for each operator – custom shapes, textures, geometry, volumes? Is
there a grouping workflow?
Groups provide a powerful technique of
particles' control. The concept implies that at the creation point or
further down the life of a particle it can be assigned to some group,
and then each single effect can be simply limited to operate on the
chosen groups only. For efficient work all the limiting options just
discussed should be available as criteria for groups assignment. Plus
the groups themselves should be a subject to logic operations (like
subtraction or intersection), should not be limited in number or
(limited) to contain some particle
exclusively. For example, you might want to group some particles
based on speed, others based on age and then create yet another
group: an intersection of the first two.
Further
considerations
The last set of questions I would
suggest might have less connection with the direct capabilities of a
given system, and still they can make a huge difference in a real
deadline-driven production.
What are the maximum amounts of
particles which the system can manage interactively and render? Are
there optimizations for the certain types of elements? What kind of
data can be output from the particle system for further manipulation?
Can the results be used for meshing into geometry later or in another
software package for example? Can a particle system deform or affect
in any other way the rest of the scene? Can it be used to drive
another particle system?
Can the results of a particle
simulation be cached to disk or memory? Can it be played backwards
(is scrubbing back-and-forth across the timeline allowed)? Are there
options for a prerun before the first frame of the scene?
Does the system provide good visual
aids for working interactively in the viewport? Can individual
particles be manually selected and manipulated? This last question
might often be a game-changer, when after days of building the setup
and simulating everything works except for few stray particles, which
no one would really miss.
Aside from the presence of
variation/randomization options which should be available for the
maximum number of parameters in the system, how stable and
controllable is the randomization? If you reseed one parameter –
will the rest of the simulation stay unaffected and preserve its
look? How predictable and stable is the particle solver in general?
Can the particle flow be split into, or merged together from several?
And as the closing point in this list
for evaluating the particular software solution, it is worth
considering the quality and accessibility of documentation together
with the amount of available presets/samples and the size of a
user-base. Trying to dig through a really complex system like Houdini
or Thinking Particles would be quite daunting without those.
No comments:
Post a Comment