Content

  1. What is Motion Tools?
  2. Getting Started
  3. Special Cases
  4. Tutorials
  5. Node Reference

01 What is Motion Tools?

Motion Tools is a small collection of tools that aims to aid Motion Graphic work being created inside Softimage. It is basically a preset of tools that should make your work faster. Currently Motion Tools addresses the creation of points and instances, procedural animation and simulation of such points.

Since it is based on factory nodes and standard workflows it allows you, and encourages you, to keep using ICE in order to push your effects further, so that they have a unique look.

02 Getting Started

Installation is pretty straight forward, just drag and drop MotionTools.xsiaddon into Softimage.

If you have Pyhton correctly installed in Softimage (all Windows installations should, if yours don’t click here) a new “Motion Tools” menu section should show up under ICE  > Create. As shown in the image to the right.

In this menu you will find 3 basic things you can do with motion tools:

  • Create Instances
  • Modify (and animate) this instances
  • Simulate this instances

First things first, you should create the instances you want to animate or simulate. When you create instances in any way Motion Tools creates a new PointCloud type of object, and inside an ICETree it creates points and arranges them in the order you have selected. Any modifier you add to this instances is added to the same root of this ICETree. So the basic anatomy of a Motion Tools setup is as follows:

  1. In the first space we connect nodes that generate instances
  2. In the second node we set basic parameters like shape and color. In case you want to simulate your instances this is were you set Mass, State and RBD properties.
  3. From here on you can plug as many modifications as you want to your instances. This may be the modifiers that come with Motion Tools, or any other ICE node that changes properties of points in a pointcloud.

The menus will get your basic setup going on, but don’t be shy and get using the ICETree for your self. It will help you get the most out of your work.

Apart from this basic setup there are two other special cases we need to go over: Controling Islands and Polygons with instances and Simulating Instances.

03 Special Cases

Controling Polygons and Polygon Islands

There are some special cases, in Motion Tools, that deviate from the basic setup. We stumble upon such cases when you want to, for example, control polygon islands from a fracture object, or text object. Also, when we want to dice a object into many polygons and control those disconnected elements.

For both this situations you can use the menu “ICE  >  Create  >  Motion Tools  >  Create Instances from Islands” and then choose the option that fits you best, either, “Use Exisiting Polygon Islands”, or, “Dice Polygons into Polygon Islands”.

Both situations will yeld a similiar setup that we can describe this way. Firstly the object in which the Polygon Islands exist (or in which the polygons will be sliced into Polygon Islands) receives a ICETree that looks like this:

  1. The ICETree is created at the uppermost position of the Modeling Stack
  2. In it you will find only one node that geathers info about the geometry
  3. This node works in two ways, it either stores info on your current Polygon Islands, or it makes every polygon a Polygon Island and then stores the information.

The information created by this node then feeds another ICETree that lives in a Pointcloud and resembles a basic Motion Tools setup, with minor differences, as follows:

  1. Points are created solely based on the information provided by the geometry with the “Create Points from Polygon Islands” node
  2. Constraint PolyIslands to Pointcloud will feed the information from your Motion Tools ICETree back to your geomtry, keeping both in sinc. In case you are using simulations, this node should be place in the Post-Simulation area.

Simulations vs Procedural Animation

The basic Motion Tools is non-simulated. That is by design, in order to enhance speed and interactivity, as many effects can be tackled this way. That said some effects and cumulative behaviour may only be achieved through simulation. The basic difference between the regular procedural animation approach and a simulation approach is that in the first the creation, positioning, modification of the effects is done at every frame, while in the second creation and positioning is done once (in the first frame) while modifications are done at every frame and may, or may not cumulate over time.

Therefore there are two standard setups when working with Motion Tools + Simulation:

  • Simulate Always: The first one is to use Motion Tools to create and arrange the instances in space and then leave all the animation up to the simulation. In this setup one can NOT procedurally animate the instances.
  • Simulate when Triggered: The second approach is to use Motion Tools to create and arrange the instances in space, using a special node to apply procedural animation inside the simulation framework, and using a trigger to kick the instances in and out of a cumulative behaviour in the simulation.

Simulate Always

In this setup a ICETree containg a basic Motion Tools setup is created in the modeling stack. Any modifiers used will be evaluated at first frame only, and remain static for the rest of the simulation:

Remember that the Set Initial Shape and Color node also initiates parameters that are relevant to simulation like Mass and RBD properties.

And an ICETree containg a basic Bullet RBD simulation is created in the simulation stack:

For more information on simulations and how to set them up please refer to the Softimage User Guide.

Simulate when Triggered

In this setup a ICETree containg a basic Motion Tools setup is created in the modeling stack. Any modifiers used will be evaluated at first frame only, and remain static for the rest of the simulation:

Remember that the Set Initial Shape and Color node also initiates parameters that are relevant to simulation like Mass and RBD properties. Most importantly this node SETS THE STATE of this instances. The default state is 0.

An ICETree is created at the simulation stack. In order to kick particles in and out of the cumulative behaviour of simulations we set states where things cumulate and states where it does not. The image below describes this setup:

  1. The first ports of the simulation root take care of what will be simulated in the default state 0.
  2. The “Apply Modifiers Inside Simulation” node will allow you to use your modifiers while avoiding a cumulative effect in their behaviour.
  3. Use a Per Point boolean to trigger a change in state. In this case we used Test Inside Null. There are many “Test…” nodes in Softimage, all of them can be used as triggers
  4. You can plug as many states as you like
  5. The states you plug in have their own simulation properties, allowing for very custom looks.
Tip: If you are going for this setup remember to set your State particles as passive RBDs, so they wont move when colided by particles in other states.
For more information on simulations and how to set them up please refer to the Softimage User Guide.

Concatenate Modifiers’ Modulation

There are two ways to have many modifiers respecting the same modulation (mapping and blending of values). The first one would be to chose a scalar value, or a modulator and input that as the blend value, or any other green port in your modifier:

Another way to achieve this is to use the Modulate output port in any modifier to control values in subsequent modifiers. This is the same effect as the above, but created in this manner:

Last, but not least, one should pay attention to the fact that the result of on modification may change the modulation on the next modifier. For example, lets say you want to modulate transformations for point inside a volume, much like in the last two examples, if the first transformation you do changes the positions of the objects they may not obey the criteria when evaluated by the seccond modifier. In order to evaluate and apply all modifications at the same time use a “Delay set data” node, like so:

04 Tutorials

The links below aim at showing you how to achieve some common looks with Motion Tools:

05 Node Reference

Generators

Instancer Root

The central hub of all Motion Tools setups. There is nothing special about it, you could connect all your nodes directly into execution ports. It only helps you keep organized, as the steps are clearly stated (Create Points, Initial Values, Modfiers), and separate modifications on a per root basis.

  • Create Poins: Just an execution port. Plug your point creation nodes here
  • Initial Values: Here you should plug the “Set Initial Shape and Color” node, to apply all of its basic parameters.
  • Modifiers: Here you should plug any node that will change position, rotation, scale, color, or any other atributes of your points. You can plug Motion Tools modifiers, or any other ICE node that deals with particles.

 Set Initial Shape and Color

Along with the Instancer Root this is a node under the Generator category that does not actually generate points. This node actually set all relevant atributes to the instance that do NOT regard their position in space. For the position is set in the generator node itself. Reason for this is so that the user may experiment with different point creation nodes without messing their look.

  • Color: RGBA color for all particles
  • Size: Size of the radius of a particle
  • Shape: Shape of a particle, all regular shapes available in ICE (point, disc, cube, etc…)
  • Hide Instance Master: This will change visibility parameters of your instance object or group, in order to hide or show the master object
  • Randomness: A value from 0-1 where 0 represents no random in the way instanced shapes are chosen, and 1 is total randomness
  • ID: The State ID in which all particles in this root are created, relevant in simulation
  • RBD properties: Please refer to Softimage menu as this are standard Softimage RBD properties

Create Point Array

Create Point Array is the simplest point creation node in Motin Tools. It does not rely on input from any other objects, only on parameters set by the user. Although there are three diferent modes of point creation in this node most of the  parameters are shared amongst them.

  • Mode: There are three ways to arrange points in space with this node (Linear, Grid, Radial), as show in the images below
  • Starting point: Lowest X,Y,Z value for points (in Local Space). Points are emited inbetween Start and End Points.
  • End Point: Greatest X,Y,Z value for points (in Local Space). Points are emited inbetween Start and End Points. Nonetheless if you have Step Mode turned ON, End Point is the position of the second point, and the distance between start and end points will be kept for further repetitions in all axis.
  • SP and EP Control Objects: Conect nulls, or other objects, to control Starting Point and End Point
  • Step Mode: When step mode is off repetitions are contained inbetween Start and End Point, when step mode is on the distance between Start and End Point is used as offset for each repetition
  • Align: Aligns particle to the orientation most relevant in that mode
  • Angle: The angle of rotation between Start and End Points. In step mode it represents the rotation at each step
  • Offset: Offset points in the polar coordinate
  • Radius Start and Radius End: Define the radius for the radial mode

Create Point Array on Geometry

Create Point Array on Geometry will create points based on the geometry’s elements (Vertex, Edge, Polygon), surface, and volume. It may adapts points scale and orientation according to the shape.

  • Mode: How to generate and arrange points. They are very self explanatory.
  • Align to Geometry: This will try to align your points to the elements where they were emitted from, or their closest point in geometry.
  • Smart Scaling: Will try to scale points so they don’t hit each other. Modes that rely on elements will get the element size, while surface and volume modes will rely on the resolution parameter.
  • Resolution: Resolution of surface and volume modes when Smart Scaling is on
  • XYZ Step Mult: Is a multiplier of steps in a Volume (Voxel Grid) values lower then one give you a “sliced” effect

Create Point Array from Texture Projection

This is an experimental (and slow) method for creating and distributing points in space according to the UVs in a object.

  • U and V: Offsets points in both of this directions
  • Fit to UV Bounds: Disables U and V parameters and trys to fit all UV information into the point creation process
  • Texture projection: Uses the default name for most Softimage texture projections. You will need to change this if, for whatever reason, you texture projection has a different name

Create Points from Polygon Islands

This node creates points based on the polygon islands of a given geometry. It relys on data created by the Init PolyIsland node (that should be placed in the geometry itself).

  • Scale Points to Fit Island Bounding Box: As the name implies it scale particles to the bounding box of the islands, that is specially useful when dealing with texts and simulations

 

Modifiers

Modifiers are nodes that change atributtes in your instances. Theoratically it could be anything, bt most modifiers deal with transforms and colors. Also, as a design choice, no modifiers will create or delete instances.

All modifiers have a basic structure, they all have this 3 elements, highlighted in the picture below:

  1. Blend: It blends the effect on or off. 0 is off and 1 is on. You can also plug the output of a modulator in here. The modulator will act as a filter, or fallof of this effect, making it affect some instances more then others.
  2. Modulate: Outputs the blend value so you can use it as modulator for other modifiers. This is usefull when you animate the blend and want to pass that value on to other nodes.
  3. Execute port, for plugin into the Instancer Root

Besides this basic structure many modifiers are a combination of the most basic modifiers (Modify Transform and Modify Color) and some modulators or other operations. In the image below you can see the guts of the Modify by Linear Steps node, it basicaly consits of both a Modify Transform and a Modify Color nodes modulated by a Modulate by Linear Steps node.

Therefore, modifiers have many parameters in common.

Modify Transform

The most basic of the modifiers. Present inside almost all other modifiers.

  • Mode: One can choose to transform Scale, Rotation or Translation
  • Space: One can do transformations in both Global or Local spaces
  • Additive: If this is disabled new values are multiplied by current values. If this is enabled values are added to current value. In the case of rotations if additive is disabled new values ovewrite old values
  • XYZ Angle: Scale and Translate use regular XYZ input. Rotation uses XYZ and Angle (Quaternion) input to avoid flipping

Modify Color

The most basic color modifier.

  • Color: The new color or color gradiente you want to input
  • Blend Alpha: At 0 old alpha is used, at 1 new alpha kicks in

Modify by Linear Steps

This modifier enables you to transform properties along different axis in a linear or arbitrary f-curve fashion.

  • Axis: Choose the axis in which the mapping of values will take place. In a grid UVW will match XYZ. In a radial or texture projection distribution this will mean different, more relevant things
  • From, To: Cut out points you dont want to affect. From 0 to 1 should affect all points
  • Transform and Colors: refer to Modify Transforms and Modify Colors
  • Remmap: This will enable you to remmap values as 0 and 1 might not fit all cases. You can also use an f-curve to have behaviours different then linear

Modify by Harmonic Steps

Modify by harmonic steps will map a wavy look into transformations and colors. You can control all parameters and speed of this wave

  • Axis: Choose the axis in which the mapping of values will take place. In a grid UVW will match XYZ. In a radial or texture projection distribution this will mean different, more relevant things
  • From, To: Cut out points you dont want to affect. From 0 to 1 should affect all points
  • Frequencie: Higher frequencies will keep valleys and hills closer together
  • Freq Offset: Will move valeys and hills to the left or right
  • Amplitude: Higher amplitude will make valleys deeper and hills higher
  • Amp Offset: Moves valleys and hills up and down at the same time
  • Speed: How much wave will move per second
  • Transform and Colors: refer to Modify Transforms and Modify Colors

 

Modify with Object

Will modify instances as they are in or out the volume of a object. The object could be both Geometry or Null. Nulls work faster.

  • Transform and Colors: refer to Modify Transforms and Modify Colors
  • Remmap Output: If it begins at 0 and ends at 1 instances outside the the object will be modified. The oposite will yeld transformations inside the object. You can use the f-curve to make the modifications not-linear and either clamp or not the effect

Modify over Time

Simulates a cumulative effect without the need for simulations.

  • Transform: refer to Modify Transforms

Point at

Makes all instances point at a certain direction.

  • Up vector: The axis that should point to the target
  • Target Position: A XYZ value where instances should point to
  • Target inname: Overwrites Target Position. Uses any object as target on which instances point at

Randomize

Randomizes values in transforms and colors. 0 means no randomization and any other number will indicate SI units of randomness.

  • Mode: In the randomize node you can only choose between the 3 basic types of transformations and not on the Space of the transformations themselves. In this node rotation is not in the XYZ and Angle format
  • Channel: You may choose to modify only R, G or B channels specifically. Or other parameters like hue, saturation and brightness, for example

Modulators

Modulators return out values, per instance, based on some criteria. This values can be plug into any green input (green = scalar), enabling you to mix and match effects for a very custom look.

Linear Stepper

Maps values onto the relative position of a instance within a certain axis.

  • Axis: Choose the axis in which the mapping of values will take place. In a grid, UVW will match XYZ. In a radial or texture projection distribution this will mean different, more relevant things
  • From, To: Cut out points you dont want to affect. From 0 to 1 should affect all points
  • Remmap: This will enable you to remmap values as 0 and 1 might not fit all cases. You can also use an f-curve to have behaviours different then linear

Harmonic Stepper

Maps a sine wave onto the relative position of a instance within a certain axis.

  • Axis: Choose the axis in which the mapping of values will take place. In a grid, UVW will match XYZ. In a radial or texture projection distribution this will mean different, more relevant things
  • From, To: Cut out points you dont want to affect. From 0 to 1 should affect all points
  • Frequencie: Higher frequencies will keep valleys and hills closer together
  • Freq Offset: Will move valeys and hills to the left or right
  • Amplitude: Higher amplitude will make valleys deeper and hills higher
  • Amp Offset: Moves valleys and hills up and down at the same time
  • Speed: How much wave will move per second

Modulate by Object

Maps a value onto instances based on their distance to the volume of a object. The object could be both Geometry or Null. Nulls work faster.

  • Remmap Output: If it begins at 0 and ends at 1 instances outside the the object will be modified. The oposite will yeld transformations inside the object. You can use the f-curve to make the modifications not-linear and either clamp or not the effect

 

Simulations

Most of the nodes used for simulation and state changing are nodes that come default with Softimage. Therefore you should refere to SIs own documentation. That said, this are the nodes within Motion Tools aimed at simulation:

Apply Modifiers Inside Simulation

This node will enable you to apply modifiers in the simulation framework but avoiding its cumulative effect. All you have to do is plug into any execution port before the actual simulation node, and plug your modifiers in it.

Constraints

This category currently has only one node. Contraint PolyIslands to PointCloud will deform you geometry in order to match the position of the points in your pointcloud. As long as the geometry has Init PolyIsland nodes attached to it, and the points in the pointcloud were created with Create Points from Polygon Islands.

Other

Nodes in this category are listed under tools, not tasks…

Init PolyIsland Data

This node initializes the data that supports the workflow for emitting points from polygon islands, and contrainting polygon islands to this points.

  • Mode: You can chose in between using the existing polygon islands or slicing each polygon into a polygon island