Lightact v3.0.6 Preview!

Home / New software releases / Lightact v3.0.6 Preview!

It’s finally here! Lightact v3.0.6 Preview features a massive upgrade of 2D projection mapping toolset, a rapid JSON parsing and packaging, a much more versatile image and video playback, a bunch of new string processing nodes and a number of other improvements and fixes.

But let’s go one step at a time.

A Brand  New 2D Projection Mapping Toolset

Projection mapping has been completely reworked! When designing the interface we made a lot of effort to minimize window switching and to create several shortcuts that will speed your workflow up.

Check out this video to get a brief overview of all that’s new.

To summarise: the whole process is done from Canvas Setup window (the window you open by double-clicking on a canvas). There you can now do all the warping, blending and color corrections you need. Also, you can map Projectors directly to a canvas just like video screens.

On top of that, you can still add additional Texture processing nodes for individual Projectors for even greater flexibility.

And, we’ve created new Test image nodes in Layer Layouts, which generate test patterns that you need to map those pixels just so.

New Nodes (and they are easier to find)

We added even more nodes to our Layer Layouts system. The most important group of additions and upgrades are the Video and image playback. Now you can playback all videos in a folder (or its subfolders) in sequence with nice fade-ins and outs in between. Similarly, Lightact can now automatically create a slideshow out of all the images in a folder with a fancy zoom in or out effect.

Another, a more utilitarian group of nodes, allow Lightact to parse JSON stream into Lightact values (and the other way around) or run a bunch of string processing operations.

Since we have almost 300 nodes in Layer Layouts now, we’ve improved the search function as well.

RealSense and MIDI Integration

We’ve also added 2 new external devices integration: Intel RealSense and MIDI. Now you can get RealSense’s RGB or Depth stream and use it as a texture emitter in Lightact’s 3D rendering engine, pass it on to Unreal Engine or any other way.

MIDI, on the other hand, is a very popular way of controlling a media server during a live show or a VJ performance and Lightact can now react to any variable in a MIDI message. Just connect your favorite MIDI device and give it a go!

Please note, this is the last Preview release before 3.1.0. This means that any subsequent release of Lightact will have a watermark (unless you purchase a license of course)

Related Posts