If you have any questions, feel free to ask in the Discussion Board

Scripting Guide

Pictarize support custom javascripts for interactive AR effects. You can open up the scripting panel on the bottom left of your scene. This is an advanced feature for developers.

Let's start with some basic concept. Each target image corresponds to one AR scene. You can attach custom scripts to each of these individual targets (i.e. scenes) to control their behaviours in order to make it more interative. Unlike traditional (game) programming, in which you need to start a program and create a running loop, Pictarize already created the main program for you, and the main program will fire up your custom functions during the lifecycles of the scene. Inside your custom functions, you can modify the properties (e.g. position) of the contents. After finish executing your code, the control will pass back to Pictarize main program.

Scene Life Cycle

Most of the time, you want your code to manipulate the content objects (e.g. 3D models, video, audio, etc). The first thing you likely would want to do is to get individual content objects. You do that by using the target.getObject() method. This method takes a single parameter, which is the name of the content. It is also the name you specified in the Targets Panel.

Content Object

The content object allows you to read and modify the properties of the underlying content. They have the following properties and methods:

Depending on the content types, there might be extra properties and methods.

3D Models

If your 3D models have built-in animations. Custom scripts allow you to control that how you want to play them. There is a getAction() method to retrieve the underlying animation action. The return type is AnimationAction from THREE.js. Ref. It's possible that there are multiple animations attached to the 3D models, so you need to pass in an index parameter.

Uploaded Videos

For uploaded videos (differnt from embeded youtube/vimeo), you can get the underlying video object using getVideo. The return type is HTML video object, so you can do whatever html supports.

Uploaded Audios

For uploaded audios, you can get the underlying audio object using getAudio. The return type is HTML audio object, so you can do whatever html supports.

Embed Youtube/Vimeo videos

For embeded videos, we currently support three methods to control the playback. Noted that it's different from uploaded videos.

onClick

For onClick function, there is an object input parameter. This is the content object being clicked. Most of the time, you might want to know what content it is by checking against the name. Example below shows how you can make a button (a button just means a content. That could be image, text, or anything), and trigger a 3D model to start animating when it is being clicked.

onUpdate

Two additional inputs, time and deltaTime, are present in onUpdate call. time is the elapsed time (in seconds) since the scene being activated (i.e. the time onActivate is being called). deltaTime is the elapsed time since the last onUpdate call. They are very useful if you want to programmatically animate some contents (e.g. create transitional effects like fade in / fade out).

data

Finally, there is also a data input for all the event functions. This is a storage object for you to keep custom data across the lifecycles of the application. You can assign any custom data to it, and even custom functions. In the example below, we hide a 3D model at the beginning and make it appear after user have clicked for five times.

Conclusion

There is a simulator right inside the editor. You can easily test your effects by running the simulator instead of building the projects everytime you make changes. Once you are satisfied, you can then proceed to build the project and test it in real devices.