Kudan

Search results for "{{ search.query }}"

No results found for "{{search.query}}". 
View All Results

ArbiTrack Basics

This tutorial goes over the basics of using Kudan’s ArbiTrack feature.

This tutorial uses assets for the target node and image node. You can download these assets here.

In this asset bundle you should find:

  • Cow Target.png - This is the image we will use for our target node. This will move with the device's Gyroscope and act as a preview for the ArbiTracker.
  • Cow Tracking.png - This is the image we will use for our image node, which will be displayed when ArbiTrack starts tracking.

Once you have downloaded the file, unzip it and add the assets to your Android Studio project.

Initialise ArbiTrack

Before you can use Kudan's ArbiTrack feature, there are two things you need to initialise. One is ARArbiTrack, which locks a node in place by tracking a set of feature points in the environment. The other is the ARGyroPlaceManager, which positions a node using your device's Gyroscope. To do this, add the following code to your activity:

@Override
public void setup() 
{
  super.setup();
  
  // Initialise ArbiTrack.
  ARArbiTrack arbiTrack = ARArbiTrack.getInstance();
  arbiTrack.initialise();
  
  // Initialise gyro placement. 
  ARGyroPlaceManager gyroPlaceManager = ARGyroPlaceManager.getInstance();
  gyroPlaceManager.initialise();
}

This will initialise everything ArbiTrack needs to work, including the Gyroscope and KudanCV's ArbiTracker.

Low-Feature Environments

ArbiTrack relies on having a large number of feature points in the environment to track correctly. If you are in an environment with a low number of features, this can cause tracking to become less consistent.

Setup the target node

To position our model in the world, we will need to use a target node. This node determines the starting point of tracking. Since the target node's position changes depending on the orientation of the device, it is useful to have a graphical representation of where the target node is.

Because the target node is an ARNode, it is possible to use anything as a preview, including a simple preview image, or even the same model you intend to to track. In this tutorial we will be using an image, specifically the Kudan Cow, for our target.

To create a target node and add it to the ArbiTracker, add the following code at the end of the setup method:

// Create a node to be used as the target.
ARImageNode targetNode = new ARImageNode("Cow Target.png");
        
// Add it to the Gyro Placement Manager's world so that it moves with the device's Gyroscope.
gyroPlaceManager.getWorld().addChild(targetNode);
        
// Rotate and scale the node to ensure it is displayed correctly.
targetNode.rotateByDegrees(90.0f, 1.0f, 0.0f, 0.0f)
targetNode.rotateByDegrees(180.0f, 0.0f, 1.0f, 0.0f);

targetNode.scaleByUniform(0.3f);
        
// Set the ArbiTracker's target node.
arbiTrack.setTargetNode(targetNode);

This will create an image node, add it to the Gyro Place Manager's world and assign it to the ArbiTracker.

Setup content with ArbiTrack

We have a target node now, but we still need something to display when ArbiTrack starts. As with the target node, you can use any node with ArbiTrack to display whatever content you wish. For this tutorial, we'll be sticking with image nodes. Add the following at the end of the setupContent method:

// Create a node to be tracked.
ARImageNode trackingNode = new ARImageNode("Cow Tracking.png");

// Rotate the node to ensure it is displayed correctly.
trackingNode.rotateByDegrees(90.0f, 1.0f, 0.0f, 0.0f);
trackingNode.rotateByDegrees(180.0f, 0.0f, 1.0f, 0.0f);

// Add the node as a child of the ArbiTracker's world.
arbiTrack.getWorld().addChild(trackingNode);

This will create an image node using the tracking image and add it to the ArbiTracker's world.

Implement touch input and Start ArbiTrack.

Now we have a target node and an image node for content, and ArbiTrack and the Gyroscope are all set up and ready to go. But how do we start tracking? With the image tracker, it was automatic, because it would just look for markers, but the ArbiTracker has nothing like that to look for. This means we have to tell it when to start. Fortunately, that's very easy to do.

There are many ways to allow input. We could add a button to the screen, for example. But that requires a lot of setup and messing around with the activity layout. A much easier way is simply to implement a GestureDetector. Make the following changes to your activity:

public class MarkerlessActivity extends ARActivity implements GestureDetector.OnGestureListener
{
  ...
  private GestureDetectorCompat gestureDetect;
  ...
  
  @Override
  protected void onCreate(Bundle savedInstanceState) 
  {
      super.onCreate(savedInstanceState);

      // Create gesture recogniser to start and stop arbitrack
      gestureDetect = new GestureDetectorCompat(this,this);
  }
  
  ...
  
  @Override
  public boolean onTouchEvent(MotionEvent event) 
  {
    gestureDetect.onTouchEvent(event);
    return super.onTouchEvent(event);
  }
  
  @Override
  public boolean onSingleTapUp(MotionEvent e) 
  {
    ARArbiTrack arbiTrack = ARArbiTrack.getInstance();

    // If arbitrack is tracking, stop tracking so that its world is no longer rendered, and make the target node visible.
    if (arbiTrack.getIsTracking())
    {
      arbiTrack.stop();
      arbiTrack.getTargetNode().setVisible(true);
    }

    // If it's not tracking, start tracking and hide the target node.
    else
    {
      arbiTrack.start();
      arbiTrack.getTargetNode().setVisible(false);
    }
    
    return false;
  }

  // We also need to implement the other overrides of the GestureDetector, though we don't need them for this sample.
  @Override
  public boolean onDown(MotionEvent e) 
  {
    return false;
  }

  @Override
  public void onShowPress(MotionEvent e) 
  {
  }

  @Override
  public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY) 
  {
    return false;
  }

  @Override
  public void onLongPress(MotionEvent e) 
  {
  }

  @Override
  public boolean onFling(MotionEvent e1, MotionEvent e2, float velocityX, float velocityY) 
  {
    return false;
  }
}

The above changes will add a recogniser for touch input to your activity. When you tap the screen, ArbiTrack will start or stop, depending if it's already tracking.

Build and run the app on an Android device, and you should see the Target Node in the centre of the screen, which will move around as you move your device. When you tap the screen, the target node disappears and the tracking node takes its place. If you tap again, the tracking node disappears and the target node returns.

ArbiTrack Basics

This tutorial goes over the basics of using Kudan’s ArbiTrack feature.