||The availability of powerful consumer-level smart devices and off-the-shelf software frameworks has tremendously popularized augmented reality (AR) applications. However, since the built-in cameras typically have rather limited field of view, it is usually preferable to position AR tools built upon these devices at a distance when large objects need to be tracked for augmentation. This arrangement makes it difficult or even impossible to physically interact with the augmented object. One solution is to adopt third person perspective (TPP) with which the smart device shows in real time the object to be interacted with, the AR information and the user herself, all captured by a remote camera. Through mental transformation between the user-centric coordinate space and the coordinate system of the remote camera, the user can directly interact with objects in the real world. In this talk, I will present a study evaluating user performance under this cognitively demanding situation. The AR system will first be briefly introduced, which is followed by the experiment design and procedures. Finally, I will show the results and the conclusions we drew from them.