In general, argon.js apps can be structured however you want. This tutorial follows a single-page app structure, in which the entire app loads from one html page.
Note: One way to load
argon.jsis to download the argon.min.js library and manually place it in a script tag in your html as you see here. However, if you are comfortable with using a module loader such as
browserify, etc., then you can do that instead (see the Quick Start guide).
As you see here, the application code is not included in the html file, instead it is segregated into one or more external files. In order to ensure that the body is loaded by the time your script executes, it is convenient to load your application script just before the end-body tag.
In addition to offering AR features (such as geolocation, video of the surrounding world, 3D graphics and image tracking), the Argon4 browser is a standard web browser (i.e., on iOS, it uses Apple’s Webkit engine) and can therefore render just about any web content. You can take advantage of these capabilities in two ways:
First, argon uses (or creates) a special div with the id “argon” as its
view. Anything you put in this div (or in divs nested inside this one) will be rendered on the screen in 2D, in front of the 3D AR content in the Argon view. See part 2 of the tutorial for an in-depth discussion on including HTML content in Argon applications.
Second, because Webkit is a full-featured HTML5 engine (used by Safari), Argon can render most web pages (e.g., including those without any AR features). Just type the url into the text box.
app.ts (Typescript) and
app.js file is actually downloaded and used in executing the channel.
To display graphics with three.js, we need three things: a scene, a camera, and a renderer.
A simple box (cube) is created using methods provided by three.js and positioned in the world. This code creates the box and adds a texture to it:
The above code creates two objects (the box itself and boxGeoObject to which the box is attached).
Next, in order to give the box a geolocation, a Cesium
Entity will be created. Argon uses an enhanced set of the core libraries from Cesium to represent and manipulate its frames of reference, using a single coordinate system for every object, from geospatial coordinates down to items tracked with the camera.
Entity object has properties for the position and orientation of the entity. The default coordinate frame used by Argon is Cesium’s
FIXED reference frame, which is centered at the center of the earth and oriented with the earth’s axes. Unfortunately, this reference frame is inconvenient to use directly, as any point on the earth is very far from the center of the earth, and the orientation of the surface of the earth is not intuitive (as “up” is aligned with the axes between the north and south poles). Therefore, within the
FIXED reference frame, Argon defines a local coordinate frame that sits on a plane tangent to the earth near the user’s current location.
setDefaultReferenceFrame() is used to set this local frame. (See the initialization code above for an example.)
Argon reports the position and orientation of entities relative to this local frame. While arbitrary (the programmer does not choose the origin of this coordinate system), this local reference frame has an intuitive orientation relative to the user’s location on the surface of the earth. This example uses
localOriginEastUpSouth as the default reference frame, so the positive x-axis is east, the positive y-axis is up and the positive z-axis is south. If the user moves more than a few kilometers from the origin of this local frame, Argon updates the origin to the new location, ensuring that the local reference frame is a reasonable approximation to what the user (and programmer) perceive as their local coordinates (i.e., as you move around the earth, positive y will remain “up”, positive x will point east and positive z will point south).
Since the local reference frame may change at any time, a programmer should not save and use the values in this frame for more than a single update and render step. If the values need to be saved and used over multiple frames, it is possible to be notified when the local frame of reference changes.
At this point, the box object (a textured cube) is attached to the scene and there is a Cesium Entity for it, but it is not yet actually located that in the world. The position of boxGeoEntity is set to (0,0,0) by default. The geolocation of the boxGeoEntity will be computed (in the Update Event code below) after Argon has determined the location of the user.
Argon is designed to work in a variety of browsers and leverage different approaches to AR. Since different setups have different update requirements, Argon controls the update loop of an application. This is in contrast to most web frameworks, such as three.js, that directly use
requestAnimationFrame() to update their content. Internally, Argon may or may not use
requestAnimationFrame to trigger updates. For example, when live video is used as the background in the Argon4 browser, updates are triggered whenever a new video frame is available from the camera. To ensure your app renders the scene at the right time, you should always use Argon’s update methods.
Whenever the application should update and re-render the scene, two event listeners (
renderEvent) are triggered in turn, allowing the application state to be updated separately from rendering. Your application (and support libraries) may subscribe to these events in multiple places, and all update event listeners will be called before all render event listeners.
An update event listener is where your application should generally make changes to the scene (adding, manipulating, or deleting objects you have created). In this example, the first time the update event listener is called, the box’s geospatial position is set to be 10 meters to the east of the user (the local reference frame is
localOriginEastUpSouth so positive
x is east). To position the box, a new position
boxPos is created and its
x value incremented by 10. Next, the value of the
position property on the Cesium.Entity
boxGeoEntity is set to this new position.
It is very important to pay attention to the frame of reference for this property, which is our default reference frame: you do not want to leave the
boxGeoEntity in this frame of reference because the default reference frame may get reset at any time (if the user moves away from the origin of the frame). Therefore,
convertEntityReferenceFrame() is called to convert this entity to
ReferenceFrame.FIXED, Cesium’s earth-centered reference frame.
convertEntityReferenceFrame updates the
orientation properties of the entity such that the entity appears to be in exactly the same position and orientation, but now these properties are expressed in the new reference frame. If you looked at the values of the properties after this call, the position would be very large, and the orientation will have changed to an angle corresponding to the tangent plane of the earth at your current location. At this point, the
boxGeoEntity is expressed in geospatial coordinates, independent of the location of the user and the (arbitrary) local reference frame.
Next, the update listener sets the position and orientation of the three.js
boxGeoObject based on the pose of the
boxGeoEntity in the local reference frame. It also rotates the box each time through the loop for visual interest (and so you have some indication the application is running when you look at it).
The renderEvent listeners are called after the updateEvent listeners. Argon supports multiple subviews within its view (currently, just single or stereo), so the render event needs to handle an arbitrary set of subviews, rendering the scene appropriately in each one. This is straightforward for the WebGL renderer, which supports rendering into subviews within the
canvas. Each subview can simply be rendererd independently.
With these two events, the code for the first example is complete. The code in the render event listener will be discussed in greater detail in part 3 of this tutorial in conjunction with the discussion of how an Argon application should handle Argon4’s Stereo Viewer Mode.
When you run the Live Demo of this part of the tutorial on a modern phone or tablet, you will likely notice the cube moving around in a very erratic fashion. The Global Positioning System (GPS) used in phones and (some) tablets is more than adequate for 2D mapping applications. However, when viewed from a first-person augmented reality perspective, its limitations become obviously. On most devices, the location is updated once per second, and the accuracy is relatively poor (in the 2-5 meter range). This means the position of the phone (the viewer) only changes once per second, no matter how the user is moving; in this example, this manifests as the cube appearing to move once per second (the cube isn’t moving; the viewer’s location is).
Worse, an accuracy of 2-5 meters means that the GPS system on the phone reports your position as being within 2-5 meters of your actual position, and that position could change each second even if you aren’t moving. Essentially, each second, the position of the phone is reported as being somewhere within a few meters of its true location, and it is impossible to predict whether that will be north or south, east or west, up or down from the true location. Furthermore, anything that blocks your device’s view of the GPS satellites in the sky will make these errors worse: dense tree foliage or tall buildings could case the estimate of the device position to fall to only being within 10 or more meters of its true location. And if you move inside a building, the GPS system stops working entirely, causing the device to fall back to the crude location estimate provided by the operating system (typically based on the signal strength of WIFI).
This is not meant as a criticism of the location tracking capabilities of modern mobile devices; the fact that a tiny chip with a small antenna can determine its location on the earth to within a few meters based on the extrememly low-power signals from a few dozen GPS satellites is amazing.
However, as an AR developer, you must be aware of the limitations of the location tracking hardware in the devices if you are to create compelling experiences for your users.