Most applications on Android are developed in Java, and Android provides a rich framework of classes to support this. It is, however, also possible to develop parts of an application in native C/C++ code using the Android NDK. This is intended for accessing existing C/C++ codebases or potentially optimizing performance critical functions.
The general approach is to build a native C/C++ shared library containing functions that are exposed using the JNI naming scheme. A Java application can then load the library and map the native functions to Java methods, which can then be called like any other method in Java. Using this approach it is now possible to create ARToolKit applications on Android.
Certain parts of these applications, must be implemented in Java, other parts can be written in C/C++. Therefore, applications will typically be a combination of C/C++, Java, and the “glue” in between.
This SDK includes components in both C/C++ and Java to permit the development of ARToolKit applications on Android. These components include:
With these components, several development strategies are possible, ranging in complexity:
The ARToolKit port includes almost all of the core modules; the notable exception being the video capture module, which in other ARToolKit versions provides a standard interface for accessing video capture on different platforms and hardware.
Android does not currently permit camera access from native code. Instead, only Java code can open the camera and capture frames. Additionally, a live camera preview must be included in the current Activity’s view for frames to be captured. This means that ARToolKit itself cannot initiate video capture, but must instead wait on the Java application to pass video information and frames using JNI.
Therefore, video capture requires coordination between corresponding libraries on either side of JNI. While this forces a slightly fragmented approach, ARToolKitWrapper and ARBaseLib libraries are provided to handle the issue. Alternatively, the ARNative example included in the SDK demonstrates how to pass video independently of these libraries.
至于校准,属于独立篇章,之后看。
You can copy the entire libs directory from: android/libs
Please ensure that you run the build.sh and build native.sh prior to copying the directory.
Read the Android Native Development for more information. Note: There are subdirectories for each CPU architecture, including armeabi, armeabi-v7a, mips and x86. It is the same library built for different instruction sets. The appropriate version is automatically chosen at runtime.
##ARBaseLib ARBaseLib provides additional classes to simplify development.
ARBaseLib is an Android library, meaning that it isn‘t an Android application itself, but can make use of the Android framework. Android applications can reference the library, and AndroidStudio will take care of including the necessary files when the APK is built and deployed. This allows reusable components to be placed in the library and used in many different examples and applications.
To use ARBaseLib, import the ARBaseLib as new module to your AndroidStudio project:
1. **File/Project Structure…** 2. Add a new module with the **+** button at the top left of by pressing ?+N (OSX) Alt+Insert (Windows) 3. Select **Import .JAR/.AAR Package** hit Next 4. Select the file with the ... on the right of the first text field. The ARBaseLib.aar file is located in $ARTOOLKIT5ROOT/AndroidStuiodProjects/ARBaseLibProj/arBaseLib/build/outputs/aar/
Referencing ARBaseLib gives the application access to several new classes. Some of the key ones are:
ARActivity takes care of setting up the view hierarchy that will display the live augmented reality view. The AR view is created by layering an OpenGL surface over the live camera preview surface. By using a transparent background clear color in OpenGL, the live video shows through from below.
A FrameLayout is used to hold the views because children of a FrameLayout are stacked on top of each other – precisely the arrangement required. The following diagram illustrates how the user interface is composed to produce an AR view.
ARActivity must be subclassed to be used. Abstract methods need to be overridden in the subclass to provide ARActivity with the objects it needs to work with.
The first object is a FrameLayout, mentioned above, which will contain the camera and OpenGL views.
protected abstract FrameLayout supplyFrameLayout();
The second required object is a renderer for displaying the AR scene. The renderer must inherit from ARRenderer, another class in ARBaseLib.
protected abstract ARRenderer supplyRenderer();
获得生效的Marker:
int markerID = ARToolKit.getInstance().addMarker("single;/sdcard/AR/Data/patt.hiro;80");
For single markers: single;path_to_pattern_file;pattern_width
Example: single;/sdcard/AR/Data/patt.hiro;80
For multi markers: multi;path_to_multi_config_file
Example: multi;/sdcard/AR/Data/multi/marker.dat
For developers who want more control and direct access to ARToolKit functions, the core ARToolKit modules are available as static libraries:
The examples are divided into 3 sets.
Loads marker names from a configuration file. (Square markers only.) The tracking will automatically be set to match the types of square markers (template (pictorial) vs. matrix (barcode)) used in the configuration file. It is not recommended that template and matrix markers are mixed in the same application, as this lowers the tracking reliability of both types.
Loads marker names from a configuration file. (Square markers only.) The tracking will automatically be set to match the types of square markers (template (pictorial) vs. matrix (barcode)) used in the configuration file. It is not recommended that template and matrix markers are mixed in the same application, as this lowers the tracking reliability of both types.
Management of OSG objects is encapsulated in a C-pseudoclass named VirtualEnvironment, which in turn acts through the API offered by the ARosg library. ARosg contains a reasonable amount of functionality for manipulating the scene graph. See the API documentation for libARosg.
Loads NFT dataset names from a configuration file.
The example uses the “Pinball.jpg” image supplied in the “Misc/patterns” folder. ARToolKit NFT requires a fast device, preferably dual-core for good operation, e.g. Samsung Galaxy SII or similar. Build/deployment for Android API 9 (Android OS v2.3) or later is recommended.
Loads NFT dataset names from a configuration file.
The example uses the “Pinball.jpg” image supplied in the “Misc/patterns” folder. ARToolKit NFT requires a fast device, preferably dual-core for good operation, e.g. Samsung Galaxy SII or similar. Build/deployment for Android API 9 (Android OS v2.3) or later is recommended.
Management of OSG objects is encapsulated in a C-pseudoclass named VirtualEnvironment, which in turn acts through the API offered by the ARosg library. ARosg contains a reasonable amount of functionality for manipulating the scene graph. See the API documentation for libARosg.
Shows an example of playback of a video file on a marker surface. The example is NDK (native)-based. Movie playback is only supported by Android OS v4.0 (“Ice Cream Sandwich”) and later (Android API level 14), and support varies in quality and reliability from device to device. It is highly recommended that you provide alternate playback mechanisms for devices where playback in the AR environment cannot proceed, e.g. full screen playback.
http://artoolkit.org/documentation/doku.php?id=4_Android:android_native
[Artoolkit] ARToolKit's SDK Structure on Android
原文:http://www.cnblogs.com/jesse123/p/6400060.html