arcore-unity-sdk | ARCore SDK for Unity | Augmented Reality library
kandi X-RAY | arcore-unity-sdk Summary
kandi X-RAY | arcore-unity-sdk Summary
This SDK provides native APIs for all of the essential AR features like motion tracking, environmental understanding, and light estimation. With these capabilities you can build entirely new AR experiences or enhance existing apps with AR features. Caution: The ARCore SDK for Unity is deprecated, and no longer supported in the 2020 and later versions of Unity. This SDK should only be used by developers working on existing projects which are unable to migrate to Unity’s AR Foundation. Developers starting new projects should instead use the [ARCore Extensions for AR Foundation] //github.com/google-ar/arcore-unity-extensions).
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of arcore-unity-sdk
arcore-unity-sdk Key Features
arcore-unity-sdk Examples and Code Snippets
Community Discussions
Trending Discussions on arcore-unity-sdk
QUESTION
I'm an IT student, and would like to know (understand) more about the Augmented Faces API in ARCore.
I just saw the ARCore V1.7 release, and the new Augmented Faces API. I get the enormous potential of this API. But I didn't see any questions or articles on this subject. So I'm questioning myself, and here are some assumptions / questions which come to my mind about this release.
Assumption- ARCore team are using (Like Instagram and Snapchat) machine learning, to generate landmarks all over the face. Probably HOG Face Detection..
- How does ARCore generate 468 points all over the user face's on a Smartphone ? Impossible to find any response on that, even in the source code.
- How they can have the depth from a simple smartphone camera ?
- How to decline the Face detection / tracking, to a custom object or another part of the body like a Hand ?
So if you have any advices or remarks on this subject, let's share !
...ANSWER
Answered 2019-Apr-20 at 07:01
- ARCore's new Augmented Faces API, that is working on the front-facing camera without depth sensor, offers a high quality,
468-point
3D mesh that allows users attach such effects to their faces as animated masks, glasses, skin retouching, etc. The mesh provides coordinates and region specific anchors that make it possible to add these effects.I firmly believe that a facial landmarks detection is generated with a help of computer vision algorithms under the hood of ARCore 1.7. It's also important to say that you can get started in Unity or in Sceneform by creating an ARCore session with the "front-facing camera" and Augmented Faces "mesh" mode enabled. Note that other AR features such as plane detection aren't currently available when using the front-facing camera.
AugmentedFace
extendsTrackable
, so faces are detected and updated just like planes, Augmented Images, and other Trackables.
As you know, 2+ years ago Google released
Face API
that performs face detection, which locates faces in pictures, along with their position (where they are in the picture) and orientation (which way they’re facing, relative to the camera). Face API allows you detect landmarks (points of interest on a face) and perform classifications to determine whether the eyes are open or closed, and whether or not a face is smiling. The Face API also detects and follows faces in moving images, which is known as face tracking.
So, ARCore 1.7 just borrowed some architectural elements from Face API and now it's not only detects facial landmarks and generates 468 points for them but also tracks them in real time at 60 fps and sticks 3D facial geometry to them.
See Google's Face Detection Concepts Overview.
To calculate a depth channel in a video, shot by moving RGB camera, is not a rocket science. You just need to apply a parallax formula to tracked features. So if a translation's amplitude of a feature on a static object is quite high – the tracked object is closer to a camera, and if an amplitude of a feature on a static object is quite low – the tracked object is farther from a camera. These approaches for calculating a depth channel is quite usual for such compositing apps as The Foundry NUKE and Blackmagic Fusion for more than 10 years. Now the same principles are accessible in ARCore.
You cannot decline the Face detection/tracking algorithm to a custom object or another part of the body like a hand. Augmented Faces API developed for just faces.
Here's how Java code for activating Augmented Faces feature looks like:
QUESTION
I've been getting my butt kicked trying to get a vertically placed 3d model GLB format placed properly on a vertical surface.
Just to be clear, I am not referring to the difficulty of identifying vertical surface, that is a whole other problem in itself.
Removing common boilerplate of setup to minimize this post.
I am using a fragment that extends ARFragment.
...ANSWER
Answered 2019-Nov-22 at 20:48Well I finally got it. Took awhile and some serious trial and error of rotating every node, axis, angle, and rotation before I finally got it to place nicely. So I'll share my results in case anyone else needs this as well.
Of course it is mildly subjective to how you held the phone and it's understanding of the surroundings, but it's always pretty darn close to level now without fail in both landscape and portrait testing that I have done.
So here's what I've learned.
Setting the worldRotation on the anchorNode will help keep the 3DModel facing towards the cameraview using a little subtraction.
QUESTION
I'm trying to build the 'arcore camera utility' library in NDK_BUILD, here: https://github.com/google-ar/arcore-unity-sdk/tree/master/Assets/GoogleARCore/Examples/ComputerVision/Plugins/src
Using this guide: https://yeephycho.github.io/2016/10/20/How-to-Use-NDK-to-Build-A-Shared-Android_Native-Library/ I was at least able to get it to compile in a libarcore_camera_utility.so file. Not only that but it was actually recognized by my app on the phone and instead of getting a DLL missing error I got the error: "EntryPointNotFoundException: Unable to find an entry point named 'TextureReader_create' in 'arcore_camera_utility'." which means it at least found the file, now.
The filesize of the .so is only 6k so it seems like I'm not compiling it correctly as the already working 32bit version that comes with the package is 100k, based on this question it seems like I'm leaving something out?: Entry point not found, Android .so file
My Android.mk file is:
...ANSWER
Answered 2019-Oct-03 at 14:57To compile arcore_camera_utility for the arm 64bit target-
1.) Create a new directory called 'arcorelibrary', then a subdirectory called 'jni'
2.) Download this zip: https://github.com/google-ar/arcore-unity-sdk/blob/master/Assets/GoogleARCore/Examples/ComputerVision/Plugins/src/arcore_camera_utility.zip
3.) get the three .cc files and the three .h files and place them in the jni directory
4.) Create a file called 'Android.mk' and place it in the jni directory, with the following contents:
QUESTION
I'm using Sceneform with ARCore on Android and am unable to understand the concepts clearly with the documentation provided. I'm trying to modify the existing HelloSceneform App from github and trying to create a app, where as soon as it's started, the user sees a 3D object directly at his/her front. This is very similar to what I found https://github.com/google-ar/arcore-unity-sdk/issues/144, but I couldn't figure out how I can improve the existing code to get it.
...ANSWER
Answered 2018-Dec-03 at 03:40 @Override
public void onUpdate(FrameTime frameTime) {
Frame frame = playFragment.getArSceneView().getArFrame();
if (frame == null) {
return;
}
if (frame.getCamera().getTrackingState() != TrackingState.TRACKING) {
return;
}
for (Plane plane : frame.getUpdatedTrackables(Plane.class)) {
playFragment.getPlaneDiscoveryController().hide();
if (plane.getTrackingState() == TrackingState.TRACKING) {
for (HitResult hit : frame.hitTest(getScreenCenter().x, getScreenCenter().y)) {
Trackable trackable = hit.getTrackable();
if (trackable instanceof Plane && ((Plane) trackable).isPoseInPolygon(hit.getHitPose())) {
Anchor anchor = hit.createAnchor();
AnchorNode anchorNode = new AnchorNode(anchor);
anchorNode.setParent(playFragment.getArSceneView().getScene());
Pose pose = hit.getHitPose();
Node node = new Node();
node.setRenderable(modelRenderable);
node.setLocalPosition(new Vector3(pose.tx(), pose.compose(Pose.makeTranslation(0.0f, 0.05f, 0.0f)).ty(), pose.tz()));
node.setParent(anchorNode);
}
}
}
}
}
private Vector3 getScreenCenter() {
View vw = findViewById(android.R.id.content);
return new Vector3(vw.getWidth() / 2f, vw.getHeight() / 2f, 0f);
}
QUESTION
i'm new to Unity and ARCore and i am trying to lauch the ARCore exemple project, but nothing is showing on my phone.
I'm using the Unity 2018.2.2f1 and ARcore v1.4, but i had the exact same problem when i used the previous version (2018.2.1f1 and 1.3). I run Unity on Windows 10.
Here is what i did :
- I create a new project named "AR"
- I add the " arcore-unity-sdk-v1.4.0 " package Assets > Import Package > Custom Package ... And select " arcore-unity-sdk-v1.4.0 "
- I select " All " to import all the package, then " Import "
- I have now 3 CS0619 errors :
[...]
error CS0619: 'UnityEngine.Network' is obsolete:
[...]error CS0619: 'UnityEngine.Network.player' is obsolete:
[...]error CS0619: 'UnityEngine.NetworkPlayer.ipAddress' is obsolete:
[...] - I correct them by following thoses instructions https://github.com/google-ar/arcore-unity-sdk/issues/197
- I have now 1 CS0618 warning, on 5 different files :
[...]
warning CS0618: 'UnityEditor.Build.IPreprocessBuild' is obsolete: 'Use IPreprocessBuildWithReport instead'
[...] So i use "IPreprocessBuildWithReport
" on the 5 different files https://docs.unity3d.com/ScriptReference/Build.IPreprocessBuildWithReport.OnPreprocessBuild.html - 1 warning is still remaning, an other CS0618
[...]
warning CS0618: 'UnityEngine.ScreenOrientation.Unknown' is obsolete:
[...] I simply replaced " [...]ScreenOrientation.Unknown
" by " [...]ScreenOrientation.AutoRotation
" - Now that i don't have any errors or warning left, i continue to follow the ArCore Tutorial https://developers.google.com/ar/develop/unity/quickstart-android (I am now at the "Open the sample scene" part)
- So i double clic on "HelloAR" Assets > GoogleARCore > Examples > HelloAR > Scenes > HelloAR
- I keep following the tutorial (Configure build settings)
- I when to File > Build Settings to open the Build Settings window
- Select Android and click Switch Platform
- Player Settings > Other Settings > Multithreaded Rendering : Disable
- Player Settings > Other Settings > Package Name : com.Help.HelloAR
- Player Settings > Other Settings > Minimum API Level : Android 7.0 'Nougat' (API level 24)
- Player Settings > Other Settings > Target API Level : Android 7.0 'Nougat' (API level 24)
- Player Settings > XR Settings > ARCore Supported : Enable
- Now that everything is ready, i enable developer options and USB debugging on my phone (Samsung S8+, Android version : 8.0.0), connect it to my computer and go to File > Build Settings > Build And Run.
- Save the apk to my computer.
- But when i launch the application on my phone, the only thing i get is this : https://image.noelshack.com/fichiers/2018/32/2/1533648381-ar-error.jpg (The camera is allowed to be used with the application) (I also tried to " Build ", and then copy the apk to my phone, it does the same thing)
But the "fun" part, is the fact that, if i go back on Unity, delete the scene "HelloAR" and start to put 3D objects in front of the camera, i will still have this grey and blue image (sky texture) when i build the project. Doesn't mater what modification i do i will still have it.
Does anyone know where this problem comes from and maybe how to fix it ? Can you think of anything i could try ?
Thanks in advance
...ANSWER
Answered 2018-Aug-07 at 19:58I had the same exact issue, then I finally noticed that I was exporting the sample scene that unity defaults to. Did you check to see if you were in fact exporting the HelloAR scene for your build?
QUESTION
I tried the ARcore sample projekt HelloAR in Unity. Now my aim is to efficiently export the mapped point cloud for postprocessing.
In the sample HelloAR Unity projekt, I changed the "PointcloudVisualizer.cs" Script. Now on every Update I copy the current Point Cloud in a Vector4 List.
...ANSWER
Answered 2018-Apr-11 at 22:43- What's your definition of postprocessing? Are you attempting to run additional processing while your app is running, or are you trying to download the data to play with locally on your computer? If the former, you shouldn't write to a text file. Keep in mind the positions are relative to the camera transform.
- Yes, point clouds are relative to each frame, there's no global point cloud (or global mesh) support (yet?)
- It's based on some heuristic that ARCore computes. Higher is better. (think of it as a percentage, I'm 90% sure this is a valid point cloud point).
- Are you trying to reconstruct the room? Are you trying to do it in real time? It's the same problem as lidar systems used in autonomous vehicles. Essentially if you see multiple cloud points at the same spot over successive frames, there's a high likely-hood that that point is real (probably the same heuristic that ARCore uses).
- Ahh this answers a few of the questions above. The toughest part will be aligning your cloud-point generated version of the room with your lidar-scanned representation. But otherwise, other than the fact that single camera visual-inertial odometry will give you noisier points than a lidar-based scan, you should be able to match one with the other w/ some error
QUESTION
Using unity, and the new 1.1 version of ARCore, the API exposes some new ways of getting the camera information. However, I can't find any good examples of saving this as a file to local storage as a jpg, for example.
The ARCore examples have a nice example of retrieving the camera data and then doing something with it here: https://github.com/google-ar/arcore-unity-sdk/blob/master/Assets/GoogleARCore/Examples/ComputerVision/Scripts/ComputerVisionController.cs#L212 and there are a few examples of retrieving the camera data in that class, but nothing around saving that data.
I've seen this: How to take & save picture / screenshot using Unity ARCore SDK? which uses the older API way of getting data, and doesn't really go into detail on saving, either.
What I ideally want is a way to turn the data from Frame.CameraImage.AcquireCameraImageBytes()
in the API into a stored jpg on disk, through Unity.
Update
I've since got it working mainly through digging through this issue on the ARCore github page: https://github.com/google-ar/arcore-unity-sdk/issues/72#issuecomment-355134812 and modifying Sonny's answer below, so it's only fair that one gets accepted.
In case anyone else is trying to do this I had to do the following steps:
Add a callback to the Start method to run your
...OnImageAvailable
method when the image is available:
ANSWER
Answered 2018-Apr-05 at 16:57In Unity, it should be possible to load the raw image data into a texture and then save it to a JPG using UnityEngine.ImageConversion.EncodeToJPG. Example code:
QUESTION
I am trying to create an app using arcore-unity-sdk-preview, which is supported by Google.
However, in order to use this ARCore, arcore-preview.apk must be installed. Otherwise, ARCore will stop working.
If I distribute the app I created in the store, the user will not be able to use the app unless I receive the arcore-preview.apk. Is there a solution to this problem?
Or are still experiencing this issue because it's not fully released yet?
If know about this, please help me.
...ANSWER
Answered 2017-Nov-22 at 12:50As you said, distribution is still an issue because it's not fully released.
To work around this issue, you could upload the apk somewhere / ship it in your app's files and install it programmatically but the user has to allow installation of apps from unknown sources (Settings > Security > Unknown Sources
)
QUESTION
I'm trying to modify the demo scene in the Unity AR Core SDK and
I've created a static bool variable isCreated
to check if the Andy prefab is created.
In the following check
...ANSWER
Answered 2017-Sep-19 at 03:29I don't know how badly you want your boolean to static, but I achieved the same result by doing something similar:
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install arcore-unity-sdk
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page