arcore-unity-sdk | ARCore SDK for Unity | Augmented Reality library

 by   google-ar C# Version: v1.25.0 License: Non-SPDX

kandi X-RAY | arcore-unity-sdk Summary

kandi X-RAY | arcore-unity-sdk Summary

arcore-unity-sdk is a C# library typically used in Virtual Reality, Augmented Reality, Unity applications. arcore-unity-sdk has no bugs, it has no vulnerabilities and it has medium support. However arcore-unity-sdk has a Non-SPDX License. You can download it from GitHub.

This SDK provides native APIs for all of the essential AR features like motion tracking, environmental understanding, and light estimation. With these capabilities you can build entirely new AR experiences or enhance existing apps with AR features. Caution: The ARCore SDK for Unity is deprecated, and no longer supported in the 2020 and later versions of Unity. This SDK should only be used by developers working on existing projects which are unable to migrate to Unity’s AR Foundation. Developers starting new projects should instead use the [ARCore Extensions for AR Foundation] //github.com/google-ar/arcore-unity-extensions).
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              arcore-unity-sdk has a medium active ecosystem.
              It has 1398 star(s) with 432 fork(s). There are 145 watchers for this library.
              OutlinedDot
              It had no major release in the last 12 months.
              There are 253 open issues and 521 have been closed. On average issues are closed in 200 days. There are 6 open pull requests and 0 closed requests.
              It has a neutral sentiment in the developer community.
              The latest version of arcore-unity-sdk is v1.25.0

            kandi-Quality Quality

              arcore-unity-sdk has 0 bugs and 0 code smells.

            kandi-Security Security

              arcore-unity-sdk has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
              arcore-unity-sdk code analysis shows 0 unresolved vulnerabilities.
              There are 0 security hotspots that need review.

            kandi-License License

              arcore-unity-sdk has a Non-SPDX License.
              Non-SPDX licenses can be open source with a non SPDX compliant license, or non open source licenses, and you need to review them closely before use.

            kandi-Reuse Reuse

              arcore-unity-sdk releases are available to install and integrate.
              Installation instructions are available. Examples and code snippets are not available.
              It has 7 lines of code, 0 functions and 259 files.
              It has low code complexity. Code complexity directly impacts maintainability of the code.

            Top functions reviewed by kandi - BETA

            kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
            Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of arcore-unity-sdk
            Get all kandi verified functions for this library.

            arcore-unity-sdk Key Features

            No Key Features are available at this moment for arcore-unity-sdk.

            arcore-unity-sdk Examples and Code Snippets

            No Code Snippets are available at this moment for arcore-unity-sdk.

            Community Discussions

            QUESTION

            Augmented Faces API – How facial landmarks generated?
            Asked 2020-Mar-11 at 18:41

            I'm an IT student, and would like to know (understand) more about the Augmented Faces API in ARCore.

            I just saw the ARCore V1.7 release, and the new Augmented Faces API. I get the enormous potential of this API. But I didn't see any questions or articles on this subject. So I'm questioning myself, and here are some assumptions / questions which come to my mind about this release.

            Assumption
            • ARCore team are using (Like Instagram and Snapchat) machine learning, to generate landmarks all over the face. Probably HOG Face Detection..
            Questions
            • How does ARCore generate 468 points all over the user face's on a Smartphone ? Impossible to find any response on that, even in the source code.
            • How they can have the depth from a simple smartphone camera ?
            • How to decline the Face detection / tracking, to a custom object or another part of the body like a Hand ?

            So if you have any advices or remarks on this subject, let's share !

            ...

            ANSWER

            Answered 2019-Apr-20 at 07:01
            1. ARCore's new Augmented Faces API, that is working on the front-facing camera without depth sensor, offers a high quality, 468-point 3D mesh that allows users attach such effects to their faces as animated masks, glasses, skin retouching, etc. The mesh provides coordinates and region specific anchors that make it possible to add these effects.

            I firmly believe that a facial landmarks detection is generated with a help of computer vision algorithms under the hood of ARCore 1.7. It's also important to say that you can get started in Unity or in Sceneform by creating an ARCore session with the "front-facing camera" and Augmented Faces "mesh" mode enabled. Note that other AR features such as plane detection aren't currently available when using the front-facing camera. AugmentedFace extends Trackable, so faces are detected and updated just like planes, Augmented Images, and other Trackables.

            As you know, 2+ years ago Google released Face API that performs face detection, which locates faces in pictures, along with their position (where they are in the picture) and orientation (which way they’re facing, relative to the camera). Face API allows you detect landmarks (points of interest on a face) and perform classifications to determine whether the eyes are open or closed, and whether or not a face is smiling. The Face API also detects and follows faces in moving images, which is known as face tracking.

            So, ARCore 1.7 just borrowed some architectural elements from Face API and now it's not only detects facial landmarks and generates 468 points for them but also tracks them in real time at 60 fps and sticks 3D facial geometry to them.

            See Google's Face Detection Concepts Overview.

            1. To calculate a depth channel in a video, shot by moving RGB camera, is not a rocket science. You just need to apply a parallax formula to tracked features. So if a translation's amplitude of a feature on a static object is quite high – the tracked object is closer to a camera, and if an amplitude of a feature on a static object is quite low – the tracked object is farther from a camera. These approaches for calculating a depth channel is quite usual for such compositing apps as The Foundry NUKE and Blackmagic Fusion for more than 10 years. Now the same principles are accessible in ARCore.

            2. You cannot decline the Face detection/tracking algorithm to a custom object or another part of the body like a hand. Augmented Faces API developed for just faces.

            Here's how Java code for activating Augmented Faces feature looks like:

            Source https://stackoverflow.com/questions/54869965

            QUESTION

            How to orient GLTF in Android Sceneform level with gravity
            Asked 2020-Feb-20 at 21:08

            I've been getting my butt kicked trying to get a vertically placed 3d model GLB format placed properly on a vertical surface.

            Just to be clear, I am not referring to the difficulty of identifying vertical surface, that is a whole other problem in itself.

            Removing common boilerplate of setup to minimize this post.

            I am using a fragment that extends ARFragment.

            ...

            ANSWER

            Answered 2019-Nov-22 at 20:48

            Well I finally got it. Took awhile and some serious trial and error of rotating every node, axis, angle, and rotation before I finally got it to place nicely. So I'll share my results in case anyone else needs this as well.

            End Result looked like:

            Of course it is mildly subjective to how you held the phone and it's understanding of the surroundings, but it's always pretty darn close to level now without fail in both landscape and portrait testing that I have done.

            So here's what I've learned.

            Setting the worldRotation on the anchorNode will help keep the 3DModel facing towards the cameraview using a little subtraction.

            Source https://stackoverflow.com/questions/58868409

            QUESTION

            How do I build the arcore camera_utility shared library in NDK-BUILD?
            Asked 2019-Oct-30 at 22:22

            I'm trying to build the 'arcore camera utility' library in NDK_BUILD, here: https://github.com/google-ar/arcore-unity-sdk/tree/master/Assets/GoogleARCore/Examples/ComputerVision/Plugins/src

            Using this guide: https://yeephycho.github.io/2016/10/20/How-to-Use-NDK-to-Build-A-Shared-Android_Native-Library/ I was at least able to get it to compile in a libarcore_camera_utility.so file. Not only that but it was actually recognized by my app on the phone and instead of getting a DLL missing error I got the error: "EntryPointNotFoundException: Unable to find an entry point named 'TextureReader_create' in 'arcore_camera_utility'." which means it at least found the file, now.

            The filesize of the .so is only 6k so it seems like I'm not compiling it correctly as the already working 32bit version that comes with the package is 100k, based on this question it seems like I'm leaving something out?: Entry point not found, Android .so file

            My Android.mk file is:

            ...

            ANSWER

            Answered 2019-Oct-03 at 14:57

            To compile arcore_camera_utility for the arm 64bit target-

            1.) Create a new directory called 'arcorelibrary', then a subdirectory called 'jni'

            2.) Download this zip: https://github.com/google-ar/arcore-unity-sdk/blob/master/Assets/GoogleARCore/Examples/ComputerVision/Plugins/src/arcore_camera_utility.zip

            3.) get the three .cc files and the three .h files and place them in the jni directory

            4.) Create a file called 'Android.mk' and place it in the jni directory, with the following contents:

            Source https://stackoverflow.com/questions/58195077

            QUESTION

            Sceneform: How do you disable surface detection and place an object in front of the camera?
            Asked 2018-Dec-03 at 03:40

            I'm using Sceneform with ARCore on Android and am unable to understand the concepts clearly with the documentation provided. I'm trying to modify the existing HelloSceneform App from github and trying to create a app, where as soon as it's started, the user sees a 3D object directly at his/her front. This is very similar to what I found https://github.com/google-ar/arcore-unity-sdk/issues/144, but I couldn't figure out how I can improve the existing code to get it.

            ...

            ANSWER

            Answered 2018-Dec-03 at 03:40
             @Override
                public void onUpdate(FrameTime frameTime) {
                    Frame frame = playFragment.getArSceneView().getArFrame();
                    if (frame == null) {
                        return;
                    }
                    if (frame.getCamera().getTrackingState() != TrackingState.TRACKING) {
                        return;
                    }
                    for (Plane plane : frame.getUpdatedTrackables(Plane.class)) {
                        playFragment.getPlaneDiscoveryController().hide();
                        if (plane.getTrackingState() == TrackingState.TRACKING) {
                            for (HitResult hit : frame.hitTest(getScreenCenter().x, getScreenCenter().y)) {
                                Trackable trackable = hit.getTrackable();
                                if (trackable instanceof Plane && ((Plane) trackable).isPoseInPolygon(hit.getHitPose())) {
                                    Anchor anchor = hit.createAnchor();
                                    AnchorNode anchorNode = new AnchorNode(anchor);
                                    anchorNode.setParent(playFragment.getArSceneView().getScene());
                                    Pose pose = hit.getHitPose();
                                    Node node = new Node();
                                    node.setRenderable(modelRenderable);
                                    node.setLocalPosition(new Vector3(pose.tx(), pose.compose(Pose.makeTranslation(0.0f, 0.05f, 0.0f)).ty(), pose.tz()));
                                    node.setParent(anchorNode);
                                }
                            }
                        }
                    }
                }
            
                private Vector3 getScreenCenter() {
                    View vw = findViewById(android.R.id.content);
                    return new Vector3(vw.getWidth() / 2f, vw.getHeight() / 2f, 0f);
                }
            

            Source https://stackoverflow.com/questions/53575489

            QUESTION

            Trying project exemple "HelloAR", build succeed but nothing shows on the phone
            Asked 2018-Aug-07 at 19:58

            i'm new to Unity and ARCore and i am trying to lauch the ARCore exemple project, but nothing is showing on my phone.

            I'm using the Unity 2018.2.2f1 and ARcore v1.4, but i had the exact same problem when i used the previous version (2018.2.1f1 and 1.3). I run Unity on Windows 10.

            Here is what i did :

            • I create a new project named "AR"
            • I add the " arcore-unity-sdk-v1.4.0 " package Assets > Import Package > Custom Package ... And select " arcore-unity-sdk-v1.4.0 "
            • I select " All " to import all the package, then " Import "
            • I have now 3 CS0619 errors : [...] error CS0619: 'UnityEngine.Network' is obsolete: [...] error CS0619: 'UnityEngine.Network.player' is obsolete: [...] error CS0619: 'UnityEngine.NetworkPlayer.ipAddress' is obsolete: [...]
            • I correct them by following thoses instructions https://github.com/google-ar/arcore-unity-sdk/issues/197
            • I have now 1 CS0618 warning, on 5 different files : [...] warning CS0618: 'UnityEditor.Build.IPreprocessBuild' is obsolete: 'Use IPreprocessBuildWithReport instead' [...] So i use " IPreprocessBuildWithReport " on the 5 different files https://docs.unity3d.com/ScriptReference/Build.IPreprocessBuildWithReport.OnPreprocessBuild.html
            • 1 warning is still remaning, an other CS0618 [...] warning CS0618: 'UnityEngine.ScreenOrientation.Unknown' is obsolete: [...] I simply replaced " [...] ScreenOrientation.Unknown " by " [...] ScreenOrientation.AutoRotation "
            • Now that i don't have any errors or warning left, i continue to follow the ArCore Tutorial https://developers.google.com/ar/develop/unity/quickstart-android (I am now at the "Open the sample scene" part)
            • So i double clic on "HelloAR" Assets > GoogleARCore > Examples > HelloAR > Scenes > HelloAR
            • I keep following the tutorial (Configure build settings)
            • I when to File > Build Settings to open the Build Settings window
            • Select Android and click Switch Platform
            • Player Settings > Other Settings > Multithreaded Rendering : Disable
            • Player Settings > Other Settings > Package Name : com.Help.HelloAR
            • Player Settings > Other Settings > Minimum API Level : Android 7.0 'Nougat' (API level 24)
            • Player Settings > Other Settings > Target API Level : Android 7.0 'Nougat' (API level 24)
            • Player Settings > XR Settings > ARCore Supported : Enable
            • Now that everything is ready, i enable developer options and USB debugging on my phone (Samsung S8+, Android version : 8.0.0), connect it to my computer and go to File > Build Settings > Build And Run.
            • Save the apk to my computer.
            • But when i launch the application on my phone, the only thing i get is this : https://image.noelshack.com/fichiers/2018/32/2/1533648381-ar-error.jpg (The camera is allowed to be used with the application) (I also tried to " Build ", and then copy the apk to my phone, it does the same thing)

            But the "fun" part, is the fact that, if i go back on Unity, delete the scene "HelloAR" and start to put 3D objects in front of the camera, i will still have this grey and blue image (sky texture) when i build the project. Doesn't mater what modification i do i will still have it.

            Does anyone know where this problem comes from and maybe how to fix it ? Can you think of anything i could try ?

            Thanks in advance

            ...

            ANSWER

            Answered 2018-Aug-07 at 19:58

            I had the same exact issue, then I finally noticed that I was exporting the sample scene that unity defaults to. Did you check to see if you were in fact exporting the HelloAR scene for your build?

            Source https://stackoverflow.com/questions/51729519

            QUESTION

            ARcore export point cloud for postprocessing
            Asked 2018-Apr-11 at 22:43

            I tried the ARcore sample projekt HelloAR in Unity. Now my aim is to efficiently export the mapped point cloud for postprocessing.

            In the sample HelloAR Unity projekt, I changed the "PointcloudVisualizer.cs" Script. Now on every Update I copy the current Point Cloud in a Vector4 List.

            ...

            ANSWER

            Answered 2018-Apr-11 at 22:43
            1. What's your definition of postprocessing? Are you attempting to run additional processing while your app is running, or are you trying to download the data to play with locally on your computer? If the former, you shouldn't write to a text file. Keep in mind the positions are relative to the camera transform.
            2. Yes, point clouds are relative to each frame, there's no global point cloud (or global mesh) support (yet?)
            3. It's based on some heuristic that ARCore computes. Higher is better. (think of it as a percentage, I'm 90% sure this is a valid point cloud point).
            4. Are you trying to reconstruct the room? Are you trying to do it in real time? It's the same problem as lidar systems used in autonomous vehicles. Essentially if you see multiple cloud points at the same spot over successive frames, there's a high likely-hood that that point is real (probably the same heuristic that ARCore uses).
            5. Ahh this answers a few of the questions above. The toughest part will be aligning your cloud-point generated version of the room with your lidar-scanned representation. But otherwise, other than the fact that single camera visual-inertial odometry will give you noisier points than a lidar-based scan, you should be able to match one with the other w/ some error

            Source https://stackoverflow.com/questions/49772050

            QUESTION

            Save AcquireCameraImageBytes() from Unity ARCore to storage as an image
            Asked 2018-Apr-05 at 16:57

            Using unity, and the new 1.1 version of ARCore, the API exposes some new ways of getting the camera information. However, I can't find any good examples of saving this as a file to local storage as a jpg, for example.

            The ARCore examples have a nice example of retrieving the camera data and then doing something with it here: https://github.com/google-ar/arcore-unity-sdk/blob/master/Assets/GoogleARCore/Examples/ComputerVision/Scripts/ComputerVisionController.cs#L212 and there are a few examples of retrieving the camera data in that class, but nothing around saving that data.

            I've seen this: How to take & save picture / screenshot using Unity ARCore SDK? which uses the older API way of getting data, and doesn't really go into detail on saving, either.

            What I ideally want is a way to turn the data from Frame.CameraImage.AcquireCameraImageBytes() in the API into a stored jpg on disk, through Unity.

            Update

            I've since got it working mainly through digging through this issue on the ARCore github page: https://github.com/google-ar/arcore-unity-sdk/issues/72#issuecomment-355134812 and modifying Sonny's answer below, so it's only fair that one gets accepted.

            In case anyone else is trying to do this I had to do the following steps:

            1. Add a callback to the Start method to run your OnImageAvailable method when the image is available:

              ...

            ANSWER

            Answered 2018-Apr-05 at 16:57

            In Unity, it should be possible to load the raw image data into a texture and then save it to a JPG using UnityEngine.ImageConversion.EncodeToJPG. Example code:

            Source https://stackoverflow.com/questions/49579334

            QUESTION

            How do I distribute apps created using ARCore in the store?
            Asked 2017-Nov-22 at 12:50

            I am trying to create an app using arcore-unity-sdk-preview, which is supported by Google.

            However, in order to use this ARCore, arcore-preview.apk must be installed. Otherwise, ARCore will stop working.

            If I distribute the app I created in the store, the user will not be able to use the app unless I receive the arcore-preview.apk. Is there a solution to this problem?

            Or are still experiencing this issue because it's not fully released yet?

            If know about this, please help me.

            ...

            ANSWER

            Answered 2017-Nov-22 at 12:50

            As you said, distribution is still an issue because it's not fully released.

            To work around this issue, you could upload the apk somewhere / ship it in your app's files and install it programmatically but the user has to allow installation of apps from unknown sources (Settings > Security > Unknown Sources)

            Source https://stackoverflow.com/questions/47430058

            QUESTION

            How to set static variable and instantiate only one prefab for the AR Core demo
            Asked 2017-Sep-19 at 03:29

            I'm trying to modify the demo scene in the Unity AR Core SDK and I've created a static bool variable isCreated to check if the Andy prefab is created.

            In the following check

            ...

            ANSWER

            Answered 2017-Sep-19 at 03:29

            I don't know how badly you want your boolean to static, but I achieved the same result by doing something similar:

            Source https://stackoverflow.com/questions/45979428

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install arcore-unity-sdk

            See the [Getting Started with Unity](//developers.google.com/ar/develop/unity/getting-started) developer guide.

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries

            Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link