ARKit | Place virtual objects using ARKit | Augmented Reality library

 by   ignacio-chiazzo Swift Version: Current License: MIT

kandi X-RAY | ARKit Summary

kandi X-RAY | ARKit Summary

ARKit is a Swift library typically used in Virtual Reality, Augmented Reality applications. ARKit has no bugs, it has no vulnerabilities, it has a Permissive License and it has low support. You can download it from GitHub.

Augmented reality offers new ways for users to interact with real and virtual 3D content in your app. However, many of the fundamental principles of human interface design are still valid. Convincing AR illusions also require careful attention to 3D asset design and rendering. By following this article's guidelines for AR human interface principles and experimenting with this example code, you can create immersive, intuitive augmented reality experiences.
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              ARKit has a low active ecosystem.
              It has 340 star(s) with 79 fork(s). There are 18 watchers for this library.
              OutlinedDot
              It had no major release in the last 6 months.
              There are 4 open issues and 1 have been closed. On average issues are closed in 13 days. There are no pull requests.
              It has a neutral sentiment in the developer community.
              The latest version of ARKit is current.

            kandi-Quality Quality

              ARKit has no bugs reported.

            kandi-Security Security

              ARKit has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.

            kandi-License License

              ARKit is licensed under the MIT License. This license is Permissive.
              Permissive licenses have the least restrictions, and you can use them in most projects.

            kandi-Reuse Reuse

              ARKit releases are not available. You will need to build from source code and install.
              Installation instructions are available. Examples and code snippets are not available.

            Top functions reviewed by kandi - BETA

            kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
            Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of ARKit
            Get all kandi verified functions for this library.

            ARKit Key Features

            No Key Features are available at this moment for ARKit.

            ARKit Examples and Code Snippets

            No Code Snippets are available at this moment for ARKit.

            Community Discussions

            QUESTION

            How can I record an ARKit scene but exclude UI elements?
            Asked 2021-May-31 at 02:54

            I'm using ARKit with Scenekit for rendering. I'd like to let users capture videos of the AR session so that they can save it to their photos or share it.

            Currently I'm using ARVideoKit for this, but the performance leaves something to be desired and I've run into some difficult to workaround bugs. Other libraries I've found haven't been any better.

            ReplayKit seems like ideal solution but it records my entire app, including the user interface. Is there a way to get ReplayKit to record just the AR content while excluding the user interface?

            ...

            ANSWER

            Answered 2021-May-31 at 02:52

            You can use ReplayKit for this but it isn't very well documented. The key is that you render all of your UI elements in a separate UIWindow that is overlaid on top of a primary UIWindow that contains the AR content. ReplayKit only records the primary window, so with this structure the user interface elements will not show up in the recording.

            While there may be a better way to do this, here's an example of how I setup this window structure for my SwiftUI based app. Here I use the UIWindow.level property to mark the AR content as the main window, while putting the UI into its own secondary window at a higher level:

            Source https://stackoverflow.com/questions/67767374

            QUESTION

            SceneKit / ARKit updating a node every frame
            Asked 2021-May-30 at 13:47

            I'm working with ARKit / SceneKit and I'm trying to have an arrow point to an arbitrary position I set in the world, but I'm having a bit of trouble. In my sceneView I have a scene set up to load in my arrow.

            ...

            ANSWER

            Answered 2021-May-30 at 13:47

            You could do so by using the updateAtTime delegate function, but I strongly recommend you to use a SCNConstraint.

            Source https://stackoverflow.com/questions/67744721

            QUESTION

            Camera Intrinsics Resolution vs Real Screen Resolution
            Asked 2021-May-28 at 13:28

            I am writing an ARKit app where I need to use camera poses and intrinsics for 3D reconstruction.

            The camera Intrinsics matrix returned by ARKit seems to be using a different image resolution than mobile screen resolution. Below is one example of this issue

            Intrinsics matrix returned by ARKit is :

            [[1569.249512, 0, 931.3638306],[0, 1569.249512, 723.3305664],[0, 0, 1]]

            whereas input image resolution is 750 (width) x 1182 (height). In this case, the principal point seems to be out of the image which cannot be possible. It should ideally be close to the image center. So above intrinsic matrix might be using image resolution of 1920 (width) x 1440 (height) returned that is completely different than the original image resolution.

            The questions are:

            • Whether the returned camera intrinsics belong to 1920x1440 image resolution?
            • If yes, how can I get the intrinsics matrix representing original image resolution i.e. 750x1182?
            ...

            ANSWER

            Answered 2021-May-28 at 13:28
            Intrinsics 3x3 matrix

            Intrinsics camera matrix converts between the 2D camera plane and 3D world coordinate space. Here's a decomposition of an intrinsic matrix, where:

            • fx and fy is a Focal Length in pixels
            • xO and yO is a Principal Point Offset in pixels
            • s is an Axis Skew

            According to Apple Documentation:

            The values fx and fy are the pixel focal length, and are identical for square pixels. The values ox and oy are the offsets of the principal point from the top-left corner of the image frame. All values are expressed in pixels.

            So you let's examine what your data is:

            Source https://stackoverflow.com/questions/66893907

            QUESTION

            Adding multiple SCNNode(s) at the same time
            Asked 2021-May-21 at 04:36

            I have this function named addShapes. I want it to create 3 shapes

            ...

            ANSWER

            Answered 2021-May-21 at 04:36

            Your code works fine (a position was a problem):

            Source https://stackoverflow.com/questions/67629296

            QUESTION

            SCNNode is not showing up
            Asked 2021-May-18 at 21:05

            I'm new in Swift and ARKit. For some reason the SCNNode node I'm trying to display is not showing up. I'm working with SwiftUI. I defined in the next code block the function addNode that should render the node.

            ...

            ANSWER

            Answered 2021-May-18 at 21:05

            Use this approach for SceneKitView:

            Source https://stackoverflow.com/questions/67578800

            QUESTION

            Persistance with an ARWorldMap created with ARKit through Unity
            Asked 2021-May-17 at 07:21

            I am new to AR and using Unity, ARFoundation, and ARKit.

            Will my ARWorldMaps have persistence in an outdoor or indoor experience and will it be as effective as Azure? I will only be deploying on iOS so cross-platform is not important.

            ...

            ANSWER

            Answered 2021-May-17 at 07:21

            Saving ARWorldMap is not a rocket science. If this feature is supported in ARKit extension for Unity, ARWorldMap will be saved in any AR app the same way as expected. The main difference is that Unity builds for iOS are written in slow Objective-C, not in faster Swift for UIKit, and not in the fastest Swift for SwiftUI. In iOS for storing ARWorldMap you must use NSKeyedArchiver, and for retrieving ARWorldMap data you must use NSKeyedUnarchiver.

            Source https://stackoverflow.com/questions/67560711

            QUESTION

            Disable AR Object occlusion in QLPreviewController
            Asked 2021-May-13 at 08:44

            I'm using QLPreviewController to show AR content. With the newer iPhones with LIDAR it seems that object occlusion is enabled by default.

            Is there any way to disable object occlusion in the QLVideoController without having to build a custom ARKit view controller? Since my models are quite large (life-size buildings), they seem to disappear or get cut off at the end.

            ...

            ANSWER

            Answered 2021-May-13 at 08:44

            ARQuickLook is a library built for quick and high-quality AR visualization. It adopts RealityKit engine, so all supported here features, like occlusion, anchors, raytraced shadows, physics, DoF, motion blur, HDR, etc, look the same way as they look in RealityKit.

            However, you can't turn on/off these features in QuickLook's API. They are on by default, if supported on your iPhone. In case you want to turn on/off People Occlusion you have to use ARKit/RealityKit frameworks, not QuickLook.

            Source https://stackoverflow.com/questions/67373239

            QUESTION

            SceneKit – Stretched texture on a Custom Geometry
            Asked 2021-May-12 at 13:01

            I want to tile the ground with ARKit using custom polygon that creates using selected positions on horizontal plan by user, but tiles are stretched and wont show properly, Maybe problem is from texture coordinates, What's wrong with this code?

            ...

            ANSWER

            Answered 2021-May-12 at 12:13

            Texture stretching happens due to a wrong texture mapping on the UV map. You have to use m41 (translate X) and m42 (translate Y) elements, containing in the fourth column, of SCNMatrix4. Неre's how a stretch looks like when matrix element m41 equals to zero:

            Source https://stackoverflow.com/questions/67476504

            QUESTION

            Rounding positional data from ARKit
            Asked 2021-May-08 at 21:01

            I have this code that gets X, Y, Z positions from each frame in ARKit.

            ...

            ANSWER

            Answered 2021-May-08 at 21:01
            Rounding to Meters

            For that you can use three holy methods: round(_:), and ceil(_:), and floor(_:).

            Source https://stackoverflow.com/questions/67394150

            QUESTION

            ARKit color correction of captured image in low light scenes
            Asked 2021-Apr-30 at 03:18

            I have an ARKit app that does the following:

            • Render the frame's captured image to a texture.
            • Apply this texture to a scenekit object in the AR scene.

            I use this to create a virtual object that perfectly blends into the AR scene.

            My current approach works great for well-lit scenes, but in dark scenes the texture on the virtual object becomes subtly different than the current scene background. What causes this and is there any way to fix it?

            Details

            I've created this branch on a demo project which demonstrates the issue.

            The project renders a face model that is textured with the frame's currentImage. The result should be that face model effectively becomes invisible, even though it is still being rendered. However in low light situations, you can clearly see where the background image ends and the face model starts.

            Here's an overview of the shader I use to capture the texture

            ...

            ANSWER

            Answered 2021-Apr-29 at 13:48

            You are using an approximate gamma correction, its not the correct conversion between RGB and sRGB. What you are really trying to do is circumvent SceneKits default pixel format (sRGB). In the fragment shader after you do the YCbCr to RGB conversion you have linear RGB but when you write to a texture from the fragment shader, that value from the shader will be interpreted as sRGB, so an RGB to sRGB conversion will happen i.e (essentially pow 1/2.4, hence why you tried to correct it with an approximate gamma correction), you need to do the inverse to circumvent this, so as if you were going from sRGB to linear. RGB-sRGB conversion (and vice versa) can be confusing as sometimes things are happening underneath the hood that you might not be aware off. So the fix is, instead of your gamma correction do this:

            Source https://stackoverflow.com/questions/67278934

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install ARKit

            Just clone the repo and build it!.

            Support

            Help users recognize when your app is ready for real-world interactions. Tracking the real-world environment involves complex algorithms whose timeliness and accuracy are affected by real-world conditions. The FocusSquare class in this example project draws a square outline in the AR view, giving the user hints about the status of ARKit world tracking. The square changes size to reflect estimated scene depth, and switches between open and closed states with a "lock" animation to indicate whether ARKit has detected a plane suitable for placing an object. Use the session(_:cameraDidChangeTrackingState:) delegate method to detect changes in tracking quality, and present feedback to the user when low-quality conditions are correctable (for example, by telling the user to move to an environment with better lighting). Use specific terms a user is likely to recognize. For example, if you give textual feedback for plane detection, a user not familiar with technical definitions might mistake the word "plane" as referring to aircraft. Fall back gracefully if tracking fails, and allow the user to reset tracking if their experience isn't working as expected. See the restartExperience button and method in this example's ViewController class. The use3DOFTrackingFallback variable controls whether to switch to a lower-fidelity session configuration when tracking quality is poor. Help users understand the relationship of your app's virtual content to the real world. Use visual cues in your UI that react to changes in camera position relative to virtual content. The focus square disappears after the user places an object in the scene, and reappears when the user points the camera away from the object. The Plane class in this example handles visualization of real-world planes detected by ARKit. Its createOcclusionNode and updateOcclusionNode methods create invisible geometry that realistically obscures virtual content.
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            CLONE
          • HTTPS

            https://github.com/ignacio-chiazzo/ARKit.git

          • CLI

            gh repo clone ignacio-chiazzo/ARKit

          • sshUrl

            git@github.com:ignacio-chiazzo/ARKit.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link

            Explore Related Topics

            Consider Popular Augmented Reality Libraries

            AR.js

            by jeromeetienne

            ar-cutpaste

            by cyrildiagne

            aframe

            by aframevr

            engine

            by playcanvas

            Awesome-ARKit

            by olucurious

            Try Top Libraries by ignacio-chiazzo

            Algorithms-Leetcode-Javascript

            by ignacio-chiazzoJavaScript

            ruby_whatsapp_sdk

            by ignacio-chiazzoRuby

            ngImgCropTool-Examples

            by ignacio-chiazzoHTML

            ionic_seed

            by ignacio-chiazzoJavaScript