arkit | JavaScript architecture diagrams and dependency graphs | Architecture library
kandi X-RAY | arkit Summary
kandi X-RAY | arkit Summary
JavaScript architecture diagrams and dependency graphs
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of arkit
arkit Key Features
arkit Examples and Code Snippets
import UIKit
import ARKit
import SceneKit
import PlaygroundSupport
public var textNode : SCNNode?
// Main ARKIT ViewController
class ViewController : UIViewController, ARSCNViewDelegate, ARSessionDelegate {
var textNode: SCNNode!
var co
textGeometry.font = UIFont(name: "Helvatica", size: 3)
let textGeometry: SCNText!
textGeometry.string = "I Have Changed The String"
textGeometry.string = ffox.text
import UIKit
import ARKit
class ARTransVC: UIViewController{
@IBOutlet weak var arSceneView: ARSCNView!
let configuration = ARWorldTrackingConfiguration()
private var player: AVPlayer = {
guard let url = Bundle.main.url(forResource:
Community Discussions
Trending Discussions on arkit
QUESTION
I'm using ARKit with Scenekit for rendering. I'd like to let users capture videos of the AR session so that they can save it to their photos or share it.
Currently I'm using ARVideoKit for this, but the performance leaves something to be desired and I've run into some difficult to workaround bugs. Other libraries I've found haven't been any better.
ReplayKit seems like ideal solution but it records my entire app, including the user interface. Is there a way to get ReplayKit to record just the AR content while excluding the user interface?
...ANSWER
Answered 2021-May-31 at 02:52You can use ReplayKit for this but it isn't very well documented. The key is that you render all of your UI elements in a separate UIWindow
that is overlaid on top of a primary UIWindow
that contains the AR content. ReplayKit only records the primary window, so with this structure the user interface elements will not show up in the recording.
While there may be a better way to do this, here's an example of how I setup this window structure for my SwiftUI based app. Here I use the UIWindow.level property to mark the AR content as the main window, while putting the UI into its own secondary window at a higher level:
QUESTION
I'm working with ARKit / SceneKit and I'm trying to have an arrow point to an arbitrary position I set in the world, but I'm having a bit of trouble. In my sceneView I have a scene set up to load in my arrow.
...ANSWER
Answered 2021-May-30 at 13:47You could do so by using the updateAtTime
delegate function, but I strongly recommend you to use a SCNConstraint.
QUESTION
I am writing an ARKit app where I need to use camera poses and intrinsics for 3D reconstruction.
The camera Intrinsics matrix returned by ARKit seems to be using a different image resolution than mobile screen resolution. Below is one example of this issue
Intrinsics matrix returned by ARKit is :
[[1569.249512, 0, 931.3638306],[0, 1569.249512, 723.3305664],[0, 0, 1]]
whereas input image resolution is 750 (width) x 1182 (height). In this case, the principal point seems to be out of the image which cannot be possible. It should ideally be close to the image center. So above intrinsic matrix might be using image resolution of 1920 (width) x 1440 (height) returned that is completely different than the original image resolution.
The questions are:
- Whether the returned camera intrinsics belong to 1920x1440 image resolution?
- If yes, how can I get the intrinsics matrix representing original image resolution i.e. 750x1182?
ANSWER
Answered 2021-May-28 at 13:28Intrinsics camera matrix converts between the 2D camera plane and 3D world coordinate space. Here's a decomposition of an intrinsic matrix, where:
fx
andfy
is a Focal Length in pixelsxO
andyO
is a Principal Point Offset in pixelss
is an Axis Skew
According to Apple Documentation:
The values fx and fy are the pixel focal length, and are identical for square pixels. The values ox and oy are the offsets of the principal point from the top-left corner of the image frame. All values are expressed in pixels.
So you let's examine what your data is:
QUESTION
I have this function named addShapes
. I want it to create 3 shapes
ANSWER
Answered 2021-May-21 at 04:36Your code works fine (a position was a problem):
QUESTION
I'm new in Swift and ARKit. For some reason the SCNNode node I'm trying to display is not showing up. I'm working with SwiftUI. I defined in the next code block the function addNode that should render the node.
...ANSWER
Answered 2021-May-18 at 21:05Use this approach for SceneKitView
:
QUESTION
I am new to AR and using Unity, ARFoundation, and ARKit.
Will my ARWorldMap
s have persistence in an outdoor or indoor experience and will it be as effective as Azure? I will only be deploying on iOS so cross-platform is not important.
ANSWER
Answered 2021-May-17 at 07:21Saving ARWorldMap is not a rocket science. If this feature is supported in ARKit extension for Unity, ARWorldMap will be saved in any AR app the same way as expected. The main difference is that Unity builds for iOS are written in slow Objective-C, not in faster Swift for UIKit, and not in the fastest Swift for SwiftUI. In iOS for storing ARWorldMap you must use NSKeyedArchiver
, and for retrieving ARWorldMap data you must use NSKeyedUnarchiver
.
QUESTION
I'm using QLPreviewController
to show AR content. With the newer iPhones with LIDAR it seems that object occlusion is enabled by default.
Is there any way to disable object occlusion in the QLVideoController without having to build a custom ARKit view controller? Since my models are quite large (life-size buildings), they seem to disappear or get cut off at the end.
...ANSWER
Answered 2021-May-13 at 08:44ARQuickLook
is a library built for quick and high-quality AR visualization. It adopts RealityKit engine, so all supported here features, like occlusion, anchors, raytraced shadows, physics, DoF, motion blur, HDR, etc, look the same way as they look in RealityKit.
However, you can't turn on
/off
these features in QuickLook's API. They are on
by default, if supported on your iPhone. In case you want to turn on
/off
People Occlusion you have to use ARKit/RealityKit frameworks, not QuickLook.
QUESTION
I want to tile the ground with ARKit using custom polygon that creates using selected positions on horizontal plan by user, but tiles are stretched and wont show properly, Maybe problem is from texture coordinates, What's wrong with this code?
...ANSWER
Answered 2021-May-12 at 12:13Texture stretching happens due to a wrong texture mapping on the UV map. You have to use m41
(translate X) and m42
(translate Y) elements, containing in the fourth column, of SCNMatrix4. Неre's how a stretch looks like when matrix element m41
equals to zero:
QUESTION
I have this code that gets X, Y, Z positions from each frame in ARKit.
...ANSWER
Answered 2021-May-08 at 21:01QUESTION
I have an ARKit app that does the following:
- Render the frame's captured image to a texture.
- Apply this texture to a scenekit object in the AR scene.
I use this to create a virtual object that perfectly blends into the AR scene.
My current approach works great for well-lit scenes, but in dark scenes the texture on the virtual object becomes subtly different than the current scene background. What causes this and is there any way to fix it?
DetailsI've created this branch on a demo project which demonstrates the issue.
The project renders a face model that is textured with the frame's currentImage. The result should be that face model effectively becomes invisible, even though it is still being rendered. However in low light situations, you can clearly see where the background image ends and the face model starts.
Here's an overview of the shader I use to capture the texture
...ANSWER
Answered 2021-Apr-29 at 13:48You are using an approximate gamma correction, its not the correct conversion between RGB and sRGB. What you are really trying to do is circumvent SceneKits default pixel format (sRGB). In the fragment shader after you do the YCbCr to RGB conversion you have linear RGB but when you write to a texture from the fragment shader, that value from the shader will be interpreted as sRGB, so an RGB to sRGB conversion will happen i.e (essentially pow 1/2.4, hence why you tried to correct it with an approximate gamma correction), you need to do the inverse to circumvent this, so as if you were going from sRGB to linear. RGB-sRGB conversion (and vice versa) can be confusing as sometimes things are happening underneath the hood that you might not be aware off. So the fix is, instead of your gamma correction do this:
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install arkit
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page