lovr-oculus-mobile | Android app for hosting lovr | Augmented Reality library
kandi X-RAY | lovr-oculus-mobile Summary
kandi X-RAY | lovr-oculus-mobile Summary
This is a repository for building LovrApp, a standalone Android app which is based on the LÖVR VR API.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of lovr-oculus-mobile
lovr-oculus-mobile Key Features
lovr-oculus-mobile Examples and Code Snippets
abiFilters 'arm64-v8a'
abiFilters 'armeabi-v7a','arm64-v8a'
keytool -genkey -v -keystore YOURNAME-key.jks -keyalg RSA -keysize 2048 -validity 10000 -alias YOURNAME
Community Discussions
Trending Discussions on Augmented Reality
QUESTION
I have an iOS app with deployment target iOS 10+, I need to add some features that depend only on RealityKit to appear with users whom their iOS version is 13+, the app compiles and runs successfully on real device but the problem is when archiving for upload to AppStore it generates a Swift file and says:
...ANSWER
Answered 2022-Mar-10 at 15:04Do not include Reality Composer's .rcproject
files in your archive for distribution. .rcproject
bundles contain the code with iOS 13.0+ classes, structs and enums. Instead, supply your project with USDZ files.
To allow iOS 13+ users to use RealityKit features, but still allow non-AR users to run this app starting from iOS 10.0, use the following code:
QUESTION
I want to use RealityKit's AnchorEntity initialized with an Anchoring Component Target of type Image. Here is what I am trying to do:
...ANSWER
Answered 2022-Jan-08 at 20:36You can use AnchorEntity(.image(...))
in SwiftUI almost the same way as you used it in UIKit. At first click Assets.xcassets
in Project Navigator pane and create AR Resources
folder. Drag your image there. Setup its physical size in Inspector. Then copy/paste the code:
QUESTION
Is it possible to import a virtual lamp object into the AR scene, that projects a light cone, which illuminates the surrounding space in the room and the real objects in it, e.g. a table, floor, walls?
For ARKit, I found this SO post.
For ARCore, there is an example of relighting technique. And this source code.
I have also been suggested that post-processing can be used to brighten the whole scene.
However, these examples are from a while ago and perhaps threre is a newer or a more straight forward solution to this problem?
...ANSWER
Answered 2021-Dec-16 at 17:25At the low level, RealityKit is only responsible for rendering virtual objects and overlaying them on top of the camera frame. If you want to illuminate the real scene, you need to post-process the camera frame.
Here are some tutorials on how to do post-processing: Tutorial1⃣️ Tutorial2⃣️
If all you need is an effect like This , then all you need to do is add a CGImage-based post-processing effect for the virtual object (lights).
More specifically, add a bloom filter to the rendered image(You can also simulate bloom filters with Gaussian blur).
In this way, the code is all around UIImage and CGImage, so it's pretty simple😎
If you want to be more realistic, consider using the depth map provided by LiDAR to calculate which areas can be illuminated for a more detailed brightness.
Or If you're a true explorer, you can use Metal to create a real world Digital Twin point cloud in real time to simulate occlusion of light.
QUESTION
I want to show image from gallery. i am loading the image using imagePicker.
...ANSWER
Answered 2021-Dec-12 at 08:44Try this. Take into consideration, a tint color is multiplied by an image – so, if tint's RGBA = [1,1,1,1]
, a result of multiplication will be an image itself (without tinting)...
QUESTION
I am developing a website related to Augmented Reality. But my mobile phone (Samsung Galaxy M02S) is not supported AR. When I try to install ARcore, google play store show an error.
So I installed ARcore using apk and now this app was installed. After that, I checked my mobile phone AR was supported or not. The below notification was shown when I try to see AR.
How I fix this issue and have any other way to install ARcore into my phone?
...ANSWER
Answered 2021-Nov-23 at 13:58AR Core requires some specific hardware to work. You can check the list of supported devices here. No amount of side loading will help because this is a hardware requirement issue. Moreover AR Core is under active development even if you somehow install a version that might work that version will soon be deprecated and you will start getting the popup saying you need to update.
Kindly use a device that is part of supported list or an Emulator that supports this. IMHO it is best to develop using a device that has support from AR Core team.
QUESTION
I'm facing an issue where SCNView.hitTest
does not detect hits against geometry that I'm modifying dynamically on the cpu.
Here's the overview: I have a node that uses an SCNGeometry
created from a MTLBuffer
of vertices:
ANSWER
Answered 2021-Aug-13 at 15:46When you perform a hit-test search, SceneKit looks for SCNGeometry objects along the ray you specify. For each intersection between the ray and a geometry, SceneKit creates a hit-test result to provide information about both the SCNNode object containing the geometry and the location of the intersection on the geometry’s surface.
The problem in your case is that when you modify the buffer’s contents (MTLBuffer) at render time, SceneKit does not know about it, and therefore cannot update SCNGeometry object which is used for performing hit-test.
So the only way I can see to solve this issue is to recreate your SCNGeometry object.
QUESTION
I am trying to display a reality
file created using Reality Composer
. The below code works for usdz
but not for reality
. Here is my code
ANSWER
Answered 2021-Jul-26 at 06:44Uploading a .reality
model from web works fine. You can easily check this in Xcode Simulator:
QUESTION
I am writing an ARKit app where I need to use camera poses and intrinsics for 3D reconstruction.
The camera Intrinsics matrix returned by ARKit seems to be using a different image resolution than mobile screen resolution. Below is one example of this issue
Intrinsics matrix returned by ARKit is :
[[1569.249512, 0, 931.3638306],[0, 1569.249512, 723.3305664],[0, 0, 1]]
whereas input image resolution is 750 (width) x 1182 (height). In this case, the principal point seems to be out of the image which cannot be possible. It should ideally be close to the image center. So above intrinsic matrix might be using image resolution of 1920 (width) x 1440 (height) returned that is completely different than the original image resolution.
The questions are:
- Whether the returned camera intrinsics belong to 1920x1440 image resolution?
- If yes, how can I get the intrinsics matrix representing original image resolution i.e. 750x1182?
ANSWER
Answered 2021-May-28 at 13:28Intrinsics camera matrix converts between the 2D camera plane and 3D world coordinate space. Here's a decomposition of an intrinsic matrix, where:
fx
andfy
is a Focal Length in pixelsxO
andyO
is a Principal Point Offset in pixelss
is an Axis Skew
According to Apple Documentation:
The values fx and fy are the pixel focal length, and are identical for square pixels. The values ox and oy are the offsets of the principal point from the top-left corner of the image frame. All values are expressed in pixels.
So you let's examine what your data is:
QUESTION
I have this function named addShapes
. I want it to create 3 shapes
ANSWER
Answered 2021-May-21 at 04:36Your code works fine (a position was a problem):
QUESTION
Im converting the ARMeshAnchor data to mesh using SCNGeometrySource which it works fine but sometimes 3/10 I will get a bad_access from SceneKit renderer.
[![enter image description here][1]][1]
...ANSWER
Answered 2021-Feb-07 at 11:58It occurs because
ARMeshAnchors
constantly update their data as ARKit refines its understanding of the real world. AllARMeshAnchors
are dynamic anchors. However their mesh's subsequent changes are not intended to reflect in real time.
If you want to duplicate your ARMeshAnchors collection use the following code:
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install lovr-oculus-mobile
The submodule cmakelib/lovr has submodules. If you did not initially clone this repo with --recurse-submodules, you will need to run (cd cmakelib/lovr && git submodule init && git submodule update) before doing anything else.
Install Android Studio
Open Android Studio, go into Preferences, search in the box for "SDK" (or from the "Welcome to Android Studio" box, choose "Configure"->"SDK Manager"). Use the "Android SDK" pane and the "SDK Platforms" tab to download Android API level 23. Next, navigate to the "SDK Tools" tab of the same pane and check "Show Package Details". Under "NDK (Side by Side)" select "21.0.6113669", and under "CMake" check "3.6.4111459" (or whichever CMake is newest). Hit "Apply". Now quit Android Studio (we'll be doing the next part at the command line).
Follow the additional platform steps below:
You need to build the gradle script in cmakelib, then run the installDebug target of the gradle script in LovrApp/Projects/Android. You can do this with the gradlew script in the root, but it will need the Android tools in PATH and the sdk install location in ANDROID_HOME. You can just run this at the Bash prompt from the repository root to do all of this:.
You need to build the gradle script in cmakelib, then run the installDebug target of the gradle script in LovrApp/Projects/Android. You can do this with the gradlew script in the root, but it will need the Android tools in PATH and the sdk install location in ANDROID_HOME. You can just run this at the Bash prompt from the repository root to do all of this: (export PATH="/Applications/Android Studio.app/Contents/jre/jdk/Contents/Home/bin":~/Library/Android/sdk/platform-tools:$PATH ANDROID_HOME=~/Library/Android/sdk GRADLE=`pwd`/gradlew; (cd cmakelib && $GRADLE build) && (cd LovrApp/Projects/Android && $GRADLE installDebug)) && say "Done"
Unfortunately this is the only way I can get the build to work on Windows right now: Edit cmakelib/lovr/CMakeLists.txt and change LOVR_USE_LUAJIT and LOVR_ENABLE_AUDIO near the top from ON to OFF. This will make things run slightly slower and also disable audio. Follow the instructions under "creating a signing key" below in this README. (This is done automatically on Mac, but not on Windows.).
Unfortunately this is the only way I can get the build to work on Windows right now: Edit cmakelib/lovr/CMakeLists.txt and change LOVR_USE_LUAJIT and LOVR_ENABLE_AUDIO near the top from ON to OFF. This will make things run slightly slower and also disable audio.
Follow the instructions under "creating a signing key" below in this README. (This is done automatically on Mac, but not on Windows.)
Run the following from a cmd.exe window: set ANDROID_HOME=%LOCALAPPDATA%\Android\Sdk set JAVA_HOME=C:\Program Files\Android\Android Studio\jre set PATH=%PATH%;%CD% pushd cmakelib gradlew build popd pushd LovrApp/Projects/Android gradlew installDebug popd
You have to have turned on developer mode on your headset before deploying. You also have to enable USB debugging for the device. For the Oculus Go, you can do this by plugging in the device, putting it on, and using the controller to accept the "Allow USB Debugging" popup.
You have to have turned on developer mode on your headset before deploying.
You also have to enable USB debugging for the device. For the Oculus Go, you can do this by plugging in the device, putting it on, and using the controller to accept the "Allow USB Debugging" popup.
If you get a message about "signatures do not match the previously installed version", run this and try again: PATH="/Applications/Android Studio.app/Contents/jre/jdk/Contents/Home/bin":~/Library/Android/sdk/platform-tools:$PATH adb uninstall org.lovr.appsample
If all you have done is changed the assets, you can upload those by running only the final installDebug gradlew task. For example: (export PATH="/Applications/Android Studio.app/Contents/jre/jdk/Contents/Home/bin":~/Library/Android/sdk/platform-tools:$PATH ANDROID_HOME=~/Library/Android/sdk GRADLE=`pwd`/gradlew; (cd LovrApp/Projects/Android && $GRADLE installDebug))
To see all the things gradlew can do in a particular directory run it with "tasks" as the argument.
The reason for the long (export PATH/ANDROID_HOME line is to get the java and android tools into scope for that line. It would also work to modify the env vars in your bashrc. But you do have to set the environment variables somehow or else you could run the wrong version of Java and get confusing errors like "Could not determine java version from '13'".
When built without any changes, this repo produces an "org.lovr.appsample" app that prints a "game not found" message. If you look in the Github "releases" section, however, you'll find a "org.lovr.test" app that loads a game from the SD card where you can easily copy it using adb sync. Run the app for full instructions.
git clone a copy of [[https://github.com/mcclure/lodr]]. Save the path to the lodr directory.
In the lovr-oculus-mobile repo, in the LovrApp directory, create a file local_assets.txt containing the path to the lodr directory.
Edit the file LovrApp/Projects/build.gradle and change the "archivesBaseName" to test.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page