Oculus | The Oculus platform components for the XRTK | Augmented Reality library
kandi X-RAY | Oculus Summary
kandi X-RAY | Oculus Summary
The Oculus platform components for the XRTK - Mixed Reality Toolkit.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of Oculus
Oculus Key Features
Oculus Examples and Code Snippets
Community Discussions
Trending Discussions on Oculus
QUESTION
I'm trying to create a directed graph with my Javascript code which I created. By clicking on a node, the list of publications assigned to a keyword should be called up. The currently selected node should be highlighted in the visualisation. The details should be shown on a separate grey area which i created:
...ANSWER
Answered 2021-May-30 at 16:05The error means that persona[keyName]
is undefined. We are not given declaration of persona
array to make better research.
QUESTION
I would like to create a graph. To do this, I have created a JSON file. The Skills (java, python, HTML, json) should be the links and the index (KayO, BenBeck) should be the nodes. Also the node must not fall below a certain minimum size and must not become too large.
After that, I would like to be able to call up the list of publications on the right-hand side by clicking on the node. The currently selected node in the visualisation should be highlighted.
I have already implemented from this example (https://bl.ocks.org/heybignick/3faf257bbbbc7743bb72310d03b86ee8). But unfortunately I can't get any further.
The error message I always get is:
Uncaught TypeError: Cannot read property 'json' of undefined
This is what my issue currently looks like:
The JSON file:
...ANSWER
Answered 2021-May-15 at 14:59Your JSON file should be of format:
QUESTION
How would one go about mirroring or cloning the WebXR 'immersive-xr'
view from a HMD like the VIVE or Oculus in the browser using the same WebGL canvas
?
There is much discussion about copying the pixels to a texture2D, then applying that as a render texture, or completely re-drawing the entire scene with an adjusted viewTransform
. These work well if you are rendering a different view, such as a remote camera or 3rd person spectator view, however both are a waste of resources if one only wants to mirror the current HMD view on the desktop.
Self answered below as there was no solid answer when I ran into this and I'd like to save future devs the time. (Especially if they're not all to savvy with WebGl2
and WebXR
)
Note, that I'm not using any existing frameworks for this project for 'reasons'. It shouldn't change much if you are, you'd just need to perform the steps at the appropriate place in your library's render pipeline.
...ANSWER
Answered 2021-May-11 at 12:46The answer is delightfully simple as it turns out, and barely hits my fps.
- Attach the canvas to the DOM and set it to your desired size. (Mine was fluid, so had a CSS width of 100% of it's parent container with a height of auto)
- When you initialize your glContext, be sure to specify that antialiasing is false. This is important if your spectator and HMD views are to be different resolutions.
{xrCompatible: true, webgl2: true, antialias: false}
- create a frameBuffer that will be used to store your rendered HMD view.
spectateBuffer
- Draw your
immersive-xr
layer as usual in yourxrSession.requestAnimationFrame(OnXRFrame);
callback - Just prior to exiting your
OnXRFrame
method, implement a call to draw the spectator view. I personally used a boolshowCanvas
to allow me to toggle the spectator mirror on and off as desired:
QUESTION
I build my first VR app using unity. I am getting the following error when trying to build and run on a connected Oculus Quest 2 device!
Android device is not responding! Make sure USB debugging has been enabled and that the device has authorized this computer. Check your device, in most cases there should be a small icon in the status bar telling you if the USB connection is up.
I have tried many different ways to get this working, I am not able to figure this out, is there anything else I can try? I have listed down things I tried and config I have!
- I have developer mode ON.
- When I connect USB to Macbook Pro I do get an option to Allow in oculus quest with the following message and I select "Allow":
Allow Access to data, the connected device will be able to access files on this headset. [Deny] [Allow]
- I see my Oculus quest device in the "Run Device" dropdown in build settings and have it selected.
- I also installed Android File Transfer and I do see all folders and files on the quest 2 device.
- I tried a factory reset and restarting the quest 2 device and my MacBook pro multiple time.
- App I have built runs fine when I click the play button and also it works when I build for the macOS and run on Macbook pro.
Screen shot for the settings and error File > Build Settings
Thank you!
...ANSWER
Answered 2021-Apr-20 at 07:06I am also developing Quest with Macbook, on Mac OS Unity, you have to install it via terminal with below command.
QUESTION
ANSWER
Answered 2021-Apr-30 at 13:57The permissions are added by the dependencies of your app. An android manifest file is automatically merged with the manifest files of its dependencies.
The top manifest file has the priority and can alter the values of the manifest files in its dependencies.
To remove a permission, use the tools:node="remove"
tag. For example, to remove the RECORD_AUDIO
permission, add this line to your app's manifest file between the XML tags:
QUESTION
So I wanted to modify the VrCuvbeWorld_Vulkan sample provided on the facebook website: https://developer.oculus.com/documentation/native/android/mobile-vrapi/ to add a geometry shader for the Oculus Quest 2. However when I tried to enable the multiviewGeometryShader feature, I was granted by a VK_ERROR_FEATURE_NOT_PRESENT error. And on http://vulkan.gpuinfo.org/displayreport.php?id=10024#features_extensions the feature is said to not be supported.
I just need a geometry shader to calculate a value for each triangle. What would be a viable alternative? On http://vulkan.gpuinfo.org they say that the geometryShader feature is supported. Therefore, is rendering eye by eye without the multiview extension a possible solution ?
...ANSWER
Answered 2021-Apr-19 at 08:32If a device supports geometry shaders and also VK_KHR_multiview
, that doesn't mean that both can be used in conjunction. That only works if the multiviewGeometryShader
property is true
. When we looked at this in detail last year, we couldn't find any device (besides some NVIDIA Quardo, I think) that supported multiview with geometry or tessellation shaders.
That means that you'll not be able to use geometry shaders with the VK_KHR_multiview
extension for now. A good alternative could be to use geometry shader instancing and use the geometry shader instance's invocation id to direct rendering into different layers:
QUESTION
I have types.ts
file which lookes like:
ANSWER
Answered 2021-Apr-15 at 12:55Someone sent a Pull Request on my repo & I found a simpler solution:
QUESTION
I'm porting a game to Quest and so part of my work is to interface the engine's Vulkan renderer w/ the Oculus Mobile SDK.
I believe I'm setting up the SDK correctly (I'm following the examples and the guidelines from Oculus' docs) but still I'm getting a nasty error when trying to submit a frame.
Here's a high-level list of the things I'm currently doing:
- I initialize the API.
- I create a Vulkan instance and device w/ the expected extensions.
- I acquire per-eye swapchains and get Vulkan handlers for each of their images.
- I setup framebuffers and renderpasses using those images.
- I acquire a native android window.
- I enter VR mode (making sure the app is resumed).
Then at the end of my render loop I setup an ovrSubmitFrameDesc
and then call vrapi_SubmitFrame2()
. I'm also making sure I only call vrapi_SubmitFrame2()
after all work has been submitted to the GPU (I'm currently using a fence on my work queues).
However, as I mentioned before, the call to vrapi_SubmitFrame2()
fails. It currently raises a SIGSEGV inside Quest's Vulkan driver:
ANSWER
Answered 2021-Mar-15 at 09:23Let me start this by asking: who do you think would win, 18 years of experience in software development or this bad boi here '&'?
The mystery of SIGSEGV being raised from vrapi_SubmitFrame2()
was nothing more than a stupid mistake when setting OVR's synchronization queue:
QUESTION
I am trying to build a next.js project with-mongodb on it but its not working and im not sure why. The log doesn't appear to be very helpful with that the issue is (as far as i can tell) and so im not sure how to fix it. here is a copy of the log;
...ANSWER
Answered 2021-Mar-13 at 14:28Today I tried again and recieved a proper error message.
QUESTION
I'm implementing a feature to get coordinate of an a-sky
while moving the VR controller.
ANSWER
Answered 2021-Feb-15 at 02:53It turns out that raycaster-intersected-cleared
event from parent element is fired but it does not mean that it is not intersected anymore. To confirm if it has been still intersected I have to check if getIntersection
result is NULL.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install Oculus
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page