MixedRealityToolkit | components intended to accelerate the development | Augmented Reality library
kandi X-RAY | MixedRealityToolkit Summary
kandi X-RAY | MixedRealityToolkit Summary
The mixed reality toolkit is a collection of scripts and components intended to accelerate the development of applications targeting Windows Mixed Reality. This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of MixedRealityToolkit
MixedRealityToolkit Key Features
MixedRealityToolkit Examples and Code Snippets
Community Discussions
Trending Discussions on MixedRealityToolkit
QUESTION
I started a project on Unity version 2019.4.24f1, imported MRTK 2.6.1 foundation, tools, etc and set everything up according to this guide:
The App starts and shows the elements that I placed in the testscene (buttons, etc), but the background of the app is black and there is no camera access question from my android system.
The device I am testing on is a Oneplus 3T and the minumum SDK requirements are met. How can I get this to work? I will post my MRTK and Player Settings below:
XR Settings: (Note that If I activate AR Core tick, it displays an error)
Packages:
...ANSWER
Answered 2021-May-06 at 19:43The screen is black is because the camera access is not granted.
Check if your manifest file has the camera access permission.
https://developer.android.com/guide/topics/manifest/uses-permission-element
QUESTION
I want to implement a feature on HoloLens2 app, which allows user to draw/paint on specific surface in every part which was touched by the user.
So we have a flat plane and when I move my hand over this plane, spots below my hand should be coloured.
How can I implement such feature keeping in mind, that HoloLens has limited processing power and calling Texture.Apply()
every frame is unacceptable?
I tried to adopt a script from Eye tracking Heat map demo scene to use hand touch instead, but didn't success.
I've changed
...ANSWER
Answered 2020-Nov-02 at 08:47HoloLens has limited processing power and calling Texture.Apply() every frame is unacceptable?
As say in Unity Documentation, Apply
is a potentially expensive operation, so you'll want to change as many pixels as possible between Apply
calls. In DemoVisualizer, the solution it uses is the dwellTimeInSec
property provided by the EyeTrackingTarget component that defines the duration in seconds that the user needs to keep looking at the target to select it via dwell activation, and the duration value defaults to 0.8s. It avoids the overuse of the Apply method.
Therefore, you can refer to this practice of EyeTrackingTarget to improve the underperforming frame rate your mixed reality application got.
QUESTION
I used the prefab HandMenu_Large_WorldLock_On_GrabAndPull as a basis for my hand menu. If grabbing the menu and placing it somewhere, I stays there and this behavior is fine for me. But now I want to walk away, looking again at my hand and the menu should reattach to my hand. How to set up HandConstraintPalmUp
to achieve that?
What I tried:
I read here that I could set the SolverHandler
to true or call the method HandConstraintPalmUp.StartWorldLockReattachCheckCoroutine()
but this is actually not setting the SolverHandler
to true, it just stays false. If running it via Holographic Emulation and setting the handler to true by clicking in the editor, it works, but not via OnFirstHandDetected
or OnHandActivate
of HandConstraintPamlUp
. Also the documentation says:
When trying to set the hand constrained object to start following your hand again (based on whether it meets the activation criteria), set the SolverHandler's UpdateSolver to true.
But what is the criteria for the solver to set UpdateSolvers
to true? What am I missing here?
Menu with default settings:
Why is for example the OnClick
-Event on the button BtnClose
working and my call on OnHandActiavte
not?
ANSWER
Answered 2020-Sep-23 at 09:35Actually, since the Use Gaze Activation
checkbox of HandConstraintPalmUp
component is disabled by default, and the coroutine function WorldLockedReattachCheck()
invoked by StartWorldLockReattachCheckCoroutine()
method has a conditional statement will evaluate the useGazeActivation
property to decide whether to execute the statements following. If you did not enable the Use Gaze Activation
, it will never update the UpdateSolver
property to true. You can refer to the source code of this class for details: HandConstraintPalmUp.cs Line 392. To fix it, you can enable the Use Gaze Activation
for your project or update the value of SolverHandler.UpdateSolvers
directly in your code.
Besides, the OnHandActiavte
event is always work fine for me, I can't reproduce your issue on my machine. Could you provide more information about it? And if you have question about how to use it, you can find examples in the HandMenuLayoutExamples
scene under Assets/MRTK/Examples/Experimental/HandMenuLayout/Scenes.
QUESTION
I'm trying to make an animated material with the MRTK in Unity. The goal is to make an effect with look like starting with a circle and propagating the texture to rest of the plane.
For now I use the MixedRealityToolkit standard shader and use the round corner option. With an animation I made that :
My problem is that I can't tile the texture to reduce the size of the texture and repeat it. Also for non-square object, the texture is stretched and it's not really nice.
If I try to change the tile setting, the texture is not repeated (texture is well in "Repeat Mode", it works when I untick Round Corners option)
(If I display Unity selection outline, I obtained the repeated texture, but it's not displayed ... )
Does anyone have a good idea to do that with the MRTK shaders or how to write a specific shader for this effect ?
...ANSWER
Answered 2020-Aug-02 at 17:41I found a solution, writing my own shader :
QUESTION
Why does this happen? Has anyone had this happen before? I have tried it numerous times to make sure I am not crazy and I can't find anyone who has had this problem before when I searched le internet.
This is what I am doing:
Step 1:
Put 3-D object on scene view.
Step 2:
Click Save
Edit:
Here is some more information:
Unity Version: 2019.2.21
What I did specifically:
I am following the directions listed here. One of the things that I did, that wasn't listed on the link is try to add a cube onto the scene view.
I did this by right clicking on "MixedRealityToolkit", choosing "3-D" object, and then selecting "cube".
I then hit save and my cube disappeared.
...ANSWER
Answered 2020-Jun-26 at 07:07The MixedRealityToolkit is the MR dev toolkit itself, and providing the central configuration entry point for the entire framework.
We should not add any game object under the MixedRealityToolkit object. Therefore, the best practice is creating an empty object as a root game object, and then move all your scene contents as child to it.
QUESTION
I have tried to get hand mesh data from Hololens2 using MRTK V2 and Unity C#. Now, I can get hand mesh data with turning on Hand Mesh Visualization option and referring MRTK HandTracking guide.
Unfortunately, the visualization(drawing hand CG) is heavy workload. So, I would like to get hand mesh without turning on Hand Mesh Visualization option but OnHandMeshUpdated function is not called due to turning off Hand Mesh Visualization option.
Does anyone know how to get hand mesh data from Hololens2 without turning on Hand Mesh Visualization option?
...ANSWER
Answered 2020-Jun-19 at 10:01MRTK does not directly provide this feature. According to the source code of MRTK-Unity, check out the code line 163 of BaseHandVisualizer
class, you will find the majority of jobs are processed in the OnHandMeshUpdated
event handler. When the current hand mesh is updated based on the passed-in state of the hand, OnHandMeshUpdated
method will be invoked with HandMeshInfo
event data. Once Hand Mesh Prefab
field in [InputSystem]->[Hand Tracking] is set as "None", MRTK will not instantiate handMeshFilter according to the conditional statement. But the hand mesh related data will be easily accessible from the event data. Check out the class definition of HandMeshInfo
here.
QUESTION
slight noob question and I may have missed something in the MRTK2-Unity docs/samples etc but I'm just getting back into Unity with the Hololens 2 and I'm looking for a simple example of dropping holograms onto the world mesh, rather like the original Hololens 1 MR 250 tutorial, using the old HoloToolKit: WorldAnchorManager and TapToPlace approach.
I'm sure this is quite a simple thing to achieve but can't seem to find an example now that the HoloToolKit "Manager Prefabs" approach has been replaced by the MRTK services etc.
I want to update some old HL1 projects to HL2 but the Porting Guide is either a bit unclear, or more likely, designed to be interpreted by clever people who know what they are doing...
cheers
...ANSWER
Answered 2020-Apr-07 at 09:19I think what you are looking for is the new approach with Solvers
:
MRLearning-Base-Ch3
Most of old HoloToolkit managers are now integrated into services but you will find all functions that you need.
The Porting Guide explain what changed, I don't really understand what is unclear in this guide ... Can you develop ?
QUESTION
That's it, I'll step on my pride!
I'm using MRTK v2 and working fine except that at some point I want to turn off the line extending from the Motion Controller to the object and that provides input. After looking around and trying to find it in the MRTK documentation (it's gotta be easy, right?), I'm still banging my head on the wall and it's starting to hurt....
The MRTK documentation explains quite well how to configure it here:
But I'm looking to do this in script, enabling and disabling it as I need it in my application.
Any clue how to do this?
Many thanks!
...ANSWER
Answered 2019-Jun-04 at 00:47Great question! Here's one way to do this that has worked for me. You can see my solution at this repository: https://github.com/julenka/MixedRealityToolkit-Unity/tree/so/linepointer_off. Open the scene Assets/TurnOffLinePointerTest.unity
and then use hand simulation to press the buttons. The code to turn the pointers on/off is in Assets/PointerConfigurationExample.cs
.
Note: the reason you need to use this approach of modifying the mediator instead of directly setting myPointer.IsActive = false
is because the default mediator overwrites these values every frame. Luckily, you can customize this behavior.
Apply the changes from this commit to your MRTK clone. This change updates the FocusProvider in MRTK to make the PointerMediator publicly accessible, and makes the DefaultPointerMediator extensible by updating fields to be protected instead of private, and making methods virtual. See this pull request that implements this change directly into MRTK.
Step 2: Create a custom PointerMediator that will turn off far pointersCreate a custom Pointer Mediator like the one from this commit.
QUESTION
So it would appear that this would be a simple solution, but as I can not find any documentation on how to do exactly this it's effectively the same as brute-force guessing a password.
Environment- Unity Versions: 2019.3.1, 2019.3.4(current)
- Platform: Universal Windows Platform
- MRTK: 2.2, 2.3(current)
- HoloLens 2 OS: Windows Holographic Operating System
- I push a button and the file browser/explorer appears inside my Unity scene
- I can not launch the file browser/explorer in HoloLens 2.
With MRTK 2/HoloLens 2 you are able to launch external apps without exiting the Unity application. Something that HoloLens 1 could not do. Microsoft provides proof of this in their Unity examples package: Assets/MixedRealityToolkit.Examples/Demos/HandTracking/Scenes/HandInteraction.Examples.unity
once you have loaded the .Foundation and .Examples external packages into your Unity project.
In the provided scene, out of all the presented object there are two buttons off to the right side that when pressed will launch the Edge Browser or the OS's settings application. This is accomplished with a launch URI attached script that runs .OpenURL
on a string provided by the user via the GameObject's inspector.
And the code snippet (provided by Microsoft in MRTK2) that runs the user-inputted string:
...ANSWER
Answered 2020-Mar-12 at 15:21I've been having some back-and-forth with Developer Support at Microsoft and they've been extremely helpful in revealing more information onto this subject, and we agreed that it'd be a good idea to document that on here in case someone comes across this need/issue in the future.
Currently there is no way to access the native file browser from within a Unity application on HoloLens 2. Before going 3rd party on a solution the best native course of action is to use FileOpenPicker:
https://docs.microsoft.com/en-us/windows/mixed-reality/app-model#file-pickers
Outside of fully native solutions, the next best course of action is to use a 3rd party asset from the asset store. Due to my developer environment/work restrictions I'm not confident I'll personally be able to use this method, but it is a viable course of action for most everyone else.
I'm considering this the official answer for the current state of HoloLens 2, but will be happy to revise this if the situation changes in the future.
tl;dr version: Currently (early 2020) File Explorer is natively inaccessible from with an application, and the best/closest native solution is to use FileOpenPicker, or 3rd party assets on the Unity Asset Store.
QUESTION
I need to change the expanding direction of the VerticalLayoutGroup. With the default behavior the group will expand downstairs. What i want is, that the group will expand upstairs.
The expected behavior is described in this Video. (Link to the answer on stackoverflow https://stackoverflow.com/a/43192904/11236801)
The solution on the link is, to rotate the LayoutGroup about 180°. Now this seems more like a workaround, because all childs have to be rotated as well. The solution suggested by this one will not give the expected behavior shown in the video.
I added the ContentSizeFitter to the LayoutGroup like described in this link in the Unity Answers. Now the LayoutGroup will expand in both directions (upstairs and downstairs).
Is there any solution to accomplish the desired behavior without rotating the LayoutGroup?
Edit: I also noticed a downside from the rotating approach: The Billboard Script from the MixedRealityToolkit will force the LayoutGroup to rotate back to 0°.
...ANSWER
Answered 2020-Feb-18 at 15:35This works for me:
Note the Child Alignment setting on the Vertical Layout Group
, and the Pivot settings on the Rect Transform
. This makes the layout group expand upwards when more items are added.
Also note the presence of a Content Size Fitter
component, and its Vertical Fit set to Preferred Size, that will adjust the height of the layout group when content is added or removed.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install MixedRealityToolkit
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page