xr3ngine | end solution for hosting humans | Augmented Reality library

 by   xr3ngine TypeScript Version: rc- License: Non-SPDX

kandi X-RAY | xr3ngine Summary

kandi X-RAY | xr3ngine Summary

xr3ngine is a TypeScript library typically used in Virtual Reality, Augmented Reality, React, Nodejs, Three.js applications. xr3ngine has no bugs, it has no vulnerabilities and it has low support. However xr3ngine has a Non-SPDX License. You can download it from GitHub.

An end-to-end solution for hosting humans and AI in a virtual space, built on top of react, three.js and express/feathers. This repo includes a fully-feature client, API server, realtime gamerserver, game engine and devops for scalable deployment. Pick and choose what you need or deploy the whole stack and start building your application on top.

            kandi-support Support

              xr3ngine has a low active ecosystem.
              It has 108 star(s) with 30 fork(s). There are 16 watchers for this library.
              It had no major release in the last 12 months.
              There are 37 open issues and 526 have been closed. On average issues are closed in 59 days. There are 6 open pull requests and 0 closed requests.
              It has a neutral sentiment in the developer community.
              The latest version of xr3ngine is rc-

            kandi-Quality Quality

              xr3ngine has no bugs reported.

            kandi-Security Security

              xr3ngine has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.

            kandi-License License

              xr3ngine has a Non-SPDX License.
              Non-SPDX licenses can be open source with a non SPDX compliant license, or non open source licenses, and you need to review them closely before use.

            kandi-Reuse Reuse

              xr3ngine releases are available to install and integrate.
              Installation instructions, examples and code snippets are available.

            Top functions reviewed by kandi - BETA

            kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
            Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of xr3ngine
            Get all kandi verified functions for this library.

            xr3ngine Key Features

            No Key Features are available at this moment for xr3ngine.

            xr3ngine Examples and Code Snippets

            No Code Snippets are available at this moment for xr3ngine.

            Community Discussions


            RealityKit app and lower iOS deployment target
            Asked 2022-Mar-10 at 15:04

            I have an iOS app with deployment target iOS 10+, I need to add some features that depend only on RealityKit to appear with users whom their iOS version is 13+, the app compiles and runs successfully on real device but the problem is when archiving for upload to AppStore it generates a Swift file and says:



            Answered 2022-Mar-10 at 15:04
            Firstly :

            Do not include Reality Composer's .rcproject files in your archive for distribution. .rcproject bundles contain the code with iOS 13.0+ classes, structs and enums. Instead, supply your project with USDZ files.

            Secondly :

            To allow iOS 13+ users to use RealityKit features, but still allow non-AR users to run this app starting from iOS 10.0, use the following code:

            Source https://stackoverflow.com/questions/71000365


            How to use RealityKit's image AnchorEntity in SwiftUI?
            Asked 2022-Jan-08 at 20:36

            I want to use RealityKit's AnchorEntity initialized with an Anchoring Component Target of type Image. Here is what I am trying to do:



            Answered 2022-Jan-08 at 20:36

            You can use AnchorEntity(.image(...)) in SwiftUI almost the same way as you used it in UIKit. At first click Assets.xcassets in Project Navigator pane and create AR Resources folder. Drag your image there. Setup its physical size in Inspector. Then copy/paste the code:

            Source https://stackoverflow.com/questions/70634542


            Augmented Reality – Lighting Real-World objects with Virtual light
            Asked 2021-Dec-16 at 17:25

            Is it possible to import a virtual lamp object into the AR scene, that projects a light cone, which illuminates the surrounding space in the room and the real objects in it, e.g. a table, floor, walls?

            For ARKit, I found this SO post.

            For ARCore, there is an example of relighting technique. And this source code.

            I have also been suggested that post-processing can be used to brighten the whole scene.

            However, these examples are from a while ago and perhaps threre is a newer or a more straight forward solution to this problem?



            Answered 2021-Dec-16 at 17:25

            At the low level, RealityKit is only responsible for rendering virtual objects and overlaying them on top of the camera frame. If you want to illuminate the real scene, you need to post-process the camera frame.

            Here are some tutorials on how to do post-processing: Tutorial1⃣️ Tutorial2⃣️

            If all you need is an effect like This , then all you need to do is add a CGImage-based post-processing effect for the virtual object (lights).

            More specifically, add a bloom filter to the rendered image(You can also simulate bloom filters with Gaussian blur).

            In this way, the code is all around UIImage and CGImage, so it's pretty simple😎

            If you want to be more realistic, consider using the depth map provided by LiDAR to calculate which areas can be illuminated for a more detailed brightness.

            Or If you're a true explorer, you can use Metal to create a real world Digital Twin point cloud in real time to simulate occlusion of light.

            Source https://stackoverflow.com/questions/70348881


            How to show image from gallery in realitykit?
            Asked 2021-Dec-12 at 08:44

            I want to show image from gallery. i am loading the image using imagePicker.



            Answered 2021-Dec-12 at 08:44

            Try this. Take into consideration, a tint color is multiplied by an image – so, if tint's RGBA = [1,1,1,1], a result of multiplication will be an image itself (without tinting)...

            Source https://stackoverflow.com/questions/70321744


            How to fix AR connected issue in my device?
            Asked 2021-Nov-23 at 13:58

            I am developing a website related to Augmented Reality. But my mobile phone (Samsung Galaxy M02S) is not supported AR. When I try to install ARcore, google play store show an error.

            So I installed ARcore using apk and now this app was installed. After that, I checked my mobile phone AR was supported or not. The below notification was shown when I try to see AR.

            How I fix this issue and have any other way to install ARcore into my phone?



            Answered 2021-Nov-23 at 13:58

            AR Core requires some specific hardware to work. You can check the list of supported devices here. No amount of side loading will help because this is a hardware requirement issue. Moreover AR Core is under active development even if you somehow install a version that might work that version will soon be deprecated and you will start getting the popup saying you need to update.

            Kindly use a device that is part of supported list or an Emulator that supports this. IMHO it is best to develop using a device that has support from AR Core team.

            Source https://stackoverflow.com/questions/70081990


            SCNKit: Hit test doesn't hit node's dynamically modified geometry
            Asked 2021-Aug-15 at 08:41

            I'm facing an issue where SCNView.hitTest does not detect hits against geometry that I'm modifying dynamically on the cpu.

            Here's the overview: I have a node that uses an SCNGeometry created from a MTLBuffer of vertices:



            Answered 2021-Aug-13 at 15:46

            When you perform a hit-test search, SceneKit looks for SCNGeometry objects along the ray you specify. For each intersection between the ray and a geometry, SceneKit creates a hit-test result to provide information about both the SCNNode object containing the geometry and the location of the intersection on the geometry’s surface.

            The problem in your case is that when you modify the buffer’s contents (MTLBuffer) at render time, SceneKit does not know about it, and therefore cannot update SCNGeometry object which is used for performing hit-test.

            So the only way I can see to solve this issue is to recreate your SCNGeometry object.

            Source https://stackoverflow.com/questions/68723000


            Error in displaying reality file from network
            Asked 2021-Jul-26 at 06:44

            I am trying to display a reality file created using Reality Composer. The below code works for usdz but not for reality. Here is my code



            Answered 2021-Jul-26 at 06:44

            Uploading a .reality model from web works fine. You can easily check this in Xcode Simulator:

            Source https://stackoverflow.com/questions/68517968


            Camera Intrinsics Resolution vs Real Screen Resolution
            Asked 2021-May-28 at 13:28

            I am writing an ARKit app where I need to use camera poses and intrinsics for 3D reconstruction.

            The camera Intrinsics matrix returned by ARKit seems to be using a different image resolution than mobile screen resolution. Below is one example of this issue

            Intrinsics matrix returned by ARKit is :

            [[1569.249512, 0, 931.3638306],[0, 1569.249512, 723.3305664],[0, 0, 1]]

            whereas input image resolution is 750 (width) x 1182 (height). In this case, the principal point seems to be out of the image which cannot be possible. It should ideally be close to the image center. So above intrinsic matrix might be using image resolution of 1920 (width) x 1440 (height) returned that is completely different than the original image resolution.

            The questions are:

            • Whether the returned camera intrinsics belong to 1920x1440 image resolution?
            • If yes, how can I get the intrinsics matrix representing original image resolution i.e. 750x1182?


            Answered 2021-May-28 at 13:28
            Intrinsics 3x3 matrix

            Intrinsics camera matrix converts between the 2D camera plane and 3D world coordinate space. Here's a decomposition of an intrinsic matrix, where:

            • fx and fy is a Focal Length in pixels
            • xO and yO is a Principal Point Offset in pixels
            • s is an Axis Skew

            According to Apple Documentation:

            The values fx and fy are the pixel focal length, and are identical for square pixels. The values ox and oy are the offsets of the principal point from the top-left corner of the image frame. All values are expressed in pixels.

            So you let's examine what your data is:

            Source https://stackoverflow.com/questions/66893907


            Adding multiple SCNNode(s) at the same time
            Asked 2021-May-21 at 04:36

            I have this function named addShapes. I want it to create 3 shapes



            Answered 2021-May-21 at 04:36

            Your code works fine (a position was a problem):

            Source https://stackoverflow.com/questions/67629296


            ARMeshAnchor – SceneKit SCNView Renderer EXC_BAD_ACCESS
            Asked 2021-Apr-21 at 00:26

            Im converting the ARMeshAnchor data to mesh using SCNGeometrySource which it works fine but sometimes 3/10 I will get a bad_access from SceneKit renderer.

            [![enter image description here][1]][1]



            Answered 2021-Feb-07 at 11:58

            It occurs because ARMeshAnchors constantly update their data as ARKit refines its understanding of the real world. All ARMeshAnchors are dynamic anchors. However their mesh's subsequent changes are not intended to reflect in real time.

            If you want to duplicate your ARMeshAnchors collection use the following code:

            Source https://stackoverflow.com/questions/66070916

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network


            No vulnerabilities reported

            Install xr3ngine

            Getting up and running requires only a few steps.
            Install your dependencies cd path/to/xr3ngine yarn install Error with mediasoup? Optional: https://mediasoup.org/documentation/v3/mediasoup/installation/ If on WSL2: sudo apt-get update sudo apt-get install build-essential npm install -g node-gypPYTHON=python3 yarn install npm config set python /usr/bin/python PYTHON=python3 yarn install
            Make sure you have a mysql database installed and running -- our recommendation is Mariadb. We've provided a docker container for easy setup: cd scripts && sudo bash start-db.sh This creates a Docker container of mariadb named xr3ngine_db. You must have docker installed on your machine for this script to work. If you do not have Docker installed and do not wish to install it, you'll have to manually create a MariaDB server. The default username is 'server', the default password is 'password', the default database name is 'xr3ngine', the default hostname is '', and the default port is '3306'. Seeing errors connecting to the local DB? Shut off your local firewall.
            Have redis installed and running redis must be running in order for feathers-sync to function; it coordinates feathers actions across servers, e.g. the API server can be notified that the gameserver patched a user. scrips/docker-compose is configured to start a redis container using Docker. Run docker-compose up from the /scripts directory to build + start it, and after that you can run docker start xr3ngine_redis to restart the redis container.
            Open a new tab and start the Agones sidecar in local mode cd scripts sudo bash start-agones.sh You can also go to vendor/agones/ and run ./sdk-server.linux.amd64 --local If you are using a Windows machine, run sdk-server.windows.amd64.exe --local and for mac, run ./sdk-server.darwin.amd64 --local
            Obtain .env.local file with configuration variable. Many parts of XR3ngine are configured using environment variables. For simplicity, it's recommended that you create a file called .env.local in the top level of xr3ngine, and have all of your ENV_VAR definitions here in the form <VAR_NAME>=<VALUE>. If you are actively working on this project, contact one of the developers for a copy of the file that has all of the development settings and keys in it.
            Start the server in database seed mode Several tables in the database need to be seeded with default values. Run cd packages/server, then run yarn dev-reinit-db. After several seconds, there should be no more logging. Some of the final lines should read like this: Executing (default): SELECT 'id', 'name', 'sceneId', 'locationSettingsId', 'slugifiedName', 'maxUsersPerInstance', 'createdAt', 'updatedAt' FROM 'location' AS 'location' WHERE ('location'.'id' = '98cbcc30-fd2d-11ea-bc7c-cd4cac9a8d61') AND 'location'.'id' IN ('98cbcc30-fd2d-11ea-bc7c-cd4cac9a8d61'); Seeded At this point, the database has been seeded. You can shut down the server with CTRL+C.
            Open two/three separate tabs and start the API server (non-seeding), gameserver (if using gameservers), and the client In /packages/server, run sudo yarn dev. If you are using gameservers, in another tab go to /packages/gameserver and run sudo yarn dev. In the final tab, go to /packages/client and run sudo yarn dev.
            Open a new tab and start local file server (optional) If the .env.local file you have has the line STORAGE_PROVIDER=local then the scene editor will save components, models, scenes, etc. locally (as opposed to storing them on S3). You will need to start a local server to serve these files, and make sure that .env.local has the line LOCAL_STORAGE_PROVIDER="localhost:8642". In a new tab, go to packages/server and run yarn serve-local-files. This will start up http-server to serve files from packages/server/upload on localhost:8642. You may have to accept the invalid self-signed certificate for it in the browser; see 'Allow local file http-server connection with invalid certificate' below.
            In a browser, navigate to The database seeding process creates a test empty location called 'test'. It can be navigated to by going to ''. See the sections below about invalid certificates if you are encountering errors connecting to the client, API, or gameserver.


            As of this writing, the cert provided in the xr3ngine package for local use is not adequately signed. Browsers will throw up warnings about going to insecure pages. You should be able to tell the browser to ignore it, usually by clicking on some sort of 'advanced options' button or link and then something along the lines of 'go there anyway'. Chrome sometimes does not show a clickable option on the warning. If so, just type badidea or thisisunsafe when on that page. You don't enter that into the address bar or into a text box, Chrome is just passively listening for those commands. For more detailed instructions check: https://github.com/FiloSottile/mkcert.
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
          • HTTPS


          • CLI

            gh repo clone xr3ngine/xr3ngine

          • sshUrl


          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link