Video-Chat | Video calling and chatting app | Frontend Framework library
kandi X-RAY | Video-Chat Summary
kandi X-RAY | Video-Chat Summary
Video calling and chatting app (PWA) built using React.js, Web RTC and Socket.io
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of Video-Chat
Video-Chat Key Features
Video-Chat Examples and Code Snippets
Community Discussions
Trending Discussions on Video-Chat
QUESTION
We have built a front-end with React and a back-end with Django Rest Frameworks and channels. We are using Heroku Redis as our Redis provider. Our users connect to Channels via a ReconnectingWebSocket
.
We are using Python 3.6 and Channels 2.4
The issue is that our API calls are attempting to pass info to the sockets and they're not always making it to the consumer. I logged the steps of the call out via prints, printed the channel_name
it's about to attempt to send it to and confirm it's what was returned to the user on connect, but the prints in the consumer don't get called meaning the message never gets sent to the user.
If I increase the number dynos to more or less a 1-1 with the users connected to sockets then it seems to solve the problem (or at least makes it much more reliable). From my understanding, 1 dyno should be able to handle many socket connections. Is there a reason that my consumer is not receiving the signals? Is there a reason scaling up the number of dynos resolved the problem?
On connect, I have the user join a group called "u_{their id}" to allow for potentially sending the signals to multiple computers logged in as the same user. I have tried sending the message through their channel_name
directly and through that group, and when messages aren't going through neither seem to go through. the prints
verify the channel_names
are correct and the consumer still doesn't receive the messages. There doesn't seem to be any errors occuring. It may not work, then I'll refresh the recipient and it'll work, then I'll refresh the recipient again and it's back to not working.
The socket connection is certainly alive - I made a simple function on the front end that pings the socket and when I do it (even if the consumer isn't getting signals from API calls), it responds.
I also notice that if I restart my dynos, when they load up and the sockets reconnect, the first user has signals working through API calls for a short time then they start not coming through again. Also, if I don't use the sockets for a while then refresh they also seem to start working briefly again.
Procfile
...ANSWER
Answered 2021-Mar-18 at 20:10The issue ended up being the Redis. I converted from channels-redis to channels-rabbitmq and all of my issues went away. I don't know if it was with my Redis provider or with channels-redis, but simply changing the backend resolved all issues.
QUESTION
My video chat app works properly in the same network. It generates and connects IceCandidate using stun as well. But for some reason the peer videos don't play in different networks. I am having trouble debugging the issue.
I haven't used any turn server but I doubt that is the problem as the peers from different networks already joins using stun, only videos don't play
...ANSWER
Answered 2021-Jan-25 at 23:18WebRTC can connect in a few ways, and falls down progressively to lower preference choices as it fails at its first choices.
- naive direct p2p with own ip
- if that fails, use a STUN server to determine what ip (e.g., router) we're behind
- if that fails, true p2p is not possible, use a TURN server instead to relay traffic.
WebRTC tries everything it can do to make a p2p connection, but there are times that it will fail. The turn server acts as a last resort so that the peers can both connect through the turn server. Obviously this is not a p2p connection, so there will be extra latency, and you will have to make sure that your turn server has enough bandwidth to cover all of the connections you expect.
Usually about 20% of connections require a TURN server. It may work fine for you on your network, but try accessing your webRTC service from a different network which has firewall and different network configurations (which will usually require TURN), and you'll see that not all connections are equal when it comes to p2p. so basically this is what is happening to you that different peer are in different network so you are not getting video of peers.
Basically what happens is that since peers have different networks so It become harder to do correct Ice candidate exchange so no media transmission happens that was decided during sdp negotiation, so we need a a public common server(TURN server) which acts as a peer to other peers for smooth ice-candidate exchange so that media transmission can happens (where this TURN server act as a relay for the media transmission between peers) I am sure in your case along with video, audio too is not working for this reasons.
QUESTION
I have a web site developped with symfony and I want to add a live video chat feature. I've read that the best way to do this , it's to use webrtc, socket.io on nodejs. So i create a little projet beside and i follow this recommandation: https://dev.to/jeffersonxavier/webrtc-a-simple-video-chat-with-javascript-1686 It works!
Now when I want to integrate it with my website, i ran the symfony app on 8000 and i run the node server on 3000.
So i replaced
...ANSWER
Answered 2021-Jan-18 at 19:04I have used a library called PeerJS to utilize Webrtc in my projects (Video/Audio/Chat). You should look into how Webrtc works in the real world, the basic idea is. You need a server called a stun server to act as a middle man to set up the peer to peer talking. After the setup is done the peers will talk to each other.
I highly suggest you use PeerJs, I was able to set up my workflow in about 2 hours due to the high-level nature of the api. Peer Js
QUESTION
I'm trying to embed a 1080p video chat stream into a Unity app on Android.
I've tried using Agora.io for this purpose, but the current version of Agora for Unity runs entirely on the CPU, and copying a 1080p texture from CPU to GPU takes 20ms on my Android device - too slow to maintain a consistent 60fps framerate in my app. Ideally, I either need the copy operation to not block the render thread, or I need the copy to happen in under 10ms, or I need the decoding to happen entirely in hardware.
I've done some digging into how other video player apps achieve this, and in general they use Android MediaCodec or ExoPlayer to decode video directly to a texture in hardware. However, this doesn't seem feasible for a video chat app because the process of encoding a video stream to one of the Android-supported formats would introduce too much latency.
I'm curious as to how apps like Skype and Hangouts achieve this - but I suspect they either run at 30fps, or they limit their video resolution to 720p or lower.
I've also considered creating my own video-chat protocol using something like Basis texture compression to decompress textures on GPU rather than CPU, but there is very little information on how to compile and integrate Basis into an Android application.
So my question is, does anybody know of a video chat SDK that offers this kind of performance & fidelity without an excessive amount of development work?
...ANSWER
Answered 2020-Jun-19 at 07:56So as it turns out, my profiling was totally wrong. The texture copy was not the hold-up at all - in fact the texture decode and copy-to-GPU were taking about 3ms, my profiler was just reporting incorrect timings.
The real bottleneck was the fact that I was trying to render the image onto a 250K triangle surface. It turns out under normal circumstances, my Android device was only able to render a surface of 100K triangles using GLES. However, after switching to Vulkan, ensuring triangles had connected edges (drastically shrinking the mesh index buffer), and performing some minor mesh optimization, I was able to increase my triangle budget to 400K and achieve a steady 60fps frame rate with a 1080p 30fps Agora video chat stream.
QUESTION
I've been following Agora's tutorial for building an Augmented Reality video chat app https://www.agora.io/en/blog/video-chat-with-unity3d-the-arfoundation-version/. When I build it onto my iPhone the cube on which the video is playing on remains blank. Frustratingly, if I then remove this line of codemRtcEngine.EnableLocalVideo(false);
the videos play but the AR camera freezes! Is there a way I can have both??
I'm using Unity version 2019.3.1
...ANSWER
Answered 2020-May-15 at 21:56It seems you can not use 2 cameras at once. See SO response, here, from a Agora.IO dev (or, seems to be a dev): Can we use Agora video using Unity AR Foundation with simultaniously using back and front camera
QUESTION
I downloaded the agora.io video sdk asset from the asset store, imported it to a new project. On the demo "SceneHome" scene, I entered the api id. Clicked play, and as soon as I click the "join" button Unity crashes. As far as I can tell the crash happens on the "app.join(field.text);" line in the TestHome.cs script (line #86).
I tested it in 2019.3.2f1 and 2020.1.0b5. The result was the same. The OS is Catalina 10.15.4.
The demo works on Windows.
I followed this tutorial: https://medium.com/@jake_agora.io/mac-run-video-chat-within-your-unity-application-e001091db62f but used x86_64 dlls instead of x86
Does anyone know what this is about? Or where should I begin to look?
...ANSWER
Answered 2020-Apr-27 at 11:17As herve nau pointed out the problem was that the Unity did not had the permission to use camera or microphone. And the solution should work. Alternatively, here is another way to add the permission as described by launzone:
1) Disable SIP: Go into recovery mode (hold CMD+R when you
restart your Mac) Don't be afraid, we are not doing anything crazy.
2) After that open Terminal (it should be accessible from one of the Menus at the Top) Type in "csrutil disable" and hit enter. Then reboot your Mac normally.
3) Open Terminal and type in: "sqlite3 ~/Library/Application\ Support/com.apple.TCC/TCC.db" and hit enter
4) For microphone access, type in: "INSERT INTO access VALUES('kTCCServiceMicrophone','com.unity3d.unityhub',0,1,1,NULL,NULL,NULL,'UNUSED',NULL,0,1541440109);" and hit enter
For camera, type in: "INSERT INTO access VALUES('kTCCServiceCamera','com.unity3d.unityhub',0,1,1,NULL,NULL,NULL,'UNUSED',NULL,0,1541440109);" and hit enter
6) check in your SystemPreferences/Security&Privacy > unity hub should now show up in both mic and cam
7) reboot in recovery mode again (CMD+R), open terminal again, type in: "csrutil enable" and hit enter, to enable SIP again
8) reboot normally and enjoy!
Here is the full thread. I hope it helps someone :)
QUESTION
I have implemented Twilio Video on Angular using this tutorial: https://www.twilio.com/blog/video-chat-app-asp-net-core-angular-twilio
It is worth mentioning that I ran into a problem earlier where I wasn't able to show or receive any video through safari browser. I fixed that issue by downgrading my zone.js version to 1.0.82.
Now that the video was displaying correctly on all browsers including safari on Mac, now I am testing on iOS safari.
The issue: Local Video (Video from iphone camera) is not showing on iOS safari. However remote video does show. And on the other end, both the remote and local video is showing perfectly.
Twilio video on all browsers except safari iOS
Twilio video on safari iOS
Camera View (html):
...ANSWER
Answered 2019-Oct-20 at 22:04I finally managed to fix the problem. I will post this for anyone who runs into this issue.
What the underlying problem was? The problem is that although laptop and macbook and some phone browsers allow you to create multiple webcams, the browsers on iphone, don't.
What I did to solve this problem? What I did was to use the camera which the twilio uses to create or connect to a room. What I was doing wrong was that I was creating a second webcam layer and accessing my webcam from there. (This is a fault in twilio's code, not mine). So basically all I did was this:
QUESTION
E:\React-Projects\video-chat\node_modules\ts-node\src\index.ts:421 return new TSError(diagnosticText, diagnosticCodes) ^ TSError: ⨯ Unable to compile TypeScript: src/server.ts:1:8 - error TS1259: Module '"E:/React-Projects/video-chat/node_modules/@types/express/index"' can only be default-imported using the 'esModuleInterop' flag
1 import express, {Application} from "express"; ~~~~~~~
node_modules/@types/express/index.d.ts:108:1 108 export = e; ~~~~~~~~~~~ This module is declared with using 'export =', and can only be used with a default import when using the 'esModuleInterop' flag.src/server.ts:2:8 - error TS1259: Module '"E:/React-Projects/video-chat/node_modules/@types/socket.io/index"' can only be default-imported using the 'esModuleInterop' flag
2 import socketIO,{Server as SocketIOServer} from "socket.io"; ~~~~~~~~
node_modules/@types/socket.io/index.d.ts:16:1 16 export = SocketIO; ~~~~~~~~~~~~~~~~~~ This module is declared with using 'export =', and can only be used with a default import when using the 'esModuleInterop' flag. at createTSError (E:\React-Projects\video-chat\node_modules\ts-node\src\index.ts:421:12) at reportTSError (E:\React-Projects\video-chat\node_modules\ts-node\src\index.ts:425:19) at getOutput (E:\React-Projects\video-chat\node_modules\ts-node\src\index.ts:553:36) at Object.compile (E:\React-Projects\video-chat\node_modules\ts-node\src\index.ts:758:32) at Module.m._compile (E:\React-Projects\video-chat\node_modules\ts-node\src\index.ts:837:43) at Module._extensions..js (internal/modules/cjs/loader.js:1177:10) at Object.require.extensions. [as .ts] (E:\React-Projects\video-chat\node_modules\ts-node\src\index.ts:840:12) at Module.load (internal/modules/cjs/loader.js:1001:32) at Function.Module._load (internal/modules/cjs/loader.js:900:14) at Module.require (internal/modules/cjs/loader.js:1043:19) [nodemon] app crashed - waiting for file changes before starting...
...ANSWER
Answered 2020-Mar-31 at 12:43I believe it is because your tsconfig.json
file should have the esModuleInterop
set to true in the compilerOptions
section. Hope it helps
QUESTION
I'm attempting to complete this tutorial and I'm getting this error in the browser console.
...ANSWER
Answered 2020-Feb-17 at 22:40Checkout tutorial files list
The part you are referrring to is in socket-connection.ts file, not index.js as author refers.
Or it may be a typescript abrakadabra :)
P.S. and you should also remember that "this" construct is always inside a function (but not in arrow function).
QUESTION
I wanna use video chat from https://websitebeaver.com/insanely-simple-webrtc-video-chat-using-firebase-with-codepen-demo in react. script worked well when I tried on css+js+html (no react syntax) but script doesn't work after I converted to react
script.js file located \chatting\public\js\script.js
and Video.js file located \chatting\src\components\Messages\Video.js
Video.js (I made this when I click the button, modal opened. and then video call should work)
...ANSWER
Answered 2019-Dec-23 at 06:33componentDidMount() lifecycle method is usually is used for fetching data from an API not for importing js file. Simply just import your script.js in Video.js. You should also export the functions you need in script.js and then you can call your functions in your class by adding the name you specified at the first. in code below I only export showFriendsFace() from script.js and called it in Video.js constructor:
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install Video-Chat
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page