apprtc | Please use the Dockerfile to run your own test/dev instance | Continuous Deployment library
kandi X-RAY | apprtc Summary
kandi X-RAY | apprtc Summary
NOTE: This project is no longer served via See Docker for local dev/testing deployment.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of apprtc
apprtc Key Features
apprtc Examples and Code Snippets
Community Discussions
Trending Discussions on apprtc
QUESTION
I'm learning WebRTC and I have found its JavaScript documentation quiet good. But then, I want to work with webrtc on Android. So I found this page. After Adding the libraries I cannot find documentation like it is for JS. There is AppRTC but then I find it confusing rather than expressive (that might be my own problem, not sure).
So where do I go for documentation for Android WebRTC library? Is there any tutorial or simple reference that is valid as of 2020? Many data I have found are 5 or more years, and seems outdated.
...ANSWER
Answered 2020-Jun-01 at 15:25I found very useful Github Project. Two years old as of 2020, but very informative and simple. Once you get the basics of WebRTC you can dive on the code. This works well in conjuction with WebRTC Codelab
QUESTION
We are developing an app using AppRTC. Audio and Video call from iOS-iOS and Android-Android are working fine but whenever we try to call from android to iOS or iOS to android, nothing happens after the call is accepted.
We have tried using the same Video codec (H264) on both android and iOS but the issue still persists.
Any assistance in this matter is highly appreciated.
...ANSWER
Answered 2020-Jul-13 at 09:07There are couple of things you can do to solve this issue:
- See if you are using https://apprtc.appspot.com instead of https://appr.tc, you should use https://appr.tc for latest AppRTC.
- Make sure you use "H264 Baseline" or "H264 High" video codec on android side as iOS supports H264 codec only.
- Keep the following class updated with AppRTC github code PeerConnectionClient.java on android
- Use latest AppRTC code on both iOS and Android
QUESTION
App is based on webRTC vs websocket . Android studio 2.3.2 last version.
I already use:
https protocol , autoplay , android version 7.0 (min > android 5.0) . App works on all supported browsers only android webview generate error.
This is first lines of errors log in logcat (android studio last ver):
E/chromium: [ERROR:audio_manager_android.cc(264)] Unable to select audio device! E/cr_VideoCapture: allocate: manager.openCamera: SecurityException: validateConnectLocked:1112: Caller "com.testwebrtc.nikola.myapplication" cannot open camera "1" without camera permission at android.hardware.camera2.CameraManager.throwAsPublicException(CameraManager.java:628) at android.hardware.camera2.CameraManager.openCameraDeviceUserAsync(CameraManager.java:347) at android.hardware.camera2.CameraManager.openCamera(CameraManager.java:450) at org.chromium.media.VideoCaptureCamera2.startCapture(VideoCaptureCamera2.java:661)
Other variant for error:
[ERROR:web_contents_delegate.cc(199)] WebContentsDelegate::CheckMediaAccessPermission: Not supported.
This is error log from chrome/webview (from errorCallBack - getUserMedia ):
...ANSWER
Answered 2017-May-25 at 15:00Your app need request permission before use camera
QUESTION
I am using this library: https://bintray.com/google/webrtc/google-webrtc
What I want to achieve (at least, at the beginning of my project) is render video locally. I am using this tutorial (which is the only one around the Internet) https://vivekc.xyz/getting-started-with-webrtc-for-android-daab1e268ff4. Unfortunately, the last line of code is not up-to-date anymore. The constructor needs a callback which I have no idea how to implement:
localVideoTrack.addRenderer(new VideoRenderer(i420Frame -> {
// no idea what to put here
}));
My code is exactly the same as in the posted tutorial. This is the very first step to make familiar with WebRTC technology in Android which I cannot figure out. My camera is capturing the video because I can see it in my log:
I/org.webrtc.Logging: CameraStatistics: Camera fps: 28.
The main issue is that I have no idea how to pass it to my SurfaceViewRenderer
through a callback. Did anyone meet that problem? I'll really appreciate any help or suggestions.
Here is the official example app which is the only source but it is done differently than one in the tutorial, it's much more complicated: https://webrtc.googlesource.com/src/+/master/examples/androidapp/src/org/appspot/apprtc
...ANSWER
Answered 2018-May-20 at 15:25You are right, the API no longer matches that in the tutorial, but it's close.
The VideoTrack, has an addRenderer(VideoRenderer renderer)
method, that requires you to create a VideoRenderer, with the SurfaceViewRenderer as parameter. But that is not possible anymore, so instead you should use the addSink(VideoSink sink)
method, of the VideoTrack. The SurfaceViewRenderer object implement the VideoSink onFrame(VideoFrame frame)
method to make this work.
QUESTION
The index.html file:
...ANSWER
Answered 2019-Dec-19 at 23:52JS Same Origin Policy:
Perhaps you have encountered access denied or similar error messages when using JavaScript to interact with iframes. This will occur if the containing document and the iframed document are not from the same domain and they attempt to reference each other's objects.
The same origin policy is a security feature of JavaScript that prevents access to properties and methods of documents from different domains.However, there are ways to ease or circumvent this restriction.
Read more: dyn-web.com
QUESTION
My question in github: https://github.com/webrtc/apprtc/issues/615 I can't config apprtc for signal server, just call video ok via wifi but via mobile network has no luck. Please view my config, I can't find any example for constands.py in anywhere. Here is my config:
...ANSWER
Answered 2019-Apr-04 at 15:48I found the error during two years ago to config apprtc: Just config ICE servers like this:
QUESTION
I'm pretty new to forum and I really appreciate the passion that all of you use to solve the questions.
So I'm here because I'm struggling to integrate the GoogleWebRTC in my Xamarin Form project in the iOS part (I'm been able to make a native objC test app ), I've try to integrate it via CocoaPods with Sharpie Pod and bind it, but I'm not be able to resolve all the errors created in the bind on ApiDefinitions.cs.
After some time I found this project https://github.com/valentingrigorean/apprtc-ios-xamarin that has successfully bind the library and after some bug fix ( editing WebRTCBinding.csproj and remove the -lstdc++.6 linker and removed some errors in code) but when I try to add that binding library into my project and try to run return me :
...ANSWER
Answered 2019-Mar-01 at 23:27I was also looking for a solution and spent a lot of time to get it running.
What I did and what helped me with the project you mentioned
- add both projects WebRTC and AppRTC into your project
- link from your project to this both projects
- Deleted lstdc++.6 as you did already
- Add both delegate IARDAppClientDelegate, IRTCEAGLVideoViewDelegate into
the class where you want to use it. You can add "RTCPeerConnectionFactory.InitializeSSL();" as test into for example ViewDidLoad and compile it for simulator and device.
Does it work or do you still get erros?
QUESTION
I use official sample to create offer SDP in Android Chrome, we can find a=rtpmap:100 H264/90000
that meant it can support H264.
But if I build AppRTC(official Android sample) and use official prebuilt libraries version 1.0.25821, call createOffer then receive SDP in SdpObserver::onCreateSuccess
, the SDP did not contain H264.
My test device is Oppo R15 (with MTK Helio P60, Android 8.1).
So, why WebRTC only support H264 in Chrome but not in native application with some Android devices?
...ANSWER
Answered 2018-Dec-16 at 10:28Chrome build uses openh264 which is not used by regular **WebRTC. What I meant by regular is that there is variant with software h.264 encoder from the chrome build which you may use but I wouldn't recommend it.
On Android WebRTC, H.264 is supported only if
- device hardware supports it, AND
- WebRTC hardware encoder glue logic supports that hardware encoder. Currently only QCOM and EXYNOS devices are supported. So any other devices even if they support h.264 HW encoder, won't be used and won't be added as part of codec factory and you won't see in SDP generated from WebRTC sample apps.
At Java level, you can see that in HardwareVideoEncoderFactory.java
which checks for QCOM and EXYNOS devices in isHardwareSupportedInCurrentSdkH264
function.
Interestingly, if you are using native code, even QCOM and EXYNOS hardware encoders are not supported (there is bug filed on Webrtc issue tracker). This is because of tight integration of HW encoding code with JNI code - definitely not a good modular code.
QUESTION
I am working on video call app using Apprtc.Follow below mention libraries.
When I change url to my custom server instead of apprtc server then video call is disconnected after 1 minute. I have lost connection with server.
To avoid the connection lost with server, we need to ping server in regular interval approx 30 sec.
But above mention AppRTC project are using jar file(autobanh.jar) to websocket connection, but in library sendPing mentod is private so not accessible.
Question 1 - There is nay way to ping websocket server.
Try after replace websocet library I had changed websocket library with below mention libraries
- https://github.com/Koredotcom/android-kore-sdk/tree/master/BotsSDK/korebotsdklib/src/main/java/kore/botssdk/autobahn
- https://github.com/martindale/soundtrack.io-android/tree/master/src/de/tavendo/autobahn
After recplacing websocket library, Now I am able to access sendPing method. But still I have connection lost after 60 seconds during video call.
Ping Method-
...ANSWER
Answered 2018-Aug-21 at 10:04I had changed the websocket library to https://github.com/crossbario/autobahn-java
This library has functionality of auto ping to server on regular time interval. After adding it, i have modified only one class of ApprtcDemo- WebSocketChannelClient
QUESTION
I build a cordova app,and want to use apprtc in app.
Environment:
...ANSWER
Answered 2018-May-17 at 08:58I use iframe instead of InAppBrowser,iframe can work well.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install apprtc
Clone the AppRTC repository
Do all the steps in the Collider instructions then continue on step 3.
Install and start a Coturn TURN server according to the instructions on the project page.
Open src/app_engine/constants.py and do the following:
If using Google Cloud Engine VM's for Collider Change WSS_INSTANCE_HOST_KEY, WSS_INSTANCE_NAME_KEY and WSS_INSTANCE_ZONE_KEY to corresponding values for your VM instances which can be found in the Google Cloud Engine management console.
Else if using other VM hosting solution Change WSS_INSTANCE_HOST_KEY to the hostname and port Collider is listening too, e.g. localhost:8089 or otherHost:443.
If using TURN and STUN servers directly Either: Comment out ICE_SERVER_OVERRIDE = None and then uncomment ICE_SERVER_OVERRIDE = [ { "urls":...] three lines below and fill your TURN server details in src/app_engine/constants.py. e.g. ICE_SERVER_OVERRIDE = [ { "urls": [ "turn:hostnameForYourTurnServer:19305?transport=udp", "turn:hostnameForYourTurnServer:19305?transport=tcp" ], "username": "TurnServerUsername", "credential": "TurnServerCredentials" }, { "urls": [ "stun:hostnameForYourStunServer:19302" ] } ] Or: Set the the comma-separated list of STUN servers in app.yaml. e.g. ICE_SERVER_URLS: "stun:hostnameForYourStunServer,stun:hostnameForYourSecondStunServer"
Else if using ICE Server provider [1] Change ICE_SERVER_BASE_URL to your ICE server provider host. Change ICE_SERVER_URL_TEMPLATE to a path or empty string depending if your ICE server provider has a specific URL path or not. Change ICE_SERVER_API_KEY to an API key or empty string depending if your ICE server provider requires an API key to access it or not. ICE_SERVER_BASE_URL = 'https://appr.tc' ICE_SERVER_URL_TEMPLATE = '%s/v1alpha/iceconfig?key=%s' ICE_SERVER_API_KEY = os.environ.get('ICE_SERVER_API_KEY')
If running locally using the Google App Engine dev server (dev/testing purposes) Start it using dev appserver provided by the Google app engine SDK pathToGcloudSDK/platform/google_appengine/dev_appserver.py out/app_engine/.
Else if running on Google App Engine in the Google Cloud (production) Make sure you have a Google Cloud Account and Google App Engine enabled. Download the Google Cloud SDK and initialize it. Deploy your AppRTC app by executing the following in the out/app_engine directory gcloud app deploy --project [YOUR_PROJECT_ID] -v [YOUR_VERSION_ID] (You can find the [YOUR_PROJECT_ID] and [YOUR_VERSION_ID] in your Google cloud console).
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page