Node-Media-Server | js implementation of RTMP/HTTP-FLV/WS-FLV/HLS/DASH/MP4 | Video Utils library
kandi X-RAY | Node-Media-Server Summary
kandi X-RAY | Node-Media-Server Summary
A Node.js implementation of RTMP/HTTP-FLV/WS-FLV/HLS/DASH/MP4 Media Server
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of Node-Media-Server
Node-Media-Server Key Features
Node-Media-Server Examples and Code Snippets
Community Discussions
Trending Discussions on Node-Media-Server
QUESTION
I have a web application I am working on that allows the user to stream video from their browser and simultaneously livestream to both Youtube and Twitch using ffmpeg. The application works fine when I don't need to send any of the audio. Currently I am getting the error below when I try to record video and audio. I am new to using ffmpeg and so any help would be greatly appreciated. Here is also my repo if needed: https://github.com/toshvelaga/livestream
Here is my node.js server with ffmpeg
...ANSWER
Answered 2021-Jul-26 at 17:51So I got the audio to work after a little bit of trial and error with ffmpeg. Not sure if this is the optimal approach, but it works for the time being.
Here is also the full file: https://github.com/toshvelaga/livestream/blob/main/server/server.js
QUESTION
All Files Below
I have trouble hosting my RTMP server on a digital-ocean droplet. I have 2 node application, 1 is just an API written with Hapi.js that runs on port 8000
The second one is a node-media-server app running on port 8888 and 1935 for RTMP, which I have integrated as a Hapi.js plugin, but they run as separate processes. I use Nginx as a reverse proxy to pass requests to node apps, and everything works fine. All the endpoints provided by node-media-server work.
But I can't think of a way to access my node-media-server's 1935 port and send a RTMP stream to it.
On localhost I use OBS like this rtmp://localhost:1935/live/{stream_key}
, but the same doesn't work for the hosted app.
Please show me a way to receive the stream from my OBS to server.
Maybe I could use ngix-rtmp module to receive the stream and just push it to my node-media-server app on the server...
/etc/nginx/sites-available/default
ANSWER
Answered 2021-Jun-26 at 19:45After some research, I got the solution.
The solution was painfully obvious
So node-media-server app listens for RTMP on port 1935. So the natural solution is to create and configure a firewall that will allow TCP connections through port 1935. For Ubuntu 18.0 Droplet the following does the trick.
First, find Your port with lsof -i :1935
, then allow TCP connection over the port with sudo ufw allow 1935/tcp
. Thus if the node-media-server is running, congrats! You can now use OBS like this rtmp://your_ip:1935/live/stream_key
Note that: Watch for the host Your app runs on. For me localhost
worked, but with some Droplet configurations You might need to set it to 0.0.0.0
QUESTION
I'm trying to run node-media-server on a EC2 instance, but i a'm not able to make OBS to connect to the server, here is my Nginx config:
...ANSWER
Answered 2021-Apr-16 at 15:57I found the problem, the first thing is to setup Nginx to listen on the port 80 only as node-media-server takes care of listening on the ports 8000 and 1935
QUESTION
I have node.js server that uses node-media-server:
...ANSWER
Answered 2020-Oct-27 at 12:57You can add text to a video in a number of ways - most common are probably:
Add a text track to the video container, i.e. the MP4 file. This is usually done server side and the client then uses this info to display it client side. You can see more info here and an example with a commonly used tool: https://www.bento4.com/developers/dash/subtitles/
Embed the text in the frames themselves - this requires more processing and also adds the text to the video frames themselves, so you can't turn text on and off at the client easily. If you do want to do this then FFMPEG is probably a good place to start.
Add a text overlay on the client itself - e.g. a text 'div' or element on a browser App, or a TextView on Android etc. You mention that synchronisation may be a problem, but you could take timing events from the video to trigger changing the text. This avoids you having to do any extra processing on the video or video container.
A simple example of using timing to trigger text is below - you would likely want to update it to avoid checking everything on each 'onTimeUpdate' event, and maybe to put the text over the video itself, but this give an example how the basic mechanism works:
QUESTION
I have media server running on port 1935 .
and I have a subdomain: "streaming.foo.com"
I need to configure Nginx to access rtmp://streaming.foo.com
I tried the RTMP Nginx plugin with this blog :
https://www.nginx.com/blog/video-streaming-for-remote-learning-with-nginx/
and configured the /etc/nginx/nginx.conf file.
...ANSWER
Answered 2020-Aug-10 at 12:41You can't have two services listening on the same port.
The common solution is to configure the real service (your media service) on another port (let's say 19350) than configure a reverse proxy on nginix to forward requests from the exposed port (1935 in this case) to the back-end service (19350 in our example).
Sorry but I don't know nginix (I use this config with apache) so I can't help you on how to configure a reverse proxy on it.
QUESTION
I'm trying to make a basic online video editor with nodeJS and ffmpeg.
To do this I need 2 steps:
set the in-and-out times of the videos from the client, which requires the client to view the video at specific times, and switch the position of the video. Meaning, if a single video is used as an input, and split it into smaller parts, it needs to replay from the starting time of the next edited segment, if that makes sense.
send the input-output data to nodejs and export it with ffmpeg as a finished vide.
At first I wanted to do 1. purely on the client, then upload the source video(s) to nodeJS, and generate the same result with ffmpeg, and send back the result.
But there are may problems with video processing on the client side in HTML at the moment, so now I have a change of plans: to do all of the processing on the nodeJS server, including the video playing.
This is the part I am stuck at now. I'm aware that ffmpeg can be used in many different ways from nodeJS, but I have not found a way to play a .mp4 webm video in realtime with ffmpeg, at a specific timestamp, and send the streaming video (again, at a certain timestamp) to the client.
I've seen the pipe:1 attribute from ffmpeg, but I couldn't find any tutorials to get it working with an mp4 webm video, and to parse the stdout data somehow with nodejs and send it to the client. And even if I could get that part to work, I still have no idea to play the video, in realtime, at a certain timestamp.
I've also seen ffplay, but that's only for testing as far as I know; I haven't seen any way of getting the video data from it in realtime with nodejs.
So:
how can I play a video, in nodeJS, at a specific time (preferably with ffmpeg), and send it back to the client in realtime?
What I have already seen:
Best approach to real time http streaming to HTML5 video client
Live streaming using FFMPEG to web audio api
Ffmpeg - How to force MJPEG output of whole frames?
ffmpeg: Render webm from stdin using NodeJS
No data written to stdin or stderr from ffmpeg
node.js live streaming ffmpeg stdout to res
Realtime video conversion using nodejs and ffmpeg
Pipe output of ffmpeg using nodejs stdout
can't re-stream using FFMPEG to MP4 HTML5 video
FFmpeg live streaming webm video to multiple http clients over Nodejs
http://www.mobiuso.com/blog/2018/04/18/video-processing-with-node-ffmpeg-and-gearman/
stream mp4 video with node fluent-ffmpeg
How to get specific start & end time in ffmpeg by Node JS?
Live streaming: node-media-server + Dash.js configured for real-time low latency
Low Latency (50ms) Video Streaming with NODE.JS and html5
Server node.js for livestreaming
Stream part of the video to the client
Video streaming with HTML 5 via node.js
How to (pseudo) stream H.264 video - in a cross browser and html5 way?
How to stream video data to a video element?
How do I convert an h.264 stream to MP4 using ffmpeg and pipe the result to the client?
https://medium.com/@brianshaler/on-the-fly-video-rendering-with-node-js-and-ffmpeg-165590314f2
...ANSWER
Answered 2020-Mar-11 at 23:15This question is a bit broad, but I've built similar things and will try to answer this in pieces for you:
- set the in-and-out times of the videos from the client, which requires the client to view the video at specific times, and switch the position of the video. Meaning, if a single video is used as an input, and split it into smaller parts, it needs to replay from the starting time of the next edited segment, if that makes sense.
Client-side, when you play back, you can simply use multiple HTMLVideoElement instances that reference the same URL.
For the timing, you can manage this yourself using the .currentTime
property. However, you'll find that your JavaScript timing isn't going to be perfect. If you know your start/end points at the time of instantiation, you can use Media Fragment URIs:
QUESTION
We're working on an app that enables live monitoring of your back yard. Each client has a camera connected to the internet, streaming to our public node.js server.
I'm trying to use node-media-server to publish an MPEG-DASH (or HLS) stream to be available for our app clients, on different networks, bandwidths and resolutions around the world.
Our goal is to get as close as possible to live "real-time" so you can monitor what happens in your backyard instantly.
The technical flow already accomplished is:
ffmpeg process on our server processes the incoming camera stream (separate child process for each camera) and publishes the stream via RTSP on the local machine for node-media-server to use as an 'input' (we are also saving segmented files, generating thumbnails, etc.). the ffmpeg command responsible for that is:
-c:v libx264 -preset ultrafast -tune zerolatency -b:v 900k -f flv rtmp://127.0.0.1:1935/live/office
node-media-server is running with what I found as the default configuration for 'live-streaming'
...
ANSWER
Answered 2020-Feb-14 at 13:18HLS and MPEG DASH are not particularly low latency as standard and the figures you are getting are not unusual.
Some examples from a publicly available DASH forum document (linked below) include:
Given the resources of some of these organisations, the results you have achieved are not bad!
There is quite a focus in the streaming industry at this time on enabling lower latency, the target being to come as close as possible to traditional broadcast latency.
One key component of the latency in chunked Adaptive Bit Rate (ABR, see this answer for more info: https://stackoverflow.com/a/42365034/334402 ) is the need for the player to receive and decode one or more segments of the video before it can display it. Traditionally the player had to receive the entire segment before it could start to decode and display it. The diagram from the first linked open source reference below illustrates this:
Low latency DASH and HLS leverage CMAF, 'Common Media Application Format' which breaks the segments, which might be 6 seconds long for example, into smaller 'chunks' within each segment. These chunks are designed to allow the player to decode and start playing them before it has received the full segment.
Other sources of latency in a typical live stream will be any transcoding from one format to another and any delay in a streaming server receiving the feed, from the webcam in your case, and encoding and packaging it for streaming.
There is quite a lot of good information available on low latency streaming at this time both from standards bodies and open source discussions which I think will really help you appreciate the issues (all links current at time of writing). From open source and standards discussions:
- https://dashif.org/docs/Report%20on%20Low%20Latency%20DASH.pdf (DASH focus)
- https://github.com/video-dev/hlsjs-rfcs/pull/1. (HLS focus)
and from vendors:
- https://bitmovin.com/cmaf-low-latency-streaming/
- https://websites.fraunhofer.de/video-dev/dash-js-low-latency-streaming-with-cmaf/
- https://aws.amazon.com/blogs/media/alhls-apple-low-latency-http-live-streaming-explained/
Note - a common use case often quoted in the broadcast world is the case where someone watching a live event like a game may hear their neighbours celebrating a goal or touchdown before they see it themselves, because their feed has higher latency than their neighbours. While this is a driver for low latency, this is really a synchronisation issue which would require other solutions if a 'perfectly' synchronised solution was the goal.
As you can see low latency streaming is not a simple challenge and it may be that you want to consider other approaches depending on the details of your use case, including how many subscribers you have, whether some loss of quality if a fair trade off for lower latency etc. As mentioned by @user1390208 in the comments, a more real time focused video communication technology like WebRTC may be a better match for the solution you are targeting.
If you want to provide a service that provides life streaming and also a recording, you may want to consider using a real time protocol for the live streaming view and HLS/DASH streaming for anyone looking back through recordings where latency may not be important but quality may be more key.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install Node-Media-Server
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page