node-media-server | Runtime Evironment library

 by   schaermu JavaScript Version: Current License: No License

kandi X-RAY | node-media-server Summary

kandi X-RAY | node-media-server Summary

node-media-server is a JavaScript library typically used in Server, Runtime Evironment, Nodejs applications. node-media-server has no bugs, it has no vulnerabilities and it has low support. You can download it from GitHub.

node-media-server
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              node-media-server has a low active ecosystem.
              It has 15 star(s) with 0 fork(s). There are 3 watchers for this library.
              OutlinedDot
              It had no major release in the last 6 months.
              node-media-server has no issues reported. There are no pull requests.
              It has a neutral sentiment in the developer community.
              The latest version of node-media-server is current.

            kandi-Quality Quality

              node-media-server has no bugs reported.

            kandi-Security Security

              node-media-server has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.

            kandi-License License

              node-media-server does not have a standard license declared.
              Check the repository for any license declaration and review the terms closely.
              OutlinedDot
              Without a license, all rights are reserved, and you cannot use the library in your applications.

            kandi-Reuse Reuse

              node-media-server releases are not available. You will need to build from source code and install.

            Top functions reviewed by kandi - BETA

            kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
            Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of node-media-server
            Get all kandi verified functions for this library.

            node-media-server Key Features

            No Key Features are available at this moment for node-media-server.

            node-media-server Examples and Code Snippets

            No Code Snippets are available at this moment for node-media-server.

            Community Discussions

            QUESTION

            Nginx setup for Node Media Server
            Asked 2021-Apr-16 at 15:57

            I'm trying to run node-media-server on a EC2 instance, but i a'm not able to make OBS to connect to the server, here is my Nginx config:

            ...

            ANSWER

            Answered 2021-Apr-16 at 15:57

            I found the problem, the first thing is to setup Nginx to listen on the port 80 only as node-media-server takes care of listening on the ports 8000 and 1935

            Source https://stackoverflow.com/questions/67066361

            QUESTION

            How to stream text and video at the same time?
            Asked 2020-Oct-27 at 12:57

            I have node.js server that uses node-media-server:

            ...

            ANSWER

            Answered 2020-Oct-27 at 12:57

            You can add text to a video in a number of ways - most common are probably:

            • Add a text track to the video container, i.e. the MP4 file. This is usually done server side and the client then uses this info to display it client side. You can see more info here and an example with a commonly used tool: https://www.bento4.com/developers/dash/subtitles/

            • Embed the text in the frames themselves - this requires more processing and also adds the text to the video frames themselves, so you can't turn text on and off at the client easily. If you do want to do this then FFMPEG is probably a good place to start.

            • Add a text overlay on the client itself - e.g. a text 'div' or element on a browser App, or a TextView on Android etc. You mention that synchronisation may be a problem, but you could take timing events from the video to trigger changing the text. This avoids you having to do any extra processing on the video or video container.

            A simple example of using timing to trigger text is below - you would likely want to update it to avoid checking everything on each 'onTimeUpdate' event, and maybe to put the text over the video itself, but this give an example how the basic mechanism works:

            Source https://stackoverflow.com/questions/64535985

            QUESTION

            rtmp nodejs server config on NGINX _ nginx: [emerg] bind() to 0.0.0.0:1935 failed (98: Address already in use)
            Asked 2020-Aug-11 at 09:23

            I have media server running on port 1935 .

            and I have a subdomain: "streaming.foo.com"

            I need to configure Nginx to access rtmp://streaming.foo.com

            I tried the RTMP Nginx plugin with this blog :

            https://www.nginx.com/blog/video-streaming-for-remote-learning-with-nginx/

            and configured the /etc/nginx/nginx.conf file.

            ...

            ANSWER

            Answered 2020-Aug-10 at 12:41

            You can't have two services listening on the same port.

            The common solution is to configure the real service (your media service) on another port (let's say 19350) than configure a reverse proxy on nginix to forward requests from the exposed port (1935 in this case) to the back-end service (19350 in our example).

            Sorry but I don't know nginix (I use this config with apache) so I can't help you on how to configure a reverse proxy on it.

            Source https://stackoverflow.com/questions/63266253

            QUESTION

            nodejs ffmpeg play video at specific time and stream it to client
            Asked 2020-Mar-11 at 23:15

            I'm trying to make a basic online video editor with nodeJS and ffmpeg.

            To do this I need 2 steps:

            1. set the in-and-out times of the videos from the client, which requires the client to view the video at specific times, and switch the position of the video. Meaning, if a single video is used as an input, and split it into smaller parts, it needs to replay from the starting time of the next edited segment, if that makes sense.

            2. send the input-output data to nodejs and export it with ffmpeg as a finished vide.

            At first I wanted to do 1. purely on the client, then upload the source video(s) to nodeJS, and generate the same result with ffmpeg, and send back the result.

            But there are may problems with video processing on the client side in HTML at the moment, so now I have a change of plans: to do all of the processing on the nodeJS server, including the video playing.

            This is the part I am stuck at now. I'm aware that ffmpeg can be used in many different ways from nodeJS, but I have not found a way to play a .mp4 webm video in realtime with ffmpeg, at a specific timestamp, and send the streaming video (again, at a certain timestamp) to the client.

            I've seen the pipe:1 attribute from ffmpeg, but I couldn't find any tutorials to get it working with an mp4 webm video, and to parse the stdout data somehow with nodejs and send it to the client. And even if I could get that part to work, I still have no idea to play the video, in realtime, at a certain timestamp.

            I've also seen ffplay, but that's only for testing as far as I know; I haven't seen any way of getting the video data from it in realtime with nodejs.

            So:

            how can I play a video, in nodeJS, at a specific time (preferably with ffmpeg), and send it back to the client in realtime?

            What I have already seen:

            Best approach to real time http streaming to HTML5 video client

            Live streaming using FFMPEG to web audio api

            Ffmpeg - How to force MJPEG output of whole frames?

            ffmpeg: Render webm from stdin using NodeJS

            No data written to stdin or stderr from ffmpeg

            node.js live streaming ffmpeg stdout to res

            Realtime video conversion using nodejs and ffmpeg

            Pipe output of ffmpeg using nodejs stdout

            can't re-stream using FFMPEG to MP4 HTML5 video

            FFmpeg live streaming webm video to multiple http clients over Nodejs

            http://www.mobiuso.com/blog/2018/04/18/video-processing-with-node-ffmpeg-and-gearman/

            stream mp4 video with node fluent-ffmpeg

            How to get specific start & end time in ffmpeg by Node JS?

            Live streaming: node-media-server + Dash.js configured for real-time low latency

            Low Latency (50ms) Video Streaming with NODE.JS and html5

            Server node.js for livestreaming

            HLS Streaming using node JS

            Stream part of the video to the client

            Video streaming with HTML 5 via node.js

            Streaming a video file to an html5 video player with Node.js so that the video controls continue to work?

            How to (pseudo) stream H.264 video - in a cross browser and html5 way?

            Pseudo Streaming an MP4 file

            How to stream video data to a video element?

            How do I convert an h.264 stream to MP4 using ffmpeg and pipe the result to the client?

            https://medium.com/@brianshaler/on-the-fly-video-rendering-with-node-js-and-ffmpeg-165590314f2

            node.js live streaming ffmpeg stdout to res

            Can Node.js edit video files?

            ...

            ANSWER

            Answered 2020-Mar-11 at 23:15

            This question is a bit broad, but I've built similar things and will try to answer this in pieces for you:

            1. set the in-and-out times of the videos from the client, which requires the client to view the video at specific times, and switch the position of the video. Meaning, if a single video is used as an input, and split it into smaller parts, it needs to replay from the starting time of the next edited segment, if that makes sense.

            Client-side, when you play back, you can simply use multiple HTMLVideoElement instances that reference the same URL.

            For the timing, you can manage this yourself using the .currentTime property. However, you'll find that your JavaScript timing isn't going to be perfect. If you know your start/end points at the time of instantiation, you can use Media Fragment URIs:

            Source https://stackoverflow.com/questions/60645390

            QUESTION

            Live streaming: node-media-server + Dash.js configured for real-time low latency
            Asked 2020-Feb-14 at 13:18

            We're working on an app that enables live monitoring of your back yard. Each client has a camera connected to the internet, streaming to our public node.js server.

            I'm trying to use node-media-server to publish an MPEG-DASH (or HLS) stream to be available for our app clients, on different networks, bandwidths and resolutions around the world.

            Our goal is to get as close as possible to live "real-time" so you can monitor what happens in your backyard instantly.

            The technical flow already accomplished is:

            1. ffmpeg process on our server processes the incoming camera stream (separate child process for each camera) and publishes the stream via RTSP on the local machine for node-media-server to use as an 'input' (we are also saving segmented files, generating thumbnails, etc.). the ffmpeg command responsible for that is:

              -c:v libx264 -preset ultrafast -tune zerolatency -b:v 900k -f flv rtmp://127.0.0.1:1935/live/office

            2. node-media-server is running with what I found as the default configuration for 'live-streaming'

              ...

            ANSWER

            Answered 2020-Feb-14 at 13:18

            HLS and MPEG DASH are not particularly low latency as standard and the figures you are getting are not unusual.

            Some examples from a publicly available DASH forum document (linked below) include:

            Given the resources of some of these organisations, the results you have achieved are not bad!

            There is quite a focus in the streaming industry at this time on enabling lower latency, the target being to come as close as possible to traditional broadcast latency.

            One key component of the latency in chunked Adaptive Bit Rate (ABR, see this answer for more info: https://stackoverflow.com/a/42365034/334402 ) is the need for the player to receive and decode one or more segments of the video before it can display it. Traditionally the player had to receive the entire segment before it could start to decode and display it. The diagram from the first linked open source reference below illustrates this:

            Low latency DASH and HLS leverage CMAF, 'Common Media Application Format' which breaks the segments, which might be 6 seconds long for example, into smaller 'chunks' within each segment. These chunks are designed to allow the player to decode and start playing them before it has received the full segment.

            Other sources of latency in a typical live stream will be any transcoding from one format to another and any delay in a streaming server receiving the feed, from the webcam in your case, and encoding and packaging it for streaming.

            There is quite a lot of good information available on low latency streaming at this time both from standards bodies and open source discussions which I think will really help you appreciate the issues (all links current at time of writing). From open source and standards discussions:

            and from vendors:

            Note - a common use case often quoted in the broadcast world is the case where someone watching a live event like a game may hear their neighbours celebrating a goal or touchdown before they see it themselves, because their feed has higher latency than their neighbours. While this is a driver for low latency, this is really a synchronisation issue which would require other solutions if a 'perfectly' synchronised solution was the goal.

            As you can see low latency streaming is not a simple challenge and it may be that you want to consider other approaches depending on the details of your use case, including how many subscribers you have, whether some loss of quality if a fair trade off for lower latency etc. As mentioned by @user1390208 in the comments, a more real time focused video communication technology like WebRTC may be a better match for the solution you are targeting.

            If you want to provide a service that provides life streaming and also a recording, you may want to consider using a real time protocol for the live streaming view and HLS/DASH streaming for anyone looking back through recordings where latency may not be important but quality may be more key.

            Source https://stackoverflow.com/questions/60152228

            QUESTION

            stop or kill node media server
            Asked 2019-Jul-09 at 12:10

            I am trying to implement stop feature for live video streaming using node-media-server.

            Basically I want to stop node-media-server completely, restart it later.

            ...

            ANSWER

            Answered 2019-Jul-09 at 12:10

            I see on the project github that there is a stop method.

            Have you try to use it ?

            https://github.com/illuspas/Node-Media-Server/blob/master/node_media_server.js

            Source https://stackoverflow.com/questions/56949566

            QUESTION

            rtl_fm stream with ffmpeg and low bandwith
            Asked 2019-Feb-13 at 12:30

            I currently try to stream audio from rtl_fm via ffmpeg to node-media-server.

            This is working fine.

            ...

            ANSWER

            Answered 2019-Feb-13 at 12:30

            As per the rtl_fm guide, -s is the output sampling rate, so you need to adjust that in the ffmpeg input parameter.

            Source https://stackoverflow.com/questions/54669197

            QUESTION

            Re-stream RTSP from IP cam with Node Media Server to http/ws and display it with html
            Asked 2018-Dec-26 at 23:01
            Goal

            My goal is to display my IP cam's RTSP-output stream on a standard HTML-page (html5 + css3 + vanilla javascript, no magic = no plugins). The HTML-page should be hosted in a NGINX web server on my Raspberry Pi.

            My equipment

            The setup I am using is a Raspberry Pi 3 B+ with Rasbian OS, Node.js and Node-Media-Server package, NGINX (but I do not believe that NGINX is important for my problem? I have not made any config for the Node-Media-Server in it anyway.) An IP-camera, and a browser.

            What I have tried

            The readme in the Node-Media-Server-project is detailed and there is a tutorial describing almost exactly what I want to do. Specifically, there is a markup example on how the live stream could be accessed:

            ...

            ANSWER

            Answered 2018-Dec-26 at 23:01

            As you can see from the configuration, your RTSP stream is pushed to the ‘cctv’ application.

            So your playback address should be:

            rtmp://localhost/cctv/uterum
            or
            http://localhost:8000/cctv/uterum.flv

            Source https://stackoverflow.com/questions/53934702

            QUESTION

            FLV Video packets sent over rtmp streamed with ffmpeg vs OBS
            Asked 2018-Sep-15 at 03:40

            I'm using node-media-server npm module to host my rtmp server. I captured the video packets from the server and I noticed that video packets streamed with ffmpeg -f gdigrab -offset_x 1920 -framerate 60 -video_size hd1080 -i desktop -crf 0 -preset ultrafast -f flv rtmp://localhost starts with

            • 2200 0085 or
            • 2200 0084 and
            • 1200 0085, and
            • the very first packet starts with 1200 0084.

            And when I stream to my rtmp server with OBS, I capture video packets that start with

            • 2701 0000 0000 00 and
            • the very first packet starts with 1701 0000 0000.

            What I'm trying to do is that I capture these packets, store it, and send these packets to 'players' when they connect to my server. I got it working with the packets captured with ffmpeg AND video converted to flv format with ffmpeg.

            However the players don't play the video packets streamed with OBS. However, the player does play well when my rtmp server is just 'relaying' what it receives instead of 'replaying' the captured packets. But the audio plays nicely.

            I would like to know what those starting hexadecimals represent (whether it indicates that OBS is not using the flv file format).

            ...

            ANSWER

            Answered 2018-Sep-15 at 03:40

            (1)

            "...And when I stream to my RTMP server with OBS, I capture video packets that start with

            • 27 01 00 00 00 00 00 and

            • the very first packet starts with 17 01 00 00 00 00

            I would like to know what those starting hexadecimals represent (whether it indicates that OBS is not using the FLV file format)."

            Those byte values are correct for FLV format (see: "Video encoding" section under FLV Structure.

            Let's say packets begin with byte XY 01 00 00...

            • X is frame type... X == 1 for Keyframes (I-frame), and X == 2 for supporting P/B-frames.

            • Y is codec type... Y == 7 for codec H.264 (MPEG).

            You will notice that in your FFmpeg generated FLV the Y codec type is 2. By default FFmpeg outputs FLV with Sorenson Spark codec (which has low picture quality).

            To get FFmpeg to output H264 inside FLV use -c:v libx264, example:

            Source https://stackoverflow.com/questions/52285075

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install node-media-server

            You can download it from GitHub.

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            CLONE
          • HTTPS

            https://github.com/schaermu/node-media-server.git

          • CLI

            gh repo clone schaermu/node-media-server

          • sshUrl

            git@github.com:schaermu/node-media-server.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link