In this article, we will go through the process of RTMP streaming using FFmpeg. For more details about the RTMP streaming protocol, please refer to this article on the basics of RTMP and other low-latency streaming protocols.
Table of Contents
What is The RTMP Streaming protocol?
Real-Time Messaging Protocol (RTMP) is a TCP-based communication protocol. It provides a bidirectional message multiplex service and mainly carries streams of videos, audio, and messages. Adobe Systems created the RTMP (Real-Time Messaging Protocol) protocol to effectively stream audio, video, and data over the internet.
Note: go here to learn more about other low-latency streaming protocols as well.
RTMP is perfect for live broadcasts of events like sporting events, concerts, or breaking news since it maintains persistent connections and permits communication with minimal latencies.
RTMP divides streams into pieces and sends them across TCP, assuring data transfer security.
Due to its low latency advantages, RTMP is still frequently used in the initial capture and transmission of live streams despite the development of more contemporary protocols like HLS and MPEG-DASH. Go here to learn more about HLS and MPEG-DASH streaming protocols.
Hence, knowing how to learn RTMP streaming using FFmpeg is pretty helpful. If you are interested in other applications of FFmpeg, please check out our Recipes in FFmpeg page.
Steps Involved in RTMP Streaming using FFmpeg
To have a better understanding of the complete process of video streaming using RTMP, we can represent the steps involved using the following diagram :
Let’s discuss each of the steps in detail.
Step 1: Encoding
The encoder encodes the audio/video using any supported codecs (what is a video codec?) and transfers the data to the RTMP server using the RTMP protocol.
We would be using FFmpeg to encode and send data to the server. Following is the FFmpeg command to encode and transfer the video to the server.
ffmpeg -re -i crowdrun.mp4 -c:v libx264 -c:a aac -f flv rtmp://localhost/show/stream
Let’s review the meaning of the parameters used in the above command.
-re | It is an input parameter that instructs FFmpeg to read the same number of frames per second as the framerate of the input video. It is most used for live streaming or with a camera input. |
-i crowdrun.mp4 | It is the input video that we are using for the discussion and can be downloaded from here |
-c:v libx264 | Here we are specifying the encoder to be used for the video as H.264/AVC |
-c:a aac | This is the target RTMP destination for the video. The URL is based on the configuration of the streaming server. |
-f flv | This is the target RTMP destination for the video. The URL is based on the configuration of the streaming server. |
rtmp://localhost/show/stream | This is the target RTMP destination for the video. The url is based on the configuration of the streaming server. |
Running the above command on the terminal after the streaming server is configured, would look like this:

You will see an output like this on your terminal that should confirm that the video is being encoded using H.264/AVC and being transferred to the server.

Let’s move to the next step.
Step 2: Serving
The RTMP server receives the feed and is responsible for scaling and delivering the content to many audiences online.
We have used the Nginx server and the RTMP module for our test. The server application is installed and configured locally; hence, the localhost is being used as the URL. It is configured to serve the content using the same RTMP protocol to the end user.
Step 3: Consuming
The last step of the process is to receive the content and consume it on the end user’s device. Since the server is configured to use the RTMP protocol for serving, we will again use the same to demonstrate this process. However, many other alternate HTTP-based protocols, such as HLS, are being used for this due to their other advantages (cost, scalability, ease of use, etc.)
As discussed in the previous sections, the target address we used to publish the video stream was : rtmp://localhost/show/stream
We can use the VLC player, which supports RTMP, to demonstrate the playback of the video.

Use the network stream option in VLC and add the RTMP URL. Now, you can see that the stream is being played back in the VLC Player. This confirms that you have a working RTMP streaming setup on your local machine.

Conclusion
In this article, we learned about RTMP streaming using FFmpeg. This opens up a world of opportunities as one can now stream to different video streaming platforms like Facebook and YouTube.
I hope you found this helpful, and do come back and read the rest of the articles on OTTVerse.
Dear OTTVerse,
There is a typing error in the command,
the ‘-c:v aac’ should be ‘-c:a aac’ (for audio codec specification).
Oops – thank you for catching it. Fixed.
Hi,
It would be helpful to provide more info about NginX RTMP server setup which provides the url “rtmp://localhost/show/stream ” in this article
We will do that in the next series. Thanks for the suggestion.
I am working with ffmpeg and rtmp camera streaming in a constrained environment. I have rtmp server running on a cloud (say eg. 11.23.123.13 in port 1935 for rtmp & 4936 for rtmps). However is there any port on sending end , i should request team to open to avoid blocking the stream sending from device end to rtmp server end(camera streaming end). Only port 443 is open in device where camera connected and ffmpeg commands are running.
It would be helpful if somebody give insights on how network is connecting from ffmpeg encoder to rtmp server
Getting the details how network is connecting from ffmpeg encoder to rtmp server?
Which ports to be open in device end(where camera connecting) to start ffmpeg streaming to rtmp server running on cloud?
Update `c:v aac` to `c:a acc` in the first command snippet.