Direct Rendering
This tutorial will show you how to build a media player application using so-called "direct rendering".
The preferred way to use media players is for your application to embed the a
video surface (a heavyweight component, most likely an AWT Canvas
) and have
VLC render the video into that.
Sometimes this is not possible, some of the more common situations are:
- You are using JavaFX - you can not embed a heavyweight AWT component in a JavaFX scene;
- You are using Java 1.7 or later on macOS - there is no heavyweight AWT toolkit on macOS for any version of Java after 1.6;
- You want to process the video buffer in some way before rendering it (maybe adding lightweight overlays, graphics, performing colour filters and so on);
- You want to display video in e.g. a JMonkeyEngine or JOGL texture;
- You want multiple media players in the same application (this will be covered in a different tutorial, but suffice to say at this point that multiple direct rendering media players in the same application may be more stable than multiple embedded media players.
Direct rendering means that your application renders the video directly into
whatever component it wants, a BufferedImage
, a PixelWriter
, a texture or
whatever.
The difference with an EmbeddedMediaPlayer
is that in the embedded case VLC
fills a native video buffer and renders it itself, whereas with a
DirectMediaPlayer
VLC still fills a native video buffer but your application
renders (or otherwise processes) it.
Let's Get Started
We create a standard Swing application similar to how we did it before.
Unlike with the EmbeddedMediaPlayerComponent
we do not add any media player
component to the user interface. Instead we have a lightweight JPanel
instance - it does nothing right now except have an opaque black background and
a fixed size. We add our panel to the application frame. This can go anywhere
in a layout, we simply set it as the content pane for the frame.
If you run this now, you see a frame with black content.
The DirectMediaPlayer Component
Now we have our basic application, we add create an instance of a
DirectMediaPlayerComponent
.
Start by adding a new class field:
Change the constructor to create the component instance:
There are two new concepts introduced in this code fragment: a
bufferFormatCallback
parameter used when creating the component; an
overridden template method implementation to return an instance of a
RenderCallback
.
These concepts will be explained shortly.
We must also remember to clean-up the media player component when our application exits:
Buffer Format
You must tell VLC what format you want for the native video buffer, i.e. the chroma (colour) format, the width and height, the number of bit-planes and line-pitches.
In theory you can choose any chroma format that VLC supports, although in
practice you need to choose a format that you can either process or render in
some way. The default buffer format is RV32
, implemented as a predefined
class in vlcj as RV32BufferFormat
. This buffer format is a 24-bit BGR
format with 8-bits of padding (no alpha) in a single plane. This format is
provided by vlcj since it is easy to render it into a BufferedImage
.
You can use whatever supported format you need, just provide your own
implementation of BufferFormat
.
Here we use the predefined RV32
format, with a width and height that match
the dimensions of our application frame. Resizing will be discussed later.
The callback is provided sourceWidth
and sourceHeight
parameters. These
parameters are the width and height of the source video, you are free to choose
these values to set up your buffer, or you can ignore that and have the video
scaled to whatever size you want.
Render Callback
The render callback is invoked to draw each frame of video.
It should go without saying that a render callback needs to execute as quickly as possible.
The implementation we use here simply preallocates in its constructor a buffer that is large enough to hold a single frame of video (in this case we match the size of our application window).
The onDisplay
method is invoked for every frame of video. Because we extend
RenderCallbackAdapter
the contents of the native video buffer have already
been copied to our preallocated array. Now it is a simple matter to copy the
contents of our preallocated array to our BufferedImage
. Again, we match the
dimensions of our application window when we set the image data. The last thing
to do is to request that the panel that contains our image be repainted,
thereby showing this frame of video.
Using the RenderCallbackAdapter
is a bit of a short-cut and is likely not to
be the most efficient way to render video. There are after multiple copies of
full frames of video data - this is potentially a lot of bytes to move around,
especially if you want to play full HD video at a decent frame rate.
You can therefore eschew RenderCallbackAdapter
and provide your own
implementation of RenderCallback
instead. The difference here is that the
RenderCallback
gets passed the direct pointer to the native video buffer
whereas RenderCallbackAdapter
has already copied the contents of this native
buffer to a secondary on-heap buffer. So clearly you have the opportunity to
remove one memory buffer copy for each frame of video you process.
You can also use the lock
and unlock
methods on the associated
DirectMediaPlayer
to ensure proper synchronisation of the native video
buffer. This can help you make sure you are not rendering the native buffer at
the same time as it is being overwriten by VLC.
Painting the Video
Everything is ready now to actually paint the video. To do this, we override
the paintComponent
method in our video surface panel.
This is pretty simple, we just paint the buffered image which has been filled with the video data.
You are of course free to do whatever you want in this paint method (as long as
it is quick enough). You can overlay text, graphics and so on using standard
Java2D
painting primitives.
If you are doing something else, like using a PixelWriter
in JavaFX, or a
JOGL texture, instead of painting a panel like this you do whatever you need to
do to render the video.
Final Code
Here's the final code for our direct rendering media player:
If you run this, remember to pass the media you want to play as the first command-line argument.
Resizing the Video
Resizing the video is always an application client responsibility.
The video will always be rendered, by VLC, at its intrinsic dimensions. If the source video size is 1920x1080, then the size of the video frame buffer matches that of the source video size.
To implement the resize behaviour, the contents of the video frame buffer must be scaled by whatever rendering system you are using.
So with Java2D you are likely using a BufferedImage
in a paint
method. To
scale the video correctly, you specify to use an AffineTransform
that
provides the appropriate scaling before you render the image (and perhaps set a
RenderingHint
for pixel interpolation, e.g. BILINEAR
).
You must implement this yourself in your own application.
Summary
This tutorial has introduced so-called "direct" media players and described how they can be used.
We have really just scratched the surface of direct rendering media players and probably demonstrated the least efficient video rendering method. There are plenty of examples on the GitHub project pages for vlcj and vlcj-javafx.