Direct Rendering
This tutorial will show you how to build a media player application using so-called "direct rendering".
The preferred way to use media players is for your application to embed the a
video surface (a heavyweight component, most likely an AWT Canvas
) and have
VLC render the video into that.
Sometimes this is not possible, some of the more common situations are:
- You are using JavaFX - you can not embed a heavyweight AWT component in a JavaFX scene;
- You are using Java 1.7 or later on macOS - there is no heavyweight AWT toolkit on macOS for any version of Java after 1.6;
- You want to process the video buffer in some way before rendering it (maybe adding lightweight overlays, graphics, performing colour filters and so on);
- You want to display video in e.g. a JMonkeyEngine, LWJGL, or JOGL texture;
- You want multiple media players in the same application (this will be covered in a different tutorial, but suffice to say at this point that multiple direct rendering media players in the same application may be more stable than multiple embedded media players.
Direct rendering means that your application renders the video directly into
whatever component it wants, a BufferedImage
, a PixelWriter
, a texture or
whatever.
Direct rendering is implemented by the CallbackVideoSurface
or the associated
CallbackMediaPlayerComponent
.
The difference with an EmbeddedMediaPlayer
is that in the embedded case VLC
fills a native video buffer and renders it itself, whereas with a
CallbackVideoSurface
VLC still fills a native video buffer but your
application renders (or otherwise processes) it.
The direct rendering implementation provided by vlcj-4 has significant improvements over that provided by vlcj-3:
- no longer a separate media player implementation, it is now an intrinsic
part of
EmbeddedMediaPlayer
via theCallbackVideoSurface
- video buffer is backed by a native
DirectByteBffer
rather than a JavaByteBuffer
- lock the video frame buffer to prevent it from being swapped from GPU to CPU
- more efficient implementation in the example applications (one less full frame copy than before)
- direct audio is also available as an intrinsic part of
MediaPlayer
Let's Get Started
We create a standard vlcj application similar to how we did it before, except
this time we use CallbackMediaPlayerComponent
instead of
EmbeddedMediaPlayerComponent
.
Using CallbackMediaPlayerComponent
hides a lot of implementation details and
provides a reasonable default implementation for direct rendering, and its
behaviour can be configured to an extent.
If the component approach does not suit your needs, you are free to use
EmbeddedMediaPlayerComponent
and your own implementation of a
CallbackVideoSurface
- consult the Javadoc for more details.
Resizing
Resizing the video is always an application client responsibility.
The video will always be rendered, by VLC, at its intrinsic dimensions. If the source video size is 1920x1080, then the size of the video frame buffer matches that of the source video size.
To implement the resize behaviour, the contents of the video frame buffer must be scaled by whatever rendering system you are using.
So with Java2D you are likely using a BufferedImage
in a paint
method. To
scale the video correctly, you specify to use an AffineTransform
that
provides the appropriate scaling before you render the image (and perhaps set a
RenderingHint
for pixel interpolation, e.g. BILINEAR
).
The CallbackMediaPlayerComponent
provides a number of alternate painting
methods (you can choose which one you want when you create the component):
ScaledCallbackImagePainter
, scales the video, preserves the aspect ratioFilledCallbackImagePainter
, scales to fit the window, ignoring aspect ratioFixedCallbackImagePainter
, renders video at its original size, centered
Alternatives
If you do not want to use CallbackMediaPlayerComponent
, then you need to use
an EmbeddedMediaPlayer
with a CallbackVideoSurface
.
You must implement your own BufferFormatCallback
and RenderCallback
, please
see the Javadoc for more details.