1. Java 与 FFMPEG
FFMPEG 是一个广泛用于媒体处理的库,在Java的世界里,处理视频的能力相当弱,于是有很大需求需要Java 调用 FFMPEG。
Java 调用C 的方式有很多,可以用最原始的JNI方式,也可以JNA方式,还可以是命令行。采用命令行的方式比较简单,不过有很大局限性,尤其是涉及到 视频的处理和分析的时候,比如要取出某个packet,然后进行处理。
基于JavaCPP的项目
1)JavaCV,是做图形图像的,有人脸识别、增强现实AR 等开源算法,非常不错
项目主页是:https://github.com/bytedeco/javacv项目主页:https://github.com/hoary/JavaAV
2. JavaCPP Presets
为了方便使用,JavaCPP下面有个presets项目,主页是 https://github.com/bytedeco/javacpp-presets,将一些常用的项目都编译好了以方便使用。
集成的项目包括:
? OpenCV 2.4.9 http://opencv.org/downloads.html
? FFmpeg 2.3.x http://ffmpeg.org/download.html
? FlyCapture 2.6.x http://ww2.ptgrey.com/sdk/flycap
? libdc1394 2.1.x or 2.2.x http://sourceforge.net/projects/libdc1394/files/
? libfreenect 0.5 https://github.com/OpenKinect/libfreenect
? videoInput 0.200 https://github.com/ofTheo/videoInput/tree/update2013
? ARToolKitPlus 2.3.0 https://launchpad.net/artoolkitplus
? flandmark 1.07 http://cmp.felk.cvut.cz/~uricamic/flandmark/#download
? FFTW 3.3.4 http://www.fftw.org/download.html
? GSL 1.16 http://www.gnu.org/software/gsl/#downloading
? LLVM 3.4.2 http://llvm.org/releases/download.html
? Leptonica 1.71 http://www.leptonica.org/download.html
? Tesseract 3.03-rc1 https://code.google.com/p/tesseract-ocr/
包含的平台有: android-arm, android-x86, linux-x86, linux-x86_64, macosx-x86_64, windows-x86, windows-x86_64,
但需要注意:这里的 linux-x86_64 是 基于Fedora 平台的,无法在 CentOS下使用。
3. 在CentOS下编译
1)安装JDK
2)安装 apache-maven 2/3
3) 安装 gcc,gcc+,gcc-c++,yasm
yum -y install yasm gcc+ gcc-c++
4) 获取源码
git clone http://github.com/bytedeco/javacpp-presets
5) 编译 ffmpeg
cd javacpp-presets/
./cppbuild.sh -platform linux-x86_64 install ffmpeg
6) 编译 jni 和 相关jar
mvn install --projects ffmpeg
7) 编译完成后,会在 ffmpeg 的lib上生成相关 .so ,javacpp-presets/targets 下生成相关的jar
4. 测试
1) 将相关的 .so 复制到 /lib64 目录下,或者通过 -Djava.library.path 指定到.so 所在目录
2) 拷贝一个实例用来测试,
public class Tutorial01 { static void SaveFrame(AVFrame pFrame, int width, int height, int iFrame) throws IOException { // Open file OutputStream stream = new FileOutputStream("frame" + iFrame + ".ppm"); // Write header stream.write(("P6\n" + width + " " + height + "\n255\n").getBytes()); // Write pixel data BytePointer data = pFrame.data(0); byte[] bytes = new byte[width * 3]; int l = pFrame.linesize(0); for (int y = 0; y < height; y++) { data.position(y * l).get(bytes); stream.write(bytes); } // Close file stream.close(); } public static void main(String[] args) throws IOException { AVFormatContext pFormatCtx = new AVFormatContext(null); int i, videoStream; AVCodecContext pCodecCtx = null; AVCodec pCodec = null; AVFrame pFrame = null; AVFrame pFrameRGB = null; AVPacket packet = new AVPacket(); int[] frameFinished = new int[1]; int numBytes; BytePointer buffer = null; AVDictionary optionsDict = null; SwsContext sws_ctx = null; if (args.length < 1) { args = new String[] { "/root/test.ts" }; // System.out.println("Please provide a movie file"); // System.exit(-1); } // Register all formats and codecs av_register_all(); // Open video file if (avformat_open_input(pFormatCtx, args[0], null, null) != 0) { System.exit(-1); // Couldn't open file } // Retrieve stream information if (avformat_find_stream_info(pFormatCtx, (PointerPointer) null) < 0) { System.exit(-1); // Couldn't find stream information } // Dump information about file onto standard error av_dump_format(pFormatCtx, 0, args[0], 0); // Find the first video stream videoStream = -1; for (i = 0; i < pFormatCtx.nb_streams(); i++) { if (pFormatCtx.streams(i).codec().codec_type() == AVMEDIA_TYPE_VIDEO) { videoStream = i; break; } } if (videoStream == -1) { System.exit(-1); // Didn't find a video stream } // Get a pointer to the codec context for the video stream pCodecCtx = pFormatCtx.streams(videoStream).codec(); // Find the decoder for the video stream pCodec = avcodec_find_decoder(pCodecCtx.codec_id()); if (pCodec == null) { System.err.println("Unsupported codec!"); System.exit(-1); // Codec not found } // Open codec if (avcodec_open2(pCodecCtx, pCodec, optionsDict) < 0) { System.exit(-1); // Could not open codec } // Allocate video frame pFrame = av_frame_alloc(); // Allocate an AVFrame structure pFrameRGB = av_frame_alloc(); if (pFrameRGB == null) { System.exit(-1); } // Determine required buffer size and allocate buffer numBytes = avpicture_get_size(AV_PIX_FMT_RGB24, pCodecCtx.width(), pCodecCtx.height()); buffer = new BytePointer(av_malloc(numBytes)); sws_ctx = sws_getContext(pCodecCtx.width(), pCodecCtx.height(), pCodecCtx.pix_fmt(), pCodecCtx.width(), pCodecCtx.height(), AV_PIX_FMT_RGB24, SWS_BILINEAR, null, null, (DoublePointer) null); // Assign appropriate parts of buffer to image planes in pFrameRGB // Note that pFrameRGB is an AVFrame, but AVFrame is a superset // of AVPicture avpicture_fill(new AVPicture(pFrameRGB), buffer, AV_PIX_FMT_RGB24, pCodecCtx.width(), pCodecCtx.height()); // Read frames and save first five frames to disk i = 0; while (av_read_frame(pFormatCtx, packet) >= 0) { // Is this a packet from the video stream? if (packet.stream_index() == videoStream) { // Decode video frame avcodec_decode_video2(pCodecCtx, pFrame, frameFinished, packet); // Did we get a video frame? if (frameFinished[0] != 0) { // Convert the image from its native format to RGB sws_scale(sws_ctx, pFrame.data(), pFrame.linesize(), 0, pCodecCtx.height(), pFrameRGB.data(), pFrameRGB.linesize()); // Save the frame to disk if (++i <= 5) { SaveFrame(pFrameRGB, pCodecCtx.width(), pCodecCtx.height(), i); } } } // Free the packet that was allocated by av_read_frame av_free_packet(packet); } // Free the RGB image av_free(buffer); av_free(pFrameRGB); // Free the YUV frame av_free(pFrame); // Close the codec avcodec_close(pCodecCtx); // Close the video file avformat_close_input(pFormatCtx); System.exit(0); } }
4)运行实例,得到输出结果:
Input #0, mpegts, from ‘/root/test.ts‘:
Duration: 00:03:20.02, start: 0.056778, bitrate: 1455 kb/s
Program 1
Metadata:
service_name : jVideo
service_provider: jTeam
Stream #0:0[0x100]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p, 960x540 [SAR 1:1 DAR 16:9], 25 fps, 25 tbr, 90k tbn, 50 tbc
Stream #0:1[0x101]: Audio: aac ([15][0][0][0] / 0x000F), 44100 Hz, stereo, fltp, 66 kb/s
CentOS 下通过 JavaCPP 调用 FFMPEG,布布扣,bubuko.com
原文:http://blog.csdn.net/maoxiang/article/details/38388387