|
In a recent project, the need to achieve camera images Mac OS X environment, real-time capture and convert in Java BufferedImage objects. First, realize Mac OS X through the development of a local repository camera image capture, using Apple's proposed new AVFoundation frame camera image format is set to kCVPixelFormatType_32ARGB (set to the other tests that can not get the image, the system does not support), through delegate after the manner CMSampleBufferRef type sample buffer, through CMSampleBufferGetImageBuffer CVImageBufferRef function to convert the type of image buffers (because this is the image data captured, the data is not sampled, we can not use CMSampleBufferGetDataBuffer). Then the image buffer to give the first address by CVPixelBufferGetBaseAddress function with CVPixelBufferGetDataSize function gets image data buffer size, but here do pay attention, do not think the image buffer data then can be acquired through the Java Raster and DataBuffer other ways to direct filling in Java BufferedImage (this assumes BufferedImage using TYPE_INT_ARGB, because thinking corresponds kCVPixelFormatType_32ARGB). Because doing so will find that the color image is completely confused, in fact, we can know by calculating the corresponding high, BufferedImage image data size TYPE_INT_ARGB under wide format is width x height x 4 bytes (because then a pixel type int, 4 bytes in size), but CVPixelBufferGetDataSize image data obtained by the total size of 4 bytes more than the former, it may be saved some other information. So here directly create BufferedImage TYPE_INT_ARGB format Raster and then filled DataBuffer, etc. does not work.
Here it is necessary to take another way, with DataBufferByte, ComponentSampleModel, WritableRaster, ColorSpace, ColorModel to build BufferedImage, BufferedImage is actually used another unusual, but very efficient constructor mode. Of course, that still need to get the number of bytes in each scan line image by CVPixelBufferGetBytesPerRow function, the height of the image obtained by CVPixelBufferGetHeight function, width of the image obtained by CVPixelBufferGetWidth functions will need them later. As specified in the following code:
DataBufferByte dataBufferByte = new DataBufferByte (new byte [] [] {dataBytes}, dataSize); // This assumes dataBytes save the image data acquired locally, dataSize image data size (always better than "w * h * bytes per pixel the number of "calculated to be bigger)
ComponentSampleModel componentSampleModel = new ComponentSampleModel (DataBuffer.TYPE_BYTE, width, height, 4, bytesPerRow, new int [] {1, 2, 3, 0}); // custom BufferedImage in image format, or in bytes to store each pixels, a specific constructor see javadoc api
WritableRaster writableRaster = Raster.createWritableRaster (componentSampleModel, dataBufferByte, new Point (0, 0)); // Create grid array that contains specific image data
ColorSpace colorSpace = ColorSpace.getInstance (ColorSpace.CS_sRGB); // create an RGB color space
int [] nBits = {8, 8, 8, 8}; // the corresponding source image data ARGB, because the source image data is 32ARGB, equivalent to 4 bytes, each sequence represent alpha, red , green, blue
ColorModel colorModel = new ComponentColorModel (colorSpace, nBits, true, false, Transparency.TRANSLUCENT, DataBuffer.TYPE_BYTE); // Create a color mode
BufferedImage bufferedImage = new BufferedImage (colorModel, writableRaster, false, null); // build custom pixel format BufferedImage
In this way there would not be a problem, real-time camera images captured by a local library to correctly fill the BufferedImage in Java, and the efficiency is very high. |
|
|
|