capture
capture.h File Reference

Capture Interface, for reading images. More...

Go to the source code of this file.

Macros

#define IMAGE_JPEG
 The media_type for a JPEG stream. (RGB colorspace)
 
#define IMAGE_UNCOMPRESSED
 The media_type for an uncompressed YUV stream. YUV Format can be specified using media property option "sdk_format=<format>". See VIDEO_FORMAT_*. More...
 
#define VIDEO_H264
 The media_type for an H.264 stream. The stream consists of a sequence of packetized network abstraction layer units (NALUs). The first four bytes in the stream defines the lenght of the packet. Media property option "&spsppsenabled=yes|no" can be used to include SPS (Sequence Parameter Set) and PPS (Picture Parameter Set) in the stream.
 
#define VIDEO_FORMAT_Y800
 
#define VIDEO_FORMAT_I420
 
#define VIDEO_FORMAT_UYVY
 
#define CAPTURE_TIME_FORMAT
 The format used when printing capture_time.
 

Typedefs

typedef struct _stream_info stream_info
 
typedef struct _media_stream media_stream
 
typedef struct _media_frame media_frame
 
typedef unsigned long long capture_time
 The datatype to hold a timestamp for the time of capture for a frame, measured in nanoseconds for the system uptime (CLOCK_MONOTONIC).
 

Functions

media_stream * capture_open_stream (const char *media_type, const char *media_props)
 Opens a new stream of the specified media type with the specified properties. More...
 
media_frame * capture_get_frame (media_stream *stream)
 Read a media_frame from an open stream. More...
 
media_frame ** capture_get_burst (media_stream *stream)
 
void * capture_frame_data (const media_frame *frame)
 Obtain the data from the media_frame. More...
 
size_t capture_frame_size (const media_frame *frame)
 Obtain the data size from the media_frame. More...
 
capture_time capture_frame_timestamp (const media_frame *frame)
 Obtain the timestamp of the media_frame, measured in nanoseconds. More...
 
size_t capture_frame_height (const media_frame *frame)
 Obtain the height of the media_frame. More...
 
size_t capture_frame_width (const media_frame *frame)
 Obtain the width of the media_frame. More...
 
size_t capture_frame_stride (const media_frame *frame)
 Obtain the stride length of the Y800 media_frame. More...
 
void capture_frame_free (media_frame *frame)
 Free the media_frame retrieved from capture_get_frame(). More...
 
void capture_burst_free (media_frame **frames)
 
void capture_close_stream (media_stream *stream)
 Close a running stream. More...
 
stream_info ** capture_stream_info_get ()
 Get information about the stream currently running on the camera. More...
 
char * capture_stream_info_props (const stream_info *info)
 Obtain the properties of the stream. More...
 
char * capture_stream_info_type (const stream_info *info)
 Obtain the type of the stream. More...
 
void capture_stream_info_free (stream_info **info)
 Free the array of stream_info acquired by capture_strem_info_get. More...
 
char * capture_get_resolutions_list (int channel)
 Returns a list with all available resolutions for the current capture mode. More...
 
char * capture_get_optimal_resolutions_list (int channel)
 Returns a list with the most optimal resolutions with respect to performance for the current capture mode. More...
 

Detailed Description

Capture Interface, for reading images.

The capture interface contains functions for opening and closing an image stream and for reading image frames from the stream.

The media_type parameter of the capture_open_stream function defines what kind of stream to open. Right now It can be IMAGE_JPEG for a JPEG stream or IMAGE_UNCOMPRESSED for a stream of uncompressed YUV images. The default YUV type is planar I420 but other formats can be specified by using the media_props argument to capture_open_stream(). Note: Retrieval of uncompressed is only supported on products with major version 0 of the capture interface.

Stream properties

The media_props argument is an extended version of the VAPIX option string. The following options are availble for the different stream types.

IMAGE_JPEG

For JPEG images the media_props argument accepts the same kind of parameters as specified in the VAPIX.

Read more at http://www.axis.com/techsup/cam_servers/dev/cam_http_api_2.php#api_blocks_image_video_mjpg_video

IMAGE_UNCOMPRESSED

Note: Retrieval of uncompressed is only supported on products with major version 0 of the capture interface.

For uncompressed YUV images the following parameters are defined:

  • fps=<int>
  • sdk_format=<string>
  • resolution=<string>

The fps parameter specifies the frame rate of the images being pushed out on the stream.

The sdk_format string is in FOURCC format. A FOURCC is a sequence of four bytes used do identify data formats. Supported formats include:

  • Y800
  • I420
  • UYVY

For ARTPEC-4 based products UYVY is native and require no internal conversion

The resolution string specifies the dimensions of the images in the stream. The syntax is the same as the corresponding VAPIX parameter.

General properties

The Burst mode, capture_get_burst() is no longer supported. This function will always return NULL.

Cropping

The new VAPIX parameters cropsize and croppos can be used to read a cropped area from an image. To read an area of 160x120 from an 640x480 image, apply the VAPIX parameters:

?resolution=640x480&croppos=100,0&cropsize=160x120

Stride

To support the processing enhancements provided by the RAPP library, uncompressed Y800 images will consist of aligned image data. And the interface defines a row length, called stride. Each uncompressed Y800 image is described by data, width, height and stride:

Pointer to the pixel data. The pixel data buffer is aligned. The alignment number is platform specific and compatible with the RAPP library.

The width of the image in number of pixels.

  • capture_frame_width()

    The height of the image in number of pixels.

  • capture_frame_height()

    The length of the rows in the image in bytes. The stride value is aligned and can be greater than the image width.
    The stride of the image needs to be taken into account when working with Y800 images. After the first width bytes of image data,
    the row is padded to achieve the stride. When a Y800 image with a resolution of 350x288 is requested each row has a length of 350
    bytes image data and then a 2 byte padding, giving a stride of 352.

  • capture_frame_stride()

Architecture dependent interface

There is also an architecture dependent interface, called native. It can be used to read YUV images in the format native to the current hardware. (UYVY on artpec-3 and artpec-4, NV12 on ambarella-a5s) Through the interface, width and height is specified and a pointer to a buffer of YUV data can be read. The memory for the buffer is reused and it needs to be copied if it is to be saved. If the application need to be fast and the format of YUV or the need to use VAPIX functionality like cropping is not important, then this interface can be useful

Examples

In this section some examples of how to use the Capture interface is shown.

Example 1

This will request a stream of JPEG images, with a framerate of 10 images per second.

media_stream *stream = capture_open_stream(IMAGE_JPEG, "fps=10");

Example 2

Note: Retrieval of uncompressed is only supported on products with major version 0 of the capture interface.

A more complete example of how to use the capture interface. This will request a stream of Y800 uncompressed images, the different media_props options are also documented in the VAPIX API at http://www.axis.com/techsup/cam_servers/dev/cam_http_api_index.php

The code will retrieve a frame of Y800 data and write it as a pgm-file. The stride is needed to make sure the image gets written correctly.

static void write_pgm(void *data, int width, int height, int stride) {
FILE *fp;
int row, column;
fp = fopen("test.pgm", "w");
fprintf(fp, "P5\n");
fprintf(fp, "# CREATOR: Axis Communications AB\n");
fprintf(fp, "%d %d\n", width, height);
fprintf(fp, "%d\n", 255);
for (row = 0; row < height; row++)
for (column = 0; column < width; column++)
fputc (((unsigned char *) data)[row * stride + column], fp);
fclose(fp);
}
int main (int argc, char **argv) {
media_frame *frame;
void *data;
size_t size;
media_stream *stream;
capture_time timestamp;
stream = capture_open_stream(IMAGE_UNCOMPRESSED, "fps=25&sdk_format=Y800&resolution=350x288&rotation=180");
frame = capture_get_frame(stream);
data = capture_frame_data(frame);
width = capture_frame_width(frame);
height = capture_frame_height(frame);
stride = capture_frame_stride(frame);
timestamp = capture_frame_timestamp(frame);
printf("Frame captured at %" CAPTURE_TIME_FORMAT "\n", timestamp);
write_pgm(data, width, height, stride);
return 0;
}

Example 3

An example of how to use the architecture dependent interface. Note: The architecture dependent interface is only supported on products with major version 0 of the capture interface.

media_native *nat = capture_open_native (640, 480);
unsigned char *data = capture_get_image_native (nat);
process_data (data);

Example 4

An example of how to use the architecture dependent interface when images from a specific video channel is requested. Note: The architecture dependent interface is only supported on products with major version 0 of the capture interface.

media_native *nat = capture_open_native_channel (2, 640, 480);
unsigned char *data = capture_get_image_native (nat);
process_data (data);

Compilation

To build a program using the capture interface you need to supply the library name as -lcapture. And you need to include the header file capture.h.

Compilation for a linux host

The capture interface is available for development and debugging in a PC environment. The library is called capturehost (-lcapturehost)and the interface can read existing mjpeg files. Needed are a Linux PC and an Axis camera or video encoder connected to the PC via LAN.

Restrictions for host

 Supported formats are:
 - Y800
 - I420

Example host

This is an example of how to use the capture interface on the host.

media_frame *frame;
void *data;
size_t size;
media_stream *stream;
capture_time timestamp;
stream = capture_open_stream(IMAGE_JPEG, "capture-cameraIP=<IP>&capture-userpass=<user>:<password>sdk_format=Y800&&resolution=176x144&fps=1");
frame = capture_get_frame(stream);
data = capture_frame_data(frame);
size = capture_frame_size(frame);
timestamp = capture_frame_timestamp(frame);
printf("Frame captured at %" CAPTURE_TIME_FORMAT "\n", timestamp);
process_data(data, size);

Note 1: replace <IP> with the IP address of the camera. Replace <user> with an existing user, and <password> with the password of this user.

Note 2: the capture-camera must be the first item in media_props, capture-pass the second, and followed by all other media_props

Note 3: The capturehost interface will use an http proxy whenever the environment variable http_proxy is set using
the same principle as curl.

Example host

This is an example of how to use the capturehost interface with an existing mjpeg file.

First, acquire an mjpeg file from a camera.

% curl -s -S -u \<user\>:\<password\> "http://<IP>/mjpg/video.mjpg?duration=5&resolution=176x144&fps=5" >myfile.mjpeg
% # Replace \<user\> with an existing user, and \<password\> with the password of the user:
% curl -s -S -u viewer:secretpassword "http://192.168.0.90/mjpg/video.mjpg?duration=5&resolution=176x144&fps=5" >myfile.mjpeg

Then use this mjpeg file to retrieve gray scale frames.

stream = capture_open_stream(IMAGE_UNCOMPRESSED, "capture-cameraIP=myfile.mjpeg&sdk_format=Y800&fps=1");
stream = capture_open_stream(IMAGE_UNCOMPRESSED, "capture-cameraIP=myfile.mjpeg&sdk_format=I420&fps=1");

The capturehost interface is able to reduce the frame rate of an existing mjpeg file selecting a few frames and ignoring the rest, for example, 5 fps can be reduced 1 fps by picking 1 frame and ignoring 4 frames. The capturehost interface can simulate real-time timing of the requested fps by blocking the caller a short while and downscaling the resolution by skipping pixels.

On host it is also possible to read a mjeg file from STDIN. In this case the call to capture_open_stream() takes no capture-cameraIP parameter.

stream = capture_open_stream(IMAGE_JPEG, "fps=1");

For example to use the file sunset.mjpeg with the host program 'my_host_viewer' reading from STDIN:

./my_host_viewer < sunset.mjpeg

Macro Definition Documentation

◆ IMAGE_UNCOMPRESSED

#define IMAGE_UNCOMPRESSED

The media_type for an uncompressed YUV stream. YUV Format can be specified using media property option "sdk_format=<format>". See VIDEO_FORMAT_*.

Deprecated:
Capture of uncompressed frames is not available on all products. It has been removed in Capture interface of major version 1.

Function Documentation

◆ capture_open_stream()

media_stream* capture_open_stream ( const char *  media_type,
const char *  media_props 
)

Opens a new stream of the specified media type with the specified properties.

The function will open a stream of the specified media type in the camera. The stream will be taken from an universal cache. If the media type with the specified properties already is running in the camera, they will share the data.

Parameters
media_typeThe specified media type of the stream.
media_propsThe properties of the media type, represented as VAPIX option string.
Returns
A structure associated with the stream.

◆ capture_get_frame()

media_frame* capture_get_frame ( media_stream *  stream)

Read a media_frame from an open stream.

The function will get a frame of data from the stream and return it. The frame contains the data, the size and the timestamp. The frame needs to be freed after use, using capture_frame_free.

Parameters
streamThe structure associated with the stream.
Returns
A pointer to the data frame.

◆ capture_get_burst()

media_frame** capture_get_burst ( media_stream *  stream)
Deprecated:
capture_get_burst is not supported and will always return NULL.

◆ capture_frame_data()

void* capture_frame_data ( const media_frame *  frame)

Obtain the data from the media_frame.

Parameters
frameThe media_frame to obtain data from.
Returns
A pointer to the data, NULL if frame is NULL.

◆ capture_frame_size()

size_t capture_frame_size ( const media_frame *  frame)

Obtain the data size from the media_frame.

Parameters
frameThe media_frame to obtain data size from.
Returns
The size of the data, 0 if frame is NULL.

◆ capture_frame_timestamp()

capture_time capture_frame_timestamp ( const media_frame *  frame)

Obtain the timestamp of the media_frame, measured in nanoseconds.

The returned value is nanoseconds for the system uptime (CLOCK_MONOTONIC).

If the frame is jpeg, the same information is found in the jpeg header.

Additional note: if there are 2 different stream used, with different resolution or framerate, then there is no guarantee, that both streams deliver frames which have the same source image (from the sensor or camera). But if the timestamps are identical, then they have the same source image.

Parameters
frameThe media_frame to obtain timestamp from.
Returns
The timestamp of the data, 0 if frame is NULL.

◆ capture_frame_height()

size_t capture_frame_height ( const media_frame *  frame)

Obtain the height of the media_frame.

Parameters
frameThe media_frame to obtain height from.
Returns
The height of the frame, 0 if frame is NULL.

◆ capture_frame_width()

size_t capture_frame_width ( const media_frame *  frame)

Obtain the width of the media_frame.

Parameters
frameThe media_frame to obtain width from.
Returns
The width of the frame, 0 if frame is NULL.

◆ capture_frame_stride()

size_t capture_frame_stride ( const media_frame *  frame)

Obtain the stride length of the Y800 media_frame.

Parameters
frameThe media_frame to obtain stride length from.
Returns
The stride length of the frame, 0 if frame is NULL or not Y800.

◆ capture_frame_free()

void capture_frame_free ( media_frame *  frame)

Free the media_frame retrieved from capture_get_frame().

Parameters
framepointer to the media_frame received from capture_get_frame()
Returns
void

◆ capture_burst_free()

void capture_burst_free ( media_frame **  frames)
Deprecated:
capture_burst_free is no longer supported.

◆ capture_close_stream()

void capture_close_stream ( media_stream *  stream)

Close a running stream.

The function closes the specified stream.

Parameters
streamThe structure associated with the stream.

◆ capture_stream_info_get()

stream_info** capture_stream_info_get ( )

Get information about the stream currently running on the camera.

Returns
A NULL terminated array of stream_info.

◆ capture_stream_info_props()

char* capture_stream_info_props ( const stream_info *  info)

Obtain the properties of the stream.

Parameters
infoThe stream_info to obtain the properties of.
Returns
The properties as a media_props option string.

◆ capture_stream_info_type()

char* capture_stream_info_type ( const stream_info *  info)

Obtain the type of the stream.

Parameters
infoThe stream_info to obtain the type of.
Returns
The properties as a media_type string.

◆ capture_stream_info_free()

void capture_stream_info_free ( stream_info **  info)

Free the array of stream_info acquired by capture_strem_info_get.

Parameters
infoThe stream_info array.

◆ capture_get_resolutions_list()

char* capture_get_resolutions_list ( int  channel)

Returns a list with all available resolutions for the current capture mode.

Parameters
channelThe videochannel.
Returns
A pointer to a string with resolutions. This must be deallocated by the caller using free().

◆ capture_get_optimal_resolutions_list()

char* capture_get_optimal_resolutions_list ( int  channel)

Returns a list with the most optimal resolutions with respect to performance for the current capture mode.

This will be a sub-set from the resolutions achieved by get_resolutions_list(int channel);

Parameters
channelThe videochannel.
Returns
A pointer to a string with resolutions. This must be deallocated by the caller using free().