31 KiB
VizionStreamer Socket Control API
VizionStreamer can be controlled via a Unix Domain Socket interface. This allows external applications to configure camera parameters and stream settings at runtime.
Copyright (c) 2025 Maik Jurischka Licensed under CC BY-NC-SA 4.0 - https://creativecommons.org/licenses/by-nc-sa/4.0/
Socket Connection
- Socket Path:
/tmp/vizion_control.sock - Protocol: Unix Domain Socket (SOCK_STREAM)
- Message Format: JSON
Command Format
All commands follow this JSON structure:
{
"command": "command_name",
"params": {
"param1": "value1",
"param2": "value2"
}
}
Response Format
All responses follow this JSON structure:
Success Response:
{
"status": "success",
"message": "Optional success message"
}
Error Response:
{
"status": "error",
"message": "Error description"
}
Available Commands
1. Get Available Formats
Retrieve all supported video formats.
Command:
{
"command": "get_formats"
}
Response:
{
"status": "success",
"formats": [
{
"width": 1920,
"height": 1080,
"framerate": 30,
"format": "YUY2"
},
{
"width": 1280,
"height": 720,
"framerate": 60,
"format": "MJPG"
}
]
}
Supported Formats: YUY2, UYVY, NV12, MJPG, BGR, RGB
2. Set Video Format
Change the video format (resolution, framerate, pixel format).
Note: Cannot be changed while streaming is active.
Command:
{
"command": "set_format",
"params": {
"width": "1920",
"height": "1080",
"framerate": "30",
"format": "YUY2"
}
}
Response:
{
"status": "success",
"message": "Format set successfully"
}
3. Start Streaming
Start video streaming from the camera.
Command:
{
"command": "start_stream"
}
Response:
{
"status": "success",
"message": "Streaming started"
}
4. Stop Streaming
Stop video streaming.
Command:
{
"command": "stop_stream"
}
Response:
{
"status": "success",
"message": "Streaming stopped"
}
5. Set GStreamer Pipeline
Configure the GStreamer pipeline for video output. This determines where and how the video stream is processed/displayed.
Note: Cannot be changed while streaming is active.
Command:
{
"command": "set_pipeline",
"params": {
"pipeline": "videoconvert ! x264enc ! rtph264pay ! udpsink host=192.168.1.100 port=5000"
}
}
Response:
{
"status": "success",
"message": "Pipeline set successfully"
}
Common Pipeline Examples:
-
Display locally:
videoconvert ! autovideosink -
Stream over UDP (H.264):
videoconvert ! x264enc tune=zerolatency ! rtph264pay ! udpsink host=192.168.1.100 port=5000 -
Stream over RTSP (requires gst-rtsp-server):
videoconvert ! x264enc ! rtph264pay name=pay0 -
Save to file:
videoconvert ! x264enc ! mp4mux ! filesink location=/tmp/output.mp4 -
Stream over TCP:
videoconvert ! x264enc ! h264parse ! mpegtsmux ! tcpserversink host=0.0.0.0 port=5000 -
MJPEG over HTTP:
videoconvert ! jpegenc ! multipartmux ! tcpserversink host=0.0.0.0 port=8080
6. Get Status
Get current streaming status and pipeline configuration.
Command:
{
"command": "get_status"
}
Response:
{
"status": "success",
"streaming": true,
"pipeline": "videoconvert ! autovideosink"
}
7. Set Exposure
Configure camera exposure settings.
Command:
{
"command": "set_exposure",
"params": {
"mode": "manual",
"value": "100"
}
}
Parameters:
mode: "auto" or "manual"value: Exposure value (only used in manual mode)
Response:
{
"status": "success",
"message": "Exposure set successfully"
}
8. Set White Balance
Configure white balance settings.
Command:
{
"command": "set_whitebalance",
"params": {
"mode": "auto",
"temperature": "4500"
}
}
Parameters:
mode: "auto" or "manual"temperature: Color temperature in Kelvin (only used in manual mode)
Response:
{
"status": "success",
"message": "White balance set successfully"
}
9. Set Brightness
Adjust camera brightness.
Command:
{
"command": "set_brightness",
"params": {
"value": "50"
}
}
Response:
{
"status": "success",
"message": "Brightness set successfully"
}
10. Set Contrast
Adjust camera contrast.
Command:
{
"command": "set_contrast",
"params": {
"value": "32"
}
}
Response:
{
"status": "success",
"message": "Contrast set successfully"
}
11. Set Saturation
Adjust color saturation.
Command:
{
"command": "set_saturation",
"params": {
"value": "64"
}
}
Response:
{
"status": "success",
"message": "Saturation set successfully"
}
12. Set Sharpness
Adjust image sharpness.
Command:
{
"command": "set_sharpness",
"params": {
"value": "3"
}
}
Response:
{
"status": "success",
"message": "Sharpness set successfully"
}
13. Set Gamma
Adjust gamma correction.
Command:
{
"command": "set_gamma",
"params": {
"value": "100"
}
}
Response:
{
"status": "success",
"message": "Gamma set successfully"
}
14. Set Gain
Adjust camera gain.
Command:
{
"command": "set_gain",
"params": {
"value": "0"
}
}
Response:
{
"status": "success",
"message": "Gain set successfully"
}
15. Set eHDR Mode
Enable or disable eHDR (Enhanced High Dynamic Range) mode.
Note: eHDR features are only available on specific camera models: VCI-AR0821/AR0822, VCS-AR0821/AR0822, VLS3-AR0821/AR0822, VLS-GM2-AR0821/AR0822, and TEVS-AR0821/AR0822.
Command:
{
"command": "set_ehdr_mode",
"params": {
"mode": "0"
}
}
Parameters:
mode: "0" to enable eHDR, "1" to disable eHDR
Response:
{
"status": "success",
"message": "eHDR mode set successfully"
}
16. Set eHDR Exposure Minimum
Set the minimum number of exposure frames for eHDR.
Command:
{
"command": "set_ehdr_exposure_min",
"params": {
"value": "1"
}
}
Parameters:
value: Minimum exposure frames (range: 1-4, default: 1)
Response:
{
"status": "success",
"message": "eHDR exposure min set successfully"
}
17. Set eHDR Exposure Maximum
Set the maximum number of exposure frames for eHDR.
Command:
{
"command": "set_ehdr_exposure_max",
"params": {
"value": "4"
}
}
Parameters:
value: Maximum exposure frames (range: 1-4, default: 4)
Response:
{
"status": "success",
"message": "eHDR exposure max set successfully"
}
18. Set eHDR Ratio Minimum
Set the minimum exposure ratio for eHDR.
Command:
{
"command": "set_ehdr_ratio_min",
"params": {
"value": "12"
}
}
Parameters:
value: Minimum exposure ratio (range: 1-128, default: 12)
Response:
{
"status": "success",
"message": "eHDR ratio min set successfully"
}
19. Set eHDR Ratio Maximum
Set the maximum exposure ratio for eHDR.
Command:
{
"command": "set_ehdr_ratio_max",
"params": {
"value": "24"
}
}
Parameters:
value: Maximum exposure ratio (range: 1-128, default: 24)
Response:
{
"status": "success",
"message": "eHDR ratio max set successfully"
}
20. Get eHDR Status
Retrieve all current eHDR settings.
Command:
{
"command": "get_ehdr_status"
}
Response:
{
"status": "success",
"ehdr_mode": 0,
"exposure_min": 1,
"exposure_max": 4,
"ratio_min": 12,
"ratio_max": 24
}
21. Enable Shared Memory
Enable shared memory output for direct frame access by external processes. This creates a shared memory region at /dev/shm/<name> where frames are written in parallel to the GStreamer pipeline.
Note: Cannot be enabled while streaming is active. Must be enabled before starting the stream.
Command:
{
"command": "enable_shared_memory",
"params": {
"name": "/vizion_frame",
"buffer_size": "8294528"
}
}
Parameters:
name: Shared memory object name (optional, default: "/vizion_frame")buffer_size: Buffer size in bytes (optional, default: 8294528 for 1080p RGBA + header)
Response:
{
"status": "success",
"message": "Shared memory enabled",
"name": "/vizion_frame",
"size": 8294528
}
Shared Memory Layout:
- Header (128 bytes): Contains frame metadata
- Magic number (0x56495A4E = "VIZN")
- Width, height, format
- Data size, timestamp (nanoseconds)
- Frame sequence counter
- Atomic write sequence (for lock-free synchronization)
- Frame Data: Raw frame bytes
Example Buffer Sizes:
- 1920×1080 RGBA: 8294528 bytes (1920×1080×4 + 128)
- 1280×720 RGBA: 3686528 bytes (1280×720×4 + 128)
22. Disable Shared Memory
Disable and cleanup shared memory output.
Command:
{
"command": "disable_shared_memory"
}
Response:
{
"status": "success",
"message": "Shared memory disabled"
}
23. Get Shared Memory Status
Query the current shared memory configuration.
Command:
{
"command": "get_shared_memory_status"
}
Response (when enabled):
{
"status": "success",
"shared_memory_enabled": true,
"name": "/vizion_frame",
"size": 8294528
}
Response (when disabled):
{
"status": "success",
"shared_memory_enabled": false
}
Usage Examples
Complete Workflow Example
# 1. Set GStreamer pipeline for UDP streaming
echo '{"command":"set_pipeline","params":{"pipeline":"videoconvert ! x264enc tune=zerolatency ! rtph264pay ! udpsink host=192.168.1.100 port=5000"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# 2. Set video format
echo '{"command":"set_format","params":{"width":"1920","height":"1080","framerate":"30","format":"YUY2"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# 3. Configure camera settings
echo '{"command":"set_exposure","params":{"mode":"auto"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
echo '{"command":"set_brightness","params":{"value":"50"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# 3a. (Optional) Configure eHDR settings (for compatible cameras)
echo '{"command":"set_ehdr_mode","params":{"mode":"0"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
echo '{"command":"set_ehdr_exposure_min","params":{"value":"1"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
echo '{"command":"set_ehdr_exposure_max","params":{"value":"4"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
echo '{"command":"set_ehdr_ratio_min","params":{"value":"12"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
echo '{"command":"set_ehdr_ratio_max","params":{"value":"24"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# 3b. (Optional) Enable shared memory output
echo '{"command":"enable_shared_memory","params":{"name":"/vizion_frame","buffer_size":"8294528"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# 4. Start streaming
echo '{"command":"start_stream"}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# 5. Check status
echo '{"command":"get_status"}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
echo '{"command":"get_shared_memory_status"}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# 6. Stop streaming when done
echo '{"command":"stop_stream"}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# 7. (Optional) Disable shared memory
echo '{"command":"disable_shared_memory"}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
GStreamer Pipeline Examples
# Stream to local display
echo '{"command":"set_pipeline","params":{"pipeline":"videoconvert ! autovideosink"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Stream over UDP (H.264)
echo '{"command":"set_pipeline","params":{"pipeline":"videoconvert ! x264enc tune=zerolatency ! rtph264pay ! udpsink host=192.168.1.100 port=5000"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Save to MP4 file
echo '{"command":"set_pipeline","params":{"pipeline":"videoconvert ! x264enc ! mp4mux ! filesink location=/tmp/output.mp4"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# MJPEG HTTP server
echo '{"command":"set_pipeline","params":{"pipeline":"videoconvert ! jpegenc ! multipartmux ! tcpserversink host=0.0.0.0 port=8080"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
Using socat
# Get available formats
echo '{"command":"get_formats"}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Set video format
echo '{"command":"set_format","params":{"width":"1920","height":"1080","framerate":"30","format":"YUY2"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Start streaming
echo '{"command":"start_stream"}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Set exposure to auto
echo '{"command":"set_exposure","params":{"mode":"auto"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Set brightness
echo '{"command":"set_brightness","params":{"value":"50"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Get status
echo '{"command":"get_status"}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Stop streaming
echo '{"command":"stop_stream"}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
eHDR Control Examples
# Enable eHDR mode
echo '{"command":"set_ehdr_mode","params":{"mode":"0"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Disable eHDR mode
echo '{"command":"set_ehdr_mode","params":{"mode":"1"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Configure eHDR exposure range
echo '{"command":"set_ehdr_exposure_min","params":{"value":"1"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
echo '{"command":"set_ehdr_exposure_max","params":{"value":"4"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Configure eHDR ratio range
echo '{"command":"set_ehdr_ratio_min","params":{"value":"12"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
echo '{"command":"set_ehdr_ratio_max","params":{"value":"24"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Get current eHDR settings
echo '{"command":"get_ehdr_status"}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
Shared Memory Control Examples
# Enable shared memory with default settings
echo '{"command":"enable_shared_memory"}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Enable shared memory with custom name and size
echo '{"command":"enable_shared_memory","params":{"name":"/vizion_cam0","buffer_size":"8294528"}}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Get shared memory status
echo '{"command":"get_shared_memory_status"}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Disable shared memory
echo '{"command":"disable_shared_memory"}' | socat - UNIX-CONNECT:/tmp/vizion_control.sock
# Verify shared memory file exists (should show ~8.3MB file)
ls -lh /dev/shm/vizion_frame
# Watch shared memory updates in real-time (check modification time)
watch -n 0.1 'ls -l /dev/shm/vizion_frame'
Using nc (netcat with Unix socket support)
echo '{"command":"get_formats"}' | nc -U /tmp/vizion_control.sock
Using Python
import socket
import json
def send_command(command, params=None):
sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
sock.connect('/tmp/vizion_control.sock')
cmd = {"command": command}
if params:
cmd["params"] = params
sock.send(json.dumps(cmd).encode())
response = sock.recv(4096).decode()
sock.close()
return json.loads(response)
# Examples
print(send_command("get_formats"))
print(send_command("set_format", {
"width": "1920",
"height": "1080",
"framerate": "30",
"format": "YUY2"
}))
print(send_command("set_exposure", {"mode": "auto"}))
print(send_command("start_stream"))
# eHDR control examples (for compatible cameras)
print(send_command("set_ehdr_mode", {"mode": "0"})) # Enable eHDR
print(send_command("set_ehdr_exposure_min", {"value": "1"}))
print(send_command("set_ehdr_exposure_max", {"value": "4"}))
print(send_command("set_ehdr_ratio_min", {"value": "12"}))
print(send_command("set_ehdr_ratio_max", {"value": "24"}))
print(send_command("get_ehdr_status")) # Get current eHDR settings
# Shared memory control examples
print(send_command("enable_shared_memory", {
"name": "/vizion_frame",
"buffer_size": "8294528"
}))
print(send_command("get_shared_memory_status"))
print(send_command("start_stream"))
# ... streaming active, external process can read /dev/shm/vizion_frame ...
print(send_command("stop_stream"))
print(send_command("disable_shared_memory"))
Using C++
#include <sys/socket.h>
#include <sys/un.h>
#include <unistd.h>
#include <string>
#include <iostream>
std::string sendCommand(const std::string& command) {
int sock = socket(AF_UNIX, SOCK_STREAM, 0);
struct sockaddr_un addr;
memset(&addr, 0, sizeof(addr));
addr.sun_family = AF_UNIX;
strcpy(addr.sun_path, "/tmp/vizion_control.sock");
connect(sock, (struct sockaddr*)&addr, sizeof(addr));
send(sock, command.c_str(), command.length(), 0);
char buffer[4096];
int bytesRead = recv(sock, buffer, sizeof(buffer) - 1, 0);
buffer[bytesRead] = '\0';
close(sock);
return std::string(buffer);
}
// Example usage
int main() {
std::cout << sendCommand(R"({"command":"get_formats"})") << std::endl;
std::cout << sendCommand(R"({"command":"set_brightness","params":{"value":"50"}})") << std::endl;
// eHDR control examples (for compatible cameras)
std::cout << sendCommand(R"({"command":"set_ehdr_mode","params":{"mode":"0"}})") << std::endl;
std::cout << sendCommand(R"({"command":"set_ehdr_exposure_min","params":{"value":"1"}})") << std::endl;
std::cout << sendCommand(R"({"command":"set_ehdr_exposure_max","params":{"value":"4"}})") << std::endl;
std::cout << sendCommand(R"({"command":"get_ehdr_status"})") << std::endl;
// Shared memory control examples
std::cout << sendCommand(R"({"command":"enable_shared_memory","params":{"name":"/vizion_frame","buffer_size":"8294528"}})") << std::endl;
std::cout << sendCommand(R"({"command":"get_shared_memory_status"})") << std::endl;
std::cout << sendCommand(R"({"command":"start_stream"})") << std::endl;
// ... streaming active, external process can read /dev/shm/vizion_frame ...
std::cout << sendCommand(R"({"command":"stop_stream"})") << std::endl;
std::cout << sendCommand(R"({"command":"disable_shared_memory"})") << std::endl;
return 0;
}
Parameter Value Ranges
The valid ranges for camera parameters depend on the specific camera model. You can query the camera capabilities through the VizionSDK API or experimentally determine valid ranges.
Typical ranges (camera-dependent):
- Brightness: 0-255
- Contrast: 0-255
- Saturation: 0-255
- Sharpness: 0-255
- Gamma: 72-500
- Gain: 0-100
- Exposure: 1-10000 (in auto mode, value is ignored)
- White Balance Temperature: 2800-6500 Kelvin
eHDR ranges (for compatible cameras only):
- eHDR Mode: 0 (enable) or 1 (disable)
- eHDR Exposure Min: 1-4 (default: 1)
- eHDR Exposure Max: 1-4 (default: 4)
- eHDR Ratio Min: 1-128 (default: 12)
- eHDR Ratio Max: 1-128 (default: 24)
Compatible eHDR Camera Models:
- VCI-AR0821/AR0822
- VCS-AR0821/AR0822
- VLS3-AR0821/AR0822
- VLS-GM2-AR0821/AR0822
- TEVS-AR0821/AR0822
Error Handling
Always check the status field in the response:
response = send_command("set_format", {...})
if response["status"] == "error":
print(f"Command failed: {response['message']}")
else:
print("Command successful")
Thread Safety
The socket server handles one client connection at a time. Commands are processed sequentially with mutex protection to ensure thread safety with the camera operations.
GStreamer Integration
VizionStreamer uses GStreamer for video processing and output. The captured frames from the VizionSDK camera are continuously fed into a GStreamer pipeline in a separate acquisition thread.
How It Works
- Continuous Acquisition Loop: A dedicated thread continuously captures frames from the camera using
VxGetImage() - Frame Buffering: Captured frames are pushed into the GStreamer pipeline via
appsrc - Pipeline Processing: GStreamer processes the frames according to the configured pipeline
- Output: Frames are displayed, saved, or streamed based on the pipeline configuration
Performance Monitoring
The acquisition loop prints FPS statistics every second:
FPS: 30 | Total frames: 1234 | Frame size: 4147200 bytes
Receiving UDP Stream
If you configured a UDP streaming pipeline, receive it with:
# Using GStreamer
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H264 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! autovideosink
# Using FFplay
ffplay -fflags nobuffer -flags low_delay -framedrop udp://0.0.0.0:5000
# Using VLC
vlc udp://@:5000
Receiving MJPEG HTTP Stream
If you configured an MJPEG HTTP server pipeline:
# View in browser
firefox http://192.168.1.100:8080
# Using FFplay
ffplay http://192.168.1.100:8080
# Using curl to save frames
curl http://192.168.1.100:8080 > stream.mjpg
Shared Memory Reader Implementation
When shared memory output is enabled, external processes can directly read frame data from /dev/shm/<name>. Here's how to implement a reader:
Shared Memory Structure
struct SharedMemoryHeader {
uint32_t magic; // 0x56495A4E ("VIZN") - for validation
uint32_t width; // Frame width in pixels
uint32_t height; // Frame height in pixels
uint32_t format; // Format enum (VX_IMAGE_FORMAT)
uint32_t data_size; // Frame data size in bytes
uint64_t timestamp_ns; // Timestamp in nanoseconds
uint32_t frame_sequence; // Monotonic frame counter
atomic_uint32_t write_sequence; // Lock-free sync counter
char format_str[16]; // Format string ("YUY2", "MJPG", etc.)
uint8_t reserved[72]; // Reserved (padding to 128 bytes)
};
// Frame data starts at offset 128
Lock-Free Read Protocol
The write_sequence counter enables lock-free synchronization:
- Even values: Write complete, data is consistent
- Odd values: Write in progress, data may be inconsistent
Reader Algorithm:
- Read
write_sequence(must be even) - Read header and frame data
- Read
write_sequenceagain - If values match → data is consistent
- If values differ → retry
C++ Reader Example
#include <sys/mman.h>
#include <sys/stat.h>
#include <fcntl.h>
#include <unistd.h>
#include <atomic>
#include <cstring>
#include <iostream>
struct SharedMemoryHeader {
uint32_t magic;
uint32_t width;
uint32_t height;
uint32_t format;
uint32_t data_size;
uint64_t timestamp_ns;
uint32_t frame_sequence;
std::atomic<uint32_t> write_sequence;
char format_str[16];
uint8_t reserved[72];
};
class SharedMemoryReader {
private:
int fd_;
void* ptr_;
size_t size_;
uint32_t last_sequence_;
public:
SharedMemoryReader(const char* name, size_t size)
: fd_(-1), ptr_(nullptr), size_(size), last_sequence_(0) {
// Open shared memory
fd_ = shm_open(name, O_RDONLY, 0666);
if (fd_ < 0) {
throw std::runtime_error("Failed to open shared memory");
}
// Map memory
ptr_ = mmap(nullptr, size_, PROT_READ, MAP_SHARED, fd_, 0);
if (ptr_ == MAP_FAILED) {
close(fd_);
throw std::runtime_error("Failed to map shared memory");
}
}
~SharedMemoryReader() {
if (ptr_ != nullptr && ptr_ != MAP_FAILED) {
munmap(ptr_, size_);
}
if (fd_ >= 0) {
close(fd_);
}
}
bool readFrame(uint8_t* buffer, size_t buffer_size,
uint32_t* width, uint32_t* height,
char* format, uint64_t* timestamp) {
auto* header = static_cast<SharedMemoryHeader*>(ptr_);
auto* frame_data = static_cast<uint8_t*>(ptr_) + sizeof(SharedMemoryHeader);
uint32_t seq1, seq2;
do {
// Read sequence number
seq1 = header->write_sequence.load(std::memory_order_acquire);
// Wait if write is in progress (odd sequence)
while (seq1 & 1) {
seq1 = header->write_sequence.load(std::memory_order_acquire);
}
// Validate magic number
if (header->magic != 0x56495A4E) {
std::cerr << "Invalid magic number" << std::endl;
return false;
}
// Check if this is a new frame
if (header->frame_sequence <= last_sequence_) {
return false; // Already seen this frame
}
// Check buffer size
if (header->data_size > buffer_size) {
std::cerr << "Buffer too small" << std::endl;
return false;
}
// Read metadata
*width = header->width;
*height = header->height;
*timestamp = header->timestamp_ns;
strncpy(format, header->format_str, 15);
format[15] = '\0';
// Copy frame data
memcpy(buffer, frame_data, header->data_size);
// Verify sequence hasn't changed
seq2 = header->write_sequence.load(std::memory_order_acquire);
} while (seq1 != seq2);
last_sequence_ = header->frame_sequence;
return true;
}
};
// Usage example
int main() {
try {
SharedMemoryReader reader("/vizion_frame", 8294528);
std::vector<uint8_t> frame_buffer(8294400); // 1920x1080x4
uint32_t width, height;
char format[16];
uint64_t timestamp;
while (true) {
if (reader.readFrame(frame_buffer.data(), frame_buffer.size(),
&width, &height, format, ×tamp)) {
std::cout << "New frame: " << width << "x" << height
<< " format=" << format
<< " timestamp=" << timestamp << std::endl;
// Process frame data here...
}
usleep(10000); // Poll every 10ms
}
} catch (const std::exception& e) {
std::cerr << "Error: " << e.what() << std::endl;
return 1;
}
return 0;
}
Python Reader Example
import mmap
import struct
import time
from pathlib import Path
class SharedMemoryReader:
HEADER_SIZE = 128
MAGIC = 0x56495A4E # "VIZN"
def __init__(self, name, size):
self.name = name
self.size = size
self.last_sequence = 0
# Open shared memory file
shm_path = Path(f"/dev/shm{name}")
self.fd = open(shm_path, "rb")
self.mmap = mmap.mmap(self.fd.fileno(), size, access=mmap.ACCESS_READ)
def close(self):
if self.mmap:
self.mmap.close()
if self.fd:
self.fd.close()
def read_frame(self):
while True:
# Read write_sequence (offset 28)
self.mmap.seek(28)
seq1 = struct.unpack('I', self.mmap.read(4))[0]
# Wait if write in progress (odd)
while seq1 & 1:
time.sleep(0.0001)
self.mmap.seek(28)
seq1 = struct.unpack('I', self.mmap.read(4))[0]
# Read header
self.mmap.seek(0)
header_bytes = self.mmap.read(self.HEADER_SIZE)
magic, width, height, fmt, data_size, timestamp, frame_seq = \
struct.unpack('IIIIIQII', header_bytes[:40])
format_str = header_bytes[40:56].decode('utf-8').strip('\x00')
# Validate magic
if magic != self.MAGIC:
return None
# Check if new frame
if frame_seq <= self.last_sequence:
return None
# Read frame data
frame_data = self.mmap.read(data_size)
# Verify sequence
self.mmap.seek(28)
seq2 = struct.unpack('I', self.mmap.read(4))[0]
if seq1 == seq2:
self.last_sequence = frame_seq
return {
'width': width,
'height': height,
'format': format_str,
'timestamp': timestamp,
'sequence': frame_seq,
'data': frame_data
}
# Usage
reader = SharedMemoryReader("/vizion_frame", 8294528)
try:
while True:
frame = reader.read_frame()
if frame:
print(f"New frame: {frame['width']}x{frame['height']} "
f"format={frame['format']} seq={frame['sequence']}")
# Process frame['data'] here...
time.sleep(0.01) # Poll every 10ms
finally:
reader.close()
Performance Considerations
Polling vs. Blocking:
- The reader examples use polling (checking periodically)
- For lower CPU usage, increase poll interval
- For lower latency, decrease poll interval
- Alternative: Use inotify to watch
/dev/shm/<name>for modifications
Memory Bandwidth:
- Reading shared memory creates an additional memcpy
- For zero-copy processing, process data in-place (requires careful synchronization)
- Multiple readers can access the same shared memory simultaneously
Frame Rate:
- Reader must keep up with writer's frame rate
- Use
frame_sequenceto detect dropped frames - Monitor
last_sequencevsheader->frame_sequencedifference
Notes
- The socket file is automatically created when VizionStreamer starts
- The socket file is removed when VizionStreamer exits cleanly
- Format and pipeline changes require streaming to be stopped first
- The acquisition loop runs continuously while streaming is active
- Some parameters may not be supported on all camera models
- Invalid parameter values will return an error response
- GStreamer pipeline errors will be reported when starting the stream
- Default pipeline:
videoconvert ! autovideosink(display locally) - eHDR features require compatible camera models (VCI/VCS/VLS3/VLS-GM2/TEVS-AR0821/AR0822)
- eHDR settings may be reset to defaults when the camera starts streaming (driver behavior)
- Shared memory output is independent of GStreamer pipeline (both run in parallel)
- Shared memory must be enabled before starting the stream
- Shared memory files are automatically cleaned up when disabled or on clean exit
- Multiple processes can read from the same shared memory region simultaneously