MiXiD XCFramework API Documentation
Structured developer documentation for MiXiD.xcframework, organized with a left navigation menu and collapsible API reference sections. The full method reference remains available, but categories and classes stay compact until opened.
Documentation Menu
- About XCFrameworks
- How to Consume
- Swift and Objective-C Examples
- Platform Slices
- API Reference
- Getting Started
- Capture
- Compositing and Mixing
- RTMP
- SRT
- HLS
- RTSP and ONVIF
- Media Files and Replay
- NDI
- WebRTC and WHIP
Reference size:
52 headers
260 properties
354 methods and callbacks
About XCFrameworks
An XCFramework is Apple’s distributable binary package format for shipping framework or library variants across platforms and architectures, including simulator builds. MiXiD uses this format to package iOS, macOS, Mac Catalyst, and tvOS slices with public Objective-C headers.
Why MiXiD Uses XCFrameworks
- One package can contain multiple platform and architecture variants.
- Device and simulator binaries stay separate, avoiding older fat-framework limitations.
- Xcode can select the correct slice during build and archive.
- Public headers are bundled with the framework for Objective-C and Swift bridging.
How to Consume MiXiD.xcframework
Add MiXiD.xcframework to your Xcode target under Frameworks, Libraries, and Embedded Content. Use Embed & Sign for app targets that load the framework at runtime.
// Objective-C
#import <MiXiD/MiXiD.h>
// Swift bridging header
#import <MiXiD/MiXiD.h>
Swift projects can import MiXiD through a bridging header unless the framework is exposed as a module in your project setup. The examples below include both Swift and Objective-C call shapes; exact Swift imported names can vary slightly based on Xcode importer rules.
Platform Slices
| Identifier | Platform | Architectures | Use |
|---|---|---|---|
ios-arm64 | iOS device | arm64 | iPhone and iPad deployment. |
ios-x86_64-simulator | iOS Simulator | x86_64 | Simulator testing on Intel simulator environments. |
ios-arm64_x86_64-maccatalyst | Mac Catalyst | arm64, x86_64 | Catalyst apps built from an iOS target. |
macos-arm64_x86_64 | macOS | arm64, x86_64 | Native macOS apps. |
tvos-arm64 | tvOS | arm64 | Apple TV apps. |
Swift and Objective-C Examples
Camera to RTMP
Swift
final class LivePublisher: NSObject, MXDCameraSourceDelegate {
private let camera = MXDCameraSource()
private let pusher: MXDRTMPStreamPusher
init(serverURL: URL, streamKey: String) {
let config = MXDRTMPStreamPusherConfiguration.default()
config.videoWidth = 1920
config.videoHeight = 1080
config.videoFPS = 30
self.pusher = MXDRTMPStreamPusher(serverURL: serverURL, streamKey: streamKey, configuration: config)
super.init()
camera.delegate = self
}
func start() {
pusher.start()
camera.start()
}
func cameraSourceDidOutputVideoFrame(_ imageBuffer: CVImageBuffer, presentationTime: CMTime) {
pusher.appendVideoFrame(imageBuffer, presentationTime: presentationTime)
}
func cameraSourceDidOutputAudioBuffer(_ audioBuffer: AVAudioPCMBuffer) {
pusher.appendAudioBuffer(audioBuffer)
}
}
Objective-C
@interface LivePublisher : NSObject <MXDCameraSourceDelegate>
@property (nonatomic, strong) MXDCameraSource *camera;
@property (nonatomic, strong) MXDRTMPStreamPusher *pusher;
@end
@implementation LivePublisher
- (instancetype)initWithServerURL:(NSURL *)serverURL streamKey:(NSString *)streamKey {
self = [super init];
if (!self) { return nil; }
MXDRTMPStreamPusherConfiguration *config = [MXDRTMPStreamPusherConfiguration defaultConfiguration];
config.videoWidth = 1920;
config.videoHeight = 1080;
config.videoFPS = 30;
_camera = [MXDCameraSource new];
_camera.delegate = self;
_pusher = [[MXDRTMPStreamPusher alloc] initWithServerURL:serverURL
streamKey:streamKey
configuration:config];
return self;
}
- (void)start {
[self.pusher start];
[self.camera start];
}
- (void)cameraSourceDidOutputVideoFrame:(CVImageBufferRef)imageBuffer
presentationTime:(CMTime)presentationTime {
[self.pusher appendVideoFrame:imageBuffer presentationTime:presentationTime];
}
- (void)cameraSourceDidOutputAudioBuffer:(AVAudioPCMBuffer *)audioBuffer {
[self.pusher appendAudioBuffer:audioBuffer];
}
@end
Four-slot Mixer
Swift
final class MixerController: NSObject, MXDCompositeMediaMixerDelegate {
private let mixer = MXDCompositeMediaMixer()
override init() {
super.init()
mixer.delegate = self
mixer.setAudioGain(1.0, forSlot: 0)
mixer.setOutputAudioGain(1.0)
}
func start() { mixer.start() }
func appendCameraFrame(_ imageBuffer: CVImageBuffer, time: CMTime) {
mixer.appendVideoFrame(imageBuffer, forSlot: 0, presentationTime: time)
}
}
Objective-C
@interface MixerController : NSObject <MXDCompositeMediaMixerDelegate>
@property (nonatomic, strong) MXDCompositeMediaMixer *mixer;
@end
@implementation MixerController
- (instancetype)init {
self = [super init];
if (!self) { return nil; }
_mixer = [MXDCompositeMediaMixer new];
_mixer.delegate = self;
[_mixer setAudioGain:1.0f forSlot:0];
[_mixer setOutputAudioGain:1.0f];
return self;
}
- (void)start {
[self.mixer start];
}
- (void)appendCameraFrame:(CVImageBufferRef)imageBuffer time:(CMTime)time {
[self.mixer appendVideoFrame:imageBuffer forSlot:0 presentationTime:time];
}
- (void)compositeMediaMixerDidOutputAudioBuffer:(AVAudioPCMBuffer *)audioBuffer {
// Forward mixed audio to a recorder or stream pusher.
}
- (void)compositeMediaMixerDidUpdateAudioLevels:(NSArray<NSNumber *> *)slotLevels
masterLevel:(Float32)masterLevel {
// Update mixer UI meters.
}
@end
Instant Replay
Swift
let config = MXDReplayBufferRecorderConfiguration.default()
config.durationSeconds = 30
config.previewEnabled = true
let replay = MXDReplayBufferRecorder(configuration: config)
replay.start()
replay.saveReplay(to: outputURL, durationSeconds: 10) { url, error in
if let url { print("Saved replay: \(url)") }
}
Objective-C
MXDReplayBufferRecorderConfiguration *config = [MXDReplayBufferRecorderConfiguration defaultConfiguration];
config.durationSeconds = 30;
config.previewEnabled = YES;
MXDReplayBufferRecorder *replay = [[MXDReplayBufferRecorder alloc] initWithConfiguration:config];
[replay start];
[replay saveReplayToURL:outputURL
durationSeconds:10
completion:^(NSURL * _Nullable url, NSError * _Nullable error) {
if (url) {
NSLog(@"Saved replay: %@", url);
}
}];
API Reference
Open a category, then open an individual header/class to inspect properties and functions. Each function includes Objective-C signature, kind, return type, parameters, details, and practical usage guidance.
The full API reference is now split into focused pages. Choose a category below, then open headers and functions inside that page.
- Getting Started — 1 headers
- Capture — 1 headers, 9 methods/callbacks
- Compositing and Mixing — 4 headers, 25 methods/callbacks
- RTMP — 9 headers, 66 methods/callbacks
- SRT — 10 headers, 51 methods/callbacks
- HLS — 9 headers, 57 methods/callbacks
- RTSP and ONVIF — 5 headers, 35 methods/callbacks
- Media Files and Replay — 4 headers, 41 methods/callbacks
- NDI — 2 headers, 20 methods/callbacks
- WebRTC and WHIP — 7 headers, 50 methods/callbacks
Source: public headers in the MiXiD source tree used to build MiXiD.xcframework.
