New in 2.0.0
Fixes
- Improved Android SDK playback experience following simulcast layer switches.
- Fixed extended freeze of frame during the switch between simulcast layers, especially prevelant during frequent switchines.
- Greatly reduced the number of freezes when packet loss experienced.
- Improved Android SDK playback in constrained bandwidth scenarios.
- Added multiple
MCCMSampleBufferVideoRendererDelegate
toMCCMSampleBufferVideoRenderer
- Supports multiple MCSampleBufferVideoUIView to use a same renderer object
- Fixed an issue where an H264 stream encoded with zerolatency and sliced-threads options enabled would have artifacts in the decoded frames when packet loss would occur.
- Fix jagged lines that occur when using the Metal video renderer (Apple platforms)
- For H264 decoders update support for missing HIGH profiles and 1080p.
- On MacOS platforms the SDK will check for microphone/camera/screen-share permissions.
- It will also request permissions if not previously requested.
- Fix C++ publisher sample app crash when not providing credentials.
- Fixing stability of the C++ AudioCapturer.
- Introduce RTCIceCandidatePair statistics in the stats report.
- Improve certain public API naming in ObjC/Swift APIs.
- Improve logger so that when applications provide their own logging callback when using the
set_logger
function default logging to stderr will be disabled (unless force enabled by app):
// C++ API example
MILLICAST_API static void set_logger(
std::function<void(const std::string& component,
const std::string& msg,
LogLevel)> f,
std::vector<Logger::LogComponent>&& components,
bool force_stderr_log = false);
Breaking Changes
Viewer API improvement.
Changing the APIs related to projecting sources on the subscriber. Now, subscribers
do not need to manage WebRTC remote tracks, project sources and manage their state.
Subscribers now receive a notion of a remote track, which can either be audio/video
and manage their own state. These tracks can be enabled with a renderer/view, query
the source they belong to, receive events such as Activity status (Inactive/Active),
layers (for video), etc.
C++
An example for using the new APIs in c++:
std::vector<millicast::EventConnectionPtr> handlers;
handlers.push_back(viewer->add_event_handler(
[](const millicast::Viewer::RtsTrackAdded& evt) {
if (evt.track.as_video()) {
evt.track.as_video()->enable(track_manager.add_renderer())
.on_result([]() {});
} else {
evt.track.as_audio->enable().on_result([]() { });
}
}));
To see these APIs, refer to the subscriber sample app which is shipped as part of the package.
Android
And example for using the new APIs in Kotlin:
subscriber.onRemoteTrack.collect { track ->
if (track is RemoteVideoTrack) {
track.enableAsync(videoSink);
}
}
For further examples please refer to the Android Getting Started sample application or the sample application in the package.
ObjC/Swift
An example for using the new APIs in Swift:
let subscriber = MCSubscriber()
let renderer = MCAcceleratedVideoRenderer() // or MCCMSampleBufferVideoRenderer
Task {
for await track in subscriber.rtsRemoteTrackAdded() {
// Only use one renderer per video track.
if let videoTrack = track.asVideo() {
try await videoTrack.enable(renderer:renderer)
}
if let audioTrack = track.asAudio() {
try await audioTrack.enable()
}
}
}
- Please refer to the Swift Getting Started sample application for a running example.
- iOS views now take renderers in the initializer instead of tracks directly. Keep
a reference to the renderer so that the renderer can be attached to tracks to receive media.
General improvements
- The synchronous APIs have been fully removed on the publisher and subscriber.
- Added layer event members target fps/height/width/bitrate.
- The logger method's have been removed. Only a method to set the logger callback remains, which is the one which also takes the component being logged in the lambda
class MCLoggerOSLog: NSObject, MCLoggerDelegate {
func onLog(withMessage message: String, level: MCLogLevel) {
switch(level) {
case .LOG:
Logger.sdk.info("\(message)")
break
case .DEBUG, .VERBOSE:
Logger.sdk.debug("\(message)")
break
case .WARNING:
Logger.sdk.warning("\(message)")
break
case .ERROR:
Logger.sdk.error("\(message)")
break
case .OFF:
break
}
}
}
- Improve the API to control the playout delay enforced on the client:
/**
* @brief Structure describing playout delay to be enforced on the client.
* https://webrtc.googlesource.com/src/+/refs/heads/main/docs/native-code/rtp-hdrext/playout-delay
* Setting both of these values to 0,0 will ensure that there is no
* playout delay added.
*/
struct ForcePlayoutDelay {
int min; /** The minimum playout delay desired. */
int max; /** The maximum playout delay desired. */
};
/**
* @brief The playout delay to enforce on the client side.
*/
std::optional<ForcePlayoutDelay> force_playout_delay;
- Exposed stats information for: freeze_count, pause_count, total_freezes_duration, tota_pauses_duration
- For debugging and analytic purposes the Stats object now contains
StreamDetails
vartiant which holds eitherStreamViewDetails/StreamPublishDetails
based on if you are viewing/publishing a stream.- The
StreamViewDetails
object contains strings describing stream ID, cluster ID, stream viewer ID, subscriber ID. - The
StreamPublishDetails
object contains strings describing stream ID, cluster ID, uuid, feed ID, publisher ID.
- The
Android
On Android the new field in the RTSReport
object now contains the StreamDetails
field:
data class StreamPublishDetails(
val publisherId: String,
val clusterId: String,
val streamId: String,
val feedId: String,
val uuid: String
) : StreamDetails()
data class StreamViewDetails(
val subscriberId: String,
val clusterId: String,
val streamId: String,
val streamViewId: String
) : StreamDetails()
iOS
On iOS the MCStatsReport
object now includes a StreamDetails
object:
MILLICAST_API @interface MCStreamPublishDetails: NSObject
@property(nonatomic, copy) NSString * publisherId;
@property(nonatomic, copy) NSString * clusterId;
@property(nonatomic, copy) NSString * streamId;
@property(nonatomic, copy) NSString * feedId;
@property(nonatomic, copy) NSString * uuid;
@end
MILLICAST_API @interface MCStreamViewDetails: NSObject
@property(nonatomic, copy) NSString * subscriberId;
@property(nonatomic, copy) NSString * clusterId;
@property(nonatomic, copy) NSString * streamId;
@property(nonatomic, copy) NSString * streamViewId;
@end
MILLICAST_API @interface MCStreamDetails: NSObject
-(MCStreamPublishDetails* _Nullable) asPublishDetails;
-(MCStreamViewDetails* _Nullable) asViewDetails;
@end
C++
The C++ objects look like and the StreamDetails
is an std::variant containing one type of stream details:
struct MILLICAST_API StreamViewDetails {
std::string stream_view_id;
std::string subscriber_id;
std::string cluster_id;
std::string stream_id;
};
struct MILLICAST_API StreamPublishDetails {
std::string uuid;
std::string feed_id;
std::string publisher_id;
std::string cluster_id;
std::string stream_id;
};
using StreamDetails = std::variant<StreamViewDetails, StreamPublishDetails>;
They are exposed in the millicast::StatsReport
via:
virtual StreamDetails stream_details() const = 0;
ObjC/Swift
- Obj-C enums are changed to utilise NS_ENUM rather than being global C based
enums - Removing OpenGL as an option in favor of using Metal.
- Enabling the Metal Video renderer on macOS. Now,
MCIosVideoRenderer
has been renamed toMCAcceleratedVideoRenderer
and the corresponding
header filesios_renderer.h
toaccelerated_video_renderer.h
Android
- Fix crash related to JNI layer overflow when reporting statistics
- Fixing
frameWidth, frameHeight and timestamp
stats values being incorrect.