Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

1.11.0 Release #547

Merged
merged 12 commits into from
Sep 27, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .github/workflows/release-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,8 @@ jobs:
ssh-private-key: ${{ secrets.BOT_SSH_PRIVATE_KEY }}

- uses: actions/[email protected]
with:
fetch-depth: 0

- uses: ./.github/actions/ruby-cache

Expand Down
22 changes: 22 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,35 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).

### πŸ”„ Changed

# [1.11.0](https://github.com/GetStream/stream-video-swift/releases/tag/1.11.0)
_September 26, 2024_

### βœ… Added
- You can now pass your customData when initializing a `CallViewModel` [#530](https://github.com/GetStream/stream-video-swift/pull/530)

### πŸ”„ Changed
- Updated the default sorting for Participants during a call to minimize the movement of already visible tiles [#515](https://github.com/GetStream/stream-video-swift/pull/515)
- **Breaking** The `StreamDeviceOrientation` values now are `.portrait(isUpsideDown: Bool)` & `.landscape(isLeft: Bool)`. [#534](https://github.com/GetStream/stream-video-swift/pull/534)

### 🐞 Fixed
- An `MissingPermissions` error was thrown when creating a `StreamVideo` with anonymous user type. [#525](https://github.com/GetStream/stream-video-swift/pull/525)

# [1.10.0](https://github.com/GetStream/stream-video-swift/releases/tag/1.10.0)
_August 29, 2024_

### βœ… Added
- Participants (regular and anonymous) count, can be accessed - before or after joining a call - from the `Call.state.participantCount` & `Call.state.anonymousParticipantCount` respectively. [#496](https://github.com/GetStream/stream-video-swift/pull/496)
- You can now provide the `CallSettings` when you start a ringing call [#497](https://github.com/GetStream/stream-video-swift/pull/497)

### πŸ”„ Changed
- The following `Call` APIs have been now marked as async to provide better observability.
- `func focus(at point: CGPoint)`
- `func addCapturePhotoOutput(_ capturePhotoOutput: AVCapturePhotoOutput)`
- `func removeCapturePhotoOutput(_ capturePhotoOutput: AVCapturePhotoOutput)`
- `func addVideoOutput(_ videoOutput: AVCaptureVideoDataOutput)`
- `func removeVideoOutput(_ videoOutput: AVCaptureVideoDataOutput)`
- `func zoom(by factor: CGFloat)`

# [1.0.9](https://github.com/GetStream/stream-video-swift/releases/tag/1.0.9)
_July 19, 2024_

Expand Down
2 changes: 1 addition & 1 deletion DemoApp/Sources/Components/AppEnvironment.swift
Original file line number Diff line number Diff line change
Expand Up @@ -363,7 +363,7 @@ extension AppEnvironment {
static var tokenExpiration: TokenExpiration = {
switch configuration {
case .debug:
return .oneMinute
return .never
case .test:
return .oneMinute
case .release:
Expand Down
6 changes: 5 additions & 1 deletion DemoApp/Sources/Components/Router.swift
Original file line number Diff line number Diff line change
Expand Up @@ -231,7 +231,11 @@ final class Router: ObservableObject {
utils.userListProvider = appState
streamVideoUI = StreamVideoUI(streamVideo: streamVideo, utils: utils)

appState.connectUser()
if user?.type != .anonymous {
appState.connectUser()
} else {
appState.loading = false
}
}

private func refreshToken(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,49 +10,51 @@ import UIKit

final class LocalParticipantSnapshotViewModel: NSObject, AVCapturePhotoCaptureDelegate,
AVCaptureVideoDataOutputSampleBufferDelegate {

private actor State {
private(set) var isCapturingVideoFrame = false
private(set) var zoomFactor: Float = 1

func setIsCapturingVideoFrame(_ value: Bool) {
isCapturingVideoFrame = value
}

func setZoomFactor(_ value: Float) {
zoomFactor = value
}
}

private lazy var photoOutput: AVCapturePhotoOutput = .init()
private lazy var videoOutput: AVCaptureVideoDataOutput = .init()
private var state = State()

weak var call: Call? {
didSet {
guard call?.cId != oldValue?.cId else { return }
do {
#if !targetEnvironment(simulator)
if #available(iOS 16.0, *) {
try call?.addVideoOutput(videoOutput)
/// Following Apple guidelines for videoOutputs from here:
/// https://developer.apple.com/library/archive/technotes/tn2445/_index.html
videoOutput.alwaysDiscardsLateVideoFrames = true
} else {
try call?.addCapturePhotoOutput(photoOutput)
Task {
do {
#if !targetEnvironment(simulator)
if #available(iOS 16.0, *) {
try await call?.addVideoOutput(videoOutput)
/// Following Apple guidelines for videoOutputs from here:
/// https://developer.apple.com/library/archive/technotes/tn2445/_index.html
videoOutput.alwaysDiscardsLateVideoFrames = true
} else {
try await call?.addCapturePhotoOutput(photoOutput)
}
#endif
} catch {
log.error("Failed to setup for localParticipant snapshot", error: error)
}
#endif
} catch {
log.error("Failed to setup for localParticipant snapshot", error: error)
}
}
}

func capturePhoto() {
guard !photoOutput.connections.isEmpty else { return }
photoOutput.capturePhoto(with: .init(), delegate: self)
}

func captureVideoFrame() {
guard !videoOutput.connections.isEmpty else { return }
videoOutput.setSampleBufferDelegate(
Expand All @@ -61,25 +63,25 @@ final class LocalParticipantSnapshotViewModel: NSObject, AVCapturePhotoCaptureDe
)
Task { await state.setIsCapturingVideoFrame(true) }
}

func zoom() {
Task {
do {
if await state.zoomFactor > 1 {
await state.setZoomFactor(1)
try call?.zoom(by: 1)
try await call?.zoom(by: 1)
} else {
await state.setZoomFactor(1.5)
try call?.zoom(by: 1.5)
try await call?.zoom(by: 1.5)
}
} catch {
log.error(error)
}
}
}

// MARK: - Private Helpers

private func sendImageData(_ data: Data) async {
defer { videoOutput.setSampleBufferDelegate(nil, queue: nil) }
guard
Expand All @@ -89,7 +91,7 @@ final class LocalParticipantSnapshotViewModel: NSObject, AVCapturePhotoCaptureDe
else {
return
}

do {
try await call?.sendCustomEvent([
"snapshot": .string(snapshotData.base64EncodedString())
Expand All @@ -98,7 +100,7 @@ final class LocalParticipantSnapshotViewModel: NSObject, AVCapturePhotoCaptureDe
log.error("Failed to send image.", error: error)
}
}

private func resize(
image: UIImage,
to targetSize: CGSize
Expand All @@ -108,13 +110,13 @@ final class LocalParticipantSnapshotViewModel: NSObject, AVCapturePhotoCaptureDe
else {
return image
}

let widthRatio = targetSize.width / image.size.width
let heightRatio = targetSize.height / image.size.height

// Determine the scale factor that preserves aspect ratio
let scaleFactor = min(widthRatio, heightRatio)

let scaledWidth = image.size.width * scaleFactor
let scaledHeight = image.size.height * scaleFactor
let targetRect = CGRect(
Expand All @@ -123,19 +125,19 @@ final class LocalParticipantSnapshotViewModel: NSObject, AVCapturePhotoCaptureDe
width: scaledWidth,
height: scaledHeight
)

// Create a new image context
UIGraphicsBeginImageContextWithOptions(targetSize, false, 0)
image.draw(in: targetRect)

let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()

return newImage
}

// MARK: - AVCapturePhotoCaptureDelegate

func photoOutput(
_ output: AVCapturePhotoOutput,
didFinishProcessingPhoto photo: AVCapturePhoto,
Expand All @@ -149,24 +151,24 @@ final class LocalParticipantSnapshotViewModel: NSObject, AVCapturePhotoCaptureDe
}
}
}

// MARK: - AVCaptureVideoDataOutputSampleBufferDelegate

func captureOutput(
_ output: AVCaptureOutput,
didOutput sampleBuffer: CMSampleBuffer,
from connection: AVCaptureConnection
) {
Task {
guard await state.isCapturingVideoFrame else { return }

if let imageBuffer = sampleBuffer.imageBuffer {
let ciImage = CIImage(cvPixelBuffer: imageBuffer)
if let data = UIImage(ciImage: ciImage).jpegData(compressionQuality: 1) {
await sendImageData(data)
}
}

await state.setIsCapturingVideoFrame(false)
}
}
Expand Down
Loading
Loading