diff --git a/.github/ISSUE_TEMPLATE/BUG_REPORT.yml b/.github/ISSUE_TEMPLATE/BUG_REPORT.yml
index fe4a241b..fecd1f21 100644
--- a/.github/ISSUE_TEMPLATE/BUG_REPORT.yml
+++ b/.github/ISSUE_TEMPLATE/BUG_REPORT.yml
@@ -5,9 +5,9 @@ labels: [🐛 bug]
body:
- type: textarea
attributes:
- label: What were you trying to do?
- description: Explain what you are trying to do.
- placeholder: I wanted to take a picture.
+ label: What's happening?
+ description: Explain what you are trying to do and what happened instead. Be as precise as possible, I can't help you if I don't understand your issue.
+ placeholder: I wanted to take a picture, but the method failed with this error "[capture/photo-not-enabled] Failed to take photo, photo is not enabled!"
validations:
required: true
- type: textarea
@@ -15,18 +15,16 @@ body:
label: Reproduceable Code
description: Share a small reproduceable code snippet here (or the entire file if necessary). This will be automatically formatted into code, so no need for backticks.
render: tsx
- - type: textarea
- attributes:
- label: What happened instead?
- description: Explain what happened instead of the desired outcome. Did something crash?
- placeholder: The app crashes with an `InvalidPhotoCodec` error.
- validations:
- required: true
- type: textarea
attributes:
label: Relevant log output
description: Please copy and paste any relevant log output (Xcode Logs/Android Studio Logcat). This will be automatically formatted into code, so no need for backticks.
render: shell
+ - type: textarea
+ attributes:
+ label: Camera Device
+ description: Please paste the JSON Camera `device` that was used here. (`console.log(JSON.stringify(device, null, 2))`) This will be automatically formatted into code, so no need for backticks.
+ render: shell
- type: input
attributes:
label: Device
@@ -38,15 +36,22 @@ body:
attributes:
label: VisionCamera Version
description: Which version of react-native-vision-camera are you using?
- placeholder: ex. 2.0.1-beta.1
+ placeholder: ex. 3.1.6
validations:
required: true
+ - type: checkboxes
+ attributes:
+ label: Can you reproduce this issue in the VisionCamera Example app?
+ description: Run the example app (`package/example/`) and see if the issue is reproduceable here.
+ options:
+ - label: I can reproduce the issue in the VisionCamera Example app.
- type: checkboxes
attributes:
label: Additional information
description: Please check all the boxes that apply
options:
- label: I am using Expo
+ - label: I have enabled Frame Processors (react-native-worklets-core)
- label: I have read the [Troubleshooting Guide](https://react-native-vision-camera.com/docs/guides/troubleshooting)
required: true
- label: I agree to follow this project's [Code of Conduct](https://github.com/mrousavy/react-native-vision-camera/blob/main/CODE_OF_CONDUCT.md)
diff --git a/.github/ISSUE_TEMPLATE/BUILD_ERROR.yml b/.github/ISSUE_TEMPLATE/BUILD_ERROR.yml
new file mode 100644
index 00000000..18b80fb8
--- /dev/null
+++ b/.github/ISSUE_TEMPLATE/BUILD_ERROR.yml
@@ -0,0 +1,64 @@
+name: 🔧 Build Error
+description: File a build error bug report
+title: "🔧 "
+labels: [🔧 build error]
+body:
+ - type: textarea
+ attributes:
+ label: How were you trying to build the app?
+ description: Explain how you tried to build the app, through Xcode, `yarn ios`, a CI, or other. Be as precise as possible, I can't help you if I don't understand your issue.
+ placeholder: I tried to build my app with react-native-vision-camera using the `yarn ios` command, and it failed.
+ validations:
+ required: true
+ - type: textarea
+ attributes:
+ label: Full build logs
+ description: Share the full build logs that appear in the console. Make sure you don't just paste the last few lines here, but rather everything from start to end.
+ render: tsx
+ - type: textarea
+ attributes:
+ label: Project dependencies
+ description: Share all of your project's dependencies including their versions from `package.json`. This is useful if there are any other conflicting libraries.
+ render: tsx
+ validations:
+ required: true
+ - type: dropdown
+ attributes:
+ label: Target platforms
+ description: Select the platforms where the build error occurs.
+ multiple: true
+ options:
+ - iOS
+ - Android
+ validations:
+ required: true
+ - type: dropdown
+ attributes:
+ label: Operating system
+ description: Select your operating system that you are trying to build on.
+ multiple: true
+ options:
+ - MacOS
+ - Windows
+ - Linux
+ validations:
+ required: true
+ - type: checkboxes
+ attributes:
+ label: Can you build the VisionCamera Example app?
+ description: Try to build the example app (`package/example/`) and see if the issue is reproduceable here.
+ options:
+ - label: I can build the VisionCamera Example app.
+ - type: checkboxes
+ attributes:
+ label: Additional information
+ description: Please check all the boxes that apply
+ options:
+ - label: I am using Expo
+ - label: I have enabled Frame Processors (react-native-worklets-core)
+ - label: I have read the [Troubleshooting Guide](https://react-native-vision-camera.com/docs/guides/troubleshooting)
+ required: true
+ - label: I agree to follow this project's [Code of Conduct](https://github.com/mrousavy/react-native-vision-camera/blob/main/CODE_OF_CONDUCT.md)
+ required: true
+ - label: I searched for [similar issues in this repository](https://github.com/mrousavy/react-native-vision-camera/issues) and found none.
+ required: true
diff --git a/.github/workflows/build-android.yml b/.github/workflows/build-android.yml
index 9c95baf8..9608bc04 100644
--- a/.github/workflows/build-android.yml
+++ b/.github/workflows/build-android.yml
@@ -6,6 +6,7 @@ on:
- main
paths:
- '.github/workflows/build-android.yml'
+ - 'cpp/**'
- 'android/**'
- 'example/android/**'
- 'yarn.lock'
@@ -13,22 +14,26 @@ on:
pull_request:
paths:
- '.github/workflows/build-android.yml'
+ - 'cpp/**'
- 'android/**'
- 'example/android/**'
- 'yarn.lock'
- 'example/yarn.lock'
jobs:
- build_example:
+ build:
name: Build Android Example App
runs-on: ubuntu-latest
+ defaults:
+ run:
+ working-directory: ./package
steps:
- uses: actions/checkout@v2
- - name: Setup JDK 1.8
+ - name: Setup JDK 11
uses: actions/setup-java@v1
with:
- java-version: 1.8
+ java-version: 11
- name: Get yarn cache directory path
id: yarn-cache-dir-path
@@ -55,7 +60,49 @@ jobs:
key: ${{ runner.os }}-gradle-${{ hashFiles('**/*.gradle*', '**/gradle-wrapper.properties') }}
restore-keys: |
${{ runner.os }}-gradle-
- - name: Run Gradle Build for android/
- run: cd android && ./gradlew assembleDebug --build-cache && cd ..
+ - name: Run Gradle Build for example/android/
+ run: cd example/android && ./gradlew assembleDebug --build-cache && cd ../..
+
+ build-no-frame-processors:
+ name: Build Android Example App (without Frame Processors)
+ runs-on: ubuntu-latest
+ defaults:
+ run:
+ working-directory: ./package
+ steps:
+ - uses: actions/checkout@v2
+
+ - name: Setup JDK 11
+ uses: actions/setup-java@v1
+ with:
+ java-version: 11
+
+ - name: Get yarn cache directory path
+ id: yarn-cache-dir-path
+ run: echo "::set-output name=dir::$(yarn cache dir)"
+ - name: Restore node_modules from cache
+ uses: actions/cache@v2
+ id: yarn-cache
+ with:
+ path: ${{ steps.yarn-cache-dir-path.outputs.dir }}
+ key: ${{ runner.os }}-yarn-${{ hashFiles('**/yarn.lock') }}
+ restore-keys: |
+ ${{ runner.os }}-yarn-
+ - name: Install node_modules
+ run: yarn install --frozen-lockfile
+ - name: Install node_modules for example/
+ run: yarn install --frozen-lockfile --cwd example
+ - name: Remove react-native-worklets-core
+ run: yarn remove react-native-worklets-core --cwd example
+
+ - name: Restore Gradle cache
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.gradle/caches
+ ~/.gradle/wrapper
+ key: ${{ runner.os }}-gradle-${{ hashFiles('**/*.gradle*', '**/gradle-wrapper.properties') }}
+ restore-keys: |
+ ${{ runner.os }}-gradle-
- name: Run Gradle Build for example/android/
run: cd example/android && ./gradlew assembleDebug --build-cache && cd ../..
diff --git a/.github/workflows/build-ios.yml b/.github/workflows/build-ios.yml
index 8fbf15aa..d40d27ad 100644
--- a/.github/workflows/build-ios.yml
+++ b/.github/workflows/build-ios.yml
@@ -6,12 +6,14 @@ on:
- main
paths:
- '.github/workflows/build-ios.yml'
+ - 'cpp/**'
- 'ios/**'
- '*.podspec'
- 'example/ios/**'
pull_request:
paths:
- '.github/workflows/build-ios.yml'
+ - 'cpp/**'
- 'ios/**'
- '*.podspec'
- 'example/ios/**'
@@ -22,7 +24,7 @@ jobs:
runs-on: macOS-latest
defaults:
run:
- working-directory: example/ios
+ working-directory: package/example/ios
steps:
- uses: actions/checkout@v2
@@ -47,9 +49,8 @@ jobs:
- name: Setup Ruby (bundle)
uses: ruby/setup-ruby@v1
with:
- ruby-version: 2.6
+ ruby-version: 2.6.10
bundler-cache: true
- working-directory: example/ios
- name: Restore Pods cache
uses: actions/cache@v2
@@ -62,7 +63,68 @@ jobs:
restore-keys: |
${{ runner.os }}-pods-
- name: Install Pods
- run: bundle exec pod check || bundle exec pod install
+ run: pod install
+ - name: Install xcpretty
+ run: gem install xcpretty
+ - name: Build App
+ run: "set -o pipefail && xcodebuild \
+ CC=clang CPLUSPLUS=clang++ LD=clang LDPLUSPLUS=clang++ \
+ -derivedDataPath build -UseModernBuildSystem=YES \
+ -workspace VisionCameraExample.xcworkspace \
+ -scheme VisionCameraExample \
+ -sdk iphonesimulator \
+ -configuration Debug \
+ -destination 'platform=iOS Simulator,name=iPhone 11 Pro' \
+ build \
+ CODE_SIGNING_ALLOWED=NO | xcpretty"
+
+ build-no-frame-processors:
+ name: Build iOS Example App without Frame Processors
+ runs-on: macOS-latest
+ defaults:
+ run:
+ working-directory: package/example/ios
+ steps:
+ - uses: actions/checkout@v2
+
+ - name: Get yarn cache directory path
+ id: yarn-cache-dir-path
+ run: echo "::set-output name=dir::$(yarn cache dir)"
+ - name: Restore node_modules from cache
+ uses: actions/cache@v2
+ id: yarn-cache
+ with:
+ path: ${{ steps.yarn-cache-dir-path.outputs.dir }}
+ key: ${{ runner.os }}-yarn-${{ hashFiles('**/yarn.lock') }}
+ restore-keys: |
+ ${{ runner.os }}-yarn-
+ - name: Install node_modules for example/
+ run: yarn install --frozen-lockfile --cwd ..
+ - name: Remove react-native-worklets-core
+ run: yarn remove react-native-worklets-core --cwd ..
+
+ - name: Restore buildcache
+ uses: mikehardy/buildcache-action@v1
+ continue-on-error: true
+
+ - name: Setup Ruby (bundle)
+ uses: ruby/setup-ruby@v1
+ with:
+ ruby-version: 2.6.10
+ bundler-cache: true
+
+ - name: Restore Pods cache
+ uses: actions/cache@v2
+ with:
+ path: |
+ example/ios/Pods
+ ~/Library/Caches/CocoaPods
+ ~/.cocoapods
+ key: ${{ runner.os }}-pods-${{ hashFiles('**/Podfile.lock') }}
+ restore-keys: |
+ ${{ runner.os }}-pods-
+ - name: Install Pods
+ run: pod install
- name: Install xcpretty
run: gem install xcpretty
- name: Build App
diff --git a/.github/workflows/notice-yarn-changes.yml b/.github/workflows/notice-yarn-changes.yml
deleted file mode 100644
index 6d155e73..00000000
--- a/.github/workflows/notice-yarn-changes.yml
+++ /dev/null
@@ -1,19 +0,0 @@
-name: Notice yarn.lock changes
-on: [pull_request]
-
-jobs:
- check:
- runs-on: ubuntu-latest
- permissions:
- pull-requests: write
- steps:
- - name: Checkout
- uses: actions/checkout@v2
- - name: Notice yarn.lock changes
- uses: Simek/yarn-lock-changes@main
- with:
- token: ${{ secrets.GITHUB_TOKEN }}
- collapsibleThreshold: '25'
- failOnDowngrade: 'false'
- path: 'yarn.lock'
- updateComment: 'true'
diff --git a/.github/workflows/validate-android.yml b/.github/workflows/validate-android.yml
index e7054512..1f569290 100644
--- a/.github/workflows/validate-android.yml
+++ b/.github/workflows/validate-android.yml
@@ -20,13 +20,13 @@ jobs:
runs-on: ubuntu-latest
defaults:
run:
- working-directory: ./android
+ working-directory: ./package/android
steps:
- uses: actions/checkout@v2
- - name: Setup JDK 1.8
+ - name: Setup JDK 11
uses: actions/setup-java@v1
with:
- java-version: 1.8
+ java-version: 11
- name: Get yarn cache directory path
id: yarn-cache-dir-path
diff --git a/.github/workflows/validate-cpp.yml b/.github/workflows/validate-cpp.yml
index fd3a32f4..13848ab5 100644
--- a/.github/workflows/validate-cpp.yml
+++ b/.github/workflows/validate-cpp.yml
@@ -6,30 +6,32 @@ on:
- main
paths:
- '.github/workflows/validate-cpp.yml'
- - 'cpp/**'
- - 'android/src/main/cpp/**'
+ - 'package/cpp/**'
+ - 'package/android/src/main/cpp/**'
+ - 'package/ios/**'
pull_request:
paths:
- '.github/workflows/validate-cpp.yml'
- - 'cpp/**'
- - 'android/src/main/cpp/**'
+ - 'package/cpp/**'
+ - 'package/android/src/main/cpp/**'
+ - 'package/ios/**'
jobs:
lint:
- name: cpplint
+ name: Check clang-format
runs-on: ubuntu-latest
+ strategy:
+ matrix:
+ path:
+ - 'package/cpp'
+ - 'package/android/src/main/cpp'
+ - 'package/ios'
steps:
- uses: actions/checkout@v2
- - uses: reviewdog/action-cpplint@master
+ - name: Run clang-format style check
+ uses: mrousavy/clang-format-action@v1
with:
- github_token: ${{ secrets.github_token }}
- reporter: github-pr-review
- flags: --linelength=230 --exclude "android/src/main/cpp/reanimated-headers"
- targets: --recursive cpp android/src/main/cpp
- filter: "-legal/copyright\
- ,-readability/todo\
- ,-build/namespaces\
- ,-whitespace/comments\
- ,-build/include_order\
- ,-build/c++11\
- "
+ clang-format-version: '16'
+ check-path: ${{ matrix.path }}
+ clang-format-style-path: package/cpp/.clang-format
+
diff --git a/.github/workflows/validate-ios.yml b/.github/workflows/validate-ios.yml
index 395050ff..82fe5a74 100644
--- a/.github/workflows/validate-ios.yml
+++ b/.github/workflows/validate-ios.yml
@@ -15,6 +15,9 @@ on:
jobs:
SwiftLint:
runs-on: ubuntu-latest
+ defaults:
+ run:
+ working-directory: ./package
steps:
- uses: actions/checkout@v2
- name: Run SwiftLint GitHub Action (--strict)
@@ -27,7 +30,7 @@ jobs:
runs-on: macOS-latest
defaults:
run:
- working-directory: ./ios
+ working-directory: ./package/ios
steps:
- uses: actions/checkout@v2
diff --git a/.github/workflows/validate-js.yml b/.github/workflows/validate-js.yml
index f430c2ba..53f1c175 100644
--- a/.github/workflows/validate-js.yml
+++ b/.github/workflows/validate-js.yml
@@ -6,32 +6,35 @@ on:
- main
paths:
- '.github/workflows/validate-js.yml'
- - 'src/**'
- - '*.json'
- - '*.js'
- - '*.lock'
- - 'example/src/**'
- - 'example/*.json'
- - 'example/*.js'
- - 'example/*.lock'
- - 'example/*.tsx'
+ - 'package/src/**'
+ - 'package/*.json'
+ - 'package/*.js'
+ - 'package/*.lock'
+ - 'package/example/src/**'
+ - 'package/example/*.json'
+ - 'package/example/*.js'
+ - 'package/example/*.lock'
+ - 'package/example/*.tsx'
pull_request:
paths:
- '.github/workflows/validate-js.yml'
- - 'src/**'
- - '*.json'
- - '*.js'
- - '*.lock'
- - 'example/src/**'
- - 'example/*.json'
- - 'example/*.js'
- - 'example/*.lock'
- - 'example/*.tsx'
+ - 'package/src/**'
+ - 'package/*.json'
+ - 'package/*.js'
+ - 'package/*.lock'
+ - 'package/example/src/**'
+ - 'package/example/*.json'
+ - 'package/example/*.js'
+ - 'package/example/*.lock'
+ - 'package/example/*.tsx'
jobs:
compile:
name: Compile JS (tsc)
runs-on: ubuntu-latest
+ defaults:
+ run:
+ working-directory: ./package
steps:
- uses: actions/checkout@v2
@@ -70,6 +73,9 @@ jobs:
lint:
name: Lint JS (eslint, prettier)
runs-on: ubuntu-latest
+ defaults:
+ run:
+ working-directory: ./package
steps:
- uses: actions/checkout@v2
diff --git a/.gitignore b/.gitignore
index 86e5443f..5b5824fb 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1,70 +1,6 @@
-# OSX
-#
.DS_Store
-# XDE
-.expo/
-
-# VSCode
-jsconfig.json
-
-# Xcode
-#
-build/
-*.pbxuser
-!default.pbxuser
-*.mode1v3
-!default.mode1v3
-*.mode2v3
-!default.mode2v3
-*.perspectivev3
-!default.perspectivev3
-xcuserdata
-*.xccheckout
-*.moved-aside
-DerivedData
-*.hmap
-*.ipa
-*.xcuserstate
-project.xcworkspace
-
-# Android/IJ
-#
-.idea
-.gradle
-local.properties
-android.iml
-*.hprof
-
-# Cocoapods
-#
-example/ios/Pods
-
-# node.js
-#
-node_modules/
-npm-debug.log
-yarn-debug.log
-yarn-error.log
-
-# BUCK
-buck-out/
-\.buckd/
-android/app/libs
-android/keystores/debug.keystore
-
-# Expo
-.expo/*
-
-# generated by bob
-lib/
-
-# we only use yarn
-package-lock.json
-
-# TypeDoc/Docusaurus stuff
-docs/docs/api
-
-# External native build folder generated in Android Studio 2.2 and later
-.externalNativeBuild
-.cxx/
+# no yarn/npm in the root repo!
+./package-lock.json
+./yarn.lock
+**/node_modules/
diff --git a/.vscode/settings.json b/.vscode/settings.json
deleted file mode 100644
index 3662b370..00000000
--- a/.vscode/settings.json
+++ /dev/null
@@ -1,3 +0,0 @@
-{
- "typescript.tsdk": "node_modules/typescript/lib"
-}
\ No newline at end of file
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index d9a3f65b..186de9f4 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -2,7 +2,8 @@
## Guidelines
-1. Don't be rude.
+1. Don't be an asshole.
+2. Don't waste anyone's time.
## Get started
@@ -10,6 +11,7 @@
2. Install dependencies
```
cd react-native-vision-camera
+ cd package
yarn bootstrap
```
@@ -39,7 +41,11 @@ Read the READMEs in [`android/`](android/README.md) and [`ios/`](ios/README.md)
1. Open the `example/android/` folder with Android Studio
2. Start the metro bundler in the `example/` directory using `yarn start`
3. Select your device in the devices drop-down
-4. Hit run
+4. Once your device is connected, make sure it can find the metro bundler's port:
+ ```
+ adb reverse tcp:8081 tcp:8081
+ ```
+6. Hit run
> Run `yarn check-android` to validate codestyle
diff --git a/README.md b/README.md
index 863b8d03..1f07ee63 100644
--- a/README.md
+++ b/README.md
@@ -1,56 +1,46 @@
-
+
-
Vision Camera
-
-
-
-
-
+
-### ‼️‼️‼️‼️‼️ ✨ VisionCamera V3 ‼️‼️‼️‼️‼️
-
-**See [this discussion](https://github.com/mrousavy/react-native-vision-camera/issues/1376) for the latest upcoming version of VisionCamera**
-
-### Documentation
-
-* [Guides](https://react-native-vision-camera.com/docs/guides)
-* [API](https://react-native-vision-camera.com/docs/api)
-* [Example](./example/)
-
### Features
-* Photo, Video and Snapshot capture
-* Customizable devices and multi-cameras (smoothly zoom out to "fish-eye" camera)
-* Customizable FPS
+VisionCamera is a powerful and fast Camera component for React Native. It features:
+
+* Photo and Video capture
+* Customizable devices and multi-cameras ("fish-eye" zoom)
+* Customizable resolutions and aspect-ratios (4k/8k images)
+* Customizable FPS (30..240 FPS)
* [Frame Processors](https://react-native-vision-camera.com/docs/guides/frame-processors) (JS worklets to run QR-Code scanning, facial recognition, AI object detection, realtime video chats, ...)
* Smooth zooming (Reanimated)
* Fast pause and resume
* HDR & Night modes
+* Custom C++/GPU accelerated video pipeline (OpenGL)
-> See the [example](./example/) app
+Install VisionCamera from npm:
+
+```sh
+yarn add react-native-vision-camera
+cd ios && pod install
+```
+
+..and get started by [setting up permissions](https://react-native-vision-camera.com/docs/guides)!
+
+### Documentation
+
+* [Guides](https://react-native-vision-camera.com/docs/guides)
+* [API](https://react-native-vision-camera.com/docs/api)
+* [Example](./package/example/)
+* [Frame Processor Plugins](https://react-native-vision-camera.com/docs/guides/frame-processor-plugin-list)
### Example
@@ -70,6 +60,8 @@ function App() {
}
```
+> See the [example](./package/example/) app
+
### Adopting at scale
@@ -80,6 +72,9 @@ VisionCamera is provided _as is_, I work on it in my free time.
If you're integrating VisionCamera in a production app, consider [funding this project](https://github.com/sponsors/mrousavy) and contact me to receive premium enterprise support, help with issues, prioritize bugfixes, request features, help at integrating VisionCamera and/or Frame Processors, and more.
-
+### Socials
-#### 🚀 Get started by [setting up permissions](https://react-native-vision-camera.com/docs/guides/)!
+* 🐦 [**Follow me on Twitter**](https://twitter.com/mrousavy) for updates
+* 📝 [**Check out my blog**](https://mrousavy.com/blog) for examples and experiments
+* 💖 [**Sponsor me on GitHub**](https://github.com/sponsors/mrousavy) to support my work
+* 🍪 [**Buy me a Ko-Fi**](https://ko-fi.com/mrousavy) to support my work
diff --git a/VisionCamera.podspec b/VisionCamera.podspec
deleted file mode 100644
index cbdf58e8..00000000
--- a/VisionCamera.podspec
+++ /dev/null
@@ -1,67 +0,0 @@
-require "json"
-
-package = JSON.parse(File.read(File.join(__dir__, "package.json")))
-
-reactVersion = '0.0.0'
-begin
- reactVersion = JSON.parse(File.read(File.join(__dir__, "..", "react-native", "package.json")))["version"]
-rescue
- reactVersion = '0.66.0'
-end
-rnVersion = reactVersion.split('.')[1]
-
-folly_flags = '-DFOLLY_NO_CONFIG -DFOLLY_MOBILE=1 -DFOLLY_USE_LIBCPP=1 -DRNVERSION=' + rnVersion
-folly_compiler_flags = folly_flags + ' ' + '-Wno-comma -Wno-shorten-64-to-32'
-folly_version = '2021.04.26.00'
-boost_compiler_flags = '-Wno-documentation'
-
-Pod::Spec.new do |s|
- s.name = "VisionCamera"
- s.version = package["version"]
- s.summary = package["description"]
- s.homepage = package["homepage"]
- s.license = package["license"]
- s.authors = package["author"]
-
- s.platforms = { :ios => "11.0" }
- s.source = { :git => "https://github.com/mrousavy/react-native-vision-camera.git", :tag => "#{s.version}" }
-
- s.pod_target_xcconfig = {
- "USE_HEADERMAP" => "YES",
- "HEADER_SEARCH_PATHS" => "\"$(PODS_TARGET_SRCROOT)/ReactCommon\" \"$(PODS_TARGET_SRCROOT)\" \"$(PODS_ROOT)/RCT-Folly\" \"$(PODS_ROOT)/boost\" \"$(PODS_ROOT)/boost-for-react-native\" \"$(PODS_ROOT)/DoubleConversion\" \"$(PODS_ROOT)/Headers/Private/React-Core\" "
- }
- s.compiler_flags = folly_compiler_flags + ' ' + boost_compiler_flags
- s.xcconfig = {
- "CLANG_CXX_LANGUAGE_STANDARD" => "c++17",
- "HEADER_SEARCH_PATHS" => "\"$(PODS_ROOT)/boost\" \"$(PODS_ROOT)/boost-for-react-native\" \"$(PODS_ROOT)/glog\" \"$(PODS_ROOT)/RCT-Folly\" \"${PODS_ROOT}/Headers/Public/React-hermes\" \"${PODS_ROOT}/Headers/Public/hermes-engine\"",
- "OTHER_CFLAGS" => "$(inherited)" + " " + folly_flags
- }
-
- s.requires_arc = true
-
- # All source files that should be publicly visible
- # Note how this does not include headers, since those can nameclash.
- s.source_files = [
- "ios/**/*.{m,mm,swift}",
- "ios/CameraBridge.h",
- "ios/Frame Processor/Frame.h",
- "ios/Frame Processor/FrameProcessorCallback.h",
- "ios/Frame Processor/FrameProcessorRuntimeManager.h",
- "ios/Frame Processor/FrameProcessorPluginRegistry.h",
- "ios/Frame Processor/FrameProcessorPlugin.h",
- "ios/React Utils/RCTBridge+runOnJS.h",
- "ios/React Utils/JSConsoleHelper.h",
- "cpp/**/*.{cpp}",
- ]
- # Any private headers that are not globally unique should be mentioned here.
- # Otherwise there will be a nameclash, since CocoaPods flattens out any header directories
- # See https://github.com/firebase/firebase-ios-sdk/issues/4035 for more details.
- s.preserve_paths = [
- "cpp/**/*.h",
- "ios/**/*.h"
- ]
-
- s.dependency "React-callinvoker"
- s.dependency "React"
- s.dependency "React-Core"
-end
diff --git a/android/CMakeLists.txt b/android/CMakeLists.txt
deleted file mode 100644
index d400df36..00000000
--- a/android/CMakeLists.txt
+++ /dev/null
@@ -1,260 +0,0 @@
-project(VisionCamera)
-cmake_minimum_required(VERSION 3.4.1)
-
-set (CMAKE_VERBOSE_MAKEFILE ON)
-set (CMAKE_CXX_STANDARD 14)
-
-if(${REACT_NATIVE_VERSION} GREATER_EQUAL 71)
- include("${NODE_MODULES_DIR}/react-native/ReactAndroid/cmake-utils/folly-flags.cmake")
- add_compile_options(${folly_FLAGS})
-else()
- set (CMAKE_CXX_FLAGS "-DFOLLY_NO_CONFIG=1 -DFOLLY_HAVE_CLOCK_GETTIME=1 -DFOLLY_HAVE_MEMRCHR=1 -DFOLLY_USE_LIBCPP=1 -DFOLLY_MOBILE=1 -DON_ANDROID -DONANDROID -DFOR_HERMES=${FOR_HERMES}")
-endif()
-
-
-set (PACKAGE_NAME "VisionCamera")
-set (BUILD_DIR ${CMAKE_SOURCE_DIR}/build)
-if(${REACT_NATIVE_VERSION} GREATER_EQUAL 71)
- # Consume shared libraries and headers from prefabs
- find_package(fbjni REQUIRED CONFIG)
- find_package(ReactAndroid REQUIRED CONFIG)
-else()
- set (RN_SO_DIR ${NODE_MODULES_DIR}/react-native/ReactAndroid/src/main/jni/first-party/react/jni)
-endif()
-# VisionCamera shared
-
-if(${REACT_NATIVE_VERSION} LESS 66)
- set (
- INCLUDE_JSI_CPP
- "${NODE_MODULES_DIR}/react-native/ReactCommon/jsi/jsi/jsi.cpp"
- )
- set (
- INCLUDE_JSIDYNAMIC_CPP
- "${NODE_MODULES_DIR}/react-native/ReactCommon/jsi/jsi/JSIDynamic.cpp"
- )
-endif()
-
-add_library(
- ${PACKAGE_NAME}
- SHARED
- src/main/cpp/VisionCamera.cpp
- src/main/cpp/JSIJNIConversion.cpp
- src/main/cpp/FrameHostObject.cpp
- src/main/cpp/FrameProcessorRuntimeManager.cpp
- src/main/cpp/CameraView.cpp
- src/main/cpp/VisionCameraScheduler.cpp
- src/main/cpp/java-bindings/JFrameProcessorPlugin.cpp
- src/main/cpp/java-bindings/JImageProxy.cpp
- src/main/cpp/java-bindings/JHashMap.cpp
-)
-
-# includes
-if(${REACT_NATIVE_VERSION} GREATER_EQUAL 71)
- target_include_directories(
- ${PACKAGE_NAME}
- PRIVATE
- "${NODE_MODULES_DIR}/react-native/ReactAndroid/src/main/jni/react/turbomodule"
- "${NODE_MODULES_DIR}/react-native/ReactCommon"
- "${NODE_MODULES_DIR}/react-native/ReactCommon/callinvoker"
- "${NODE_MODULES_DIR}/react-native/ReactCommon/jsi"
- "${NODE_MODULES_DIR}/react-native/ReactCommon/react/renderer/graphics/platform/cxx"
- "${NODE_MODULES_DIR}/react-native/ReactCommon/runtimeexecutor"
- "${NODE_MODULES_DIR}/react-native/ReactCommon/yoga"
- # --- Reanimated ---
- # New
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/AnimatedSensor"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/Tools"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/SpecTools"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/SharedItems"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/Registries"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/LayoutAnimations"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/hidden_headers"
- "src/main/cpp"
- )
-else()
- file (GLOB LIBFBJNI_INCLUDE_DIR "${BUILD_DIR}/fbjni-*-headers.jar/")
-
- target_include_directories(
- ${PACKAGE_NAME}
- PRIVATE
- # --- fbjni ---
- "${LIBFBJNI_INCLUDE_DIR}"
- # --- Third Party (required by RN) ---
- "${BUILD_DIR}/third-party-ndk/boost"
- "${BUILD_DIR}/third-party-ndk/double-conversion"
- "${BUILD_DIR}/third-party-ndk/folly"
- "${BUILD_DIR}/third-party-ndk/glog"
- # --- React Native ---
- "${NODE_MODULES_DIR}/react-native/React"
- "${NODE_MODULES_DIR}/react-native/React/Base"
- "${NODE_MODULES_DIR}/react-native/ReactAndroid/src/main/jni"
- "${NODE_MODULES_DIR}/react-native/ReactAndroid/src/main/java/com/facebook/react/turbomodule/core/jni"
- "${NODE_MODULES_DIR}/react-native/ReactCommon"
- "${NODE_MODULES_DIR}/react-native/ReactCommon/callinvoker"
- "${NODE_MODULES_DIR}/react-native/ReactCommon/jsi"
- "${NODE_MODULES_DIR}/hermes-engine/android/include/"
- ${INCLUDE_JSI_CPP} # only on older RN versions
- ${INCLUDE_JSIDYNAMIC_CPP} # only on older RN versions
- # --- Reanimated ---
- # Old
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/headers/AnimatedSensor"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/headers/Tools"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/headers/SpecTools"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/headers/SharedItems"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/headers/Registries"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/headers/LayoutAnimations"
- # New
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/AnimatedSensor"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/Tools"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/SpecTools"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/SharedItems"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/Registries"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/LayoutAnimations"
- "${NODE_MODULES_DIR}/react-native-reanimated/Common/cpp/hidden_headers"
- "src/main/cpp"
- )
-endif()
-
-
-
-# find libraries
-
-file (GLOB LIBRN_DIR "${BUILD_DIR}/react-native-0*/jni/${ANDROID_ABI}")
-
-if(${FOR_HERMES})
- string(APPEND CMAKE_CXX_FLAGS " -DFOR_HERMES=1")
-
- if(${REACT_NATIVE_VERSION} GREATER_EQUAL 71)
- find_package(hermes-engine REQUIRED CONFIG)
- elseif(${REACT_NATIVE_VERSION} GREATER_EQUAL 69)
- # Bundled Hermes from module `com.facebook.react:hermes-engine` or project `:ReactAndroid:hermes-engine`
- target_include_directories(
- ${PACKAGE_NAME}
- PRIVATE
- "${JS_RUNTIME_DIR}/API"
- "${JS_RUNTIME_DIR}/public"
- )
- else()
- # From `hermes-engine` npm package
- target_include_directories(
- ${PACKAGE_NAME}
- PRIVATE
- "${JS_RUNTIME_DIR}/android/include"
- )
- endif()
-
- if(${REACT_NATIVE_VERSION} GREATER_EQUAL 71)
- target_link_libraries(
- ${PACKAGE_NAME}
- "hermes-engine::libhermes"
- )
- else()
- target_link_libraries(
- ${PACKAGE_NAME}
- "${BUILD_DIR}/third-party-ndk/hermes/jni/${ANDROID_ABI}/libhermes.so"
- )
- endif()
- file (GLOB LIBREANIMATED_DIR "${BUILD_DIR}/react-native-reanimated-*-hermes.aar/jni/${ANDROID_ABI}")
-else()
- file (GLOB LIBJSC_DIR "${BUILD_DIR}/android-jsc*.aar/jni/${ANDROID_ABI}")
-
- if(${REACT_NATIVE_VERSION} GREATER_EQUAL 71)
- set(JS_ENGINE_LIB ReactAndroid::jscexecutor)
- else()
- # Use JSC
- find_library(
- JS_ENGINE_LIB
- jscexecutor
- PATHS ${LIBRN_DIR}
- NO_CMAKE_FIND_ROOT_PATH
- )
- endif()
- target_link_libraries(
- ${PACKAGE_NAME}
- ${JS_ENGINE_LIB}
- )
-
- # Use Reanimated JSC
- file (GLOB LIBREANIMATED_DIR "${BUILD_DIR}/react-native-reanimated-*-jsc.aar/jni/${ANDROID_ABI}")
-endif()
-
-if(${REACT_NATIVE_VERSION} LESS 71)
- find_library(
- FBJNI_LIB
- fbjni
- PATHS ${LIBRN_DIR}
- NO_CMAKE_FIND_ROOT_PATH
- )
-endif()
-
-if(${REACT_NATIVE_VERSION} LESS 69)
- find_library(
- FOLLY_LIB
- folly_json
- PATHS ${LIBRN_DIR}
- NO_CMAKE_FIND_ROOT_PATH
- )
-elseif(${REACT_NATIVE_VERSION} LESS 71)
- find_library(
- FOLLY_LIB
- folly_runtime
- PATHS ${LIBRN_DIR}
- NO_CMAKE_FIND_ROOT_PATH
- )
-endif()
-
-if(${REACT_NATIVE_VERSION} LESS 71)
- find_library(
- REACT_NATIVE_JNI_LIB
- reactnativejni
- PATHS ${LIBRN_DIR}
- NO_CMAKE_FIND_ROOT_PATH
- )
-endif()
-
-if(${REACT_NATIVE_VERSION} GREATER_EQUAL 71)
- target_link_libraries(
- ${PACKAGE_NAME}
- ReactAndroid::folly_runtime
- ReactAndroid::glog
- ReactAndroid::jsi
- ReactAndroid::reactnativejni
- fbjni::fbjni
- )
-elseif(${REACT_NATIVE_VERSION} LESS 66)
- # JSI lib didn't exist on RN 0.65 and before. Simply omit it.
- set (JSI_LIB "")
-else()
- # RN 0.66 distributes libjsi.so, can be used instead of compiling jsi.cpp manually.
- find_library(
- JSI_LIB
- jsi
- PATHS ${LIBRN_DIR}
- NO_CMAKE_FIND_ROOT_PATH
- )
-endif()
-
-find_library(
- REANIMATED_LIB
- reanimated
- PATHS ${LIBREANIMATED_DIR}
- NO_CMAKE_FIND_ROOT_PATH
-)
-
-find_library(
- LOG_LIB
- log
-)
-
-# linking
-message(WARNING "VisionCamera linking: FOR_HERMES=${FOR_HERMES}")
-target_link_libraries(
- ${PACKAGE_NAME}
- ${LOG_LIB}
- ${JSI_LIB}
- ${REANIMATED_LIB}
- ${REACT_NATIVE_JNI_LIB}
- ${FBJNI_LIB}
- ${FOLLY_LIB}
- android
-)
diff --git a/android/build.gradle b/android/build.gradle
deleted file mode 100644
index 6162d4ee..00000000
--- a/android/build.gradle
+++ /dev/null
@@ -1,638 +0,0 @@
-import groovy.json.JsonSlurper
-import org.apache.tools.ant.filters.ReplaceTokens
-import java.nio.file.Paths
-
-static def findNodeModules(baseDir) {
- def basePath = baseDir.toPath().normalize()
- // Node's module resolution algorithm searches up to the root directory,
- // after which the base path will be null
- while (basePath) {
- def nodeModulesPath = Paths.get(basePath.toString(), "node_modules")
- def reactNativePath = Paths.get(nodeModulesPath.toString(), "react-native")
- if (nodeModulesPath.toFile().exists() && reactNativePath.toFile().exists()) {
- return nodeModulesPath.toString()
- }
- basePath = basePath.getParent()
- }
- throw new GradleException("VisionCamera: Failed to find node_modules/ path!")
-}
-static def findNodeModulePath(baseDir, packageName) {
- def basePath = baseDir.toPath().normalize()
- // Node's module resolution algorithm searches up to the root directory,
- // after which the base path will be null
- while (basePath) {
- def candidatePath = Paths.get(basePath.toString(), "node_modules", packageName)
- if (candidatePath.toFile().exists()) {
- return candidatePath.toString()
- }
- basePath = basePath.getParent()
- }
- return null
-}
-
-def isNewArchitectureEnabled() {
- // To opt-in for the New Architecture, you can either:
- // - Set `newArchEnabled` to true inside the `gradle.properties` file
- // - Invoke gradle with `-newArchEnabled=true`
- // - Set an environment variable `ORG_GRADLE_PROJECT_newArchEnabled=true`
- return project.hasProperty("newArchEnabled") && project.newArchEnabled == "true"
-}
-
-def nodeModules = findNodeModules(projectDir)
-logger.warn("VisionCamera: node_modules/ found at: ${nodeModules}")
-def reactNative = new File("$nodeModules/react-native")
-def CMAKE_NODE_MODULES_DIR = project.getProjectDir().getParentFile().getParent()
-def reactProperties = new Properties()
-file("$nodeModules/react-native/ReactAndroid/gradle.properties").withInputStream { reactProperties.load(it) }
-def REACT_NATIVE_FULL_VERSION = reactProperties.getProperty("VERSION_NAME")
-def REACT_NATIVE_VERSION = reactProperties.getProperty("VERSION_NAME").split("\\.")[1].toInteger()
-
-def FOR_HERMES = System.getenv("FOR_HERMES") == "True"
-rootProject.getSubprojects().forEach({project ->
- if (project.plugins.hasPlugin("com.android.application")) {
- FOR_HERMES = REACT_NATIVE_VERSION >= 71 && project.hermesEnabled || project.ext.react.enableHermes
- }
-})
-def jsRuntimeDir = {
- if (FOR_HERMES) {
- if (REACT_NATIVE_VERSION >= 69) {
- return Paths.get(CMAKE_NODE_MODULES_DIR, "react-native", "sdks", "hermes")
- } else {
- return Paths.get(CMAKE_NODE_MODULES_DIR, "hermes-engine")
- }
- } else {
- return Paths.get(CMAKE_NODE_MODULES_DIR, "react-native", "ReactCommon", "jsi")
- }
-}.call()
-logger.warn("VisionCamera: Building with ${FOR_HERMES ? "Hermes" : "JSC"}...")
-
-buildscript {
- // Buildscript is evaluated before everything else so we can't use getExtOrDefault
- def kotlin_version = rootProject.ext.has('kotlinVersion') ? rootProject.ext.get('kotlinVersion') : project.properties['VisionCamera_kotlinVersion']
-
- repositories {
- google()
- mavenCentral()
- maven {
- url "https://plugins.gradle.org/m2/"
- }
- }
-
- dependencies {
- classpath 'com.android.tools.build:gradle:4.2.2'
- classpath 'de.undercouch:gradle-download-task:4.1.2'
- // noinspection DifferentKotlinGradleVersion
- classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
- classpath "org.jetbrains.kotlin:kotlin-android-extensions:$kotlin_version"
- }
-}
-
-apply plugin: 'com.android.library'
-apply plugin: 'de.undercouch.download'
-apply plugin: 'kotlin-android'
-apply plugin: 'kotlin-android-extensions'
-
-def getExtOrDefault(name) {
- return rootProject.ext.has(name) ? rootProject.ext.get(name) : project.properties['VisionCamera_' + name]
-}
-
-def getExtOrIntegerDefault(name) {
- return rootProject.ext.has(name) ? rootProject.ext.get(name) : (project.properties['VisionCamera_' + name]).toInteger()
-}
-
-def resolveBuildType() {
- def buildType = "debug"
- tasks.all({ task ->
- if (task.name == "buildCMakeRelease") {
- buildType = "release"
- }
- })
- return buildType
-}
-
-// plugin.js file only exists since REA v2.
-def hasReanimated2 = file("${nodeModules}/react-native-reanimated/plugin.js").exists()
-def disableFrameProcessors = rootProject.ext.has("disableFrameProcessors") ? rootProject.ext.get("disableFrameProcessors").asBoolean() : false
-def ENABLE_FRAME_PROCESSORS = hasReanimated2 && !disableFrameProcessors
-
-if (ENABLE_FRAME_PROCESSORS) {
- logger.warn("VisionCamera: Frame Processors are enabled! Building C++ part...")
-} else {
- if (disableFrameProcessors) {
- logger.warn("VisionCamera: Frame Processors are disabled because the user explicitly disabled it ('disableFrameProcessors=${disableFrameProcessors}'). C++ part will not be built.")
- } else if (!hasReanimated2) {
- logger.warn("VisionCamera: Frame Processors are disabled because REA v2 does not exist. C++ part will not be built.")
- }
-}
-
-android {
- compileSdkVersion getExtOrIntegerDefault('compileSdkVersion')
- buildToolsVersion getExtOrDefault('buildToolsVersion')
- ndkVersion getExtOrDefault('ndkVersion')
-
- if (REACT_NATIVE_VERSION >= 71) {
- buildFeatures {
- prefab true
- }
- }
-
- defaultConfig {
- minSdkVersion 21
- targetSdkVersion getExtOrIntegerDefault('targetSdkVersion')
-
- if (ENABLE_FRAME_PROCESSORS) {
- externalNativeBuild {
- cmake {
- cppFlags "-fexceptions", "-frtti", "-std=c++1y", "-DONANDROID"
- abiFilters 'x86', 'x86_64', 'armeabi-v7a', 'arm64-v8a'
- arguments '-DANDROID_STL=c++_shared',
- "-DREACT_NATIVE_VERSION=${REACT_NATIVE_VERSION}",
- "-DNODE_MODULES_DIR=${nodeModules}",
- "-DFOR_HERMES=${FOR_HERMES}",
- "-DJS_RUNTIME_DIR=${jsRuntimeDir}"
- }
- }
- }
- }
-
- dexOptions {
- javaMaxHeapSize "4g"
- }
-
- if (ENABLE_FRAME_PROCESSORS) {
- externalNativeBuild {
- cmake {
- path "CMakeLists.txt"
- }
- }
- }
-
- packagingOptions {
- // Exclude all Libraries that are already present in the user's app (through React Native or by him installing REA)
- excludes = ["**/libc++_shared.so",
- "**/libfbjni.so",
- "**/libjsi.so",
- "**/libreactnativejni.so",
- "**/libfolly_json.so",
- "**/libreanimated.so",
- "**/libjscexecutor.so",
- "**/libhermes.so",
- "**/libfolly_runtime.so",
- "**/libglog.so",
- ]
- // META-INF is duplicate by CameraX.
- exclude "META-INF/**"
- }
-
- buildTypes {
- release {
- minifyEnabled false
- }
- }
-
- lintOptions {
- disable 'GradleCompatible'
- }
- compileOptions {
- sourceCompatibility JavaVersion.VERSION_1_8
- targetCompatibility JavaVersion.VERSION_1_8
- }
-
- configurations {
- extractHeaders
- extractJNI
- }
-}
-
-repositories {
- mavenCentral()
- google()
-
- def found = false
- def defaultDir = null
- def androidSourcesName = 'React Native sources'
-
- if (rootProject.ext.has('reactNativeAndroidRoot')) {
- defaultDir = rootProject.ext.get('reactNativeAndroidRoot')
- } else {
- defaultDir = new File(
- projectDir,
- '/../../../node_modules/react-native/android'
- )
- }
-
- if (defaultDir.exists()) {
- maven {
- url defaultDir.toString()
- name androidSourcesName
- }
-
- logger.info(":${project.name}:reactNativeAndroidRoot ${defaultDir.canonicalPath}")
- found = true
- } else {
- def parentDir = rootProject.projectDir
-
- 1.upto(5, {
- if (found) return true
- parentDir = parentDir.parentFile
-
- def androidSourcesDir = new File(
- parentDir,
- 'node_modules/react-native'
- )
-
- def androidPrebuiltBinaryDir = new File(
- parentDir,
- 'node_modules/react-native/android'
- )
-
- if (androidPrebuiltBinaryDir.exists()) {
- maven {
- url androidPrebuiltBinaryDir.toString()
- name androidSourcesName
- }
-
- logger.info(":${project.name}:reactNativeAndroidRoot ${androidPrebuiltBinaryDir.canonicalPath}")
- found = true
- } else if (androidSourcesDir.exists()) {
- maven {
- url androidSourcesDir.toString()
- name androidSourcesName
- }
-
- logger.info(":${project.name}:reactNativeAndroidRoot ${androidSourcesDir.canonicalPath}")
- found = true
- }
- })
- }
-
- if (!found) {
- throw new GradleException(
- "${project.name}: unable to locate React Native android sources. " +
- "Ensure you have you installed React Native as a dependency in your project and try again."
- )
- }
-}
-
-def kotlin_version = getExtOrDefault('kotlinVersion')
-
-dependencies {
- if (REACT_NATIVE_VERSION >= 71) {
- implementation "com.facebook.react:react-android:"
- implementation "com.facebook.react:hermes-android:"
- } else {
- // noinspection GradleDynamicVersion
- implementation 'com.facebook.react:react-native:+'
- }
-
- if (ENABLE_FRAME_PROCESSORS) {
- implementation project(':react-native-reanimated')
-
- if (REACT_NATIVE_VERSION < 71) {
- //noinspection GradleDynamicVersion
- extractHeaders("com.facebook.fbjni:fbjni:0.4.0:headers")
- //noinspection GradleDynamicVersion
- extractJNI("com.facebook.fbjni:fbjni:0.4.0")
-
- def rnAarMatcher = "**/react-native/**/*${resolveBuildType()}.aar"
- if (REACT_NATIVE_VERSION < 69) {
- rnAarMatcher = "**/**/*.aar"
- }
-
- def rnAAR = fileTree("$reactNative/android").matching({ it.include rnAarMatcher }).singleFile
- def jscAAR = fileTree("${nodeModules}/jsc-android/dist/org/webkit/android-jsc").matching({ it.include "**/**/*.aar" }).singleFile
- extractJNI(files(rnAAR, jscAAR))
- }
-
- def jsEngine = FOR_HERMES ? "hermes" : "jsc"
- def reaAAR = "${nodeModules}/react-native-reanimated/android/react-native-reanimated-${REACT_NATIVE_VERSION}-${jsEngine}.aar"
- extractJNI(files(reaAAR))
- }
-
- implementation "org.jetbrains.kotlin:kotlin-stdlib:$kotlin_version"
- implementation "org.jetbrains.kotlinx:kotlinx-coroutines-guava:1.5.2"
- implementation "org.jetbrains.kotlinx:kotlinx-coroutines-android:1.5.2"
-
- implementation "androidx.camera:camera-core:1.1.0"
- implementation "androidx.camera:camera-camera2:1.1.0"
- implementation "androidx.camera:camera-lifecycle:1.1.0"
- implementation "androidx.camera:camera-video:1.1.0"
-
- implementation "androidx.camera:camera-view:1.1.0"
- implementation "androidx.camera:camera-extensions:1.1.0"
-
- implementation "androidx.exifinterface:exifinterface:1.3.3"
-}
-
-
-if (ENABLE_FRAME_PROCESSORS) {
- // third-party-ndk deps headers
- // mostly a copy of https://github.com/software-mansion/react-native-reanimated/blob/master/android/build.gradle#L115
- def customDownloadsDir = System.getenv("REACT_NATIVE_DOWNLOADS_DIR")
- def downloadsDir = customDownloadsDir ? new File(customDownloadsDir) : new File("$buildDir/downloads")
- def thirdPartyNdkDir = new File("$buildDir/third-party-ndk")
- def thirdPartyVersionsFile = new File("${nodeModules}/react-native/ReactAndroid/gradle.properties")
- def thirdPartyVersions = new Properties()
- thirdPartyVersions.load(new FileInputStream(thirdPartyVersionsFile))
-
- def BOOST_VERSION = thirdPartyVersions["BOOST_VERSION"]
- def boost_file = new File(downloadsDir, "boost_${BOOST_VERSION}.tar.gz")
- def DOUBLE_CONVERSION_VERSION = thirdPartyVersions["DOUBLE_CONVERSION_VERSION"]
- def double_conversion_file = new File(downloadsDir, "double-conversion-${DOUBLE_CONVERSION_VERSION}.tar.gz")
- def FOLLY_VERSION = thirdPartyVersions["FOLLY_VERSION"]
- def folly_file = new File(downloadsDir, "folly-${FOLLY_VERSION}.tar.gz")
- def GLOG_VERSION = thirdPartyVersions["GLOG_VERSION"]
- def glog_file = new File(downloadsDir, "glog-${GLOG_VERSION}.tar.gz")
-
- task createNativeDepsDirectories {
- doLast {
- downloadsDir.mkdirs()
- thirdPartyNdkDir.mkdirs()
- }
- }
-
- task downloadBoost(dependsOn: createNativeDepsDirectories, type: Download) {
- def transformedVersion = BOOST_VERSION.replace("_", ".")
- def srcUrl = "https://boostorg.jfrog.io/artifactory/main/release/${transformedVersion}/source/boost_${BOOST_VERSION}.tar.gz"
- if (REACT_NATIVE_VERSION < 69) {
- srcUrl = "https://github.com/react-native-community/boost-for-react-native/releases/download/v${transformedVersion}-0/boost_${BOOST_VERSION}.tar.gz"
- }
- src(srcUrl)
- onlyIfNewer(true)
- overwrite(false)
- dest(boost_file)
- }
-
- task prepareBoost(dependsOn: downloadBoost, type: Copy) {
- from(tarTree(resources.gzip(downloadBoost.dest)))
- from("src/main/jni/third-party/boost/Android.mk")
- include("Android.mk", "boost_${BOOST_VERSION}/boost/**/*.hpp", "boost/boost/**/*.hpp")
- includeEmptyDirs = false
- into("$thirdPartyNdkDir") // /boost_X_XX_X
- doLast {
- file("$thirdPartyNdkDir/boost_${BOOST_VERSION}").renameTo("$thirdPartyNdkDir/boost")
- }
- }
-
- task downloadDoubleConversion(dependsOn: createNativeDepsDirectories, type: Download) {
- src("https://github.com/google/double-conversion/archive/v${DOUBLE_CONVERSION_VERSION}.tar.gz")
- onlyIfNewer(true)
- overwrite(false)
- dest(double_conversion_file)
- }
-
- task prepareDoubleConversion(dependsOn: downloadDoubleConversion, type: Copy) {
- from(tarTree(downloadDoubleConversion.dest))
- from("src/main/jni/third-party/double-conversion/Android.mk")
- include("double-conversion-${DOUBLE_CONVERSION_VERSION}/src/**/*", "Android.mk")
- filesMatching("*/src/**/*", { fname -> fname.path = "double-conversion/${fname.name}" })
- includeEmptyDirs = false
- into("$thirdPartyNdkDir/double-conversion")
- }
-
- task downloadFolly(dependsOn: createNativeDepsDirectories, type: Download) {
- src("https://github.com/facebook/folly/archive/v${FOLLY_VERSION}.tar.gz")
- onlyIfNewer(true)
- overwrite(false)
- dest(folly_file)
- }
-
- task prepareFolly(dependsOn: downloadFolly, type: Copy) {
- from(tarTree(downloadFolly.dest))
- from("src/main/jni/third-party/folly/Android.mk")
- include("folly-${FOLLY_VERSION}/folly/**/*", "Android.mk")
- eachFile { fname -> fname.path = (fname.path - "folly-${FOLLY_VERSION}/") }
- includeEmptyDirs = false
- into("$thirdPartyNdkDir/folly")
- }
-
- task downloadGlog(dependsOn: createNativeDepsDirectories, type: Download) {
- src("https://github.com/google/glog/archive/v${GLOG_VERSION}.tar.gz")
- onlyIfNewer(true)
- overwrite(false)
- dest(glog_file)
- }
-
- task prepareGlog(dependsOn: downloadGlog, type: Copy) {
- from(tarTree(downloadGlog.dest))
- from("src/main/jni/third-party/glog/")
- include("glog-${GLOG_VERSION}/src/**/*", "Android.mk", "config.h")
- includeEmptyDirs = false
- filesMatching("**/*.h.in") {
- filter(ReplaceTokens, tokens: [
- ac_cv_have_unistd_h : "1",
- ac_cv_have_stdint_h : "1",
- ac_cv_have_systypes_h : "1",
- ac_cv_have_inttypes_h : "1",
- ac_cv_have_libgflags : "0",
- ac_google_start_namespace : "namespace google {",
- ac_cv_have_uint16_t : "1",
- ac_cv_have_u_int16_t : "1",
- ac_cv_have___uint16 : "0",
- ac_google_end_namespace : "}",
- ac_cv_have___builtin_expect : "1",
- ac_google_namespace : "google",
- ac_cv___attribute___noinline : "__attribute__ ((noinline))",
- ac_cv___attribute___noreturn : "__attribute__ ((noreturn))",
- ac_cv___attribute___printf_4_5: "__attribute__((__format__ (__printf__, 4, 5)))"
- ])
- it.path = (it.name - ".in")
- }
- into("$thirdPartyNdkDir/glog")
-
- doLast {
- copy {
- from(fileTree(dir: "$thirdPartyNdkDir/glog", includes: ["stl_logging.h", "logging.h", "raw_logging.h", "vlog_is_on.h", "**/src/glog/log_severity.h"]).files)
- includeEmptyDirs = false
- into("$thirdPartyNdkDir/glog/exported/glog")
- }
- }
- }
-
- task prepareThirdPartyNdkHeaders {
- if (!boost_file.exists()) {
- dependsOn(prepareBoost)
- }
- if (!double_conversion_file.exists()) {
- dependsOn(prepareDoubleConversion)
- }
- if (!folly_file.exists()) {
- dependsOn(prepareFolly)
- }
- if (!glog_file.exists()) {
- dependsOn(prepareGlog)
- }
- }
-
- prepareThirdPartyNdkHeaders.mustRunAfter createNativeDepsDirectories
-
- /*
- COPY-PASTE from react-native-reanimated.
- Vision Camera includes "hermes/hermes.h" header file in `NativeProxy.cpp`.
- Previously, we used header files from `hermes-engine` package in `node_modules`.
- Starting from React Native 0.69, Hermes is no longer distributed as package on NPM.
- On the new architecture, Hermes is downloaded from GitHub and then compiled from sources.
- However, on the old architecture, we need to download Hermes header files on our own
- as well as unzip Hermes AAR in order to obtain `libhermes.so` shared library.
- For more details, see https://reactnative.dev/architecture/bundled-hermes
- or https://github.com/reactwg/react-native-new-architecture/discussions/4
- */
- if (REACT_NATIVE_VERSION >= 69 && !isNewArchitectureEnabled()) {
- // copied from `react-native/ReactAndroid/hermes-engine/build.gradle`
-
- def customDownloadDir = System.getenv("REACT_NATIVE_DOWNLOADS_DIR")
- def downloadDir = customDownloadDir ? new File(customDownloadDir) : new File(reactNative, "sdks/download")
-
- // By default we are going to download and unzip hermes inside the /sdks/hermes folder
- // but you can provide an override for where the hermes source code is located.
- def hermesDir = System.getenv("REACT_NATIVE_OVERRIDE_HERMES_DIR") ?: new File(reactNative, "sdks/hermes")
-
- def hermesVersion = "main"
- def hermesVersionFile = new File(reactNative, "sdks/.hermesversion")
- if (hermesVersionFile.exists()) {
- hermesVersion = hermesVersionFile.text
- }
-
- task downloadHermes(type: Download) {
- src("https://github.com/facebook/hermes/tarball/${hermesVersion}")
- onlyIfNewer(true)
- overwrite(false)
- dest(new File(downloadDir, "hermes.tar.gz"))
- }
-
- task unzipHermes(dependsOn: downloadHermes, type: Copy) {
- from(tarTree(downloadHermes.dest)) {
- eachFile { file ->
- // We flatten the unzip as the tarball contains a `facebook-hermes-`
- // folder at the top level.
- if (file.relativePath.segments.size() > 1) {
- file.relativePath = new org.gradle.api.file.RelativePath(!file.isDirectory(), file.relativePath.segments.drop(1))
- }
- }
- }
- into(hermesDir)
- }
- }
-
- task prepareHermes() {
- if (REACT_NATIVE_VERSION >= 69) {
- if (!isNewArchitectureEnabled()) {
- dependsOn(unzipHermes)
- }
-
- doLast {
- def hermesAAR = file("$reactNative/android/com/facebook/react/hermes-engine/$REACT_NATIVE_FULL_VERSION/hermes-engine-$REACT_NATIVE_FULL_VERSION-${resolveBuildType()}.aar") // e.g. hermes-engine-0.70.0-rc.1-debug.aar
- if (!hermesAAR.exists()) {
- throw new GradleScriptException("Could not find hermes-engine AAR", null)
- }
-
- def soFiles = zipTree(hermesAAR).matching({ it.include "**/*.so" })
-
- copy {
- from soFiles
- from "$reactNative/ReactAndroid/src/main/jni/first-party/hermes/Android.mk"
- into "$thirdPartyNdkDir/hermes"
- }
- }
- } else {
- doLast {
- def hermesPackagePath = findNodeModulePath(projectDir, "hermes-engine")
- if (!hermesPackagePath) {
- throw new GradleScriptException("Could not find the hermes-engine npm package", null)
- }
-
- def hermesAAR = file("$hermesPackagePath/android/hermes-debug.aar")
- if (!hermesAAR.exists()) {
- throw new GradleScriptException("The hermes-engine npm package is missing \"android/hermes-debug.aar\"", null)
- }
-
- def soFiles = zipTree(hermesAAR).matching({ it.include "**/*.so" })
-
- copy {
- from soFiles
- from "$reactNative/ReactAndroid/src/main/jni/first-party/hermes/Android.mk"
- into "$thirdPartyNdkDir/hermes"
- }
- }
- }
- }
-
- prepareHermes.mustRunAfter prepareThirdPartyNdkHeaders
-
- task prepareJSC {
- doLast {
- def jscPackagePath = file("${nodeModules}/jsc-android")
- if (!jscPackagePath.exists()) {
- throw new GradleScriptException("Could not find the jsc-android npm package", null)
- }
-
- def jscDist = file("$jscPackagePath/dist")
- if (!jscDist.exists()) {
- throw new GradleScriptException("The jsc-android npm package is missing its \"dist\" directory", null)
- }
-
- def jscAAR = fileTree(jscDist).matching({ it.include "**/android-jsc/**/*.aar" }).singleFile
- def soFiles = zipTree(jscAAR).matching({ it.include "**/*.so" })
-
- def headerFiles = fileTree(jscDist).matching({ it.include "**/include/*.h" })
-
- copy {
- from(soFiles)
- from(headerFiles)
- from("$reactNative/ReactAndroid/src/main/jni/third-party/jsc/Android.mk")
-
- filesMatching("**/*.h", { it.path = "JavaScriptCore/${it.name}" })
-
- includeEmptyDirs(false)
- into("$thirdPartyNdkDir/jsc")
- }
- }
- }
-
- prepareJSC.mustRunAfter prepareHermes
-
- task extractAARHeaders {
- doLast {
- configurations.extractHeaders.files.each {
- def file = it.absoluteFile
- copy {
- from zipTree(file)
- into "$buildDir/$file.name"
- include "**/*.h"
- }
- }
- }
- }
-
- extractAARHeaders.mustRunAfter prepareJSC
-
- task extractJNIFiles {
- doLast {
- configurations.extractJNI.files.each {
- def file = it.absoluteFile
-
- copy {
- from zipTree(file)
- into "$buildDir/$file.name"
- include "jni/**/*"
- }
- }
- }
- }
-
- extractJNIFiles.mustRunAfter extractAARHeaders
-
- // pre-native build pipeline
-
- tasks.whenTaskAdded { task ->
- if (!task.name.contains('Clean') && (task.name.contains('externalNative') || task.name.contains('CMake'))) {
- task.dependsOn(extractJNIFiles)
- if (REACT_NATIVE_VERSION < 71) {
- task.dependsOn(extractAARHeaders)
- task.dependsOn(prepareThirdPartyNdkHeaders)
- task.dependsOn(prepareJSC)
- task.dependsOn(prepareHermes)
- }
- }
- }
-}
diff --git a/android/gradlew b/android/gradlew
deleted file mode 100755
index 2fe81a7d..00000000
--- a/android/gradlew
+++ /dev/null
@@ -1,183 +0,0 @@
-#!/usr/bin/env sh
-
-#
-# Copyright 2015 the original author or authors.
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# https://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-##############################################################################
-##
-## Gradle start up script for UN*X
-##
-##############################################################################
-
-# Attempt to set APP_HOME
-# Resolve links: $0 may be a link
-PRG="$0"
-# Need this for relative symlinks.
-while [ -h "$PRG" ] ; do
- ls=`ls -ld "$PRG"`
- link=`expr "$ls" : '.*-> \(.*\)$'`
- if expr "$link" : '/.*' > /dev/null; then
- PRG="$link"
- else
- PRG=`dirname "$PRG"`"/$link"
- fi
-done
-SAVED="`pwd`"
-cd "`dirname \"$PRG\"`/" >/dev/null
-APP_HOME="`pwd -P`"
-cd "$SAVED" >/dev/null
-
-APP_NAME="Gradle"
-APP_BASE_NAME=`basename "$0"`
-
-# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
-DEFAULT_JVM_OPTS='"-Xmx64m" "-Xms64m"'
-
-# Use the maximum available, or set MAX_FD != -1 to use that value.
-MAX_FD="maximum"
-
-warn () {
- echo "$*"
-}
-
-die () {
- echo
- echo "$*"
- echo
- exit 1
-}
-
-# OS specific support (must be 'true' or 'false').
-cygwin=false
-msys=false
-darwin=false
-nonstop=false
-case "`uname`" in
- CYGWIN* )
- cygwin=true
- ;;
- Darwin* )
- darwin=true
- ;;
- MINGW* )
- msys=true
- ;;
- NONSTOP* )
- nonstop=true
- ;;
-esac
-
-CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
-
-# Determine the Java command to use to start the JVM.
-if [ -n "$JAVA_HOME" ] ; then
- if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
- # IBM's JDK on AIX uses strange locations for the executables
- JAVACMD="$JAVA_HOME/jre/sh/java"
- else
- JAVACMD="$JAVA_HOME/bin/java"
- fi
- if [ ! -x "$JAVACMD" ] ; then
- die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
-
-Please set the JAVA_HOME variable in your environment to match the
-location of your Java installation."
- fi
-else
- JAVACMD="java"
- which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
-
-Please set the JAVA_HOME variable in your environment to match the
-location of your Java installation."
-fi
-
-# Increase the maximum file descriptors if we can.
-if [ "$cygwin" = "false" -a "$darwin" = "false" -a "$nonstop" = "false" ] ; then
- MAX_FD_LIMIT=`ulimit -H -n`
- if [ $? -eq 0 ] ; then
- if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
- MAX_FD="$MAX_FD_LIMIT"
- fi
- ulimit -n $MAX_FD
- if [ $? -ne 0 ] ; then
- warn "Could not set maximum file descriptor limit: $MAX_FD"
- fi
- else
- warn "Could not query maximum file descriptor limit: $MAX_FD_LIMIT"
- fi
-fi
-
-# For Darwin, add options to specify how the application appears in the dock
-if $darwin; then
- GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
-fi
-
-# For Cygwin or MSYS, switch paths to Windows format before running java
-if [ "$cygwin" = "true" -o "$msys" = "true" ] ; then
- APP_HOME=`cygpath --path --mixed "$APP_HOME"`
- CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
- JAVACMD=`cygpath --unix "$JAVACMD"`
-
- # We build the pattern for arguments to be converted via cygpath
- ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
- SEP=""
- for dir in $ROOTDIRSRAW ; do
- ROOTDIRS="$ROOTDIRS$SEP$dir"
- SEP="|"
- done
- OURCYGPATTERN="(^($ROOTDIRS))"
- # Add a user-defined pattern to the cygpath arguments
- if [ "$GRADLE_CYGPATTERN" != "" ] ; then
- OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
- fi
- # Now convert the arguments - kludge to limit ourselves to /bin/sh
- i=0
- for arg in "$@" ; do
- CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
- CHECK2=`echo "$arg"|egrep -c "^-"` ### Determine if an option
-
- if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then ### Added a condition
- eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
- else
- eval `echo args$i`="\"$arg\""
- fi
- i=`expr $i + 1`
- done
- case $i in
- 0) set -- ;;
- 1) set -- "$args0" ;;
- 2) set -- "$args0" "$args1" ;;
- 3) set -- "$args0" "$args1" "$args2" ;;
- 4) set -- "$args0" "$args1" "$args2" "$args3" ;;
- 5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
- 6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
- 7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
- 8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
- 9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
- esac
-fi
-
-# Escape application args
-save () {
- for i do printf %s\\n "$i" | sed "s/'/'\\\\''/g;1s/^/'/;\$s/\$/' \\\\/" ; done
- echo " "
-}
-APP_ARGS=`save "$@"`
-
-# Collect all arguments for the java command, following the shell quoting and substitution rules
-eval set -- $DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS "\"-Dorg.gradle.appname=$APP_BASE_NAME\"" -classpath "\"$CLASSPATH\"" org.gradle.wrapper.GradleWrapperMain "$APP_ARGS"
-
-exec "$JAVACMD" "$@"
diff --git a/android/settings.gradle b/android/settings.gradle
deleted file mode 100644
index b02ca4a9..00000000
--- a/android/settings.gradle
+++ /dev/null
@@ -1,6 +0,0 @@
-rootProject.name = 'VisionCamera'
-
-include ':react-native-reanimated'
-project(':react-native-reanimated').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-reanimated/android/')
-
-include ':VisionCamera'
diff --git a/android/src/main/cpp/CameraView.cpp b/android/src/main/cpp/CameraView.cpp
deleted file mode 100644
index 665bd312..00000000
--- a/android/src/main/cpp/CameraView.cpp
+++ /dev/null
@@ -1,58 +0,0 @@
-//
-// Created by Marc Rousavy on 14.06.21.
-//
-
-#include "CameraView.h"
-
-#include
-#include
-#include
-
-#include
-#include
-#include
-
-namespace vision {
-
-using namespace facebook;
-using namespace jni;
-
-using TSelf = local_ref;
-
-TSelf CameraView::initHybrid(alias_ref jThis) {
- return makeCxxInstance(jThis);
-}
-
-void CameraView::registerNatives() {
- registerHybrid({
- makeNativeMethod("initHybrid", CameraView::initHybrid),
- makeNativeMethod("frameProcessorCallback", CameraView::frameProcessorCallback),
- });
-}
-
-void CameraView::frameProcessorCallback(const alias_ref& frame) {
- if (frameProcessor_ == nullptr) {
- __android_log_write(ANDROID_LOG_WARN, TAG, "Called Frame Processor callback, but `frameProcessor` is null!");
- return;
- }
-
- try {
- frameProcessor_(frame);
- } catch (const jsi::JSError& error) {
- // TODO: jsi::JSErrors cannot be caught on Hermes. They crash the entire app.
- auto stack = std::regex_replace(error.getStack(), std::regex("\n"), "\n ");
- __android_log_print(ANDROID_LOG_ERROR, TAG, "Frame Processor threw an error! %s\nIn: %s", error.getMessage().c_str(), stack.c_str());
- } catch (const std::exception& exception) {
- __android_log_print(ANDROID_LOG_ERROR, TAG, "Frame Processor threw a C++ error! %s", exception.what());
- }
-}
-
-void CameraView::setFrameProcessor(const TFrameProcessor&& frameProcessor) {
- frameProcessor_ = frameProcessor;
-}
-
-void vision::CameraView::unsetFrameProcessor() {
- frameProcessor_ = nullptr;
-}
-
-} // namespace vision
diff --git a/android/src/main/cpp/CameraView.h b/android/src/main/cpp/CameraView.h
deleted file mode 100644
index a1ca070d..00000000
--- a/android/src/main/cpp/CameraView.h
+++ /dev/null
@@ -1,43 +0,0 @@
-//
-// Created by Marc Rousavy on 14.06.21.
-//
-
-#pragma once
-
-#include
-#include
-
-#include
-
-#include "java-bindings/JImageProxy.h"
-
-namespace vision {
-
-using namespace facebook;
-using TFrameProcessor = std::function)>;
-
-class CameraView : public jni::HybridClass {
- public:
- static auto constexpr kJavaDescriptor = "Lcom/mrousavy/camera/CameraView;";
- static auto constexpr TAG = "VisionCamera";
- static jni::local_ref initHybrid(jni::alias_ref jThis);
- static void registerNatives();
-
- // TODO: Use template<> to avoid heap allocation for std::function<>
- void setFrameProcessor(const TFrameProcessor&& frameProcessor);
- void unsetFrameProcessor();
-
- private:
- friend HybridBase;
- jni::global_ref javaPart_;
- TFrameProcessor frameProcessor_;
-
- void frameProcessorCallback(const jni::alias_ref& frame);
-
- explicit CameraView(jni::alias_ref jThis) :
- javaPart_(jni::make_global(jThis)),
- frameProcessor_(nullptr)
- {}
-};
-
-} // namespace vision
diff --git a/android/src/main/cpp/FrameHostObject.cpp b/android/src/main/cpp/FrameHostObject.cpp
deleted file mode 100644
index b7d02cdc..00000000
--- a/android/src/main/cpp/FrameHostObject.cpp
+++ /dev/null
@@ -1,100 +0,0 @@
-//
-// Created by Marc on 19/06/2021.
-//
-
-#include "FrameHostObject.h"
-#include
-#include
-#include
-#include
-#include
-
-namespace vision {
-
-using namespace facebook;
-
-FrameHostObject::FrameHostObject(jni::alias_ref image): frame(make_global(image)) { }
-
-FrameHostObject::~FrameHostObject() {
- // Hermes' Garbage Collector (Hades GC) calls destructors on a separate Thread
- // which might not be attached to JNI. Ensure that we use the JNI class loader when
- // deallocating the `frame` HybridClass, because otherwise JNI cannot call the Java
- // destroy() function.
- jni::ThreadScope::WithClassLoader([&] { frame.reset(); });
-}
-
-std::vector FrameHostObject::getPropertyNames(jsi::Runtime& rt) {
- std::vector result;
- result.push_back(jsi::PropNameID::forUtf8(rt, std::string("toString")));
- result.push_back(jsi::PropNameID::forUtf8(rt, std::string("isValid")));
- result.push_back(jsi::PropNameID::forUtf8(rt, std::string("width")));
- result.push_back(jsi::PropNameID::forUtf8(rt, std::string("height")));
- result.push_back(jsi::PropNameID::forUtf8(rt, std::string("bytesPerRow")));
- result.push_back(jsi::PropNameID::forUtf8(rt, std::string("planesCount")));
- result.push_back(jsi::PropNameID::forUtf8(rt, std::string("close")));
- return result;
-}
-
-jsi::Value FrameHostObject::get(jsi::Runtime& runtime, const jsi::PropNameID& propNameId) {
- auto name = propNameId.utf8(runtime);
-
- if (name == "toString") {
- auto toString = [this] (jsi::Runtime& runtime, const jsi::Value&, const jsi::Value*, size_t) -> jsi::Value {
- if (!this->frame) {
- return jsi::String::createFromUtf8(runtime, "[closed frame]");
- }
- auto width = this->frame->getWidth();
- auto height = this->frame->getHeight();
- auto str = std::to_string(width) + " x " + std::to_string(height) + " Frame";
- return jsi::String::createFromUtf8(runtime, str);
- };
- return jsi::Function::createFromHostFunction(runtime, jsi::PropNameID::forUtf8(runtime, "toString"), 0, toString);
- }
- if (name == "close") {
- auto close = [this] (jsi::Runtime& runtime, const jsi::Value&, const jsi::Value*, size_t) -> jsi::Value {
- if (!this->frame) {
- throw jsi::JSError(runtime, "Trying to close an already closed frame! Did you call frame.close() twice?");
- }
- this->close();
- return jsi::Value::undefined();
- };
- return jsi::Function::createFromHostFunction(runtime, jsi::PropNameID::forUtf8(runtime, "close"), 0, close);
- }
-
- if (name == "isValid") {
- return jsi::Value(this->frame && this->frame->getIsValid());
- }
- if (name == "width") {
- this->assertIsFrameStrong(runtime, name);
- return jsi::Value(this->frame->getWidth());
- }
- if (name == "height") {
- this->assertIsFrameStrong(runtime, name);
- return jsi::Value(this->frame->getHeight());
- }
- if (name == "bytesPerRow") {
- this->assertIsFrameStrong(runtime, name);
- return jsi::Value(this->frame->getBytesPerRow());
- }
- if (name == "planesCount") {
- this->assertIsFrameStrong(runtime, name);
- return jsi::Value(this->frame->getPlanesCount());
- }
-
- return jsi::Value::undefined();
-}
-
-void FrameHostObject::assertIsFrameStrong(jsi::Runtime& runtime, const std::string& accessedPropName) const {
- if (!this->frame) {
- auto message = "Cannot get `" + accessedPropName + "`, frame is already closed!";
- throw jsi::JSError(runtime, message.c_str());
- }
-}
-
-void FrameHostObject::close() {
- if (this->frame) {
- this->frame->close();
- }
-}
-
-} // namespace vision
diff --git a/android/src/main/cpp/FrameHostObject.h b/android/src/main/cpp/FrameHostObject.h
deleted file mode 100644
index 90fcf019..00000000
--- a/android/src/main/cpp/FrameHostObject.h
+++ /dev/null
@@ -1,39 +0,0 @@
-//
-// Created by Marc on 19/06/2021.
-//
-
-#pragma once
-
-#include
-#include
-#include
-#include
-#include
-
-#include "java-bindings/JImageProxy.h"
-
-namespace vision {
-
-using namespace facebook;
-
-class JSI_EXPORT FrameHostObject : public jsi::HostObject {
- public:
- explicit FrameHostObject(jni::alias_ref image);
- ~FrameHostObject();
-
- public:
- jsi::Value get(jsi::Runtime &, const jsi::PropNameID &name) override;
- std::vector getPropertyNames(jsi::Runtime &rt) override;
-
- void close();
-
- public:
- jni::global_ref frame;
-
- private:
- static auto constexpr TAG = "VisionCamera";
-
- void assertIsFrameStrong(jsi::Runtime& runtime, const std::string& accessedPropName) const; // NOLINT(runtime/references)
-};
-
-} // namespace vision
diff --git a/android/src/main/cpp/FrameProcessorRuntimeManager.cpp b/android/src/main/cpp/FrameProcessorRuntimeManager.cpp
deleted file mode 100644
index ebbf3ef2..00000000
--- a/android/src/main/cpp/FrameProcessorRuntimeManager.cpp
+++ /dev/null
@@ -1,278 +0,0 @@
-//
-// Created by Marc Rousavy on 11.06.21.
-//
-
-#include "FrameProcessorRuntimeManager.h"
-#include
-#include
-#include
-#include
-
-#include "RuntimeDecorator.h"
-#include "RuntimeManager.h"
-#include "reanimated-headers/AndroidScheduler.h"
-#include "reanimated-headers/AndroidErrorHandler.h"
-
-#include "MakeJSIRuntime.h"
-#include "CameraView.h"
-#include "FrameHostObject.h"
-#include "JSIJNIConversion.h"
-#include "VisionCameraScheduler.h"
-#include "java-bindings/JImageProxy.h"
-#include "java-bindings/JFrameProcessorPlugin.h"
-
-namespace vision {
-
-// type aliases
-using TSelf = local_ref::jhybriddata>;
-using TJSCallInvokerHolder = jni::alias_ref;
-using TAndroidScheduler = jni::alias_ref;
-
-// JNI binding
-void vision::FrameProcessorRuntimeManager::registerNatives() {
- registerHybrid({
- makeNativeMethod("initHybrid",
- FrameProcessorRuntimeManager::initHybrid),
- makeNativeMethod("installJSIBindings",
- FrameProcessorRuntimeManager::installJSIBindings),
- makeNativeMethod("initializeRuntime",
- FrameProcessorRuntimeManager::initializeRuntime),
- makeNativeMethod("registerPlugin",
- FrameProcessorRuntimeManager::registerPlugin),
- });
-}
-
-// JNI init
-TSelf vision::FrameProcessorRuntimeManager::initHybrid(
- alias_ref jThis,
- jlong jsRuntimePointer,
- TJSCallInvokerHolder jsCallInvokerHolder,
- TAndroidScheduler androidScheduler) {
- __android_log_write(ANDROID_LOG_INFO, TAG,
- "Initializing FrameProcessorRuntimeManager...");
-
- // cast from JNI hybrid objects to C++ instances
- auto runtime = reinterpret_cast(jsRuntimePointer);
- auto jsCallInvoker = jsCallInvokerHolder->cthis()->getCallInvoker();
- auto scheduler = std::shared_ptr(androidScheduler->cthis());
- scheduler->setJSCallInvoker(jsCallInvoker);
-
- return makeCxxInstance(jThis, runtime, jsCallInvoker, scheduler);
-}
-
-void vision::FrameProcessorRuntimeManager::initializeRuntime() {
- __android_log_write(ANDROID_LOG_INFO, TAG,
- "Initializing Vision JS-Runtime...");
-
- // create JSI runtime and decorate it
- auto runtime = makeJSIRuntime();
- reanimated::RuntimeDecorator::decorateRuntime(*runtime, "FRAME_PROCESSOR");
- runtime->global().setProperty(*runtime, "_FRAME_PROCESSOR",
- jsi::Value(true));
-
- // create REA runtime manager
- auto errorHandler = std::make_shared(scheduler_);
- _runtimeManager = std::make_unique(std::move(runtime),
- errorHandler,
- scheduler_);
-
- __android_log_write(ANDROID_LOG_INFO, TAG,
- "Initialized Vision JS-Runtime!");
-}
-
-global_ref FrameProcessorRuntimeManager::findCameraViewById(int viewId) {
- static const auto findCameraViewByIdMethod = javaPart_->getClass()->getMethod("findCameraViewById");
- auto weakCameraView = findCameraViewByIdMethod(javaPart_.get(), viewId);
- return make_global(weakCameraView);
-}
-
-void FrameProcessorRuntimeManager::logErrorToJS(const std::string& message) {
- if (!this->jsCallInvoker_) {
- return;
- }
-
- this->jsCallInvoker_->invokeAsync([this, message]() {
- if (this->runtime_ == nullptr) {
- return;
- }
-
- auto& runtime = *this->runtime_;
- auto consoleError = runtime
- .global()
- .getPropertyAsObject(runtime, "console")
- .getPropertyAsFunction(runtime, "error");
- consoleError.call(runtime, jsi::String::createFromUtf8(runtime, message));
- });
-}
-
-void FrameProcessorRuntimeManager::setFrameProcessor(jsi::Runtime& runtime,
- int viewTag,
- const jsi::Value& frameProcessor) {
- __android_log_write(ANDROID_LOG_INFO, TAG,
- "Setting new Frame Processor...");
-
- if (!_runtimeManager || !_runtimeManager->runtime) {
- throw jsi::JSError(runtime,
- "setFrameProcessor(..): VisionCamera's RuntimeManager is not yet initialized!");
- }
-
- // find camera view
- auto cameraView = findCameraViewById(viewTag);
- __android_log_write(ANDROID_LOG_INFO, TAG, "Found CameraView!");
-
- // convert jsi::Function to a ShareableValue (can be shared across runtimes)
- __android_log_write(ANDROID_LOG_INFO, TAG,
- "Adapting Shareable value from function (conversion to worklet)...");
- auto worklet = reanimated::ShareableValue::adapt(runtime,
- frameProcessor,
- _runtimeManager.get());
- __android_log_write(ANDROID_LOG_INFO, TAG, "Successfully created worklet!");
-
- scheduler_->scheduleOnUI([=]() {
- // cast worklet to a jsi::Function for the new runtime
- auto& rt = *_runtimeManager->runtime;
- auto function = std::make_shared(worklet->getValue(rt).asObject(rt).asFunction(rt));
-
- // assign lambda to frame processor
- cameraView->cthis()->setFrameProcessor([this, &rt, function](jni::alias_ref frame) {
- try {
- // create HostObject which holds the Frame (JImageProxy)
- auto hostObject = std::make_shared(frame);
- function->callWithThis(rt, *function, jsi::Object::createFromHostObject(rt, hostObject));
- } catch (jsi::JSError& jsError) {
- auto message = "Frame Processor threw an error: " + jsError.getMessage();
- __android_log_write(ANDROID_LOG_ERROR, TAG, message.c_str());
- this->logErrorToJS(message);
- }
- });
-
- __android_log_write(ANDROID_LOG_INFO, TAG, "Frame Processor set!");
- });
-}
-
-void FrameProcessorRuntimeManager::unsetFrameProcessor(int viewTag) {
- __android_log_write(ANDROID_LOG_INFO, TAG, "Removing Frame Processor...");
-
- // find camera view
- auto cameraView = findCameraViewById(viewTag);
-
- // call Java method to unset frame processor
- cameraView->cthis()->unsetFrameProcessor();
-
- __android_log_write(ANDROID_LOG_INFO, TAG, "Frame Processor removed!");
-}
-
-// actual JSI installer
-void FrameProcessorRuntimeManager::installJSIBindings() {
- __android_log_write(ANDROID_LOG_INFO, TAG, "Installing JSI bindings...");
-
- if (runtime_ == nullptr) {
- __android_log_write(ANDROID_LOG_ERROR, TAG,
- "JS-Runtime was null, Frame Processor JSI bindings could not be installed!");
- return;
- }
-
- auto& jsiRuntime = *runtime_;
-
- auto setFrameProcessor = [this](jsi::Runtime &runtime,
- const jsi::Value &thisValue,
- const jsi::Value *arguments,
- size_t count) -> jsi::Value {
- __android_log_write(ANDROID_LOG_INFO, TAG,
- "Setting new Frame Processor...");
-
- if (!arguments[0].isNumber()) {
- throw jsi::JSError(runtime,
- "Camera::setFrameProcessor: First argument ('viewTag') must be a number!");
- }
- if (!arguments[1].isObject()) {
- throw jsi::JSError(runtime,
- "Camera::setFrameProcessor: Second argument ('frameProcessor') must be a function!");
- }
-
- double viewTag = arguments[0].asNumber();
- const jsi::Value& frameProcessor = arguments[1];
- this->setFrameProcessor(runtime, static_cast(viewTag), frameProcessor);
-
- return jsi::Value::undefined();
- };
- jsiRuntime.global().setProperty(jsiRuntime,
- "setFrameProcessor",
- jsi::Function::createFromHostFunction(
- jsiRuntime,
- jsi::PropNameID::forAscii(jsiRuntime,
- "setFrameProcessor"),
- 2, // viewTag, frameProcessor
- setFrameProcessor));
-
-
- auto unsetFrameProcessor = [this](jsi::Runtime &runtime,
- const jsi::Value &thisValue,
- const jsi::Value *arguments,
- size_t count) -> jsi::Value {
- __android_log_write(ANDROID_LOG_INFO, TAG, "Removing Frame Processor...");
- if (!arguments[0].isNumber()) {
- throw jsi::JSError(runtime,
- "Camera::unsetFrameProcessor: First argument ('viewTag') must be a number!");
- }
-
- auto viewTag = arguments[0].asNumber();
- this->unsetFrameProcessor(static_cast(viewTag));
-
- return jsi::Value::undefined();
- };
- jsiRuntime.global().setProperty(jsiRuntime,
- "unsetFrameProcessor",
- jsi::Function::createFromHostFunction(
- jsiRuntime,
- jsi::PropNameID::forAscii(jsiRuntime,
- "unsetFrameProcessor"),
- 1, // viewTag
- unsetFrameProcessor));
-
- __android_log_write(ANDROID_LOG_INFO, TAG, "Finished installing JSI bindings!");
-}
-
-void FrameProcessorRuntimeManager::registerPlugin(alias_ref plugin) {
- // _runtimeManager might never be null, but we can never be too sure.
- if (!_runtimeManager || !_runtimeManager->runtime) {
- throw std::runtime_error("Tried to register plugin before initializing JS runtime! Call `initializeRuntime()` first.");
- }
-
- auto& runtime = *_runtimeManager->runtime;
-
- // we need a strong reference on the plugin, make_global does that.
- auto pluginGlobal = make_global(plugin);
- // name is always prefixed with two underscores (__)
- auto name = "__" + pluginGlobal->getName();
-
- __android_log_print(ANDROID_LOG_INFO, TAG, "Installing Frame Processor Plugin \"%s\"...", name.c_str());
-
- auto callback = [pluginGlobal](jsi::Runtime& runtime,
- const jsi::Value& thisValue,
- const jsi::Value* arguments,
- size_t count) -> jsi::Value {
- // Unbox object and get typed HostObject
- auto boxedHostObject = arguments[0].asObject(runtime).asHostObject(runtime);
- auto frameHostObject = static_cast(boxedHostObject.get());
-
- // parse params - we are offset by `1` because the frame is the first parameter.
- auto params = JArrayClass::newArray(count - 1);
- for (size_t i = 1; i < count; i++) {
- params->setElement(i - 1, JSIJNIConversion::convertJSIValueToJNIObject(runtime, arguments[i]));
- }
-
- // call implemented virtual method
- auto result = pluginGlobal->callback(frameHostObject->frame, params);
-
- // convert result from JNI to JSI value
- return JSIJNIConversion::convertJNIObjectToJSIValue(runtime, result);
- };
-
- runtime.global().setProperty(runtime, name.c_str(), jsi::Function::createFromHostFunction(runtime,
- jsi::PropNameID::forAscii(runtime, name),
- 1, // frame
- callback));
-}
-
-} // namespace vision
diff --git a/android/src/main/cpp/FrameProcessorRuntimeManager.h b/android/src/main/cpp/FrameProcessorRuntimeManager.h
deleted file mode 100644
index 14d37018..00000000
--- a/android/src/main/cpp/FrameProcessorRuntimeManager.h
+++ /dev/null
@@ -1,64 +0,0 @@
-//
-// Created by Marc Rousavy on 11.06.21.
-//
-
-#pragma once
-
-#include
-#include
-#include
-#include
-#include
-
-#include "RuntimeManager.h"
-#include "reanimated-headers/AndroidScheduler.h"
-
-#include "CameraView.h"
-#include "VisionCameraScheduler.h"
-#include "java-bindings/JFrameProcessorPlugin.h"
-
-namespace vision {
-
-using namespace facebook;
-
-class FrameProcessorRuntimeManager : public jni::HybridClass {
- public:
- static auto constexpr kJavaDescriptor = "Lcom/mrousavy/camera/frameprocessor/FrameProcessorRuntimeManager;";
- static auto constexpr TAG = "VisionCamera";
- static jni::local_ref initHybrid(jni::alias_ref jThis,
- jlong jsContext,
- jni::alias_ref jsCallInvokerHolder,
- jni::alias_ref androidScheduler);
- static void registerNatives();
-
- explicit FrameProcessorRuntimeManager(jni::alias_ref jThis,
- jsi::Runtime* runtime,
- std::shared_ptr jsCallInvoker,
- std::shared_ptr scheduler) :
- javaPart_(jni::make_global(jThis)),
- runtime_(runtime),
- jsCallInvoker_(jsCallInvoker),
- scheduler_(scheduler)
- {}
-
- private:
- friend HybridBase;
- jni::global_ref javaPart_;
- jsi::Runtime* runtime_;
- std::shared_ptr jsCallInvoker_;
- std::shared_ptr _runtimeManager;
- std::shared_ptr scheduler_;
-
- jni::global_ref findCameraViewById(int viewId);
- void initializeRuntime();
- void installJSIBindings();
- void registerPlugin(alias_ref plugin);
- void logErrorToJS(const std::string& message);
-
- void setFrameProcessor(jsi::Runtime& runtime, // NOLINT(runtime/references)
- int viewTag,
- const jsi::Value& frameProcessor);
- void unsetFrameProcessor(int viewTag);
-};
-
-} // namespace vision
diff --git a/android/src/main/cpp/JSIJNIConversion.cpp b/android/src/main/cpp/JSIJNIConversion.cpp
deleted file mode 100644
index a3a0d708..00000000
--- a/android/src/main/cpp/JSIJNIConversion.cpp
+++ /dev/null
@@ -1,199 +0,0 @@
-//
-// Created by Marc Rousavy on 22.06.21.
-//
-
-#include "JSIJNIConversion.h"
-
-#include
-#include
-#include
-#include
-
-#include
-#include
-#include
-
-#include
-#include
-#include
-
-#include
-#include
-
-#include "FrameHostObject.h"
-#include "java-bindings/JImageProxy.h"
-#include "java-bindings/JArrayList.h"
-#include "java-bindings/JHashMap.h"
-
-namespace vision {
-
-using namespace facebook;
-
-jobject JSIJNIConversion::convertJSIValueToJNIObject(jsi::Runtime &runtime, const jsi::Value &value) {
- if (value.isBool()) {
- // jsi::Bool
-
- auto boolean = jni::JBoolean::valueOf(value.getBool());
- return boolean.release();
-
- } else if (value.isNumber()) {
- // jsi::Number
-
- auto number = jni::JDouble::valueOf(value.getNumber());
- return number.release();
-
- } else if (value.isNull() || value.isUndefined()) {
- // jsi::undefined
-
- return nullptr;
-
- } else if (value.isString()) {
- // jsi::String
-
- auto string = jni::make_jstring(value.getString(runtime).utf8(runtime));
- return string.release();
-
- } else if (value.isObject()) {
- // jsi::Object
-
- auto object = value.asObject(runtime);
-
- if (object.isArray(runtime)) {
- // jsi::Array
-
- auto dynamic = jsi::dynamicFromValue(runtime, value);
- auto nativeArray = react::ReadableNativeArray::newObjectCxxArgs(std::move(dynamic));
- return nativeArray.release();
-
- } else if (object.isHostObject(runtime)) {
- // jsi::HostObject
-
- auto boxedHostObject = object.getHostObject(runtime);
- auto hostObject = dynamic_cast(boxedHostObject.get());
- if (hostObject != nullptr) {
- // return jni local_ref to the JImageProxy
- return hostObject->frame.get();
- } else {
- // it's different kind of HostObject. We don't support it.
- throw std::runtime_error("Received an unknown HostObject! Cannot convert to a JNI value.");
- }
-
- } else if (object.isFunction(runtime)) {
- // jsi::Function
-
- // TODO: Convert Function to Callback
- throw std::runtime_error("Cannot convert a JS Function to a JNI value (yet)!");
-
- } else {
- // jsi::Object
-
- auto dynamic = jsi::dynamicFromValue(runtime, value);
- auto map = react::ReadableNativeMap::createWithContents(std::move(dynamic));
- return map.release();
-
- }
- } else {
- // unknown jsi type!
-
- auto stringRepresentation = value.toString(runtime).utf8(runtime);
- auto message = "Received unknown JSI value! (" + stringRepresentation + ") Cannot convert to a JNI value.";
- throw std::runtime_error(message);
-
- }
-}
-
-jsi::Value JSIJNIConversion::convertJNIObjectToJSIValue(jsi::Runtime &runtime, const jni::local_ref& object) {
- if (object == nullptr) {
- // null
-
- return jsi::Value::undefined();
-
- } else if (object->isInstanceOf(jni::JBoolean::javaClassStatic())) {
- // Boolean
-
- static const auto getBooleanFunc = jni::findClassLocal("java/lang/Boolean")->getMethod("booleanValue");
- auto boolean = getBooleanFunc(object.get());
- return jsi::Value(boolean == true);
-
- } else if (object->isInstanceOf(jni::JDouble::javaClassStatic())) {
- // Double
-
- static const auto getDoubleFunc = jni::findClassLocal("java/lang/Double")->getMethod("doubleValue");
- auto d = getDoubleFunc(object.get());
- return jsi::Value(d);
-
- } else if (object->isInstanceOf(jni::JInteger::javaClassStatic())) {
- // Integer
-
- static const auto getIntegerFunc = jni::findClassLocal("java/lang/Integer")->getMethod("intValue");
- auto i = getIntegerFunc(object.get());
- return jsi::Value(i);
-
- } else if (object->isInstanceOf(jni::JString::javaClassStatic())) {
- // String
-
- return jsi::String::createFromUtf8(runtime, object->toString());
-
- } else if (object->isInstanceOf(JArrayList::javaClassStatic())) {
- // ArrayList
-
- auto arrayList = static_ref_cast>(object);
- auto size = arrayList->size();
-
- auto result = jsi::Array(runtime, size);
- size_t i = 0;
- for (const auto& item : *arrayList) {
- result.setValueAtIndex(runtime, i, convertJNIObjectToJSIValue(runtime, item));
- i++;
- }
- return result;
-
- } else if (object->isInstanceOf(react::ReadableArray::javaClassStatic())) {
- // ReadableArray
-
- static const auto toArrayListFunc = react::ReadableArray::javaClassLocal()->getMethod()>("toArrayList");
-
- // call recursive, this time ArrayList
- auto array = toArrayListFunc(object.get());
- return convertJNIObjectToJSIValue(runtime, array);
-
- } else if (object->isInstanceOf(JHashMap::javaClassStatic())) {
- // HashMap
-
- auto map = static_ref_cast>(object);
-
- auto result = jsi::Object(runtime);
- for (const auto& entry : *map) {
- auto key = entry.first->toString();
- auto value = entry.second;
- auto jsiValue = convertJNIObjectToJSIValue(runtime, value);
- result.setProperty(runtime, key.c_str(), jsiValue);
- }
- return result;
-
- } else if (object->isInstanceOf(react::ReadableMap::javaClassStatic())) {
- // ReadableMap
-
- static const auto toHashMapFunc = react::ReadableMap::javaClassLocal()->getMethod()>("toHashMap");
-
- // call recursive, this time HashMap
- auto hashMap = toHashMapFunc(object.get());
- return convertJNIObjectToJSIValue(runtime, hashMap);
-
- } else if (object->isInstanceOf(JImageProxy::javaClassStatic())) {
- // ImageProxy
-
- auto frame = static_ref_cast(object);
-
- // box into HostObject
- auto hostObject = std::make_shared(frame);
- return jsi::Object::createFromHostObject(runtime, hostObject);
- }
-
- auto type = object->getClass()->toString();
- auto message = "Received unknown JNI type \"" + type + "\"! Cannot convert to jsi::Value.";
- __android_log_write(ANDROID_LOG_ERROR, "VisionCamera", message.c_str());
- throw std::runtime_error(message);
-}
-
-} // namespace vision
diff --git a/android/src/main/cpp/JSIJNIConversion.h b/android/src/main/cpp/JSIJNIConversion.h
deleted file mode 100644
index 4af31266..00000000
--- a/android/src/main/cpp/JSIJNIConversion.h
+++ /dev/null
@@ -1,23 +0,0 @@
-//
-// Created by Marc Rousavy on 22.06.21.
-//
-
-#pragma once
-
-#include
-#include
-#include
-
-namespace vision {
-
-namespace JSIJNIConversion {
-
-using namespace facebook;
-
-jobject convertJSIValueToJNIObject(jsi::Runtime& runtime, const jsi::Value& value); // NOLINT(runtime/references)
-
-jsi::Value convertJNIObjectToJSIValue(jsi::Runtime& runtime, const jni::local_ref& object); // NOLINT(runtime/references)
-
-} // namespace JSIJNIConversion
-
-} // namespace vision
diff --git a/android/src/main/cpp/MakeJSIRuntime.h b/android/src/main/cpp/MakeJSIRuntime.h
deleted file mode 100644
index 39045bde..00000000
--- a/android/src/main/cpp/MakeJSIRuntime.h
+++ /dev/null
@@ -1,30 +0,0 @@
-//
-// Created by Marc Rousavy on 06.07.21.
-//
-
-#pragma once
-
-#include
-#include
-
-#if FOR_HERMES
- // Hermes
- #include
-#else
- // JSC
- #include
-#endif
-
-namespace vision {
-
-using namespace facebook;
-
-static std::unique_ptr makeJSIRuntime() {
- #if FOR_HERMES
- return facebook::hermes::makeHermesRuntime();
- #else
- return facebook::jsc::makeJSCRuntime();
- #endif
-}
-
-} // namespace vision
diff --git a/android/src/main/cpp/VisionCamera.cpp b/android/src/main/cpp/VisionCamera.cpp
deleted file mode 100644
index 6b91b8ef..00000000
--- a/android/src/main/cpp/VisionCamera.cpp
+++ /dev/null
@@ -1,13 +0,0 @@
-#include
-#include
-#include "FrameProcessorRuntimeManager.h"
-#include "CameraView.h"
-#include "VisionCameraScheduler.h"
-
-JNIEXPORT jint JNICALL JNI_OnLoad(JavaVM *vm, void *) {
- return facebook::jni::initialize(vm, [] {
- vision::FrameProcessorRuntimeManager::registerNatives();
- vision::CameraView::registerNatives();
- vision::VisionCameraScheduler::registerNatives();
- });
-}
diff --git a/android/src/main/cpp/VisionCameraScheduler.cpp b/android/src/main/cpp/VisionCameraScheduler.cpp
deleted file mode 100644
index 3f065a58..00000000
--- a/android/src/main/cpp/VisionCameraScheduler.cpp
+++ /dev/null
@@ -1,42 +0,0 @@
-//
-// Created by Marc Rousavy on 25.07.21.
-//
-
-#include "VisionCameraScheduler.h"
-#include
-
-namespace vision {
-
-using namespace facebook;
-using TSelf = jni::local_ref;
-
-TSelf VisionCameraScheduler::initHybrid(jni::alias_ref jThis) {
- return makeCxxInstance(jThis);
-}
-
-void VisionCameraScheduler::scheduleOnUI(std::function job) {
- // 1. add job to queue
- uiJobs.push(job);
- scheduleTrigger();
-}
-
-void VisionCameraScheduler::scheduleTrigger() {
- // 2. schedule `triggerUI` to be called on the java thread
- static auto method = javaPart_->getClass()->getMethod("scheduleTrigger");
- method(javaPart_.get());
-}
-
-void VisionCameraScheduler::triggerUI() {
- // 3. call job we enqueued in step 1.
- auto job = uiJobs.pop();
- job();
-}
-
-void VisionCameraScheduler::registerNatives() {
- registerHybrid({
- makeNativeMethod("initHybrid", VisionCameraScheduler::initHybrid),
- makeNativeMethod("triggerUI", VisionCameraScheduler::triggerUI),
- });
-}
-
-} // namespace vision
\ No newline at end of file
diff --git a/android/src/main/cpp/VisionCameraScheduler.h b/android/src/main/cpp/VisionCameraScheduler.h
deleted file mode 100644
index a05c3c9d..00000000
--- a/android/src/main/cpp/VisionCameraScheduler.h
+++ /dev/null
@@ -1,38 +0,0 @@
-//
-// Created by Marc Rousavy on 25.07.21.
-//
-
-#pragma once
-
-#include "Scheduler.h"
-#include
-#include
-#include
-
-namespace vision {
-
-using namespace facebook;
-
-class VisionCameraScheduler : public reanimated::Scheduler, public jni::HybridClass {
-public:
- static auto constexpr kJavaDescriptor = "Lcom/mrousavy/camera/frameprocessor/VisionCameraScheduler;";
- static jni::local_ref initHybrid(jni::alias_ref jThis);
- static void registerNatives();
-
- // schedules the given job to be run on the VisionCamera FP Thread at some future point in time
- void scheduleOnUI(std::function job) override;
-
-private:
- friend HybridBase;
- jni::global_ref javaPart_;
-
- explicit VisionCameraScheduler(jni::alias_ref jThis):
- javaPart_(jni::make_global(jThis)) {}
-
- // Schedules a call to `triggerUI` on the VisionCamera FP Thread
- void scheduleTrigger();
- // Calls the latest job in the job queue
- void triggerUI() override;
-};
-
-} // namespace vision
\ No newline at end of file
diff --git a/android/src/main/cpp/java-bindings/JArrayList.h b/android/src/main/cpp/java-bindings/JArrayList.h
deleted file mode 100644
index 1b961348..00000000
--- a/android/src/main/cpp/java-bindings/JArrayList.h
+++ /dev/null
@@ -1,21 +0,0 @@
-//
-// Created by Marc Rousavy on 24.06.21.
-//
-
-#pragma once
-
-#include
-#include
-
-namespace vision {
-
-using namespace facebook;
-using namespace jni;
-
-// TODO: Remove when fbjni 0.2.3 releases.
-template
-struct JArrayList : JavaClass, JList> {
- constexpr static auto kJavaDescriptor = "Ljava/util/ArrayList;";
-};
-
-} // namespace vision
diff --git a/android/src/main/cpp/java-bindings/JFrameProcessorPlugin.cpp b/android/src/main/cpp/java-bindings/JFrameProcessorPlugin.cpp
deleted file mode 100644
index 54a162b9..00000000
--- a/android/src/main/cpp/java-bindings/JFrameProcessorPlugin.cpp
+++ /dev/null
@@ -1,30 +0,0 @@
-//
-// Created by Marc Rousavy on 29.09.21.
-//
-
-#include "JFrameProcessorPlugin.h"
-
-#include
-#include
-
-namespace vision {
-
-using namespace facebook;
-using namespace jni;
-
-using TCallback = jobject(alias_ref, alias_ref>);
-
-local_ref JFrameProcessorPlugin::callback(alias_ref image,
- alias_ref> params) const {
- auto callbackMethod = getClass()->getMethod("callback");
-
- auto result = callbackMethod(self(), image, params);
- return make_local(result);
-}
-
-std::string JFrameProcessorPlugin::getName() const {
- auto getNameMethod = getClass()->getMethod("getName");
- return getNameMethod(self())->toStdString();
-}
-
-} // namespace vision
diff --git a/android/src/main/cpp/java-bindings/JHashMap.cpp b/android/src/main/cpp/java-bindings/JHashMap.cpp
deleted file mode 100644
index affd4647..00000000
--- a/android/src/main/cpp/java-bindings/JHashMap.cpp
+++ /dev/null
@@ -1,20 +0,0 @@
-//
-// Created by Marc Rousavy on 25.06.21.
-//
-
-#include "JHashMap.h"
-
-#include
-#include
-
-namespace vision {
-
-using namespace facebook;
-using namespace jni;
-
-template
-local_ref> JHashMap::create() {
- return JHashMap::newInstance();
-}
-
-} // namespace vision
diff --git a/android/src/main/cpp/java-bindings/JHashMap.h b/android/src/main/cpp/java-bindings/JHashMap.h
deleted file mode 100644
index b11d3c83..00000000
--- a/android/src/main/cpp/java-bindings/JHashMap.h
+++ /dev/null
@@ -1,23 +0,0 @@
-//
-// Created by Marc Rousavy on 25.06.21.
-//
-
-#pragma once
-
-#include
-#include
-
-namespace vision {
-
-using namespace facebook;
-using namespace jni;
-
-// TODO: Remove when fbjni 0.2.3 releases.
-template
-struct JHashMap : JavaClass, JMap> {
- constexpr static auto kJavaDescriptor = "Ljava/util/HashMap;";
-
- static local_ref> create();
-};
-
-} // namespace vision
diff --git a/android/src/main/cpp/java-bindings/JImageProxy.cpp b/android/src/main/cpp/java-bindings/JImageProxy.cpp
deleted file mode 100644
index 0aec4d55..00000000
--- a/android/src/main/cpp/java-bindings/JImageProxy.cpp
+++ /dev/null
@@ -1,53 +0,0 @@
-//
-// Created by Marc Rousavy on 22.06.21.
-//
-
-#include "JImageProxy.h"
-
-#include
-#include
-
-namespace vision {
-
-using namespace facebook;
-using namespace jni;
-
-int JImageProxy::getWidth() const {
- static const auto getWidthMethod = getClass()->getMethod("getWidth");
- return getWidthMethod(self());
-}
-
-int JImageProxy::getHeight() const {
- static const auto getWidthMethod = getClass()->getMethod("getHeight");
- return getWidthMethod(self());
-}
-
-alias_ref getUtilsClass() {
- static const auto ImageProxyUtilsClass = findClassStatic("com/mrousavy/camera/frameprocessor/ImageProxyUtils");
- return ImageProxyUtilsClass;
-}
-
-bool JImageProxy::getIsValid() const {
- auto utilsClass = getUtilsClass();
- static const auto isImageProxyValidMethod = utilsClass->getStaticMethod("isImageProxyValid");
- return isImageProxyValidMethod(utilsClass, self());
-}
-
-int JImageProxy::getPlanesCount() const {
- auto utilsClass = getUtilsClass();
- static const auto getPlanesCountMethod = utilsClass->getStaticMethod("getPlanesCount");
- return getPlanesCountMethod(utilsClass, self());
-}
-
-int JImageProxy::getBytesPerRow() const {
- auto utilsClass = getUtilsClass();
- static const auto getBytesPerRowMethod = utilsClass->getStaticMethod("getBytesPerRow");
- return getBytesPerRowMethod(utilsClass, self());
-}
-
-void JImageProxy::close() {
- static const auto closeMethod = getClass()->getMethod("close");
- closeMethod(self());
-}
-
-} // namespace vision
diff --git a/android/src/main/cpp/java-bindings/JImageProxy.h b/android/src/main/cpp/java-bindings/JImageProxy.h
deleted file mode 100644
index c293cd8b..00000000
--- a/android/src/main/cpp/java-bindings/JImageProxy.h
+++ /dev/null
@@ -1,27 +0,0 @@
-//
-// Created by Marc on 19/06/2021.
-//
-
-#pragma once
-
-#include
-#include
-
-namespace vision {
-
-using namespace facebook;
-using namespace jni;
-
-struct JImageProxy : public JavaClass {
- static constexpr auto kJavaDescriptor = "Landroidx/camera/core/ImageProxy;";
-
- public:
- int getWidth() const;
- int getHeight() const;
- bool getIsValid() const;
- int getPlanesCount() const;
- int getBytesPerRow() const;
- void close();
-};
-
-} // namespace vision
diff --git a/android/src/main/cpp/reanimated-headers/AndroidErrorHandler.h b/android/src/main/cpp/reanimated-headers/AndroidErrorHandler.h
deleted file mode 100644
index fd45154f..00000000
--- a/android/src/main/cpp/reanimated-headers/AndroidErrorHandler.h
+++ /dev/null
@@ -1,30 +0,0 @@
-// copied from https://github.com/software-mansion/react-native-reanimated/blob/master/android/src/main/cpp/headers/AndroidErrorHandler.h
-
-#pragma once
-
-#include "ErrorHandler.h"
-#include "AndroidScheduler.h"
-#include "Scheduler.h"
-#include
-#include
-#include
-#include "Logger.h"
-
-namespace reanimated
-{
-
- class AndroidErrorHandler : public JavaClass, public ErrorHandler {
- std::shared_ptr error;
- std::shared_ptr scheduler;
- void raiseSpec() override;
- public:
- static auto constexpr kJavaDescriptor = "Lcom/swmansion/reanimated/AndroidErrorHandler;";
- AndroidErrorHandler(
- std::shared_ptr scheduler);
- std::shared_ptr getScheduler() override;
- std::shared_ptr getError() override;
- void setError(std::string message) override;
- virtual ~AndroidErrorHandler() {}
- };
-
-}
diff --git a/android/src/main/cpp/reanimated-headers/AndroidScheduler.h b/android/src/main/cpp/reanimated-headers/AndroidScheduler.h
deleted file mode 100644
index e96887cb..00000000
--- a/android/src/main/cpp/reanimated-headers/AndroidScheduler.h
+++ /dev/null
@@ -1,37 +0,0 @@
-// copied from https://github.com/software-mansion/react-native-reanimated/blob/master/android/src/main/cpp/headers/AndroidScheduler.h
-
-#pragma once
-
-#include
-#include
-#include
-#include
-#include
-#include "Scheduler.h"
-
-namespace reanimated {
-
- using namespace facebook;
-
- class AndroidScheduler : public jni::HybridClass {
- public:
- static auto constexpr kJavaDescriptor = "Lcom/swmansion/reanimated/Scheduler;";
- static jni::local_ref initHybrid(jni::alias_ref jThis);
- static void registerNatives();
-
- std::shared_ptr getScheduler() { return scheduler_; }
-
- void scheduleOnUI();
-
- private:
- friend HybridBase;
-
- void triggerUI();
-
- jni::global_ref javaPart_;
- std::shared_ptr scheduler_;
-
- explicit AndroidScheduler(jni::alias_ref jThis);
- };
-
-}
diff --git a/android/src/main/java/com/mrousavy/camera/CameraView+Focus.kt b/android/src/main/java/com/mrousavy/camera/CameraView+Focus.kt
deleted file mode 100644
index c945c0b6..00000000
--- a/android/src/main/java/com/mrousavy/camera/CameraView+Focus.kt
+++ /dev/null
@@ -1,29 +0,0 @@
-package com.mrousavy.camera
-
-import androidx.camera.core.FocusMeteringAction
-import com.facebook.react.bridge.ReadableMap
-import kotlinx.coroutines.guava.await
-import kotlinx.coroutines.withContext
-import java.util.concurrent.TimeUnit
-
-suspend fun CameraView.focus(pointMap: ReadableMap) {
- val cameraControl = camera?.cameraControl ?: throw CameraNotReadyError()
- if (!pointMap.hasKey("x") || !pointMap.hasKey("y")) {
- throw InvalidTypeScriptUnionError("point", pointMap.toString())
- }
-
- val dpi = resources.displayMetrics.density
- val x = pointMap.getDouble("x") * dpi
- val y = pointMap.getDouble("y") * dpi
-
- // Getting the point from the previewView needs to be run on the UI thread
- val point = withContext(coroutineScope.coroutineContext) {
- previewView.meteringPointFactory.createPoint(x.toFloat(), y.toFloat());
- }
-
- val action = FocusMeteringAction.Builder(point, FocusMeteringAction.FLAG_AF or FocusMeteringAction.FLAG_AE)
- .setAutoCancelDuration(5, TimeUnit.SECONDS) // auto-reset after 5 seconds
- .build()
-
- cameraControl.startFocusAndMetering(action).await()
-}
diff --git a/android/src/main/java/com/mrousavy/camera/CameraView+RecordVideo.kt b/android/src/main/java/com/mrousavy/camera/CameraView+RecordVideo.kt
deleted file mode 100644
index 14ee7472..00000000
--- a/android/src/main/java/com/mrousavy/camera/CameraView+RecordVideo.kt
+++ /dev/null
@@ -1,122 +0,0 @@
-package com.mrousavy.camera
-
-import android.Manifest
-import android.annotation.SuppressLint
-import android.content.pm.PackageManager
-import androidx.camera.video.FileOutputOptions
-import androidx.camera.video.VideoRecordEvent
-import androidx.core.content.ContextCompat
-import androidx.core.util.Consumer
-import com.facebook.react.bridge.*
-import com.mrousavy.camera.utils.makeErrorMap
-import java.io.File
-import java.text.SimpleDateFormat
-import java.util.*
-
-data class TemporaryFile(val path: String)
-
-fun CameraView.startRecording(options: ReadableMap, onRecordCallback: Callback) {
- if (videoCapture == null) {
- if (video == true) {
- throw CameraNotReadyError()
- } else {
- throw VideoNotEnabledError()
- }
- }
-
- // check audio permission
- if (audio == true) {
- if (ContextCompat.checkSelfPermission(context, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED) {
- throw MicrophonePermissionError()
- }
- }
-
- if (options.hasKey("flash")) {
- val enableFlash = options.getString("flash") == "on"
- // overrides current torch mode value to enable flash while recording
- camera!!.cameraControl.enableTorch(enableFlash)
- }
-
- val id = SimpleDateFormat("yyyyMMdd_HHmmss", Locale.US).format(Date())
- val file = File.createTempFile("VisionCamera-${id}", ".mp4")
- val fileOptions = FileOutputOptions.Builder(file).build()
-
- val recorder = videoCapture!!.output
- var recording = recorder.prepareRecording(context, fileOptions)
-
- if (audio == true) {
- @SuppressLint("MissingPermission")
- recording = recording.withAudioEnabled()
- }
-
- activeVideoRecording = recording.start(ContextCompat.getMainExecutor(context), object : Consumer {
- override fun accept(event: VideoRecordEvent?) {
- if (event is VideoRecordEvent.Finalize) {
- if (event.hasError()) {
- // error occured!
- val error = when (event.error) {
- VideoRecordEvent.Finalize.ERROR_ENCODING_FAILED -> VideoEncoderError(event.cause)
- VideoRecordEvent.Finalize.ERROR_FILE_SIZE_LIMIT_REACHED -> FileSizeLimitReachedError(event.cause)
- VideoRecordEvent.Finalize.ERROR_INSUFFICIENT_STORAGE -> InsufficientStorageError(event.cause)
- VideoRecordEvent.Finalize.ERROR_INVALID_OUTPUT_OPTIONS -> InvalidVideoOutputOptionsError(event.cause)
- VideoRecordEvent.Finalize.ERROR_NO_VALID_DATA -> NoValidDataError(event.cause)
- VideoRecordEvent.Finalize.ERROR_RECORDER_ERROR -> RecorderError(event.cause)
- VideoRecordEvent.Finalize.ERROR_SOURCE_INACTIVE -> InactiveSourceError(event.cause)
- else -> UnknownCameraError(event.cause)
- }
- val map = makeErrorMap("${error.domain}/${error.id}", error.message, error)
- onRecordCallback(null, map)
- } else {
- // recording saved successfully!
- val map = Arguments.createMap()
- map.putString("path", event.outputResults.outputUri.toString())
- map.putDouble("duration", /* seconds */ event.recordingStats.recordedDurationNanos.toDouble() / 1000000.0 / 1000.0)
- map.putDouble("size", /* kB */ event.recordingStats.numBytesRecorded.toDouble() / 1000.0)
- onRecordCallback(map, null)
- }
-
- // reset the torch mode
- camera!!.cameraControl.enableTorch(torch == "on")
- }
- }
- })
-}
-
-@SuppressLint("RestrictedApi")
-fun CameraView.pauseRecording() {
- if (videoCapture == null) {
- throw CameraNotReadyError()
- }
- if (activeVideoRecording == null) {
- throw NoRecordingInProgressError()
- }
-
- activeVideoRecording!!.pause()
-}
-
-@SuppressLint("RestrictedApi")
-fun CameraView.resumeRecording() {
- if (videoCapture == null) {
- throw CameraNotReadyError()
- }
- if (activeVideoRecording == null) {
- throw NoRecordingInProgressError()
- }
-
- activeVideoRecording!!.resume()
-}
-
-@SuppressLint("RestrictedApi")
-fun CameraView.stopRecording() {
- if (videoCapture == null) {
- throw CameraNotReadyError()
- }
- if (activeVideoRecording == null) {
- throw NoRecordingInProgressError()
- }
-
- activeVideoRecording!!.stop()
-
- // reset torch mode to original value
- camera!!.cameraControl.enableTorch(torch == "on")
-}
diff --git a/android/src/main/java/com/mrousavy/camera/CameraView+TakePhoto.kt b/android/src/main/java/com/mrousavy/camera/CameraView+TakePhoto.kt
deleted file mode 100644
index cb7854e4..00000000
--- a/android/src/main/java/com/mrousavy/camera/CameraView+TakePhoto.kt
+++ /dev/null
@@ -1,114 +0,0 @@
-package com.mrousavy.camera
-
-import android.annotation.SuppressLint
-import android.hardware.camera2.*
-import android.util.Log
-import androidx.camera.camera2.interop.Camera2CameraInfo
-import androidx.camera.core.ImageCapture
-import androidx.camera.core.ImageProxy
-import androidx.exifinterface.media.ExifInterface
-import com.facebook.react.bridge.Arguments
-import com.facebook.react.bridge.ReadableMap
-import com.facebook.react.bridge.WritableMap
-import com.mrousavy.camera.utils.*
-import kotlinx.coroutines.*
-import java.io.File
-import kotlin.system.measureTimeMillis
-
-@SuppressLint("UnsafeOptInUsageError")
-suspend fun CameraView.takePhoto(options: ReadableMap): WritableMap = coroutineScope {
- if (fallbackToSnapshot) {
- Log.i(CameraView.TAG, "takePhoto() called, but falling back to Snapshot because 1 use-case is already occupied.")
- return@coroutineScope takeSnapshot(options)
- }
-
- val startFunc = System.nanoTime()
- Log.i(CameraView.TAG, "takePhoto() called")
- if (imageCapture == null) {
- if (photo == true) {
- throw CameraNotReadyError()
- } else {
- throw PhotoNotEnabledError()
- }
- }
-
- if (options.hasKey("flash")) {
- val flashMode = options.getString("flash")
- imageCapture!!.flashMode = when (flashMode) {
- "on" -> ImageCapture.FLASH_MODE_ON
- "off" -> ImageCapture.FLASH_MODE_OFF
- "auto" -> ImageCapture.FLASH_MODE_AUTO
- else -> throw InvalidTypeScriptUnionError("flash", flashMode ?: "(null)")
- }
- }
- // All those options are not yet implemented - see https://github.com/mrousavy/react-native-vision-camera/issues/75
- if (options.hasKey("photoCodec")) {
- // TODO photoCodec
- }
- if (options.hasKey("qualityPrioritization")) {
- // TODO qualityPrioritization
- }
- if (options.hasKey("enableAutoRedEyeReduction")) {
- // TODO enableAutoRedEyeReduction
- }
- if (options.hasKey("enableDualCameraFusion")) {
- // TODO enableDualCameraFusion
- }
- if (options.hasKey("enableAutoStabilization")) {
- // TODO enableAutoStabilization
- }
- if (options.hasKey("enableAutoDistortionCorrection")) {
- // TODO enableAutoDistortionCorrection
- }
- val skipMetadata = if (options.hasKey("skipMetadata")) options.getBoolean("skipMetadata") else false
-
- val camera2Info = Camera2CameraInfo.from(camera!!.cameraInfo)
- val lensFacing = camera2Info.getCameraCharacteristic(CameraCharacteristics.LENS_FACING)
-
- val results = awaitAll(
- async(coroutineContext) {
- Log.d(CameraView.TAG, "Taking picture...")
- val startCapture = System.nanoTime()
- val pic = imageCapture!!.takePicture(takePhotoExecutor)
- val endCapture = System.nanoTime()
- Log.i(CameraView.TAG_PERF, "Finished image capture in ${(endCapture - startCapture) / 1_000_000}ms")
- pic
- },
- async(Dispatchers.IO) {
- Log.d(CameraView.TAG, "Creating temp file...")
- File.createTempFile("mrousavy", ".jpg", context.cacheDir).apply { deleteOnExit() }
- }
- )
- val photo = results.first { it is ImageProxy } as ImageProxy
- val file = results.first { it is File } as File
-
- val exif: ExifInterface?
- @Suppress("BlockingMethodInNonBlockingContext")
- withContext(Dispatchers.IO) {
- Log.d(CameraView.TAG, "Saving picture to ${file.absolutePath}...")
- val milliseconds = measureTimeMillis {
- val flipHorizontally = lensFacing == CameraCharacteristics.LENS_FACING_FRONT
- photo.save(file, flipHorizontally)
- }
- Log.i(CameraView.TAG_PERF, "Finished image saving in ${milliseconds}ms")
- // TODO: Read Exif from existing in-memory photo buffer instead of file?
- exif = if (skipMetadata) null else ExifInterface(file)
- }
-
- val map = Arguments.createMap()
- map.putString("path", file.absolutePath)
- map.putInt("width", photo.width)
- map.putInt("height", photo.height)
- map.putBoolean("isRawPhoto", photo.isRaw)
-
- val metadata = exif?.buildMetadataMap()
- map.putMap("metadata", metadata)
-
- photo.close()
-
- Log.d(CameraView.TAG, "Finished taking photo!")
-
- val endFunc = System.nanoTime()
- Log.i(CameraView.TAG_PERF, "Finished function execution in ${(endFunc - startFunc) / 1_000_000}ms")
- return@coroutineScope map
-}
diff --git a/android/src/main/java/com/mrousavy/camera/CameraView+TakeSnapshot.kt b/android/src/main/java/com/mrousavy/camera/CameraView+TakeSnapshot.kt
deleted file mode 100644
index 3f3299bf..00000000
--- a/android/src/main/java/com/mrousavy/camera/CameraView+TakeSnapshot.kt
+++ /dev/null
@@ -1,60 +0,0 @@
-package com.mrousavy.camera
-
-import android.graphics.Bitmap
-import androidx.exifinterface.media.ExifInterface
-import com.facebook.react.bridge.Arguments
-import com.facebook.react.bridge.ReadableMap
-import com.facebook.react.bridge.WritableMap
-import com.mrousavy.camera.utils.buildMetadataMap
-import kotlinx.coroutines.Dispatchers
-import kotlinx.coroutines.coroutineScope
-import kotlinx.coroutines.withContext
-import java.io.File
-import java.io.FileOutputStream
-import kotlinx.coroutines.guava.await
-
-suspend fun CameraView.takeSnapshot(options: ReadableMap): WritableMap = coroutineScope {
- val camera = camera ?: throw com.mrousavy.camera.CameraNotReadyError()
- val enableFlash = options.getString("flash") == "on"
-
- try {
- if (enableFlash) {
- camera.cameraControl.enableTorch(true).await()
- }
-
- val bitmap = withContext(coroutineScope.coroutineContext) {
- previewView.bitmap ?: throw CameraNotReadyError()
- }
-
- val quality = if (options.hasKey("quality")) options.getInt("quality") else 100
-
- val file: File
- val exif: ExifInterface
- @Suppress("BlockingMethodInNonBlockingContext")
- withContext(Dispatchers.IO) {
- file = File.createTempFile("mrousavy", ".jpg", context.cacheDir).apply { deleteOnExit() }
- FileOutputStream(file).use { stream ->
- bitmap.compress(Bitmap.CompressFormat.JPEG, quality, stream)
- }
- exif = ExifInterface(file)
- }
-
- val map = Arguments.createMap()
- map.putString("path", file.absolutePath)
- map.putInt("width", bitmap.width)
- map.putInt("height", bitmap.height)
- map.putBoolean("isRawPhoto", false)
-
- val skipMetadata =
- if (options.hasKey("skipMetadata")) options.getBoolean("skipMetadata") else false
- val metadata = if (skipMetadata) null else exif.buildMetadataMap()
- map.putMap("metadata", metadata)
-
- return@coroutineScope map
- } finally {
- if (enableFlash) {
- // reset to `torch` property
- camera.cameraControl.enableTorch(this@takeSnapshot.torch == "on")
- }
- }
-}
diff --git a/android/src/main/java/com/mrousavy/camera/CameraView.kt b/android/src/main/java/com/mrousavy/camera/CameraView.kt
deleted file mode 100644
index e666cb51..00000000
--- a/android/src/main/java/com/mrousavy/camera/CameraView.kt
+++ /dev/null
@@ -1,549 +0,0 @@
-package com.mrousavy.camera
-
-import android.Manifest
-import android.annotation.SuppressLint
-import android.content.Context
-import android.content.pm.PackageManager
-import android.content.res.Configuration
-import android.hardware.camera2.*
-import android.util.Log
-import android.util.Range
-import android.view.*
-import android.view.View.OnTouchListener
-import android.widget.FrameLayout
-import androidx.camera.camera2.interop.Camera2Interop
-import androidx.camera.core.*
-import androidx.camera.core.impl.*
-import androidx.camera.extensions.*
-import androidx.camera.lifecycle.ProcessCameraProvider
-import androidx.camera.video.*
-import androidx.camera.video.VideoCapture
-import androidx.camera.view.PreviewView
-import androidx.core.content.ContextCompat
-import androidx.lifecycle.*
-import com.facebook.jni.HybridData
-import com.facebook.proguard.annotations.DoNotStrip
-import com.facebook.react.bridge.*
-import com.facebook.react.uimanager.events.RCTEventEmitter
-import com.mrousavy.camera.frameprocessor.FrameProcessorPerformanceDataCollector
-import com.mrousavy.camera.frameprocessor.FrameProcessorRuntimeManager
-import com.mrousavy.camera.utils.*
-import kotlinx.coroutines.*
-import kotlinx.coroutines.guava.await
-import java.lang.IllegalArgumentException
-import java.util.concurrent.ExecutorService
-import java.util.concurrent.Executors
-import kotlin.math.floor
-import kotlin.math.max
-import kotlin.math.min
-
-//
-// TODOs for the CameraView which are currently too hard to implement either because of CameraX' limitations, or my brain capacity.
-//
-// CameraView
-// TODO: Actually use correct sizes for video and photo (currently it's both the video size)
-// TODO: Configurable FPS higher than 30
-// TODO: High-speed video recordings (export in CameraViewModule::getAvailableVideoDevices(), and set in CameraView::configurePreview()) (120FPS+)
-// TODO: configureSession() enableDepthData
-// TODO: configureSession() enableHighQualityPhotos
-// TODO: configureSession() enablePortraitEffectsMatteDelivery
-// TODO: configureSession() colorSpace
-
-// CameraView+RecordVideo
-// TODO: Better startRecording()/stopRecording() (promise + callback, wait for TurboModules/JSI)
-// TODO: videoStabilizationMode
-// TODO: Return Video size/duration
-
-// CameraView+TakePhoto
-// TODO: Mirror selfie images
-// TODO: takePhoto() depth data
-// TODO: takePhoto() raw capture
-// TODO: takePhoto() photoCodec ("hevc" | "jpeg" | "raw")
-// TODO: takePhoto() qualityPrioritization
-// TODO: takePhoto() enableAutoRedEyeReduction
-// TODO: takePhoto() enableAutoStabilization
-// TODO: takePhoto() enableAutoDistortionCorrection
-// TODO: takePhoto() return with jsi::Value Image reference for faster capture
-
-@Suppress("KotlinJniMissingFunction") // I use fbjni, Android Studio is not smart enough to realize that.
-@SuppressLint("ClickableViewAccessibility", "ViewConstructor")
-class CameraView(context: Context, private val frameProcessorThread: ExecutorService) : FrameLayout(context), LifecycleOwner {
- companion object {
- const val TAG = "CameraView"
- const val TAG_PERF = "CameraView.performance"
-
- private val propsThatRequireSessionReconfiguration = arrayListOf("cameraId", "format", "fps", "hdr", "lowLightBoost", "photo", "video", "enableFrameProcessor")
- private val arrayListOfZoom = arrayListOf("zoom")
- }
-
- // react properties
- // props that require reconfiguring
- var cameraId: String? = null // this is actually not a react prop directly, but the result of setting device={}
- var enableDepthData = false
- var enableHighQualityPhotos: Boolean? = null
- var enablePortraitEffectsMatteDelivery = false
- // use-cases
- var photo: Boolean? = null
- var video: Boolean? = null
- var audio: Boolean? = null
- var enableFrameProcessor = false
- // props that require format reconfiguring
- var format: ReadableMap? = null
- var fps: Int? = null
- var hdr: Boolean? = null // nullable bool
- var colorSpace: String? = null
- var lowLightBoost: Boolean? = null // nullable bool
- // other props
- var isActive = false
- var torch = "off"
- var zoom: Float = 1f // in "factor"
- var orientation: String? = null
- var enableZoomGesture = false
- set(value) {
- field = value
- setOnTouchListener(if (value) touchEventListener else null)
- }
- var frameProcessorFps = 1.0
- set(value) {
- field = value
- actualFrameProcessorFps = if (value == -1.0) 30.0 else value
- lastFrameProcessorPerformanceEvaluation = System.currentTimeMillis()
- frameProcessorPerformanceDataCollector.clear()
- }
-
- // private properties
- private var isMounted = false
- private val reactContext: ReactContext
- get() = context as ReactContext
-
- @Suppress("JoinDeclarationAndAssignment")
- internal val previewView: PreviewView
- private val cameraExecutor = Executors.newSingleThreadExecutor()
- internal val takePhotoExecutor = Executors.newSingleThreadExecutor()
- internal val recordVideoExecutor = Executors.newSingleThreadExecutor()
- internal var coroutineScope = CoroutineScope(Dispatchers.Main)
-
- internal var camera: Camera? = null
- internal var imageCapture: ImageCapture? = null
- internal var videoCapture: VideoCapture? = null
- private var imageAnalysis: ImageAnalysis? = null
- private var preview: Preview? = null
-
- internal var activeVideoRecording: Recording? = null
-
- private var lastFrameProcessorCall = System.currentTimeMillis()
-
- private var extensionsManager: ExtensionsManager? = null
-
- private val scaleGestureListener: ScaleGestureDetector.SimpleOnScaleGestureListener
- private val scaleGestureDetector: ScaleGestureDetector
- private val touchEventListener: OnTouchListener
-
- private val lifecycleRegistry: LifecycleRegistry
- private var hostLifecycleState: Lifecycle.State
-
- private val inputRotation: Int
- get() {
- return context.displayRotation
- }
- private val outputRotation: Int
- get() {
- if (orientation != null) {
- // user is overriding output orientation
- return when (orientation!!) {
- "portrait" -> Surface.ROTATION_0
- "landscapeRight" -> Surface.ROTATION_90
- "portraitUpsideDown" -> Surface.ROTATION_180
- "landscapeLeft" -> Surface.ROTATION_270
- else -> throw InvalidTypeScriptUnionError("orientation", orientation!!)
- }
- } else {
- // use same as input rotation
- return inputRotation
- }
- }
-
- private var minZoom: Float = 1f
- private var maxZoom: Float = 1f
-
- private var actualFrameProcessorFps = 30.0
- private val frameProcessorPerformanceDataCollector = FrameProcessorPerformanceDataCollector()
- private var lastSuggestedFrameProcessorFps = 0.0
- private var lastFrameProcessorPerformanceEvaluation = System.currentTimeMillis()
- private val isReadyForNewEvaluation: Boolean
- get() {
- val lastPerformanceEvaluationElapsedTime = System.currentTimeMillis() - lastFrameProcessorPerformanceEvaluation
- return lastPerformanceEvaluationElapsedTime > 1000
- }
-
- @DoNotStrip
- private var mHybridData: HybridData? = null
-
- @Suppress("LiftReturnOrAssignment", "RedundantIf")
- internal val fallbackToSnapshot: Boolean
- @SuppressLint("UnsafeOptInUsageError")
- get() {
- if (video != true && !enableFrameProcessor) {
- // Both use-cases are disabled, so `photo` is the only use-case anyways. Don't need to fallback here.
- return false
- }
- cameraId?.let { cameraId ->
- val cameraManger = reactContext.getSystemService(Context.CAMERA_SERVICE) as? CameraManager
- cameraManger?.let {
- val characteristics = cameraManger.getCameraCharacteristics(cameraId)
- val hardwareLevel = characteristics.get(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL)
- if (hardwareLevel == CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY) {
- // Camera only supports a single use-case at a time
- return true
- } else {
- if (video == true && enableFrameProcessor) {
- // Camera supports max. 2 use-cases, but both are occupied by `frameProcessor` and `video`
- return true
- } else {
- // Camera supports max. 2 use-cases and only one is occupied (either `frameProcessor` or `video`), so we can add `photo`
- return false
- }
- }
- }
- }
- return false
- }
-
- init {
- if (FrameProcessorRuntimeManager.enableFrameProcessors) {
- mHybridData = initHybrid()
- }
-
- previewView = PreviewView(context)
- previewView.layoutParams = LayoutParams(LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT)
- previewView.installHierarchyFitter() // If this is not called correctly, view finder will be black/blank
- addView(previewView)
-
- scaleGestureListener = object : ScaleGestureDetector.SimpleOnScaleGestureListener() {
- override fun onScale(detector: ScaleGestureDetector): Boolean {
- zoom = max(min((zoom * detector.scaleFactor), maxZoom), minZoom)
- update(arrayListOfZoom)
- return true
- }
- }
- scaleGestureDetector = ScaleGestureDetector(context, scaleGestureListener)
- touchEventListener = OnTouchListener { _, event -> return@OnTouchListener scaleGestureDetector.onTouchEvent(event) }
-
- hostLifecycleState = Lifecycle.State.INITIALIZED
- lifecycleRegistry = LifecycleRegistry(this)
- reactContext.addLifecycleEventListener(object : LifecycleEventListener {
- override fun onHostResume() {
- hostLifecycleState = Lifecycle.State.RESUMED
- updateLifecycleState()
- // workaround for https://issuetracker.google.com/issues/147354615, preview must be bound on resume
- update(propsThatRequireSessionReconfiguration)
- }
- override fun onHostPause() {
- hostLifecycleState = Lifecycle.State.CREATED
- updateLifecycleState()
- }
- override fun onHostDestroy() {
- hostLifecycleState = Lifecycle.State.DESTROYED
- updateLifecycleState()
- cameraExecutor.shutdown()
- takePhotoExecutor.shutdown()
- recordVideoExecutor.shutdown()
- reactContext.removeLifecycleEventListener(this)
- }
- })
- }
-
- override fun onConfigurationChanged(newConfig: Configuration?) {
- super.onConfigurationChanged(newConfig)
- updateOrientation()
- }
-
- @SuppressLint("RestrictedApi")
- private fun updateOrientation() {
- preview?.targetRotation = inputRotation
- imageCapture?.targetRotation = outputRotation
- videoCapture?.targetRotation = outputRotation
- imageAnalysis?.targetRotation = outputRotation
- }
-
- private external fun initHybrid(): HybridData
- private external fun frameProcessorCallback(frame: ImageProxy)
-
- override fun getLifecycle(): Lifecycle {
- return lifecycleRegistry
- }
-
- /**
- * Updates the custom Lifecycle to match the host activity's lifecycle, and if it's active we narrow it down to the [isActive] and [isAttachedToWindow] fields.
- */
- private fun updateLifecycleState() {
- val lifecycleBefore = lifecycleRegistry.currentState
- if (hostLifecycleState == Lifecycle.State.RESUMED) {
- // Host Lifecycle (Activity) is currently active (RESUMED), so we narrow it down to the view's lifecycle
- if (isActive && isAttachedToWindow) {
- lifecycleRegistry.currentState = Lifecycle.State.RESUMED
- } else {
- lifecycleRegistry.currentState = Lifecycle.State.CREATED
- }
- } else {
- // Host Lifecycle (Activity) is currently inactive (STARTED or DESTROYED), so that overrules our view's lifecycle
- lifecycleRegistry.currentState = hostLifecycleState
- }
- Log.d(TAG, "Lifecycle went from ${lifecycleBefore.name} -> ${lifecycleRegistry.currentState.name} (isActive: $isActive | isAttachedToWindow: $isAttachedToWindow)")
- }
-
- override fun onAttachedToWindow() {
- super.onAttachedToWindow()
- updateLifecycleState()
- if (!isMounted) {
- isMounted = true
- invokeOnViewReady()
- }
- }
-
- override fun onDetachedFromWindow() {
- super.onDetachedFromWindow()
- updateLifecycleState()
- }
-
- /**
- * Invalidate all React Props and reconfigure the device
- */
- fun update(changedProps: ArrayList) = previewView.post {
- // TODO: Does this introduce too much overhead?
- // I need to .post on the previewView because it might've not been initialized yet
- // I need to use CoroutineScope.launch because of the suspend fun [configureSession]
- coroutineScope.launch {
- try {
- val shouldReconfigureSession = changedProps.containsAny(propsThatRequireSessionReconfiguration)
- val shouldReconfigureZoom = shouldReconfigureSession || changedProps.contains("zoom")
- val shouldReconfigureTorch = shouldReconfigureSession || changedProps.contains("torch")
- val shouldUpdateOrientation = shouldReconfigureSession || changedProps.contains("orientation")
-
- if (changedProps.contains("isActive")) {
- updateLifecycleState()
- }
- if (shouldReconfigureSession) {
- configureSession()
- }
- if (shouldReconfigureZoom) {
- val zoomClamped = max(min(zoom, maxZoom), minZoom)
- camera!!.cameraControl.setZoomRatio(zoomClamped)
- }
- if (shouldReconfigureTorch) {
- camera!!.cameraControl.enableTorch(torch == "on")
- }
- if (shouldUpdateOrientation) {
- updateOrientation()
- }
- } catch (e: Throwable) {
- Log.e(TAG, "update() threw: ${e.message}")
- invokeOnError(e)
- }
- }
- }
-
- /**
- * Configures the camera capture session. This should only be called when the camera device changes.
- */
- @SuppressLint("RestrictedApi")
- private suspend fun configureSession() {
- try {
- val startTime = System.currentTimeMillis()
- Log.i(TAG, "Configuring session...")
- if (ContextCompat.checkSelfPermission(context, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
- throw CameraPermissionError()
- }
- if (cameraId == null) {
- throw NoCameraDeviceError()
- }
- if (format != null)
- Log.i(TAG, "Configuring session with Camera ID $cameraId and custom format...")
- else
- Log.i(TAG, "Configuring session with Camera ID $cameraId and default format options...")
-
- // Used to bind the lifecycle of cameras to the lifecycle owner
- val cameraProvider = ProcessCameraProvider.getInstance(reactContext).await()
-
- var cameraSelector = CameraSelector.Builder().byID(cameraId!!).build()
-
- val tryEnableExtension: (suspend (extension: Int) -> Unit) = lambda@ { extension ->
- if (extensionsManager == null) {
- Log.i(TAG, "Initializing ExtensionsManager...")
- extensionsManager = ExtensionsManager.getInstanceAsync(context, cameraProvider).await()
- }
- if (extensionsManager!!.isExtensionAvailable(cameraSelector, extension)) {
- Log.i(TAG, "Enabling extension $extension...")
- cameraSelector = extensionsManager!!.getExtensionEnabledCameraSelector(cameraSelector, extension)
- } else {
- Log.e(TAG, "Extension $extension is not available for the given Camera!")
- throw when (extension) {
- ExtensionMode.HDR -> HdrNotContainedInFormatError()
- ExtensionMode.NIGHT -> LowLightBoostNotContainedInFormatError()
- else -> Error("Invalid extension supplied! Extension $extension is not available.")
- }
- }
- }
-
- val previewBuilder = Preview.Builder()
- .setTargetRotation(inputRotation)
-
- val imageCaptureBuilder = ImageCapture.Builder()
- .setTargetRotation(outputRotation)
- .setCaptureMode(ImageCapture.CAPTURE_MODE_MINIMIZE_LATENCY)
-
- val videoRecorderBuilder = Recorder.Builder()
- .setExecutor(cameraExecutor)
-
- val imageAnalysisBuilder = ImageAnalysis.Builder()
- .setTargetRotation(outputRotation)
- .setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST)
- .setBackgroundExecutor(frameProcessorThread)
-
- if (format == null) {
- // let CameraX automatically find best resolution for the target aspect ratio
- Log.i(TAG, "No custom format has been set, CameraX will automatically determine best configuration...")
- val aspectRatio = aspectRatio(previewView.height, previewView.width) // flipped because it's in sensor orientation.
- previewBuilder.setTargetAspectRatio(aspectRatio)
- imageCaptureBuilder.setTargetAspectRatio(aspectRatio)
- // TODO: Aspect Ratio for Video Recorder?
- imageAnalysisBuilder.setTargetAspectRatio(aspectRatio)
- } else {
- // User has selected a custom format={}. Use that
- val format = DeviceFormat(format!!)
- Log.i(TAG, "Using custom format - photo: ${format.photoSize}, video: ${format.videoSize} @ $fps FPS")
- if (video == true) {
- previewBuilder.setTargetResolution(format.videoSize)
- } else {
- previewBuilder.setTargetResolution(format.photoSize)
- }
- imageCaptureBuilder.setTargetResolution(format.photoSize)
- imageAnalysisBuilder.setTargetResolution(format.photoSize)
-
- // TODO: Ability to select resolution exactly depending on format? Just like on iOS...
- when (min(format.videoSize.height, format.videoSize.width)) {
- in 0..480 -> videoRecorderBuilder.setQualitySelector(QualitySelector.from(Quality.SD))
- in 480..720 -> videoRecorderBuilder.setQualitySelector(QualitySelector.from(Quality.HD, FallbackStrategy.lowerQualityThan(Quality.HD)))
- in 720..1080 -> videoRecorderBuilder.setQualitySelector(QualitySelector.from(Quality.FHD, FallbackStrategy.lowerQualityThan(Quality.FHD)))
- in 1080..2160 -> videoRecorderBuilder.setQualitySelector(QualitySelector.from(Quality.UHD, FallbackStrategy.lowerQualityThan(Quality.UHD)))
- in 2160..4320 -> videoRecorderBuilder.setQualitySelector(QualitySelector.from(Quality.HIGHEST, FallbackStrategy.lowerQualityThan(Quality.HIGHEST)))
- }
-
- fps?.let { fps ->
- if (format.frameRateRanges.any { it.contains(fps) }) {
- // Camera supports the given FPS (frame rate range)
- val frameDuration = (1.0 / fps.toDouble()).toLong() * 1_000_000_000
-
- Log.i(TAG, "Setting AE_TARGET_FPS_RANGE to $fps-$fps, and SENSOR_FRAME_DURATION to $frameDuration")
- Camera2Interop.Extender(previewBuilder)
- .setCaptureRequestOption(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, Range(fps, fps))
- .setCaptureRequestOption(CaptureRequest.SENSOR_FRAME_DURATION, frameDuration)
- // TODO: Frame Rate/FPS for Video Recorder?
- } else {
- throw FpsNotContainedInFormatError(fps)
- }
- }
- if (hdr == true) {
- tryEnableExtension(ExtensionMode.HDR)
- }
- if (lowLightBoost == true) {
- tryEnableExtension(ExtensionMode.NIGHT)
- }
- }
-
-
- // Unbind use cases before rebinding
- videoCapture = null
- imageCapture = null
- imageAnalysis = null
- cameraProvider.unbindAll()
-
- // Bind use cases to camera
- val useCases = ArrayList()
- if (video == true) {
- Log.i(TAG, "Adding VideoCapture use-case...")
-
- val videoRecorder = videoRecorderBuilder.build()
- videoCapture = VideoCapture.withOutput(videoRecorder)
- videoCapture!!.targetRotation = outputRotation
- useCases.add(videoCapture!!)
- }
- if (photo == true) {
- if (fallbackToSnapshot) {
- Log.i(TAG, "Tried to add photo use-case (`photo={true}`) but the Camera device only supports " +
- "a single use-case at a time. Falling back to Snapshot capture.")
- } else {
- Log.i(TAG, "Adding ImageCapture use-case...")
- imageCapture = imageCaptureBuilder.build()
- useCases.add(imageCapture!!)
- }
- }
- if (enableFrameProcessor) {
- Log.i(TAG, "Adding ImageAnalysis use-case...")
- imageAnalysis = imageAnalysisBuilder.build().apply {
- setAnalyzer(cameraExecutor, { image ->
- val now = System.currentTimeMillis()
- val intervalMs = (1.0 / actualFrameProcessorFps) * 1000.0
- if (now - lastFrameProcessorCall > intervalMs) {
- lastFrameProcessorCall = now
-
- val perfSample = frameProcessorPerformanceDataCollector.beginPerformanceSampleCollection()
- frameProcessorCallback(image)
- perfSample.endPerformanceSampleCollection()
- }
- image.close()
-
- if (isReadyForNewEvaluation) {
- // last evaluation was more than a second ago, evaluate again
- evaluateNewPerformanceSamples()
- }
- })
- }
- useCases.add(imageAnalysis!!)
- }
-
- preview = previewBuilder.build()
- Log.i(TAG, "Attaching ${useCases.size} use-cases...")
- camera = cameraProvider.bindToLifecycle(this, cameraSelector, preview, *useCases.toTypedArray())
- preview!!.setSurfaceProvider(previewView.surfaceProvider)
-
- minZoom = camera!!.cameraInfo.zoomState.value?.minZoomRatio ?: 1f
- maxZoom = camera!!.cameraInfo.zoomState.value?.maxZoomRatio ?: 1f
-
- val duration = System.currentTimeMillis() - startTime
- Log.i(TAG_PERF, "Session configured in $duration ms! Camera: ${camera!!}")
- invokeOnInitialized()
- } catch (exc: Throwable) {
- Log.e(TAG, "Failed to configure session: ${exc.message}")
- throw when (exc) {
- is CameraError -> exc
- is IllegalArgumentException -> {
- if (exc.message?.contains("too many use cases") == true) {
- ParallelVideoProcessingNotSupportedError(exc)
- } else {
- InvalidCameraDeviceError(exc)
- }
- }
- else -> UnknownCameraError(exc)
- }
- }
- }
-
- private fun evaluateNewPerformanceSamples() {
- lastFrameProcessorPerformanceEvaluation = System.currentTimeMillis()
- val maxFrameProcessorFps = 30 // TODO: Get maxFrameProcessorFps from ImageAnalyser
- val averageFps = 1.0 / frameProcessorPerformanceDataCollector.averageExecutionTimeSeconds
- val suggestedFrameProcessorFps = floor(min(averageFps, maxFrameProcessorFps.toDouble()))
-
- if (frameProcessorFps == -1.0) {
- // frameProcessorFps="auto"
- actualFrameProcessorFps = suggestedFrameProcessorFps
- } else {
- // frameProcessorFps={someCustomFpsValue}
- if (suggestedFrameProcessorFps != lastSuggestedFrameProcessorFps && suggestedFrameProcessorFps != frameProcessorFps) {
- invokeOnFrameProcessorPerformanceSuggestionAvailable(frameProcessorFps, suggestedFrameProcessorFps)
- lastSuggestedFrameProcessorFps = suggestedFrameProcessorFps
- }
- }
- }
-}
diff --git a/android/src/main/java/com/mrousavy/camera/CameraViewModule.kt b/android/src/main/java/com/mrousavy/camera/CameraViewModule.kt
deleted file mode 100644
index fb85308a..00000000
--- a/android/src/main/java/com/mrousavy/camera/CameraViewModule.kt
+++ /dev/null
@@ -1,403 +0,0 @@
-package com.mrousavy.camera
-
-import android.Manifest
-import android.content.Context
-import android.content.pm.PackageManager
-import android.hardware.camera2.CameraCharacteristics
-import android.hardware.camera2.CameraManager
-import android.os.Build
-import android.util.Log
-import android.util.Size
-import androidx.camera.core.CameraSelector
-import androidx.camera.extensions.ExtensionMode
-import androidx.camera.extensions.ExtensionsManager
-import androidx.camera.lifecycle.ProcessCameraProvider
-import androidx.camera.video.QualitySelector
-import androidx.core.content.ContextCompat
-import com.facebook.react.bridge.*
-import com.facebook.react.module.annotations.ReactModule
-import com.facebook.react.modules.core.PermissionAwareActivity
-import com.facebook.react.modules.core.PermissionListener
-import com.facebook.react.uimanager.UIManagerHelper
-import com.facebook.react.bridge.ReactApplicationContext
-import com.facebook.react.turbomodule.core.CallInvokerHolderImpl
-import com.mrousavy.camera.CameraView
-import com.mrousavy.camera.ViewNotFoundError
-import java.util.concurrent.ExecutorService
-import com.mrousavy.camera.frameprocessor.FrameProcessorRuntimeManager
-import com.mrousavy.camera.parsers.*
-import com.mrousavy.camera.utils.*
-import kotlinx.coroutines.*
-import kotlinx.coroutines.guava.await
-import java.util.concurrent.Executors
-
-@ReactModule(name = CameraViewModule.TAG)
-@Suppress("unused")
-class CameraViewModule(reactContext: ReactApplicationContext) : ReactContextBaseJavaModule(reactContext) {
- companion object {
- const val TAG = "CameraView"
- var RequestCode = 10
-
- fun parsePermissionStatus(status: Int): String {
- return when (status) {
- PackageManager.PERMISSION_DENIED -> "denied"
- PackageManager.PERMISSION_GRANTED -> "authorized"
- else -> "not-determined"
- }
- }
- }
-
- var frameProcessorThread: ExecutorService = Executors.newSingleThreadExecutor()
- private val coroutineScope = CoroutineScope(Dispatchers.Default) // TODO: or Dispatchers.Main?
- private var frameProcessorManager: FrameProcessorRuntimeManager? = null
-
- private fun cleanup() {
- if (coroutineScope.isActive) {
- coroutineScope.cancel("CameraViewModule has been destroyed.")
- }
- frameProcessorManager = null
- }
-
- override fun initialize() {
- super.initialize()
-
- if (frameProcessorManager == null) {
- frameProcessorThread.execute {
- frameProcessorManager = FrameProcessorRuntimeManager(reactApplicationContext, frameProcessorThread)
- }
- }
- }
-
- override fun onCatalystInstanceDestroy() {
- super.onCatalystInstanceDestroy()
- cleanup()
- }
-
- override fun invalidate() {
- super.invalidate()
- cleanup()
- }
-
- override fun getName(): String {
- return TAG
- }
-
- private fun findCameraView(viewId: Int): CameraView {
- Log.d(TAG, "Finding view $viewId...")
- val view = if (reactApplicationContext != null) UIManagerHelper.getUIManager(reactApplicationContext, viewId)?.resolveView(viewId) as CameraView? else null
- Log.d(TAG, if (reactApplicationContext != null) "Found view $viewId!" else "Couldn't find view $viewId!")
- return view ?: throw ViewNotFoundError(viewId)
- }
-
- @ReactMethod
- fun takePhoto(viewTag: Int, options: ReadableMap, promise: Promise) {
- coroutineScope.launch {
- withPromise(promise) {
- val view = findCameraView(viewTag)
- view.takePhoto(options)
- }
- }
- }
-
- @Suppress("unused")
- @ReactMethod
- fun takeSnapshot(viewTag: Int, options: ReadableMap, promise: Promise) {
- coroutineScope.launch {
- withPromise(promise) {
- val view = findCameraView(viewTag)
- view.takeSnapshot(options)
- }
- }
- }
-
- // TODO: startRecording() cannot be awaited, because I can't have a Promise and a onRecordedCallback in the same function. Hopefully TurboModules allows that
- @ReactMethod
- fun startRecording(viewTag: Int, options: ReadableMap, onRecordCallback: Callback) {
- coroutineScope.launch {
- val view = findCameraView(viewTag)
- try {
- view.startRecording(options, onRecordCallback)
- } catch (error: CameraError) {
- val map = makeErrorMap("${error.domain}/${error.id}", error.message, error)
- onRecordCallback(null, map)
- } catch (error: Throwable) {
- val map = makeErrorMap("capture/unknown", "An unknown error occurred while trying to start a video recording!", error)
- onRecordCallback(null, map)
- }
- }
- }
-
- @ReactMethod
- fun pauseRecording(viewTag: Int, promise: Promise) {
- withPromise(promise) {
- val view = findCameraView(viewTag)
- view.pauseRecording()
- return@withPromise null
- }
- }
-
- @ReactMethod
- fun resumeRecording(viewTag: Int, promise: Promise) {
- withPromise(promise) {
- val view = findCameraView(viewTag)
- view.resumeRecording()
- return@withPromise null
- }
- }
-
- @ReactMethod
- fun stopRecording(viewTag: Int, promise: Promise) {
- withPromise(promise) {
- val view = findCameraView(viewTag)
- view.stopRecording()
- return@withPromise null
- }
- }
-
- @ReactMethod
- fun focus(viewTag: Int, point: ReadableMap, promise: Promise) {
- coroutineScope.launch {
- withPromise(promise) {
- val view = findCameraView(viewTag)
- view.focus(point)
- return@withPromise null
- }
- }
- }
-
- // TODO: This uses the Camera2 API to list all characteristics of a camera device and therefore doesn't work with Camera1. Find a way to use CameraX for this
- // https://issuetracker.google.com/issues/179925896
- @ReactMethod
- fun getAvailableCameraDevices(promise: Promise) {
- val startTime = System.currentTimeMillis()
- coroutineScope.launch {
- withPromise(promise) {
- val cameraProvider = ProcessCameraProvider.getInstance(reactApplicationContext).await()
- val extensionsManager = ExtensionsManager.getInstanceAsync(reactApplicationContext, cameraProvider).await()
- ProcessCameraProvider.getInstance(reactApplicationContext).await()
-
- val manager = reactApplicationContext.getSystemService(Context.CAMERA_SERVICE) as? CameraManager
- ?: throw CameraManagerUnavailableError()
-
- val cameraDevices: WritableArray = Arguments.createArray()
-
- manager.cameraIdList.filter{ id -> id.toIntOrNull() != null }.forEach loop@{ id ->
- val cameraSelector = CameraSelector.Builder().byID(id).build()
-
- val characteristics = manager.getCameraCharacteristics(id)
- val hardwareLevel = characteristics.get(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL)!!
-
- val capabilities = characteristics.get(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES)!!
- val isMultiCam = Build.VERSION.SDK_INT >= Build.VERSION_CODES.P &&
- capabilities.contains(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES_LOGICAL_MULTI_CAMERA)
- val deviceTypes = characteristics.getDeviceTypes()
-
- val cameraConfig = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP)!!
- val lensFacing = characteristics.get(CameraCharacteristics.LENS_FACING)!!
- val hasFlash = characteristics.get(CameraCharacteristics.FLASH_INFO_AVAILABLE)!!
- val maxScalerZoom = characteristics.get(CameraCharacteristics.SCALER_AVAILABLE_MAX_DIGITAL_ZOOM)!!
- val supportsDepthCapture = Build.VERSION.SDK_INT >= Build.VERSION_CODES.M &&
- capabilities.contains(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES_DEPTH_OUTPUT)
- val supportsRawCapture = capabilities.contains(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES_RAW)
- val isoRange = characteristics.get(CameraCharacteristics.SENSOR_INFO_SENSITIVITY_RANGE)
- val digitalStabilizationModes = characteristics.get(CameraCharacteristics.CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES)
- val opticalStabilizationModes = characteristics.get(CameraCharacteristics.LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION)
- val zoomRange = if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R)
- characteristics.get(CameraCharacteristics.CONTROL_ZOOM_RATIO_RANGE)
- else null
- val name = if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.P)
- characteristics.get(CameraCharacteristics.INFO_VERSION)
- else null
- val fpsRanges = characteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES)!!
-
- val supportsHdr = extensionsManager.isExtensionAvailable(cameraSelector, ExtensionMode.HDR)
- val supportsLowLightBoost = extensionsManager.isExtensionAvailable(cameraSelector, ExtensionMode.NIGHT)
- // see https://developer.android.com/reference/android/hardware/camera2/CameraDevice#regular-capture
- val supportsParallelVideoProcessing = hardwareLevel != CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY && hardwareLevel != CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED
-
- val fieldOfView = characteristics.getFieldOfView()
-
- val map = Arguments.createMap()
- map.putString("id", id)
- map.putArray("devices", deviceTypes)
- map.putString("position", parseLensFacing(lensFacing))
- map.putString("name", name ?: "${parseLensFacing(lensFacing)} ($id)")
- map.putBoolean("hasFlash", hasFlash)
- map.putBoolean("hasTorch", hasFlash)
- map.putBoolean("isMultiCam", isMultiCam)
- map.putBoolean("supportsParallelVideoProcessing", supportsParallelVideoProcessing)
- map.putBoolean("supportsRawCapture", supportsRawCapture)
- map.putBoolean("supportsDepthCapture", supportsDepthCapture)
- map.putBoolean("supportsLowLightBoost", supportsLowLightBoost)
- map.putBoolean("supportsFocus", true) // I believe every device here supports focussing
- if (zoomRange != null) {
- map.putDouble("minZoom", zoomRange.lower.toDouble())
- map.putDouble("maxZoom", zoomRange.upper.toDouble())
- } else {
- map.putDouble("minZoom", 1.0)
- map.putDouble("maxZoom", maxScalerZoom.toDouble())
- }
- map.putDouble("neutralZoom", 1.0)
-
- val supportedVideoResolutions: List
- val cameraInfos = cameraSelector.filter(cameraProvider.availableCameraInfos)
- if (cameraInfos.size > 0) {
- supportedVideoResolutions = QualitySelector
- .getSupportedQualities(cameraInfos[0])
- .map { QualitySelector.getResolution(cameraInfos[0], it)!! }
- } else {
- supportedVideoResolutions = emptyList()
- }
-
- // TODO: Optimize?
- val maxImageOutputSize = cameraConfig.outputFormats
- .flatMap { cameraConfig.getOutputSizes(it).toList() }
- .maxByOrNull { it.width * it.height }!!
-
- val formats = Arguments.createArray()
-
- cameraConfig.outputFormats.forEach { formatId ->
- val formatName = parseImageFormat(formatId)
-
- cameraConfig.getOutputSizes(formatId).forEach { size ->
- val isHighestPhotoQualitySupported = areUltimatelyEqual(size, maxImageOutputSize)
-
- // Get the number of seconds that each frame will take to process
- val secondsPerFrame = try {
- cameraConfig.getOutputMinFrameDuration(formatId, size) / 1_000_000_000.0
- } catch (error: Throwable) {
- Log.e(TAG, "Minimum Frame Duration for MediaRecorder Output cannot be calculated, format \"$formatName\" is not supported.")
- null
- }
-
- val frameRateRanges = Arguments.createArray()
- if (secondsPerFrame != null && secondsPerFrame > 0) {
- val fps = (1.0 / secondsPerFrame).toInt()
- val frameRateRange = Arguments.createMap()
- frameRateRange.putInt("minFrameRate", 1)
- frameRateRange.putInt("maxFrameRate", fps)
- frameRateRanges.pushMap(frameRateRange)
- }
- fpsRanges.forEach { range ->
- val frameRateRange = Arguments.createMap()
- frameRateRange.putInt("minFrameRate", range.lower)
- frameRateRange.putInt("maxFrameRate", range.upper)
- frameRateRanges.pushMap(frameRateRange)
- }
-
- val colorSpaces = Arguments.createArray()
- colorSpaces.pushString(formatName)
-
- val videoStabilizationModes = Arguments.createArray()
- videoStabilizationModes.pushString("off")
- if (digitalStabilizationModes != null) {
- if (digitalStabilizationModes.contains(CameraCharacteristics.CONTROL_VIDEO_STABILIZATION_MODE_ON)) {
- videoStabilizationModes.pushString("auto")
- videoStabilizationModes.pushString("standard")
- }
- }
- if (opticalStabilizationModes != null) {
- if (opticalStabilizationModes.contains(CameraCharacteristics.LENS_OPTICAL_STABILIZATION_MODE_ON)) {
- videoStabilizationModes.pushString("cinematic")
- }
- }
-
- // TODO: Get the pixel format programatically rather than assuming a default of 420v
- val pixelFormat = "420v"
-
- val format = Arguments.createMap()
- format.putDouble("photoHeight", size.height.toDouble())
- format.putDouble("photoWidth", size.width.toDouble())
- // since supportedVideoResolutions is sorted from highest resolution to lowest,
- // videoResolution will be the highest supported video resolution lower than or equal to photo resolution
- // TODO: Somehow integrate with CamcorderProfileProxy?
- val videoResolution = supportedVideoResolutions.find { it.width <= size.width && it.height <= size.height }
- format.putDouble("videoHeight", videoResolution?.height?.toDouble())
- format.putDouble("videoWidth", videoResolution?.width?.toDouble())
- format.putBoolean("isHighestPhotoQualitySupported", isHighestPhotoQualitySupported)
- format.putInt("maxISO", isoRange?.upper)
- format.putInt("minISO", isoRange?.lower)
- format.putDouble("fieldOfView", fieldOfView) // TODO: Revisit getAvailableCameraDevices (is fieldOfView accurate?)
- format.putDouble("maxZoom", (zoomRange?.upper ?: maxScalerZoom).toDouble())
- format.putArray("colorSpaces", colorSpaces)
- format.putBoolean("supportsVideoHDR", false) // TODO: supportsVideoHDR
- format.putBoolean("supportsPhotoHDR", supportsHdr)
- format.putArray("frameRateRanges", frameRateRanges)
- format.putString("autoFocusSystem", "none") // TODO: Revisit getAvailableCameraDevices (autoFocusSystem) (CameraCharacteristics.CONTROL_AF_AVAILABLE_MODES or CameraCharacteristics.LENS_INFO_FOCUS_DISTANCE_CALIBRATION)
- format.putArray("videoStabilizationModes", videoStabilizationModes)
- format.putString("pixelFormat", pixelFormat)
- formats.pushMap(format)
- }
- }
-
- map.putArray("formats", formats)
- cameraDevices.pushMap(map)
- }
-
- val difference = System.currentTimeMillis() - startTime
- Log.w(TAG, "CameraViewModule::getAvailableCameraDevices took: $difference ms")
- return@withPromise cameraDevices
- }
- }
- }
-
- @ReactMethod
- fun getCameraPermissionStatus(promise: Promise) {
- val status = ContextCompat.checkSelfPermission(reactApplicationContext, Manifest.permission.CAMERA)
- promise.resolve(parsePermissionStatus(status))
- }
-
- @ReactMethod
- fun getMicrophonePermissionStatus(promise: Promise) {
- val status = ContextCompat.checkSelfPermission(reactApplicationContext, Manifest.permission.RECORD_AUDIO)
- promise.resolve(parsePermissionStatus(status))
- }
-
- @ReactMethod
- fun requestCameraPermission(promise: Promise) {
- if (Build.VERSION.SDK_INT < Build.VERSION_CODES.M) {
- // API 21 and below always grants permission on app install
- return promise.resolve("authorized")
- }
-
- val activity = reactApplicationContext.currentActivity
- if (activity is PermissionAwareActivity) {
- val currentRequestCode = RequestCode++
- val listener = PermissionListener { requestCode: Int, _: Array, grantResults: IntArray ->
- if (requestCode == currentRequestCode) {
- val permissionStatus = if (grantResults.isNotEmpty()) grantResults[0] else PackageManager.PERMISSION_DENIED
- promise.resolve(parsePermissionStatus(permissionStatus))
- return@PermissionListener true
- }
- return@PermissionListener false
- }
- activity.requestPermissions(arrayOf(Manifest.permission.CAMERA), currentRequestCode, listener)
- } else {
- promise.reject("NO_ACTIVITY", "No PermissionAwareActivity was found! Make sure the app has launched before calling this function.")
- }
- }
-
- @ReactMethod
- fun requestMicrophonePermission(promise: Promise) {
- if (Build.VERSION.SDK_INT < Build.VERSION_CODES.M) {
- // API 21 and below always grants permission on app install
- return promise.resolve("authorized")
- }
-
- val activity = reactApplicationContext.currentActivity
- if (activity is PermissionAwareActivity) {
- val currentRequestCode = RequestCode++
- val listener = PermissionListener { requestCode: Int, _: Array, grantResults: IntArray ->
- if (requestCode == currentRequestCode) {
- val permissionStatus = if (grantResults.isNotEmpty()) grantResults[0] else PackageManager.PERMISSION_DENIED
- promise.resolve(parsePermissionStatus(permissionStatus))
- return@PermissionListener true
- }
- return@PermissionListener false
- }
- activity.requestPermissions(arrayOf(Manifest.permission.RECORD_AUDIO), currentRequestCode, listener)
- } else {
- promise.reject("NO_ACTIVITY", "No PermissionAwareActivity was found! Make sure the app has launched before calling this function.")
- }
- }
-}
diff --git a/android/src/main/java/com/mrousavy/camera/Errors.kt b/android/src/main/java/com/mrousavy/camera/Errors.kt
deleted file mode 100644
index 06aefa33..00000000
--- a/android/src/main/java/com/mrousavy/camera/Errors.kt
+++ /dev/null
@@ -1,112 +0,0 @@
-package com.mrousavy.camera
-
-import android.graphics.ImageFormat
-import androidx.camera.video.VideoRecordEvent.Finalize.VideoRecordError
-
-abstract class CameraError(
- /**
- * The domain of the error. Error domains are used to group errors.
- *
- * Example: "permission"
- */
- val domain: String,
- /**
- * The id of the error. Errors are uniquely identified under a given domain.
- *
- * Example: "microphone-permission-denied"
- */
- val id: String,
- /**
- * A detailed error description of "what went wrong".
- *
- * Example: "The microphone permission was denied!"
- */
- message: String,
- /**
- * A throwable that caused this error.
- */
- cause: Throwable? = null
-) : Throwable("[$domain/$id] $message", cause)
-
-val CameraError.code: String
- get() = "$domain/$id"
-
-class MicrophonePermissionError : CameraError("permission", "microphone-permission-denied", "The Microphone permission was denied! If you want to record Video without sound, pass `audio={false}`.")
-class CameraPermissionError : CameraError("permission", "camera-permission-denied", "The Camera permission was denied!")
-
-class InvalidTypeScriptUnionError(unionName: String, unionValue: String) : CameraError("parameter", "invalid-parameter", "The given value for $unionName could not be parsed! (Received: $unionValue)")
-
-class NoCameraDeviceError : CameraError("device", "no-device", "No device was set! Use `getAvailableCameraDevices()` to select a suitable Camera device.")
-class InvalidCameraDeviceError(cause: Throwable) : CameraError("device", "invalid-device", "The given Camera device could not be found for use-case binding!", cause)
-class ParallelVideoProcessingNotSupportedError(cause: Throwable) : CameraError("device", "parallel-video-processing-not-supported", "The given LEGACY Camera device does not support parallel " +
- "video processing (`video={true}` + `frameProcessor={...}`). Disable either `video` or `frameProcessor`. To find out if a device supports parallel video processing, check the `supportsParallelVideoProcessing` property on the CameraDevice. " +
- "See https://react-native-vision-camera.com/docs/guides/devices#the-supportsparallelvideoprocessing-prop for more information.", cause)
-
-class FpsNotContainedInFormatError(fps: Int) : CameraError("format", "invalid-fps", "The given FPS were not valid for the currently selected format. Make sure you select a format which `frameRateRanges` includes $fps FPS!")
-class HdrNotContainedInFormatError() : CameraError(
- "format", "invalid-hdr",
- "The currently selected format does not support HDR capture! " +
- "Make sure you select a format which `frameRateRanges` includes `supportsPhotoHDR`!"
-)
-class LowLightBoostNotContainedInFormatError() : CameraError(
- "format", "invalid-low-light-boost",
- "The currently selected format does not support low-light boost (night mode)! " +
- "Make sure you select a format which includes `supportsLowLightBoost`."
-)
-
-class CameraNotReadyError : CameraError("session", "camera-not-ready", "The Camera is not ready yet! Wait for the onInitialized() callback!")
-
-class VideoNotEnabledError : CameraError("capture", "video-not-enabled", "Video capture is disabled! Pass `video={true}` to enable video recordings.")
-class PhotoNotEnabledError : CameraError("capture", "photo-not-enabled", "Photo capture is disabled! Pass `photo={true}` to enable photo capture.")
-
-class InvalidFormatError(format: Int) : CameraError("capture", "invalid-photo-format", "The Photo has an invalid format! Expected ${ImageFormat.YUV_420_888}, actual: $format")
-
-class VideoEncoderError(cause: Throwable?) : CameraError("capture", "encoder-error", "The recording failed while encoding.\n" +
- "This error may be generated when the video or audio codec encounters an error during encoding. " +
- "When this happens and the output file is generated, the output file is not properly constructed. " +
- "The application will need to clean up the output file, such as deleting the file.",
- cause)
-
-class InvalidVideoOutputOptionsError(cause: Throwable?) : CameraError("capture", "invalid-video-options",
- "The recording failed due to invalid output options.\n" +
- "This error is generated when invalid output options have been used while preparing a recording",
- cause)
-
-class RecorderError(cause: Throwable?) : CameraError("capture", "recorder-error",
- "The recording failed because the Recorder is in an unrecoverable error state.\n" +
- "When this happens and the output file is generated, the output file is not properly constructed. " +
- "The application will need to clean up the output file, such as deleting the file. " +
- "Such an error will usually require creating a new Recorder object to start a new recording.",
- cause)
-
-class NoValidDataError(cause: Throwable?) : CameraError("capture", "no-valid-data",
- "The recording failed because no valid data was produced to be recorded.\n" +
- "This error is generated when the essential data for a recording to be played correctly is missing, for example, " +
- "a recording must contain at least one key frame. The application will need to clean up the output file, such as deleting the file.",
- cause)
-
-class InactiveSourceError(cause: Throwable?) : CameraError("capture", "inactive-source",
- "The recording failed because the source becomes inactive and stops sending frames.\n" +
- "One case is that if camera is closed due to lifecycle stopped, the active recording will be finalized with this error, " +
- "and the output will be generated, containing the frames produced before camera closing. " +
- "Attempting to start a new recording will be finalized immediately if the source remains inactive and no output will be generated.",
- cause)
-
-class InsufficientStorageError(cause: Throwable?) : CameraError("capture", "insufficient-storage",
- "The recording failed due to insufficient storage space.\n" +
- "There are two possible cases that will cause this error.\n" +
- "1. The storage is already full before the recording starts, so no output file will be generated.\n" +
- "2. The storage becomes full during recording, so the output file will be generated.",
- cause)
-
-class FileSizeLimitReachedError(cause: Throwable?) : CameraError("capture", "file-size-limit-reached",
- "The recording failed due to file size limitation.\n" +
- "The file size limitation will refer to OutputOptions.getFileSizeLimit(). The output file will still be generated with this error.",
- cause)
-
-class NoRecordingInProgressError : CameraError("capture", "no-recording-in-progress", "No active recording in progress!")
-
-class CameraManagerUnavailableError : CameraError("system", "no-camera-manager", "The Camera manager instance was unavailable for the current Application!")
-class ViewNotFoundError(viewId: Int) : CameraError("system", "view-not-found", "The given view (ID $viewId) was not found in the view manager.")
-
-class UnknownCameraError(cause: Throwable?) : CameraError("unknown", "unknown", cause?.message ?: "An unknown camera error occured.", cause)
diff --git a/android/src/main/java/com/mrousavy/camera/frameprocessor/FrameProcessorPerformanceDataCollector.kt b/android/src/main/java/com/mrousavy/camera/frameprocessor/FrameProcessorPerformanceDataCollector.kt
deleted file mode 100644
index 80cb4281..00000000
--- a/android/src/main/java/com/mrousavy/camera/frameprocessor/FrameProcessorPerformanceDataCollector.kt
+++ /dev/null
@@ -1,38 +0,0 @@
-package com.mrousavy.camera.frameprocessor
-
-data class PerformanceSampleCollection(val endPerformanceSampleCollection: () -> Unit)
-
-// keep a maximum of `maxSampleSize` historical performance data samples cached.
-private const val maxSampleSize = 15
-
-class FrameProcessorPerformanceDataCollector {
- private var counter = 0
- private var performanceSamples: ArrayList = ArrayList()
-
- val averageExecutionTimeSeconds: Double
- get() = performanceSamples.average()
-
- fun beginPerformanceSampleCollection(): PerformanceSampleCollection {
- val begin = System.currentTimeMillis()
-
- return PerformanceSampleCollection {
- val end = System.currentTimeMillis()
- val seconds = (end - begin) / 1_000.0
-
- val index = counter % maxSampleSize
-
- if (performanceSamples.size > index) {
- performanceSamples[index] = seconds
- } else {
- performanceSamples.add(seconds)
- }
-
- counter++
- }
- }
-
- fun clear() {
- counter = 0
- performanceSamples.clear()
- }
-}
diff --git a/android/src/main/java/com/mrousavy/camera/frameprocessor/FrameProcessorPlugin.java b/android/src/main/java/com/mrousavy/camera/frameprocessor/FrameProcessorPlugin.java
deleted file mode 100644
index 8b733451..00000000
--- a/android/src/main/java/com/mrousavy/camera/frameprocessor/FrameProcessorPlugin.java
+++ /dev/null
@@ -1,53 +0,0 @@
-package com.mrousavy.camera.frameprocessor;
-
-import androidx.annotation.Keep;
-import androidx.annotation.NonNull;
-import androidx.annotation.Nullable;
-import androidx.camera.core.ImageProxy;
-import com.facebook.proguard.annotations.DoNotStrip;
-
-/**
- * Declares a Frame Processor Plugin.
- */
-@DoNotStrip
-@Keep
-public abstract class FrameProcessorPlugin {
- private final @NonNull String mName;
-
- /**
- * The actual Frame Processor plugin callback. Called for every frame the ImageAnalyzer receives.
- * @param image The CameraX ImageProxy. Don't call .close() on this, as VisionCamera handles that.
- * @return You can return any primitive, map or array you want. See the
- * Types
- * table for a list of supported types.
- */
- @DoNotStrip
- @Keep
- public abstract @Nullable Object callback(@NonNull ImageProxy image, @NonNull Object[] params);
-
- /**
- * Initializes the native plugin part.
- * @param name Specifies the Frame Processor Plugin's name in the Runtime.
- * The actual name in the JS Runtime will be prefixed with two underscores (`__`)
- */
- protected FrameProcessorPlugin(@NonNull String name) {
- mName = name;
- }
-
- /**
- * Get the user-defined name of the Frame Processor Plugin.
- */
- @DoNotStrip
- @Keep
- public @NonNull String getName() {
- return mName;
- }
-
- /**
- * Registers the given plugin in the Frame Processor Runtime.
- * @param plugin An instance of a plugin.
- */
- public static void register(@NonNull FrameProcessorPlugin plugin) {
- FrameProcessorRuntimeManager.Companion.getPlugins().add(plugin);
- }
-}
diff --git a/android/src/main/java/com/mrousavy/camera/frameprocessor/FrameProcessorRuntimeManager.kt b/android/src/main/java/com/mrousavy/camera/frameprocessor/FrameProcessorRuntimeManager.kt
deleted file mode 100644
index ebad5f6f..00000000
--- a/android/src/main/java/com/mrousavy/camera/frameprocessor/FrameProcessorRuntimeManager.kt
+++ /dev/null
@@ -1,79 +0,0 @@
-package com.mrousavy.camera.frameprocessor
-
-import android.util.Log
-import androidx.annotation.Keep
-import com.facebook.jni.HybridData
-import com.facebook.proguard.annotations.DoNotStrip
-import com.facebook.react.bridge.ReactApplicationContext
-import com.facebook.react.turbomodule.core.CallInvokerHolderImpl
-import com.facebook.react.uimanager.UIManagerHelper
-import com.mrousavy.camera.CameraView
-import com.mrousavy.camera.ViewNotFoundError
-import java.lang.ref.WeakReference
-import java.util.concurrent.ExecutorService
-
-@Suppress("KotlinJniMissingFunction") // I use fbjni, Android Studio is not smart enough to realize that.
-class FrameProcessorRuntimeManager(context: ReactApplicationContext, frameProcessorThread: ExecutorService) {
- companion object {
- const val TAG = "FrameProcessorRuntime"
- val Plugins: ArrayList = ArrayList()
- var enableFrameProcessors = true
-
- init {
- try {
- System.loadLibrary("reanimated")
- System.loadLibrary("VisionCamera")
- } catch (e: UnsatisfiedLinkError) {
- Log.w(TAG, "Failed to load Reanimated/VisionCamera C++ library. Frame Processors are disabled!")
- enableFrameProcessors = false
- }
- }
- }
-
- @DoNotStrip
- private var mHybridData: HybridData? = null
- private var mContext: WeakReference? = null
- private var mScheduler: VisionCameraScheduler? = null
-
- init {
- if (enableFrameProcessors) {
- val holder = context.catalystInstance.jsCallInvokerHolder as CallInvokerHolderImpl
- mScheduler = VisionCameraScheduler(frameProcessorThread)
- mContext = WeakReference(context)
- mHybridData = initHybrid(context.javaScriptContextHolder.get(), holder, mScheduler!!)
- initializeRuntime()
-
- Log.i(TAG, "Installing Frame Processor Plugins...")
- Plugins.forEach { plugin ->
- registerPlugin(plugin)
- }
- Log.i(TAG, "Successfully installed ${Plugins.count()} Frame Processor Plugins!")
-
- Log.i(TAG, "Installing JSI Bindings on JS Thread...")
- context.runOnJSQueueThread {
- installJSIBindings()
- }
- }
- }
-
- @Suppress("unused")
- @DoNotStrip
- @Keep
- fun findCameraViewById(viewId: Int): CameraView {
- Log.d(TAG, "Finding view $viewId...")
- val ctx = mContext?.get()
- val view = if (ctx != null) UIManagerHelper.getUIManager(ctx, viewId)?.resolveView(viewId) as CameraView? else null
- Log.d(TAG, if (view != null) "Found view $viewId!" else "Couldn't find view $viewId!")
- return view ?: throw ViewNotFoundError(viewId)
- }
-
- // private C++ funcs
- private external fun initHybrid(
- jsContext: Long,
- jsCallInvokerHolder: CallInvokerHolderImpl,
- scheduler: VisionCameraScheduler
- ): HybridData
- private external fun initializeRuntime()
- private external fun registerPlugin(plugin: FrameProcessorPlugin)
- private external fun installJSIBindings()
-}
diff --git a/android/src/main/java/com/mrousavy/camera/frameprocessor/ImageProxyUtils.java b/android/src/main/java/com/mrousavy/camera/frameprocessor/ImageProxyUtils.java
deleted file mode 100644
index e03a55a7..00000000
--- a/android/src/main/java/com/mrousavy/camera/frameprocessor/ImageProxyUtils.java
+++ /dev/null
@@ -1,42 +0,0 @@
-package com.mrousavy.camera.frameprocessor;
-
-import android.annotation.SuppressLint;
-import android.media.Image;
-
-import androidx.annotation.Keep;
-import androidx.camera.core.ImageProxy;
-import com.facebook.proguard.annotations.DoNotStrip;
-
-@SuppressWarnings("unused") // used through JNI
-@DoNotStrip
-@Keep
-public class ImageProxyUtils {
- @SuppressLint("UnsafeOptInUsageError")
- @DoNotStrip
- @Keep
- public static boolean isImageProxyValid(ImageProxy imageProxy) {
- try {
- Image image = imageProxy.getImage();
- if (image == null) return false;
- // will throw an exception if the image is already closed
- imageProxy.getImage().getCropRect();
- // no exception thrown, image must still be valid.
- return true;
- } catch (Exception e) {
- // exception thrown, image has already been closed.
- return false;
- }
- }
-
- @DoNotStrip
- @Keep
- public static int getPlanesCount(ImageProxy imageProxy) {
- return imageProxy.getPlanes().length;
- }
-
- @DoNotStrip
- @Keep
- public static int getBytesPerRow(ImageProxy imageProxy) {
- return imageProxy.getPlanes()[0].getRowStride();
- }
-}
diff --git a/android/src/main/java/com/mrousavy/camera/parsers/ImageFormat+String.kt b/android/src/main/java/com/mrousavy/camera/parsers/ImageFormat+String.kt
deleted file mode 100644
index b37b3465..00000000
--- a/android/src/main/java/com/mrousavy/camera/parsers/ImageFormat+String.kt
+++ /dev/null
@@ -1,48 +0,0 @@
-package com.mrousavy.camera.parsers
-
-import android.graphics.ImageFormat
-
-/**
- * Parses ImageFormat/PixelFormat int to a string representation useable for the TypeScript types.
- */
-fun parseImageFormat(imageFormat: Int): String {
- return when (imageFormat) {
- ImageFormat.YUV_420_888 -> "yuv"
- ImageFormat.YUV_422_888 -> "yuv"
- ImageFormat.YUV_444_888 -> "yuv"
- ImageFormat.JPEG -> "jpeg"
- ImageFormat.DEPTH_JPEG -> "jpeg-depth"
- ImageFormat.RAW_SENSOR -> "raw"
- ImageFormat.RAW_PRIVATE -> "raw"
- ImageFormat.HEIC -> "heic"
- ImageFormat.PRIVATE -> "private"
- ImageFormat.DEPTH16 -> "depth-16"
- else -> "unknown"
- /*
- ImageFormat.UNKNOWN -> "TODOFILL"
- ImageFormat.RGB_565 -> "TODOFILL"
- ImageFormat.YV12 -> "TODOFILL"
- ImageFormat.Y8 -> "TODOFILL"
- ImageFormat.NV16 -> "TODOFILL"
- ImageFormat.NV21 -> "TODOFILL"
- ImageFormat.YUY2 -> "TODOFILL"
- ImageFormat.FLEX_RGB_888 -> "TODOFILL"
- ImageFormat.FLEX_RGBA_8888 -> "TODOFILL"
- ImageFormat.RAW10 -> "TODOFILL"
- ImageFormat.RAW12 -> "TODOFILL"
- ImageFormat.DEPTH_POINT_CLOUD -> "TODOFILL"
- @Suppress("DUPLICATE_LABEL_IN_WHEN")
- PixelFormat.UNKNOWN -> "TODOFILL"
- PixelFormat.TRANSPARENT -> "TODOFILL"
- PixelFormat.TRANSLUCENT -> "TODOFILL"
- PixelFormat.RGBX_8888 -> "TODOFILL"
- PixelFormat.RGBA_F16 -> "TODOFILL"
- PixelFormat.RGBA_8888 -> "TODOFILL"
- PixelFormat.RGBA_1010102 -> "TODOFILL"
- PixelFormat.OPAQUE -> "TODOFILL"
- @Suppress("DUPLICATE_LABEL_IN_WHEN")
- PixelFormat.RGB_565 -> "TODOFILL"
- PixelFormat.RGB_888 -> "TODOFILL"
- */
- }
-}
diff --git a/android/src/main/java/com/mrousavy/camera/parsers/LenseFacing+String.kt b/android/src/main/java/com/mrousavy/camera/parsers/LenseFacing+String.kt
deleted file mode 100644
index 335ed6e3..00000000
--- a/android/src/main/java/com/mrousavy/camera/parsers/LenseFacing+String.kt
+++ /dev/null
@@ -1,15 +0,0 @@
-package com.mrousavy.camera.parsers
-
-import android.hardware.camera2.CameraCharacteristics
-
-/**
- * Parses Lens Facing int to a string representation useable for the TypeScript types.
- */
-fun parseLensFacing(lensFacing: Int?): String? {
- return when (lensFacing) {
- CameraCharacteristics.LENS_FACING_BACK -> "back"
- CameraCharacteristics.LENS_FACING_FRONT -> "front"
- CameraCharacteristics.LENS_FACING_EXTERNAL -> "external"
- else -> null
- }
-}
diff --git a/android/src/main/java/com/mrousavy/camera/parsers/Size+easy.kt b/android/src/main/java/com/mrousavy/camera/parsers/Size+easy.kt
deleted file mode 100644
index c80d7120..00000000
--- a/android/src/main/java/com/mrousavy/camera/parsers/Size+easy.kt
+++ /dev/null
@@ -1,20 +0,0 @@
-package com.mrousavy.camera.parsers
-
-import android.util.Size
-import android.util.SizeF
-import kotlin.math.max
-import kotlin.math.min
-
-val Size.bigger: Int
- get() = max(this.width, this.height)
-val Size.smaller: Int
- get() = min(this.width, this.height)
-
-val SizeF.bigger: Float
- get() = max(this.width, this.height)
-val SizeF.smaller: Float
- get() = min(this.width, this.height)
-
-fun areUltimatelyEqual(size1: Size, size2: Size): Boolean {
- return size1.width * size1.height == size2.width * size2.height
-}
diff --git a/android/src/main/java/com/mrousavy/camera/utils/AspectRatio.kt b/android/src/main/java/com/mrousavy/camera/utils/AspectRatio.kt
deleted file mode 100644
index 8fb366aa..00000000
--- a/android/src/main/java/com/mrousavy/camera/utils/AspectRatio.kt
+++ /dev/null
@@ -1,28 +0,0 @@
-package com.mrousavy.camera.utils
-
-import androidx.camera.core.AspectRatio
-import kotlin.math.abs
-import kotlin.math.max
-import kotlin.math.min
-
-private const val RATIO_4_3_VALUE = 4.0 / 3.0
-private const val RATIO_16_9_VALUE = 16.0 / 9.0
-
-/**
- * [androidx.camera.core.ImageAnalysisConfig] requires enum value of
- * [androidx.camera.core.AspectRatio]. Currently it has values of 4:3 & 16:9.
- *
- * Detecting the most suitable ratio for dimensions provided in @params by counting absolute
- * of preview ratio to one of the provided values.
- *
- * @param width - preview width
- * @param height - preview height
- * @return suitable aspect ratio
- */
-fun aspectRatio(width: Int, height: Int): Int {
- val previewRatio = max(width, height).toDouble() / min(width, height)
- if (abs(previewRatio - RATIO_4_3_VALUE) <= abs(previewRatio - RATIO_16_9_VALUE)) {
- return AspectRatio.RATIO_4_3
- }
- return AspectRatio.RATIO_16_9
-}
diff --git a/android/src/main/java/com/mrousavy/camera/utils/CameraCharacteristicsUtils.kt b/android/src/main/java/com/mrousavy/camera/utils/CameraCharacteristicsUtils.kt
deleted file mode 100644
index f9a39f5f..00000000
--- a/android/src/main/java/com/mrousavy/camera/utils/CameraCharacteristicsUtils.kt
+++ /dev/null
@@ -1,58 +0,0 @@
-package com.mrousavy.camera.utils
-
-import android.hardware.camera2.CameraCharacteristics
-import android.util.Size
-import com.facebook.react.bridge.Arguments
-import com.facebook.react.bridge.ReadableArray
-import com.mrousavy.camera.parsers.bigger
-import kotlin.math.PI
-import kotlin.math.atan
-
-// 35mm is 135 film format, a standard in which focal lengths are usually measured
-val Size35mm = Size(36, 24)
-
-/**
- * Convert a given array of focal lengths to the corresponding TypeScript union type name.
- *
- * Possible values for single cameras:
- * * `"wide-angle-camera"`
- * * `"ultra-wide-angle-camera"`
- * * `"telephoto-camera"`
- *
- * Sources for the focal length categories:
- * * [Telephoto Lens (wikipedia)](https://en.wikipedia.org/wiki/Telephoto_lens)
- * * [Normal Lens (wikipedia)](https://en.wikipedia.org/wiki/Normal_lens)
- * * [Wide-Angle Lens (wikipedia)](https://en.wikipedia.org/wiki/Wide-angle_lens)
- * * [Ultra-Wide-Angle Lens (wikipedia)](https://en.wikipedia.org/wiki/Ultra_wide_angle_lens)
- */
-fun CameraCharacteristics.getDeviceTypes(): ReadableArray {
- // TODO: Check if getDeviceType() works correctly, even for logical multi-cameras
- val focalLengths = this.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS)!!
- val sensorSize = this.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE)!!
-
- // To get valid focal length standards we have to upscale to the 35mm measurement (film standard)
- val cropFactor = Size35mm.bigger / sensorSize.bigger
-
- val deviceTypes = Arguments.createArray()
-
- val containsTelephoto = focalLengths.any { l -> (l * cropFactor) > 35 } // TODO: Telephoto lenses are > 85mm, but we don't have anything between that range..
- // val containsNormalLens = focalLengths.any { l -> (l * cropFactor) > 35 && (l * cropFactor) <= 55 }
- val containsWideAngle = focalLengths.any { l -> (l * cropFactor) >= 24 && (l * cropFactor) <= 35 }
- val containsUltraWideAngle = focalLengths.any { l -> (l * cropFactor) < 24 }
-
- if (containsTelephoto)
- deviceTypes.pushString("telephoto-camera")
- if (containsWideAngle)
- deviceTypes.pushString("wide-angle-camera")
- if (containsUltraWideAngle)
- deviceTypes.pushString("ultra-wide-angle-camera")
-
- return deviceTypes
-}
-
-fun CameraCharacteristics.getFieldOfView(): Double {
- val focalLengths = this.get(CameraCharacteristics.LENS_INFO_AVAILABLE_FOCAL_LENGTHS)!!
- val sensorSize = this.get(CameraCharacteristics.SENSOR_INFO_PHYSICAL_SIZE)!!
-
- return 2 * atan(sensorSize.bigger / (focalLengths[0] * 2)) * (180 / PI)
-}
diff --git a/android/src/main/java/com/mrousavy/camera/utils/CameraSelector+byID.kt b/android/src/main/java/com/mrousavy/camera/utils/CameraSelector+byID.kt
deleted file mode 100644
index 4bc2a0c2..00000000
--- a/android/src/main/java/com/mrousavy/camera/utils/CameraSelector+byID.kt
+++ /dev/null
@@ -1,25 +0,0 @@
-package com.mrousavy.camera.utils
-
-import android.annotation.SuppressLint
-import androidx.camera.camera2.interop.Camera2CameraInfo
-import androidx.camera.core.CameraSelector
-import java.lang.IllegalArgumentException
-
-/**
- * Create a new [CameraSelector] which selects the camera with the given [cameraId]
- */
-@SuppressLint("UnsafeOptInUsageError")
-fun CameraSelector.Builder.byID(cameraId: String): CameraSelector.Builder {
- return this.addCameraFilter { cameras ->
- cameras.filter { cameraInfoX ->
- try {
- val cameraInfo = Camera2CameraInfo.from(cameraInfoX)
- return@filter cameraInfo.cameraId == cameraId
- } catch (e: IllegalArgumentException) {
- // Occurs when the [cameraInfoX] is not castable to a Camera2 Info object.
- // We can ignore this error because the [getAvailableCameraDevices()] func only returns Camera2 devices.
- return@filter false
- }
- }
- }
-}
diff --git a/android/src/main/java/com/mrousavy/camera/utils/Context+displayRotation.kt b/android/src/main/java/com/mrousavy/camera/utils/Context+displayRotation.kt
deleted file mode 100644
index 92cde740..00000000
--- a/android/src/main/java/com/mrousavy/camera/utils/Context+displayRotation.kt
+++ /dev/null
@@ -1,36 +0,0 @@
-package com.mrousavy.camera.utils
-
-import android.content.Context
-import android.os.Build
-import android.view.Surface
-import android.view.WindowManager
-import com.facebook.react.bridge.ReactContext
-
-val Context.displayRotation: Int
- get() {
- if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R) {
- // Context.display
- this.display?.let { display ->
- return display.rotation
- }
-
- // ReactContext.currentActivity.display
- if (this is ReactContext) {
- currentActivity?.display?.let { display ->
- return display.rotation
- }
- }
- }
-
- // WindowManager.defaultDisplay
- val windowManager = getSystemService(Context.WINDOW_SERVICE) as? WindowManager
- if (windowManager != null) {
- @Suppress("DEPRECATION") // deprecated since SDK 30
- windowManager.defaultDisplay?.let { display ->
- return display.rotation
- }
- }
-
- // 0
- return Surface.ROTATION_0
- }
diff --git a/android/src/main/java/com/mrousavy/camera/utils/DeviceFormat.kt b/android/src/main/java/com/mrousavy/camera/utils/DeviceFormat.kt
deleted file mode 100644
index 3364d8c5..00000000
--- a/android/src/main/java/com/mrousavy/camera/utils/DeviceFormat.kt
+++ /dev/null
@@ -1,33 +0,0 @@
-package com.mrousavy.camera.utils
-
-import android.util.Range
-import android.util.Size
-import com.facebook.react.bridge.ReadableMap
-
-class DeviceFormat(map: ReadableMap) {
- val frameRateRanges: List>
- val photoSize: Size
- val videoSize: Size
-
- init {
- frameRateRanges = map.getArray("frameRateRanges")!!.toArrayList().map { range ->
- if (range is HashMap<*, *>)
- rangeFactory(range["minFrameRate"], range["maxFrameRate"])
- else
- throw IllegalArgumentException("DeviceFormat: frameRateRanges contained a Range that was not of type HashMap<*,*>! Actual Type: ${range?.javaClass?.name}")
- }
- photoSize = Size(map.getInt("photoWidth"), map.getInt("photoHeight"))
- videoSize = Size(map.getInt("videoWidth"), map.getInt("videoHeight"))
- }
-}
-
-fun rangeFactory(minFrameRate: Any?, maxFrameRate: Any?): Range {
- return when (minFrameRate) {
- is Int -> Range(minFrameRate, maxFrameRate as Int)
- is Double -> Range(minFrameRate.toInt(), (maxFrameRate as Double).toInt())
- else -> throw IllegalArgumentException(
- "DeviceFormat: frameRateRanges contained a Range that didn't have minFrameRate/maxFrameRate of types Int/Double! " +
- "Actual Type: ${minFrameRate?.javaClass?.name} & ${maxFrameRate?.javaClass?.name}"
- )
- }
-}
diff --git a/android/src/main/java/com/mrousavy/camera/utils/ExifInterface+buildMetadataMap.kt b/android/src/main/java/com/mrousavy/camera/utils/ExifInterface+buildMetadataMap.kt
deleted file mode 100644
index e4a8c45a..00000000
--- a/android/src/main/java/com/mrousavy/camera/utils/ExifInterface+buildMetadataMap.kt
+++ /dev/null
@@ -1,62 +0,0 @@
-package com.mrousavy.camera.utils
-
-import androidx.exifinterface.media.ExifInterface
-import com.facebook.react.bridge.Arguments
-import com.facebook.react.bridge.WritableMap
-
-fun ExifInterface.buildMetadataMap(): WritableMap {
- val metadataMap = Arguments.createMap()
- metadataMap.putInt("Orientation", this.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_NORMAL))
-
- val tiffMap = Arguments.createMap()
- tiffMap.putInt("ResolutionUnit", this.getAttributeInt(ExifInterface.TAG_RESOLUTION_UNIT, 0))
- tiffMap.putString("Software", this.getAttribute(ExifInterface.TAG_SOFTWARE))
- tiffMap.putString("Make", this.getAttribute(ExifInterface.TAG_MAKE))
- tiffMap.putString("DateTime", this.getAttribute(ExifInterface.TAG_DATETIME))
- tiffMap.putDouble("XResolution", this.getAttributeDouble(ExifInterface.TAG_X_RESOLUTION, 0.0))
- tiffMap.putString("Model", this.getAttribute(ExifInterface.TAG_MODEL))
- tiffMap.putDouble("YResolution", this.getAttributeDouble(ExifInterface.TAG_Y_RESOLUTION, 0.0))
- metadataMap.putMap("{TIFF}", tiffMap)
-
- val exifMap = Arguments.createMap()
- exifMap.putString("DateTimeOriginal", this.getAttribute(ExifInterface.TAG_DATETIME_ORIGINAL))
- exifMap.putDouble("ExposureTime", this.getAttributeDouble(ExifInterface.TAG_EXPOSURE_TIME, 0.0))
- exifMap.putDouble("FNumber", this.getAttributeDouble(ExifInterface.TAG_F_NUMBER, 0.0))
- val lensSpecificationArray = Arguments.createArray()
- this.getAttributeRange(ExifInterface.TAG_LENS_SPECIFICATION)?.forEach { lensSpecificationArray.pushInt(it.toInt()) }
- exifMap.putArray("LensSpecification", lensSpecificationArray)
- exifMap.putDouble("ExposureBiasValue", this.getAttributeDouble(ExifInterface.TAG_EXPOSURE_BIAS_VALUE, 0.0))
- exifMap.putInt("ColorSpace", this.getAttributeInt(ExifInterface.TAG_COLOR_SPACE, ExifInterface.COLOR_SPACE_S_RGB))
- exifMap.putInt("FocalLenIn35mmFilm", this.getAttributeInt(ExifInterface.TAG_FOCAL_LENGTH_IN_35MM_FILM, 0))
- exifMap.putDouble("BrightnessValue", this.getAttributeDouble(ExifInterface.TAG_BRIGHTNESS_VALUE, 0.0))
- exifMap.putInt("ExposureMode", this.getAttributeInt(ExifInterface.TAG_EXPOSURE_MODE, ExifInterface.EXPOSURE_MODE_AUTO.toInt()))
- exifMap.putString("LensModel", this.getAttribute(ExifInterface.TAG_LENS_MODEL))
- exifMap.putInt("SceneType", this.getAttributeInt(ExifInterface.TAG_SCENE_TYPE, ExifInterface.SCENE_TYPE_DIRECTLY_PHOTOGRAPHED.toInt()))
- exifMap.putInt("PixelXDimension", this.getAttributeInt(ExifInterface.TAG_PIXEL_X_DIMENSION, 0))
- exifMap.putDouble("ShutterSpeedValue", this.getAttributeDouble(ExifInterface.TAG_SHUTTER_SPEED_VALUE, 0.0))
- exifMap.putInt("SensingMethod", this.getAttributeInt(ExifInterface.TAG_SENSING_METHOD, ExifInterface.SENSOR_TYPE_NOT_DEFINED.toInt()))
- val subjectAreaArray = Arguments.createArray()
- this.getAttributeRange(ExifInterface.TAG_SUBJECT_AREA)?.forEach { subjectAreaArray.pushInt(it.toInt()) }
- exifMap.putArray("SubjectArea", subjectAreaArray)
- exifMap.putDouble("ApertureValue", this.getAttributeDouble(ExifInterface.TAG_APERTURE_VALUE, 0.0))
- exifMap.putString("SubsecTimeDigitized", this.getAttribute(ExifInterface.TAG_SUBSEC_TIME_DIGITIZED))
- exifMap.putDouble("FocalLength", this.getAttributeDouble(ExifInterface.TAG_FOCAL_LENGTH, 0.0))
- exifMap.putString("LensMake", this.getAttribute(ExifInterface.TAG_LENS_MAKE))
- exifMap.putString("SubsecTimeOriginal", this.getAttribute(ExifInterface.TAG_SUBSEC_TIME_ORIGINAL))
- exifMap.putString("OffsetTimeDigitized", this.getAttribute(ExifInterface.TAG_OFFSET_TIME_DIGITIZED))
- exifMap.putInt("PixelYDimension", this.getAttributeInt(ExifInterface.TAG_PIXEL_Y_DIMENSION, 0))
- val isoSpeedRatingsArray = Arguments.createArray()
- this.getAttributeRange(ExifInterface.TAG_PHOTOGRAPHIC_SENSITIVITY)?.forEach { isoSpeedRatingsArray.pushInt(it.toInt()) }
- exifMap.putArray("ISOSpeedRatings", isoSpeedRatingsArray)
- exifMap.putInt("WhiteBalance", this.getAttributeInt(ExifInterface.TAG_WHITE_BALANCE, 0))
- exifMap.putString("DateTimeDigitized", this.getAttribute(ExifInterface.TAG_DATETIME_DIGITIZED))
- exifMap.putString("OffsetTimeOriginal", this.getAttribute(ExifInterface.TAG_OFFSET_TIME_ORIGINAL))
- exifMap.putString("ExifVersion", this.getAttribute(ExifInterface.TAG_EXIF_VERSION))
- exifMap.putString("OffsetTime", this.getAttribute(ExifInterface.TAG_OFFSET_TIME))
- exifMap.putInt("Flash", this.getAttributeInt(ExifInterface.TAG_FLASH, ExifInterface.FLAG_FLASH_FIRED.toInt()))
- exifMap.putInt("ExposureProgram", this.getAttributeInt(ExifInterface.TAG_EXPOSURE_PROGRAM, ExifInterface.EXPOSURE_PROGRAM_NOT_DEFINED.toInt()))
- exifMap.putInt("MeteringMode", this.getAttributeInt(ExifInterface.TAG_METERING_MODE, ExifInterface.METERING_MODE_UNKNOWN.toInt()))
- metadataMap.putMap("{Exif}", exifMap)
-
- return metadataMap
-}
diff --git a/android/src/main/java/com/mrousavy/camera/utils/ImageCapture+suspendables.kt b/android/src/main/java/com/mrousavy/camera/utils/ImageCapture+suspendables.kt
deleted file mode 100644
index 5bbe16bd..00000000
--- a/android/src/main/java/com/mrousavy/camera/utils/ImageCapture+suspendables.kt
+++ /dev/null
@@ -1,41 +0,0 @@
-package com.mrousavy.camera.utils
-
-import androidx.camera.core.ImageCapture
-import androidx.camera.core.ImageCaptureException
-import androidx.camera.core.ImageProxy
-import java.util.concurrent.Executor
-import kotlin.coroutines.resume
-import kotlin.coroutines.resumeWithException
-import kotlin.coroutines.suspendCoroutine
-
-suspend inline fun ImageCapture.takePicture(options: ImageCapture.OutputFileOptions, executor: Executor) = suspendCoroutine { cont ->
- this.takePicture(
- options, executor,
- object : ImageCapture.OnImageSavedCallback {
- override fun onImageSaved(outputFileResults: ImageCapture.OutputFileResults) {
- cont.resume(outputFileResults)
- }
-
- override fun onError(exception: ImageCaptureException) {
- cont.resumeWithException(exception)
- }
- }
- )
-}
-
-suspend inline fun ImageCapture.takePicture(executor: Executor) = suspendCoroutine { cont ->
- this.takePicture(
- executor,
- object : ImageCapture.OnImageCapturedCallback() {
- override fun onCaptureSuccess(image: ImageProxy) {
- super.onCaptureSuccess(image)
- cont.resume(image)
- }
-
- override fun onError(exception: ImageCaptureException) {
- super.onError(exception)
- cont.resumeWithException(exception)
- }
- }
- )
-}
diff --git a/android/src/main/java/com/mrousavy/camera/utils/ImageProxy+isRaw.kt b/android/src/main/java/com/mrousavy/camera/utils/ImageProxy+isRaw.kt
deleted file mode 100644
index b86e6c6d..00000000
--- a/android/src/main/java/com/mrousavy/camera/utils/ImageProxy+isRaw.kt
+++ /dev/null
@@ -1,12 +0,0 @@
-package com.mrousavy.camera.utils
-
-import android.graphics.ImageFormat
-import androidx.camera.core.ImageProxy
-
-val ImageProxy.isRaw: Boolean
- get() {
- return when (format) {
- ImageFormat.RAW_SENSOR, ImageFormat.RAW10, ImageFormat.RAW12, ImageFormat.RAW_PRIVATE -> true
- else -> false
- }
- }
diff --git a/android/src/main/java/com/mrousavy/camera/utils/ImageProxy+save.kt b/android/src/main/java/com/mrousavy/camera/utils/ImageProxy+save.kt
deleted file mode 100644
index 73deec4c..00000000
--- a/android/src/main/java/com/mrousavy/camera/utils/ImageProxy+save.kt
+++ /dev/null
@@ -1,127 +0,0 @@
-package com.mrousavy.camera.utils
-
-import android.graphics.Bitmap
-import android.graphics.BitmapFactory
-import android.graphics.ImageFormat
-import android.graphics.Matrix
-import android.util.Log
-import androidx.camera.core.ImageProxy
-import androidx.exifinterface.media.ExifInterface
-import com.mrousavy.camera.CameraView
-import com.mrousavy.camera.InvalidFormatError
-import java.io.ByteArrayOutputStream
-import java.io.File
-import java.io.FileOutputStream
-import java.nio.ByteBuffer
-import kotlin.system.measureTimeMillis
-
-// TODO: Fix this flip() function (this outputs a black image)
-fun flip(imageBytes: ByteArray, imageWidth: Int): ByteArray {
- // separate out the sub arrays
- var holder = ByteArray(imageBytes.size)
- var subArray = ByteArray(imageWidth)
- var subCount = 0
- for (i in imageBytes.indices) {
- subArray[subCount] = imageBytes[i]
- subCount++
- if (i % imageWidth == 0) {
- subArray.reverse()
- if (i == imageWidth) {
- holder = subArray
- } else {
- holder += subArray
- }
- subCount = 0
- subArray = ByteArray(imageWidth)
- }
- }
- subArray = ByteArray(imageWidth)
- System.arraycopy(imageBytes, imageBytes.size - imageWidth, subArray, 0, subArray.size)
- return holder + subArray
-}
-
-// TODO: This function is slow. Figure out a faster way to flip images, preferably via directly manipulating the byte[] Exif flags
-fun flipImage(imageBytes: ByteArray): ByteArray {
- val bitmap = BitmapFactory.decodeByteArray(imageBytes, 0, imageBytes.size)
- val matrix = Matrix()
-
- val exif = ExifInterface(imageBytes.inputStream())
- val orientation = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_UNDEFINED)
-
- when (orientation) {
- ExifInterface.ORIENTATION_ROTATE_180 -> {
- matrix.setRotate(180f)
- matrix.postScale(-1f, 1f)
- }
- ExifInterface.ORIENTATION_FLIP_VERTICAL -> {
- matrix.setRotate(180f)
- }
- ExifInterface.ORIENTATION_TRANSPOSE -> {
- matrix.setRotate(90f)
- }
- ExifInterface.ORIENTATION_ROTATE_90 -> {
- matrix.setRotate(90f)
- matrix.postScale(-1f, 1f)
- }
- ExifInterface.ORIENTATION_TRANSVERSE -> {
- matrix.setRotate(-90f)
- }
- ExifInterface.ORIENTATION_ROTATE_270 -> {
- matrix.setRotate(-90f)
- matrix.postScale(-1f, 1f)
- }
- }
-
- val newBitmap = Bitmap.createBitmap(bitmap, 0, 0, bitmap.width, bitmap.height, matrix, true)
- val stream = ByteArrayOutputStream()
- newBitmap.compress(Bitmap.CompressFormat.JPEG, 100, stream)
- return stream.toByteArray()
-}
-
-fun ImageProxy.save(file: File, flipHorizontally: Boolean) {
- when (format) {
- // TODO: ImageFormat.RAW_SENSOR
- // TODO: ImageFormat.DEPTH_JPEG
- ImageFormat.JPEG -> {
- val buffer = planes[0].buffer
- var bytes = ByteArray(buffer.remaining())
-
- // copy image from buffer to byte array
- buffer.get(bytes)
-
- if (flipHorizontally) {
- val milliseconds = measureTimeMillis {
- bytes = flipImage(bytes)
- }
- Log.i(CameraView.TAG_PERF, "Flipping Image took $milliseconds ms.")
- }
-
- val output = FileOutputStream(file)
- output.write(bytes)
- output.close()
- }
- ImageFormat.YUV_420_888 -> {
- // "prebuffer" simply contains the meta information about the following planes.
- val prebuffer = ByteBuffer.allocate(16)
- prebuffer.putInt(width)
- .putInt(height)
- .putInt(planes[1].pixelStride)
- .putInt(planes[1].rowStride)
-
- val output = FileOutputStream(file)
- output.write(prebuffer.array()) // write meta information to file
- // Now write the actual planes.
- var buffer: ByteBuffer
- var bytes: ByteArray
-
- for (i in 0..2) {
- buffer = planes[i].buffer
- bytes = ByteArray(buffer.remaining()) // makes byte array large enough to hold image
- buffer.get(bytes) // copies image from buffer to byte array
- output.write(bytes) // write the byte array to file
- }
- output.close()
- }
- else -> throw InvalidFormatError(format)
- }
-}
diff --git a/android/src/main/java/com/mrousavy/camera/utils/Size+rotated.kt b/android/src/main/java/com/mrousavy/camera/utils/Size+rotated.kt
deleted file mode 100644
index 2c530b40..00000000
--- a/android/src/main/java/com/mrousavy/camera/utils/Size+rotated.kt
+++ /dev/null
@@ -1,17 +0,0 @@
-package com.mrousavy.camera.utils
-
-import android.util.Size
-import android.view.Surface
-
-/**
- * Rotate by a given Surface Rotation
- */
-fun Size.rotated(surfaceRotation: Int): Size {
- return when (surfaceRotation) {
- Surface.ROTATION_0 -> Size(width, height)
- Surface.ROTATION_90 -> Size(height, width)
- Surface.ROTATION_180 -> Size(width, height)
- Surface.ROTATION_270 -> Size(height, width)
- else -> Size(width, height)
- }
-}
diff --git a/android/src/main/java/com/mrousavy/camera/utils/WritableArray+Nullables.kt b/android/src/main/java/com/mrousavy/camera/utils/WritableArray+Nullables.kt
deleted file mode 100644
index e573e69b..00000000
--- a/android/src/main/java/com/mrousavy/camera/utils/WritableArray+Nullables.kt
+++ /dev/null
@@ -1,24 +0,0 @@
-package com.mrousavy.camera.utils
-
-import com.facebook.react.bridge.WritableArray
-
-fun WritableArray.pushInt(value: Int?) {
- if (value == null)
- this.pushNull()
- else
- this.pushInt(value)
-}
-
-fun WritableArray.pushDouble(value: Double?) {
- if (value == null)
- this.pushNull()
- else
- this.pushDouble(value)
-}
-
-fun WritableArray.pushBoolean(value: Boolean?) {
- if (value == null)
- this.pushNull()
- else
- this.pushBoolean(value)
-}
diff --git a/docs/docs/api/_category_.yml b/docs/docs/api/_category_.yml
new file mode 100644
index 00000000..24a46026
--- /dev/null
+++ b/docs/docs/api/_category_.yml
@@ -0,0 +1 @@
+label: "API"
\ No newline at end of file
diff --git a/docs/docs/api/classes/Camera.md b/docs/docs/api/classes/Camera.md
new file mode 100644
index 00000000..11c8ab96
--- /dev/null
+++ b/docs/docs/api/classes/Camera.md
@@ -0,0 +1,383 @@
+---
+id: "Camera"
+title: "Camera"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+### A powerful `` component.
+
+Read the [VisionCamera documentation](https://react-native-vision-camera.com/) for more information.
+
+The `` component's most important (and therefore _required_) properties are:
+
+* [`device`](../interfaces/CameraProps.md#device): Specifies the [`CameraDevice`](../interfaces/CameraDevice.md) to use. Get a [`CameraDevice`](../interfaces/CameraDevice.md) by using the [`useCameraDevices()`](../#usecameradevices) hook, or manually by using the [`Camera.getAvailableCameraDevices()`](Camera.md#getavailablecameradevices) function.
+* [`isActive`](../interfaces/CameraProps.md#isactive): A boolean value that specifies whether the Camera should actively stream video frames or not. This can be compared to a Video component, where `isActive` specifies whether the video is paused or not. If you fully unmount the `` component instead of using `isActive={false}`, the Camera will take a bit longer to start again.
+
+**`Example`**
+
+```tsx
+function App() {
+ const devices = useCameraDevices('wide-angle-camera')
+ const device = devices.back
+
+ if (device == null) return
+ return (
+
+ )
+}
+```
+
+**`Component`**
+
+## Hierarchy
+
+- `PureComponent`<[`CameraProps`](../interfaces/CameraProps.md)\>
+
+ ↳ **`Camera`**
+
+## Methods
+
+### focus
+
+▸ **focus**(`point`): `Promise`<`void`\>
+
+Focus the camera to a specific point in the coordinate system.
+
+**`Throws`**
+
+[`CameraRuntimeError`](CameraRuntimeError.md) When any kind of error occured while focussing. Use the [`code`](CameraRuntimeError.md#code) property to get the actual error
+
+**`Example`**
+
+```ts
+await camera.current.focus({
+ x: tapEvent.x,
+ y: tapEvent.y
+})
+```
+
+#### Parameters
+
+| Name | Type | Description |
+| :------ | :------ | :------ |
+| `point` | [`Point`](../interfaces/Point.md) | The point to focus to. This should be relative to the Camera view's coordinate system, and expressed in Pixel on iOS and Points on Android. * `(0, 0)` means **top left**. * `(CameraView.width, CameraView.height)` means **bottom right**. Make sure the value doesn't exceed the CameraView's dimensions. |
+
+#### Returns
+
+`Promise`<`void`\>
+
+#### Defined in
+
+[Camera.tsx:250](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Camera.tsx#L250)
+
+___
+
+### pauseRecording
+
+▸ **pauseRecording**(): `Promise`<`void`\>
+
+Pauses the current video recording.
+
+**`Throws`**
+
+[`CameraCaptureError`](CameraCaptureError.md) When any kind of error occured while pausing the video recording. Use the [`code`](CameraCaptureError.md#code) property to get the actual error
+
+**`Example`**
+
+```ts
+// Start
+await camera.current.startRecording()
+await timeout(1000)
+// Pause
+await camera.current.pauseRecording()
+await timeout(500)
+// Resume
+await camera.current.resumeRecording()
+await timeout(2000)
+// Stop
+const video = await camera.current.stopRecording()
+```
+
+#### Returns
+
+`Promise`<`void`\>
+
+#### Defined in
+
+[Camera.tsx:175](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Camera.tsx#L175)
+
+___
+
+### resumeRecording
+
+▸ **resumeRecording**(): `Promise`<`void`\>
+
+Resumes a currently paused video recording.
+
+**`Throws`**
+
+[`CameraCaptureError`](CameraCaptureError.md) When any kind of error occured while resuming the video recording. Use the [`code`](CameraCaptureError.md#code) property to get the actual error
+
+**`Example`**
+
+```ts
+// Start
+await camera.current.startRecording()
+await timeout(1000)
+// Pause
+await camera.current.pauseRecording()
+await timeout(500)
+// Resume
+await camera.current.resumeRecording()
+await timeout(2000)
+// Stop
+const video = await camera.current.stopRecording()
+```
+
+#### Returns
+
+`Promise`<`void`\>
+
+#### Defined in
+
+[Camera.tsx:203](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Camera.tsx#L203)
+
+___
+
+### startRecording
+
+▸ **startRecording**(`options`): `void`
+
+Start a new video recording.
+
+Records in the following formats:
+* **iOS**: QuickTime (`.mov`)
+* **Android**: MPEG4 (`.mp4`)
+
+**`Blocking`**
+
+This function is synchronized/blocking.
+
+**`Throws`**
+
+[`CameraCaptureError`](CameraCaptureError.md) When any kind of error occured while starting the video recording. Use the [`code`](CameraCaptureError.md#code) property to get the actual error
+
+**`Example`**
+
+```ts
+camera.current.startRecording({
+ onRecordingFinished: (video) => console.log(video),
+ onRecordingError: (error) => console.error(error),
+})
+setTimeout(() => {
+ camera.current.stopRecording()
+}, 5000)
+```
+
+#### Parameters
+
+| Name | Type |
+| :------ | :------ |
+| `options` | [`RecordVideoOptions`](../interfaces/RecordVideoOptions.md) |
+
+#### Returns
+
+`void`
+
+#### Defined in
+
+[Camera.tsx:138](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Camera.tsx#L138)
+
+___
+
+### stopRecording
+
+▸ **stopRecording**(): `Promise`<`void`\>
+
+Stop the current video recording.
+
+**`Throws`**
+
+[`CameraCaptureError`](CameraCaptureError.md) When any kind of error occured while stopping the video recording. Use the [`code`](CameraCaptureError.md#code) property to get the actual error
+
+**`Example`**
+
+```ts
+await camera.current.startRecording()
+setTimeout(async () => {
+ const video = await camera.current.stopRecording()
+}, 5000)
+```
+
+#### Returns
+
+`Promise`<`void`\>
+
+#### Defined in
+
+[Camera.tsx:224](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Camera.tsx#L224)
+
+___
+
+### takePhoto
+
+▸ **takePhoto**(`options?`): `Promise`<[`PhotoFile`](../interfaces/PhotoFile.md)\>
+
+Take a single photo and write it's content to a temporary file.
+
+**`Throws`**
+
+[`CameraCaptureError`](CameraCaptureError.md) When any kind of error occured while capturing the photo. Use the [`code`](CameraCaptureError.md#code) property to get the actual error
+
+**`Example`**
+
+```ts
+const photo = await camera.current.takePhoto({
+ qualityPrioritization: 'quality',
+ flash: 'on',
+ enableAutoRedEyeReduction: true
+})
+```
+
+#### Parameters
+
+| Name | Type |
+| :------ | :------ |
+| `options?` | [`TakePhotoOptions`](../interfaces/TakePhotoOptions.md) |
+
+#### Returns
+
+`Promise`<[`PhotoFile`](../interfaces/PhotoFile.md)\>
+
+#### Defined in
+
+[Camera.tsx:108](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Camera.tsx#L108)
+
+___
+
+### getAvailableCameraDevices
+
+▸ `Static` **getAvailableCameraDevices**(): `Promise`<[`CameraDevice`](../interfaces/CameraDevice.md)[]\>
+
+Get a list of all available camera devices on the current phone.
+
+**`Throws`**
+
+[`CameraRuntimeError`](CameraRuntimeError.md) When any kind of error occured while getting all available camera devices. Use the [`code`](CameraRuntimeError.md#code) property to get the actual error
+
+**`Example`**
+
+```ts
+const devices = await Camera.getAvailableCameraDevices()
+const filtered = devices.filter((d) => matchesMyExpectations(d))
+const sorted = devices.sort(sortDevicesByAmountOfCameras)
+return {
+ back: sorted.find((d) => d.position === "back"),
+ front: sorted.find((d) => d.position === "front")
+}
+```
+
+#### Returns
+
+`Promise`<[`CameraDevice`](../interfaces/CameraDevice.md)[]\>
+
+#### Defined in
+
+[Camera.tsx:276](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Camera.tsx#L276)
+
+___
+
+### getCameraPermissionStatus
+
+▸ `Static` **getCameraPermissionStatus**(): `Promise`<[`CameraPermissionStatus`](../#camerapermissionstatus)\>
+
+Gets the current Camera Permission Status. Check this before mounting the Camera to ensure
+the user has permitted the app to use the camera.
+
+To actually prompt the user for camera permission, use [`requestCameraPermission()`](Camera.md#requestcamerapermission).
+
+**`Throws`**
+
+[`CameraRuntimeError`](CameraRuntimeError.md) When any kind of error occured while getting the current permission status. Use the [`code`](CameraRuntimeError.md#code) property to get the actual error
+
+#### Returns
+
+`Promise`<[`CameraPermissionStatus`](../#camerapermissionstatus)\>
+
+#### Defined in
+
+[Camera.tsx:291](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Camera.tsx#L291)
+
+___
+
+### getMicrophonePermissionStatus
+
+▸ `Static` **getMicrophonePermissionStatus**(): `Promise`<[`CameraPermissionStatus`](../#camerapermissionstatus)\>
+
+Gets the current Microphone-Recording Permission Status. Check this before mounting the Camera to ensure
+the user has permitted the app to use the microphone.
+
+To actually prompt the user for microphone permission, use [`requestMicrophonePermission()`](Camera.md#requestmicrophonepermission).
+
+**`Throws`**
+
+[`CameraRuntimeError`](CameraRuntimeError.md) When any kind of error occured while getting the current permission status. Use the [`code`](CameraRuntimeError.md#code) property to get the actual error
+
+#### Returns
+
+`Promise`<[`CameraPermissionStatus`](../#camerapermissionstatus)\>
+
+#### Defined in
+
+[Camera.tsx:306](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Camera.tsx#L306)
+
+___
+
+### requestCameraPermission
+
+▸ `Static` **requestCameraPermission**(): `Promise`<[`CameraPermissionRequestResult`](../#camerapermissionrequestresult)\>
+
+Shows a "request permission" alert to the user, and resolves with the new camera permission status.
+
+If the user has previously blocked the app from using the camera, the alert will not be shown
+and `"denied"` will be returned.
+
+**`Throws`**
+
+[`CameraRuntimeError`](CameraRuntimeError.md) When any kind of error occured while requesting permission. Use the [`code`](CameraRuntimeError.md#code) property to get the actual error
+
+#### Returns
+
+`Promise`<[`CameraPermissionRequestResult`](../#camerapermissionrequestresult)\>
+
+#### Defined in
+
+[Camera.tsx:321](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Camera.tsx#L321)
+
+___
+
+### requestMicrophonePermission
+
+▸ `Static` **requestMicrophonePermission**(): `Promise`<[`CameraPermissionRequestResult`](../#camerapermissionrequestresult)\>
+
+Shows a "request permission" alert to the user, and resolves with the new microphone permission status.
+
+If the user has previously blocked the app from using the microphone, the alert will not be shown
+and `"denied"` will be returned.
+
+**`Throws`**
+
+[`CameraRuntimeError`](CameraRuntimeError.md) When any kind of error occured while requesting permission. Use the [`code`](CameraRuntimeError.md#code) property to get the actual error
+
+#### Returns
+
+`Promise`<[`CameraPermissionRequestResult`](../#camerapermissionrequestresult)\>
+
+#### Defined in
+
+[Camera.tsx:336](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Camera.tsx#L336)
diff --git a/docs/docs/api/classes/CameraCaptureError.md b/docs/docs/api/classes/CameraCaptureError.md
new file mode 100644
index 00000000..e397f1e5
--- /dev/null
+++ b/docs/docs/api/classes/CameraCaptureError.md
@@ -0,0 +1,88 @@
+---
+id: "CameraCaptureError"
+title: "CameraCaptureError"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+Represents any kind of error that occured while trying to capture a video or photo.
+
+See the ["Camera Errors" documentation](https://react-native-vision-camera.com/docs/guides/errors) for more information about Camera Errors.
+
+## Hierarchy
+
+- `CameraError`<[`CaptureError`](../#captureerror)\>
+
+ ↳ **`CameraCaptureError`**
+
+## Accessors
+
+### cause
+
+• `get` **cause**(): `undefined` \| `Error`
+
+#### Returns
+
+`undefined` \| `Error`
+
+#### Inherited from
+
+CameraError.cause
+
+#### Defined in
+
+[CameraError.ts:132](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L132)
+
+___
+
+### code
+
+• `get` **code**(): `TCode`
+
+#### Returns
+
+`TCode`
+
+#### Inherited from
+
+CameraError.code
+
+#### Defined in
+
+[CameraError.ts:126](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L126)
+
+___
+
+### message
+
+• `get` **message**(): `string`
+
+#### Returns
+
+`string`
+
+#### Inherited from
+
+CameraError.message
+
+#### Defined in
+
+[CameraError.ts:129](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L129)
+
+## Methods
+
+### toString
+
+▸ **toString**(): `string`
+
+#### Returns
+
+`string`
+
+#### Inherited from
+
+CameraError.toString
+
+#### Defined in
+
+[CameraError.ts:150](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L150)
diff --git a/docs/docs/api/classes/CameraRuntimeError.md b/docs/docs/api/classes/CameraRuntimeError.md
new file mode 100644
index 00000000..eb967858
--- /dev/null
+++ b/docs/docs/api/classes/CameraRuntimeError.md
@@ -0,0 +1,88 @@
+---
+id: "CameraRuntimeError"
+title: "CameraRuntimeError"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+Represents any kind of error that occured in the Camera View Module.
+
+See the ["Camera Errors" documentation](https://react-native-vision-camera.com/docs/guides/errors) for more information about Camera Errors.
+
+## Hierarchy
+
+- `CameraError`<[`PermissionError`](../#permissionerror) \| [`ParameterError`](../#parametererror) \| [`DeviceError`](../#deviceerror) \| [`FormatError`](../#formaterror) \| [`SessionError`](../#sessionerror) \| [`SystemError`](../#systemerror) \| [`UnknownError`](../#unknownerror)\>
+
+ ↳ **`CameraRuntimeError`**
+
+## Accessors
+
+### cause
+
+• `get` **cause**(): `undefined` \| `Error`
+
+#### Returns
+
+`undefined` \| `Error`
+
+#### Inherited from
+
+CameraError.cause
+
+#### Defined in
+
+[CameraError.ts:132](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L132)
+
+___
+
+### code
+
+• `get` **code**(): `TCode`
+
+#### Returns
+
+`TCode`
+
+#### Inherited from
+
+CameraError.code
+
+#### Defined in
+
+[CameraError.ts:126](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L126)
+
+___
+
+### message
+
+• `get` **message**(): `string`
+
+#### Returns
+
+`string`
+
+#### Inherited from
+
+CameraError.message
+
+#### Defined in
+
+[CameraError.ts:129](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L129)
+
+## Methods
+
+### toString
+
+▸ **toString**(): `string`
+
+#### Returns
+
+`string`
+
+#### Inherited from
+
+CameraError.toString
+
+#### Defined in
+
+[CameraError.ts:150](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L150)
diff --git a/docs/docs/api/classes/_category_.yml b/docs/docs/api/classes/_category_.yml
new file mode 100644
index 00000000..55c7980a
--- /dev/null
+++ b/docs/docs/api/classes/_category_.yml
@@ -0,0 +1,2 @@
+label: "Classes"
+position: 3
\ No newline at end of file
diff --git a/docs/docs/api/index.md b/docs/docs/api/index.md
new file mode 100644
index 00000000..78db55b9
--- /dev/null
+++ b/docs/docs/api/index.md
@@ -0,0 +1,636 @@
+---
+id: "index"
+title: "VisionCamera"
+sidebar_label: "Overview"
+sidebar_position: 0.5
+custom_edit_url: null
+---
+
+## Classes
+
+- [Camera](classes/Camera.md)
+- [CameraCaptureError](classes/CameraCaptureError.md)
+- [CameraRuntimeError](classes/CameraRuntimeError.md)
+
+## Interfaces
+
+- [CameraDevice](interfaces/CameraDevice.md)
+- [CameraDeviceFormat](interfaces/CameraDeviceFormat.md)
+- [CameraProps](interfaces/CameraProps.md)
+- [ErrorWithCause](interfaces/ErrorWithCause.md)
+- [PhotoFile](interfaces/PhotoFile.md)
+- [Point](interfaces/Point.md)
+- [RecordVideoOptions](interfaces/RecordVideoOptions.md)
+- [TakePhotoOptions](interfaces/TakePhotoOptions.md)
+- [TemporaryFile](interfaces/TemporaryFile.md)
+- [VideoFile](interfaces/VideoFile.md)
+
+## Type Aliases
+
+### AutoFocusSystem
+
+Ƭ **AutoFocusSystem**: ``"contrast-detection"`` \| ``"phase-detection"`` \| ``"none"``
+
+Indicates a format's autofocus system.
+
+* `"none"`: Indicates that autofocus is not available
+* `"contrast-detection"`: Indicates that autofocus is achieved by contrast detection. Contrast detection performs a focus scan to find the optimal position
+* `"phase-detection"`: Indicates that autofocus is achieved by phase detection. Phase detection has the ability to achieve focus in many cases without a focus scan. Phase detection autofocus is typically less visually intrusive than contrast detection autofocus
+
+#### Defined in
+
+[CameraDevice.ts:53](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L53)
+
+___
+
+### CameraDevices
+
+Ƭ **CameraDevices**: { [key in CameraPosition]: CameraDevice \| undefined }
+
+#### Defined in
+
+[hooks/useCameraDevices.ts:7](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/hooks/useCameraDevices.ts#L7)
+
+___
+
+### CameraPermissionRequestResult
+
+Ƭ **CameraPermissionRequestResult**: ``"granted"`` \| ``"denied"``
+
+#### Defined in
+
+[Camera.tsx:15](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Camera.tsx#L15)
+
+___
+
+### CameraPermissionStatus
+
+Ƭ **CameraPermissionStatus**: ``"granted"`` \| ``"not-determined"`` \| ``"denied"`` \| ``"restricted"``
+
+#### Defined in
+
+[Camera.tsx:14](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Camera.tsx#L14)
+
+___
+
+### CameraPosition
+
+Ƭ **CameraPosition**: ``"front"`` \| ``"back"`` \| ``"unspecified"`` \| ``"external"``
+
+Represents the camera device position.
+
+* `"back"`: Indicates that the device is physically located on the back of the system hardware
+* `"front"`: Indicates that the device is physically located on the front of the system hardware
+
+#### iOS only
+* `"unspecified"`: Indicates that the device's position relative to the system hardware is unspecified
+
+#### Android only
+* `"external"`: The camera device is an external camera, and has no fixed facing relative to the device's screen. (Android only)
+
+#### Defined in
+
+[CameraPosition.ts:13](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraPosition.ts#L13)
+
+___
+
+### CaptureError
+
+Ƭ **CaptureError**: ``"capture/invalid-photo-format"`` \| ``"capture/encoder-error"`` \| ``"capture/muxer-error"`` \| ``"capture/recording-in-progress"`` \| ``"capture/no-recording-in-progress"`` \| ``"capture/file-io-error"`` \| ``"capture/create-temp-file-error"`` \| ``"capture/invalid-video-options"`` \| ``"capture/create-recorder-error"`` \| ``"capture/recorder-error"`` \| ``"capture/no-valid-data"`` \| ``"capture/inactive-source"`` \| ``"capture/insufficient-storage"`` \| ``"capture/file-size-limit-reached"`` \| ``"capture/invalid-photo-codec"`` \| ``"capture/not-bound-error"`` \| ``"capture/capture-type-not-supported"`` \| ``"capture/video-not-enabled"`` \| ``"capture/photo-not-enabled"`` \| ``"capture/aborted"`` \| ``"capture/unknown"``
+
+#### Defined in
+
+[CameraError.ts:31](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L31)
+
+___
+
+### DeviceError
+
+Ƭ **DeviceError**: ``"device/configuration-error"`` \| ``"device/no-device"`` \| ``"device/invalid-device"`` \| ``"device/torch-unavailable"`` \| ``"device/microphone-unavailable"`` \| ``"device/pixel-format-not-supported"`` \| ``"device/low-light-boost-not-supported"`` \| ``"device/focus-not-supported"`` \| ``"device/camera-not-available-on-simulator"``
+
+#### Defined in
+
+[CameraError.ts:8](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L8)
+
+___
+
+### FormatError
+
+Ƭ **FormatError**: ``"format/invalid-fps"`` \| ``"format/invalid-hdr"`` \| ``"format/invalid-low-light-boost"`` \| ``"format/invalid-format"`` \| ``"format/invalid-color-space"``
+
+#### Defined in
+
+[CameraError.ts:18](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L18)
+
+___
+
+### FrameProcessor
+
+Ƭ **FrameProcessor**: `Object`
+
+#### Type declaration
+
+| Name | Type |
+| :------ | :------ |
+| `frameProcessor` | (`frame`: `Frame`) => `void` |
+| `type` | ``"frame-processor"`` |
+
+#### Defined in
+
+[CameraProps.ts:7](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L7)
+
+___
+
+### LogicalCameraDeviceType
+
+Ƭ **LogicalCameraDeviceType**: ``"dual-camera"`` \| ``"dual-wide-camera"`` \| ``"triple-camera"``
+
+Indentifiers for a logical camera (Combinations of multiple physical cameras to create a single logical camera).
+
+* `"dual-camera"`: A combination of wide-angle and telephoto cameras that creates a capture device.
+* `"dual-wide-camera"`: A device that consists of two cameras of fixed focal length, one ultrawide angle and one wide angle.
+* `"triple-camera"`: A device that consists of three cameras of fixed focal length, one ultrawide angle, one wide angle, and one telephoto.
+
+#### Defined in
+
+[CameraDevice.ts:21](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L21)
+
+___
+
+### ParameterError
+
+Ƭ **ParameterError**: ``"parameter/invalid-parameter"`` \| ``"parameter/unsupported-os"`` \| ``"parameter/unsupported-output"`` \| ``"parameter/unsupported-input"`` \| ``"parameter/invalid-combination"``
+
+#### Defined in
+
+[CameraError.ts:2](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L2)
+
+___
+
+### PermissionError
+
+Ƭ **PermissionError**: ``"permission/microphone-permission-denied"`` \| ``"permission/camera-permission-denied"``
+
+#### Defined in
+
+[CameraError.ts:1](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L1)
+
+___
+
+### PhysicalCameraDeviceType
+
+Ƭ **PhysicalCameraDeviceType**: ``"ultra-wide-angle-camera"`` \| ``"wide-angle-camera"`` \| ``"telephoto-camera"``
+
+Indentifiers for a physical camera (one that actually exists on the back/front of the device)
+
+* `"ultra-wide-angle-camera"`: A built-in camera with a shorter focal length than that of a wide-angle camera. (focal length between below 24mm)
+* `"wide-angle-camera"`: A built-in wide-angle camera. (focal length between 24mm and 35mm)
+* `"telephoto-camera"`: A built-in camera device with a longer focal length than a wide-angle camera. (focal length between above 85mm)
+
+#### Defined in
+
+[CameraDevice.ts:12](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L12)
+
+___
+
+### SessionError
+
+Ƭ **SessionError**: ``"session/camera-not-ready"`` \| ``"session/camera-cannot-be-opened"`` \| ``"session/camera-has-been-disconnected"`` \| ``"session/audio-session-setup-failed"`` \| ``"session/audio-in-use-by-other-app"`` \| ``"session/audio-session-failed-to-activate"``
+
+#### Defined in
+
+[CameraError.ts:24](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L24)
+
+___
+
+### SystemError
+
+Ƭ **SystemError**: ``"system/camera-module-not-found"`` \| ``"system/no-camera-manager"`` \| ``"system/frame-processors-unavailable"`` \| ``"system/view-not-found"``
+
+#### Defined in
+
+[CameraError.ts:53](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L53)
+
+___
+
+### UnknownError
+
+Ƭ **UnknownError**: ``"unknown/unknown"``
+
+#### Defined in
+
+[CameraError.ts:58](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L58)
+
+___
+
+### VideoStabilizationMode
+
+Ƭ **VideoStabilizationMode**: ``"off"`` \| ``"standard"`` \| ``"cinematic"`` \| ``"cinematic-extended"`` \| ``"auto"``
+
+Indicates a format's supported video stabilization mode. Enabling video stabilization may introduce additional latency into the video capture pipeline.
+
+* `"off"`: No video stabilization. Indicates that video should not be stabilized
+* `"standard"`: Standard software-based video stabilization. Standard video stabilization reduces the field of view by about 10%.
+* `"cinematic"`: Advanced software-based video stabilization. This applies more aggressive cropping or transformations than standard.
+* `"cinematic-extended"`: Extended software- and hardware-based stabilization that aggressively crops and transforms the video to apply a smooth cinematic stabilization.
+* `"auto"`: Indicates that the most appropriate video stabilization mode for the device and format should be chosen automatically
+
+#### Defined in
+
+[CameraDevice.ts:64](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L64)
+
+## Variables
+
+### VisionCameraProxy
+
+• `Const` **VisionCameraProxy**: `TVisionCameraProxy` = `proxy`
+
+#### Defined in
+
+[FrameProcessorPlugins.ts:95](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/FrameProcessorPlugins.ts#L95)
+
+## Functions
+
+### createFrameProcessor
+
+▸ **createFrameProcessor**(`frameProcessor`, `type`): [`FrameProcessor`](#frameprocessor)
+
+Create a new Frame Processor function which you can pass to the ``.
+(See ["Frame Processors"](https://mrousavy.github.io/react-native-vision-camera/docs/guides/frame-processors))
+
+Make sure to add the `'worklet'` directive to the top of the Frame Processor function, otherwise it will not get compiled into a worklet.
+
+Also make sure to memoize the returned object, so that the Camera doesn't reset the Frame Processor Context each time.
+
+#### Parameters
+
+| Name | Type |
+| :------ | :------ |
+| `frameProcessor` | (`frame`: `Frame`) => `void` |
+| `type` | ``"frame-processor"`` |
+
+#### Returns
+
+[`FrameProcessor`](#frameprocessor)
+
+#### Defined in
+
+[hooks/useFrameProcessor.ts:13](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/hooks/useFrameProcessor.ts#L13)
+
+___
+
+### isErrorWithCause
+
+▸ **isErrorWithCause**(`error`): error is ErrorWithCause
+
+Checks if the given `error` is of type [`ErrorWithCause`](interfaces/ErrorWithCause.md)
+
+#### Parameters
+
+| Name | Type | Description |
+| :------ | :------ | :------ |
+| `error` | `unknown` | Any unknown object to validate |
+
+#### Returns
+
+error is ErrorWithCause
+
+`true` if the given `error` is of type [`ErrorWithCause`](interfaces/ErrorWithCause.md)
+
+#### Defined in
+
+[CameraError.ts:176](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L176)
+
+___
+
+### parsePhysicalDeviceTypes
+
+▸ **parsePhysicalDeviceTypes**(`physicalDeviceTypes`): [`PhysicalCameraDeviceType`](#physicalcameradevicetype) \| [`LogicalCameraDeviceType`](#logicalcameradevicetype)
+
+Parses an array of physical device types into a single [`PhysicalCameraDeviceType`](#physicalcameradevicetype) or [`LogicalCameraDeviceType`](#logicalcameradevicetype), depending what matches.
+
+**`Method`**
+
+#### Parameters
+
+| Name | Type |
+| :------ | :------ |
+| `physicalDeviceTypes` | [`PhysicalCameraDeviceType`](#physicalcameradevicetype)[] |
+
+#### Returns
+
+[`PhysicalCameraDeviceType`](#physicalcameradevicetype) \| [`LogicalCameraDeviceType`](#logicalcameradevicetype)
+
+#### Defined in
+
+[CameraDevice.ts:27](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L27)
+
+___
+
+### runAsync
+
+▸ **runAsync**(`frame`, `func`): `void`
+
+Runs the given function asynchronously, while keeping a strong reference to the Frame.
+
+For example, if you want to run a heavy face detection algorithm
+while still drawing to the screen at 60 FPS, you can use `runAsync(...)`
+to offload the face detection algorithm to a separate thread.
+
+**`Example`**
+
+```ts
+const frameProcessor = useFrameProcessor((frame) => {
+ 'worklet'
+ console.log('New Frame')
+ runAsync(frame, () => {
+ 'worklet'
+ const faces = detectFaces(frame)
+ const face = [faces0]
+ console.log(`Detected a new face: ${face}`)
+ })
+})
+```
+
+#### Parameters
+
+| Name | Type | Description |
+| :------ | :------ | :------ |
+| `frame` | `Frame` | The current Frame of the Frame Processor. |
+| `func` | () => `void` | The function to execute. |
+
+#### Returns
+
+`void`
+
+#### Defined in
+
+[FrameProcessorPlugins.ts:177](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/FrameProcessorPlugins.ts#L177)
+
+___
+
+### runAtTargetFps
+
+▸ **runAtTargetFps**<`T`\>(`fps`, `func`): `T` \| `undefined`
+
+Runs the given function at the given target FPS rate.
+
+For example, if you want to run a heavy face detection algorithm
+only once per second, you can use `runAtTargetFps(1, ...)` to
+throttle it to 1 FPS.
+
+**`Example`**
+
+```ts
+const frameProcessor = useFrameProcessor((frame) => {
+ 'worklet'
+ console.log('New Frame')
+ runAtTargetFps(5, () => {
+ 'worklet'
+ const faces = detectFaces(frame)
+ console.log(`Detected a new face: ${faces[0]}`)
+ })
+})
+```
+
+#### Type parameters
+
+| Name |
+| :------ |
+| `T` |
+
+#### Parameters
+
+| Name | Type | Description |
+| :------ | :------ | :------ |
+| `fps` | `number` | The target FPS rate at which the given function should be executed |
+| `func` | () => `T` | The function to execute. |
+
+#### Returns
+
+`T` \| `undefined`
+
+The result of the function if it was executed, or `undefined` otherwise.
+
+#### Defined in
+
+[FrameProcessorPlugins.ts:136](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/FrameProcessorPlugins.ts#L136)
+
+___
+
+### sortDevices
+
+▸ **sortDevices**(`left`, `right`): `number`
+
+Compares two devices by the following criteria:
+* `wide-angle-camera`s are ranked higher than others
+* Devices with more physical cameras are ranked higher than ones with less. (e.g. "Triple Camera" > "Wide-Angle Camera")
+
+> Note that this makes the `sort()` function descending, so the first element (`[0]`) is the "best" device.
+
+**`Example`**
+
+```ts
+const devices = camera.devices.sort(sortDevices)
+const bestDevice = devices[0]
+```
+
+**`Method`**
+
+#### Parameters
+
+| Name | Type |
+| :------ | :------ |
+| `left` | [`CameraDevice`](interfaces/CameraDevice.md) |
+| `right` | [`CameraDevice`](interfaces/CameraDevice.md) |
+
+#### Returns
+
+`number`
+
+#### Defined in
+
+[utils/FormatFilter.ts:18](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/utils/FormatFilter.ts#L18)
+
+___
+
+### sortFormats
+
+▸ **sortFormats**(`left`, `right`): `number`
+
+Sort formats by resolution and aspect ratio difference (to the Screen size).
+
+> Note that this makes the `sort()` function descending, so the first element (`[0]`) is the "best" device.
+
+#### Parameters
+
+| Name | Type |
+| :------ | :------ |
+| `left` | [`CameraDeviceFormat`](interfaces/CameraDeviceFormat.md) |
+| `right` | [`CameraDeviceFormat`](interfaces/CameraDeviceFormat.md) |
+
+#### Returns
+
+`number`
+
+#### Defined in
+
+[utils/FormatFilter.ts:72](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/utils/FormatFilter.ts#L72)
+
+___
+
+### tryParseNativeCameraError
+
+▸ **tryParseNativeCameraError**<`T`\>(`nativeError`): [`CameraCaptureError`](classes/CameraCaptureError.md) \| [`CameraRuntimeError`](classes/CameraRuntimeError.md) \| `T`
+
+Tries to parse an error coming from native to a typed JS camera error.
+
+**`Method`**
+
+#### Type parameters
+
+| Name |
+| :------ |
+| `T` |
+
+#### Parameters
+
+| Name | Type | Description |
+| :------ | :------ | :------ |
+| `nativeError` | `T` | The native error instance. This is a JSON in the legacy native module architecture. |
+
+#### Returns
+
+[`CameraCaptureError`](classes/CameraCaptureError.md) \| [`CameraRuntimeError`](classes/CameraRuntimeError.md) \| `T`
+
+A [`CameraRuntimeError`](classes/CameraRuntimeError.md) or [`CameraCaptureError`](classes/CameraCaptureError.md), or the `nativeError` itself if it's not parsable
+
+#### Defined in
+
+[CameraError.ts:202](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L202)
+
+___
+
+### useCameraDevices
+
+▸ **useCameraDevices**(): [`CameraDevices`](#cameradevices)
+
+Gets the best available [`CameraDevice`](interfaces/CameraDevice.md). Devices with more cameras are preferred.
+
+**`Throws`**
+
+[`CameraRuntimeError`](classes/CameraRuntimeError.md) if no device was found.
+
+**`Example`**
+
+```tsx
+const device = useCameraDevice()
+// ...
+return
+```
+
+#### Returns
+
+[`CameraDevices`](#cameradevices)
+
+The best matching [`CameraDevice`](interfaces/CameraDevice.md).
+
+#### Defined in
+
+[hooks/useCameraDevices.ts:29](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/hooks/useCameraDevices.ts#L29)
+
+▸ **useCameraDevices**(`deviceType`): [`CameraDevices`](#cameradevices)
+
+Gets a [`CameraDevice`](interfaces/CameraDevice.md) for the requested device type.
+
+**`Throws`**
+
+[`CameraRuntimeError`](classes/CameraRuntimeError.md) if no device was found.
+
+**`Example`**
+
+```tsx
+const device = useCameraDevice('wide-angle-camera')
+// ...
+return
+```
+
+#### Parameters
+
+| Name | Type | Description |
+| :------ | :------ | :------ |
+| `deviceType` | [`PhysicalCameraDeviceType`](#physicalcameradevicetype) \| [`LogicalCameraDeviceType`](#logicalcameradevicetype) | Specifies a device type which will be used as a device filter. |
+
+#### Returns
+
+[`CameraDevices`](#cameradevices)
+
+A [`CameraDevice`](interfaces/CameraDevice.md) for the requested device type.
+
+#### Defined in
+
+[hooks/useCameraDevices.ts:44](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/hooks/useCameraDevices.ts#L44)
+
+___
+
+### useCameraFormat
+
+▸ **useCameraFormat**(`device?`): [`CameraDeviceFormat`](interfaces/CameraDeviceFormat.md) \| `undefined`
+
+Returns the best format for the given camera device.
+
+This function tries to choose a format with the highest possible photo-capture resolution and best matching aspect ratio.
+
+#### Parameters
+
+| Name | Type | Description |
+| :------ | :------ | :------ |
+| `device?` | [`CameraDevice`](interfaces/CameraDevice.md) | The Camera Device |
+
+#### Returns
+
+[`CameraDeviceFormat`](interfaces/CameraDeviceFormat.md) \| `undefined`
+
+The best matching format for the given camera device, or `undefined` if the camera device is `undefined`.
+
+#### Defined in
+
+[hooks/useCameraFormat.ts:14](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/hooks/useCameraFormat.ts#L14)
+
+___
+
+### useFrameProcessor
+
+▸ **useFrameProcessor**(`frameProcessor`, `dependencies`): [`FrameProcessor`](#frameprocessor)
+
+Returns a memoized Frame Processor function wich you can pass to the ``.
+(See ["Frame Processors"](https://mrousavy.github.io/react-native-vision-camera/docs/guides/frame-processors))
+
+Make sure to add the `'worklet'` directive to the top of the Frame Processor function, otherwise it will not get compiled into a worklet.
+
+**`Example`**
+
+```ts
+const frameProcessor = useFrameProcessor((frame) => {
+ 'worklet'
+ const qrCodes = scanQRCodes(frame)
+ console.log(`QR Codes: ${qrCodes}`)
+}, [])
+```
+
+#### Parameters
+
+| Name | Type | Description |
+| :------ | :------ | :------ |
+| `frameProcessor` | (`frame`: `Frame`) => `void` | The Frame Processor |
+| `dependencies` | `DependencyList` | The React dependencies which will be copied into the VisionCamera JS-Runtime. |
+
+#### Returns
+
+[`FrameProcessor`](#frameprocessor)
+
+The memoized Frame Processor.
+
+#### Defined in
+
+[hooks/useFrameProcessor.ts:49](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/hooks/useFrameProcessor.ts#L49)
diff --git a/docs/docs/api/interfaces/CameraDevice.md b/docs/docs/api/interfaces/CameraDevice.md
new file mode 100644
index 00000000..e84d9eaf
--- /dev/null
+++ b/docs/docs/api/interfaces/CameraDevice.md
@@ -0,0 +1,246 @@
+---
+id: "CameraDevice"
+title: "CameraDevice"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+Represents a camera device discovered by the [`Camera.getAvailableCameraDevices()`](../classes/Camera.md#getavailablecameradevices) function
+
+## Properties
+
+### devices
+
+• **devices**: [`PhysicalCameraDeviceType`](../#physicalcameradevicetype)[]
+
+The physical devices this `CameraDevice` contains.
+
+* If this camera device is a **logical camera** (combination of multiple physical cameras), there are multiple cameras in this array.
+* If this camera device is a **physical camera**, there is only a single element in this array.
+
+You can check if the camera is a logical multi-camera by using the `isMultiCam` property.
+
+#### Defined in
+
+[CameraDevice.ts:149](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L149)
+
+___
+
+### formats
+
+• **formats**: [`CameraDeviceFormat`](CameraDeviceFormat.md)[]
+
+All available formats for this camera device. Use this to find the best format for your use case and set it to the Camera's [`Camera's .format`](CameraProps.md#format) property.
+
+See [the Camera Formats documentation](https://react-native-vision-camera.com/docs/guides/formats) for more information about Camera Formats.
+
+#### Defined in
+
+[CameraDevice.ts:203](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L203)
+
+___
+
+### hardwareLevel
+
+• **hardwareLevel**: ``"legacy"`` \| ``"limited"`` \| ``"full"``
+
+The hardware level of the Camera.
+- On Android, some older devices are running at a `legacy` or `limited` level which means they are running in a backwards compatible mode.
+- On iOS, all devices are `full`.
+
+#### Defined in
+
+[CameraDevice.ts:229](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L229)
+
+___
+
+### hasFlash
+
+• **hasFlash**: `boolean`
+
+Specifies whether this camera supports enabling flash for photo capture.
+
+#### Defined in
+
+[CameraDevice.ts:161](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L161)
+
+___
+
+### hasTorch
+
+• **hasTorch**: `boolean`
+
+Specifies whether this camera supports continuously enabling the flash to act like a torch (flash with video capture)
+
+#### Defined in
+
+[CameraDevice.ts:165](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L165)
+
+___
+
+### id
+
+• **id**: `string`
+
+The native ID of the camera device instance.
+
+#### Defined in
+
+[CameraDevice.ts:140](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L140)
+
+___
+
+### isMultiCam
+
+• **isMultiCam**: `boolean`
+
+A property indicating whether the device is a virtual multi-camera consisting of multiple combined physical cameras.
+
+Examples:
+* The Dual Camera, which supports seamlessly switching between a wide and telephoto camera while zooming and generating depth data from the disparities between the different points of view of the physical cameras.
+* The TrueDepth Camera, which generates depth data from disparities between a YUV camera and an Infrared camera pointed in the same direction.
+
+#### Defined in
+
+[CameraDevice.ts:173](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L173)
+
+___
+
+### maxZoom
+
+• **maxZoom**: `number`
+
+Maximum available zoom factor (e.g. `128`)
+
+#### Defined in
+
+[CameraDevice.ts:181](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L181)
+
+___
+
+### minZoom
+
+• **minZoom**: `number`
+
+Minimum available zoom factor (e.g. `1`)
+
+#### Defined in
+
+[CameraDevice.ts:177](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L177)
+
+___
+
+### name
+
+• **name**: `string`
+
+A friendly localized name describing the camera.
+
+#### Defined in
+
+[CameraDevice.ts:157](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L157)
+
+___
+
+### neutralZoom
+
+• **neutralZoom**: `number`
+
+The zoom factor where the camera is "neutral".
+
+* For single-physical cameras this property is always `1.0`.
+* For multi cameras this property is a value between `minZoom` and `maxZoom`, where the camera is in _wide-angle_ mode and hasn't switched to the _ultra-wide-angle_ ("fish-eye") or telephoto camera yet.
+
+Use this value as an initial value for the zoom property if you implement custom zoom. (e.g. reanimated shared value should be initially set to this value)
+
+**`Example`**
+
+```ts
+const device = ...
+
+const zoom = useSharedValue(device.neutralZoom) // <-- initial value so it doesn't start at ultra-wide
+const cameraProps = useAnimatedProps(() => ({
+ zoom: zoom.value
+}))
+```
+
+#### Defined in
+
+[CameraDevice.ts:197](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L197)
+
+___
+
+### position
+
+• **position**: [`CameraPosition`](../#cameraposition)
+
+Specifies the physical position of this camera. (back or front)
+
+#### Defined in
+
+[CameraDevice.ts:153](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L153)
+
+___
+
+### sensorOrientation
+
+• **sensorOrientation**: `Orientation`
+
+Represents the sensor's orientation relative to the phone.
+For most phones this will be landscape, as Camera sensors are usually always rotated by 90 degrees (i.e. width and height are flipped).
+
+#### Defined in
+
+[CameraDevice.ts:234](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L234)
+
+___
+
+### supportsDepthCapture
+
+• **supportsDepthCapture**: `boolean`
+
+Whether this camera supports taking photos with depth data.
+
+**! Work in Progress !**
+
+#### Defined in
+
+[CameraDevice.ts:213](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L213)
+
+___
+
+### supportsFocus
+
+• **supportsFocus**: `boolean`
+
+Specifies whether this device supports focusing ([`Camera.focus(...)`](../classes/Camera.md#focus))
+
+#### Defined in
+
+[CameraDevice.ts:223](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L223)
+
+___
+
+### supportsLowLightBoost
+
+• **supportsLowLightBoost**: `boolean`
+
+Whether this camera device supports low light boost.
+
+#### Defined in
+
+[CameraDevice.ts:207](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L207)
+
+___
+
+### supportsRawCapture
+
+• **supportsRawCapture**: `boolean`
+
+Whether this camera supports taking photos in RAW format
+
+**! Work in Progress !**
+
+#### Defined in
+
+[CameraDevice.ts:219](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L219)
diff --git a/docs/docs/api/interfaces/CameraDeviceFormat.md b/docs/docs/api/interfaces/CameraDeviceFormat.md
new file mode 100644
index 00000000..41b1f4d9
--- /dev/null
+++ b/docs/docs/api/interfaces/CameraDeviceFormat.md
@@ -0,0 +1,189 @@
+---
+id: "CameraDeviceFormat"
+title: "CameraDeviceFormat"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+A Camera Device's video format. Do not create instances of this type yourself, only use [`Camera.getAvailableCameraDevices()`](../classes/Camera.md#getavailablecameradevices).
+
+## Properties
+
+### autoFocusSystem
+
+• **autoFocusSystem**: [`AutoFocusSystem`](../#autofocussystem)
+
+Specifies this format's auto focus system.
+
+#### Defined in
+
+[CameraDevice.ts:121](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L121)
+
+___
+
+### fieldOfView
+
+• **fieldOfView**: `number`
+
+The video field of view in degrees
+
+#### Defined in
+
+[CameraDevice.ts:97](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L97)
+
+___
+
+### maxFps
+
+• **maxFps**: `number`
+
+The maximum frame rate this Format is able to run at. High resolution formats often run at lower frame rates.
+
+#### Defined in
+
+[CameraDevice.ts:117](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L117)
+
+___
+
+### maxISO
+
+• **maxISO**: `number`
+
+Maximum supported ISO value
+
+#### Defined in
+
+[CameraDevice.ts:89](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L89)
+
+___
+
+### maxZoom
+
+• **maxZoom**: `number`
+
+The maximum zoom factor (e.g. `128`)
+
+#### Defined in
+
+[CameraDevice.ts:101](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L101)
+
+___
+
+### minFps
+
+• **minFps**: `number`
+
+The minum frame rate this Format needs to run at. High resolution formats often run at lower frame rates.
+
+#### Defined in
+
+[CameraDevice.ts:113](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L113)
+
+___
+
+### minISO
+
+• **minISO**: `number`
+
+Minimum supported ISO value
+
+#### Defined in
+
+[CameraDevice.ts:93](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L93)
+
+___
+
+### photoHeight
+
+• **photoHeight**: `number`
+
+The height of the highest resolution a still image (photo) can be produced in
+
+#### Defined in
+
+[CameraDevice.ts:73](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L73)
+
+___
+
+### photoWidth
+
+• **photoWidth**: `number`
+
+The width of the highest resolution a still image (photo) can be produced in
+
+#### Defined in
+
+[CameraDevice.ts:77](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L77)
+
+___
+
+### pixelFormats
+
+• **pixelFormats**: `PixelFormat`[]
+
+Specifies this format's supported pixel-formats.
+In most cases, this is `['native', 'yuv']`.
+
+#### Defined in
+
+[CameraDevice.ts:130](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L130)
+
+___
+
+### supportsPhotoHDR
+
+• **supportsPhotoHDR**: `boolean`
+
+Specifies whether this format supports HDR mode for photo capture
+
+#### Defined in
+
+[CameraDevice.ts:109](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L109)
+
+___
+
+### supportsVideoHDR
+
+• **supportsVideoHDR**: `boolean`
+
+Specifies whether this format supports HDR mode for video capture
+
+#### Defined in
+
+[CameraDevice.ts:105](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L105)
+
+___
+
+### videoHeight
+
+• **videoHeight**: `number`
+
+The video resolutions's height
+
+#### Defined in
+
+[CameraDevice.ts:81](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L81)
+
+___
+
+### videoStabilizationModes
+
+• **videoStabilizationModes**: [`VideoStabilizationMode`](../#videostabilizationmode)[]
+
+All supported video stabilization modes
+
+#### Defined in
+
+[CameraDevice.ts:125](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L125)
+
+___
+
+### videoWidth
+
+• **videoWidth**: `number`
+
+The video resolution's width
+
+#### Defined in
+
+[CameraDevice.ts:85](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraDevice.ts#L85)
diff --git a/docs/docs/api/interfaces/CameraProps.md b/docs/docs/api/interfaces/CameraProps.md
new file mode 100644
index 00000000..464e75dd
--- /dev/null
+++ b/docs/docs/api/interfaces/CameraProps.md
@@ -0,0 +1,406 @@
+---
+id: "CameraProps"
+title: "CameraProps"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+## Hierarchy
+
+- `ViewProps`
+
+ ↳ **`CameraProps`**
+
+## Properties
+
+### audio
+
+• `Optional` **audio**: `boolean`
+
+Enables **audio capture** for video recordings (see ["Recording Videos"](https://react-native-vision-camera.com/docs/guides/capturing/#recording-videos))
+
+#### Defined in
+
+[CameraProps.ts:61](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L61)
+
+___
+
+### device
+
+• **device**: [`CameraDevice`](CameraDevice.md)
+
+The Camera Device to use.
+
+See the [Camera Devices](https://react-native-vision-camera.com/docs/guides/devices) section in the documentation for more information about Camera Devices.
+
+**`Example`**
+
+```tsx
+const devices = useCameraDevices('wide-angle-camera')
+const device = devices.back
+
+return (
+
+)
+```
+
+#### Defined in
+
+[CameraProps.ts:37](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L37)
+
+___
+
+### enableDepthData
+
+• `Optional` **enableDepthData**: `boolean`
+
+Also captures data from depth-perception sensors. (e.g. disparity maps)
+
+**`Default`**
+
+false
+
+#### Defined in
+
+[CameraProps.ts:145](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L145)
+
+___
+
+### enableFpsGraph
+
+• `Optional` **enableFpsGraph**: `boolean`
+
+If `true`, show a debug view to display the FPS of the Camera session.
+This is useful for debugging your Frame Processor's speed.
+
+**`Default`**
+
+false
+
+#### Defined in
+
+[CameraProps.ts:173](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L173)
+
+___
+
+### enableHighQualityPhotos
+
+• `Optional` **enableHighQualityPhotos**: `boolean`
+
+Indicates whether the Camera should prepare the photo pipeline to provide maximum quality photos.
+
+This enables:
+* High Resolution Capture ([`isHighResolutionCaptureEnabled`](https://developer.apple.com/documentation/avfoundation/avcapturephotooutput/1648721-ishighresolutioncaptureenabled))
+* Virtual Device fusion for greater detail ([`isVirtualDeviceConstituentPhotoDeliveryEnabled`](https://developer.apple.com/documentation/avfoundation/avcapturephotooutput/3192189-isvirtualdeviceconstituentphotod))
+* Dual Device fusion for greater detail ([`isDualCameraDualPhotoDeliveryEnabled`](https://developer.apple.com/documentation/avfoundation/avcapturephotosettings/2873917-isdualcameradualphotodeliveryena))
+* Sets the maximum quality prioritization to `.quality` ([`maxPhotoQualityPrioritization`](https://developer.apple.com/documentation/avfoundation/avcapturephotooutput/3182995-maxphotoqualityprioritization))
+
+**`Default`**
+
+false
+
+#### Defined in
+
+[CameraProps.ts:166](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L166)
+
+___
+
+### enablePortraitEffectsMatteDelivery
+
+• `Optional` **enablePortraitEffectsMatteDelivery**: `boolean`
+
+A boolean specifying whether the photo render pipeline is prepared for portrait effects matte delivery.
+
+When enabling this, you must also set `enableDepthData` to `true`.
+
+**`Platform`**
+
+iOS 12.0+
+
+**`Default`**
+
+false
+
+#### Defined in
+
+[CameraProps.ts:154](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L154)
+
+___
+
+### enableZoomGesture
+
+• `Optional` **enableZoomGesture**: `boolean`
+
+Enables or disables the native pinch to zoom gesture.
+
+If you want to implement a custom zoom gesture, see [the Zooming with Reanimated documentation](https://react-native-vision-camera.com/docs/guides/animated).
+
+**`Default`**
+
+false
+
+#### Defined in
+
+[CameraProps.ts:106](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L106)
+
+___
+
+### format
+
+• `Optional` **format**: [`CameraDeviceFormat`](CameraDeviceFormat.md)
+
+Selects a given format. By default, the best matching format is chosen.
+
+#### Defined in
+
+[CameraProps.ts:113](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L113)
+
+___
+
+### fps
+
+• `Optional` **fps**: `number`
+
+Specify the frames per second this camera should use. Make sure the given `format` includes a frame rate range with the given `fps`.
+
+Requires `format` to be set.
+
+#### Defined in
+
+[CameraProps.ts:119](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L119)
+
+___
+
+### frameProcessor
+
+• `Optional` **frameProcessor**: [`FrameProcessor`](../#frameprocessor)
+
+A worklet which will be called for every frame the Camera "sees".
+
+> See [the Frame Processors documentation](https://mrousavy.github.io/react-native-vision-camera/docs/guides/frame-processors) for more information
+
+**`Example`**
+
+```tsx
+const frameProcessor = useFrameProcessor((frame) => {
+ 'worklet'
+ const qrCodes = scanQRCodes(frame)
+ console.log(`Detected QR Codes: ${qrCodes}`)
+}, [])
+
+return
+```
+
+#### Defined in
+
+[CameraProps.ts:204](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L204)
+
+___
+
+### hdr
+
+• `Optional` **hdr**: `boolean`
+
+Enables or disables HDR on this camera device. Make sure the given `format` supports HDR mode.
+
+Requires `format` to be set.
+
+#### Defined in
+
+[CameraProps.ts:125](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L125)
+
+___
+
+### isActive
+
+• **isActive**: `boolean`
+
+Whether the Camera should actively stream video frames, or not. See the [documentation about the `isActive` prop](https://react-native-vision-camera.com/docs/guides/lifecycle#the-isactive-prop) for more information.
+
+This can be compared to a Video component, where `isActive` specifies whether the video is paused or not.
+
+> Note: If you fully unmount the `` component instead of using `isActive={false}`, the Camera will take a bit longer to start again. In return, it will use less resources since the Camera will be completely destroyed when unmounted.
+
+#### Defined in
+
+[CameraProps.ts:45](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L45)
+
+___
+
+### lowLightBoost
+
+• `Optional` **lowLightBoost**: `boolean`
+
+Enables or disables low-light boost on this camera device. Make sure the given `format` supports low-light boost.
+
+Requires `format` to be set.
+
+#### Defined in
+
+[CameraProps.ts:131](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L131)
+
+___
+
+### onError
+
+• `Optional` **onError**: (`error`: [`CameraRuntimeError`](../classes/CameraRuntimeError.md)) => `void`
+
+#### Type declaration
+
+▸ (`error`): `void`
+
+Called when any kind of runtime error occured.
+
+##### Parameters
+
+| Name | Type |
+| :------ | :------ |
+| `error` | [`CameraRuntimeError`](../classes/CameraRuntimeError.md) |
+
+##### Returns
+
+`void`
+
+#### Defined in
+
+[CameraProps.ts:183](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L183)
+
+___
+
+### onInitialized
+
+• `Optional` **onInitialized**: () => `void`
+
+#### Type declaration
+
+▸ (): `void`
+
+Called when the camera was successfully initialized.
+
+##### Returns
+
+`void`
+
+#### Defined in
+
+[CameraProps.ts:187](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L187)
+
+___
+
+### orientation
+
+• `Optional` **orientation**: `Orientation`
+
+Represents the orientation of all Camera Outputs (Photo, Video, and Frame Processor). If this value is not set, the device orientation is used.
+
+#### Defined in
+
+[CameraProps.ts:177](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L177)
+
+___
+
+### photo
+
+• `Optional` **photo**: `boolean`
+
+Enables **photo capture** with the `takePhoto` function (see ["Taking Photos"](https://react-native-vision-camera.com/docs/guides/capturing#taking-photos))
+
+#### Defined in
+
+[CameraProps.ts:51](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L51)
+
+___
+
+### pixelFormat
+
+• `Optional` **pixelFormat**: ``"yuv"`` \| ``"rgb"`` \| ``"native"``
+
+Specifies the pixel format for the video pipeline.
+
+Frames from a [Frame Processor](https://mrousavy.github.io/react-native-vision-camera/docs/guides/frame-processors) will be streamed in the pixel format specified here.
+
+While `native` and `yuv` are the most efficient formats, some ML models (such as MLKit Barcode detection) require input Frames to be in RGB colorspace, otherwise they just output nonsense.
+
+- `native`: The hardware native GPU buffer format. This is the most efficient format. (`PRIVATE` on Android, sometimes YUV on iOS)
+- `yuv`: The YUV (Y'CbCr 4:2:0 or NV21, 8-bit) format, either video- or full-range, depending on hardware capabilities. This is the second most efficient format.
+- `rgb`: The RGB (RGB, RGBA or ABGRA, 8-bit) format. This is least efficient and requires explicit conversion.
+
+**`Default`**
+
+`native`
+
+#### Defined in
+
+[CameraProps.ts:75](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L75)
+
+___
+
+### torch
+
+• `Optional` **torch**: ``"off"`` \| ``"on"``
+
+Set the current torch mode.
+
+Note: The torch is only available on `"back"` cameras, and isn't supported by every phone.
+
+**`Default`**
+
+"off"
+
+#### Defined in
+
+[CameraProps.ts:86](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L86)
+
+___
+
+### video
+
+• `Optional` **video**: `boolean`
+
+Enables **video capture** with the `startRecording` function (see ["Recording Videos"](https://react-native-vision-camera.com/docs/guides/capturing/#recording-videos))
+
+Note: If both the `photo` and `video` properties are enabled at the same time and the device is running at a `hardwareLevel` of `'legacy'` or `'limited'`, VisionCamera _might_ use a lower resolution for video capture due to hardware constraints.
+
+#### Defined in
+
+[CameraProps.ts:57](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L57)
+
+___
+
+### videoStabilizationMode
+
+• `Optional` **videoStabilizationMode**: [`VideoStabilizationMode`](../#videostabilizationmode)
+
+Specifies the video stabilization mode to use.
+
+Requires a `format` to be set that contains the given `videoStabilizationMode`.
+
+#### Defined in
+
+[CameraProps.ts:137](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L137)
+
+___
+
+### zoom
+
+• `Optional` **zoom**: `number`
+
+Specifies the zoom factor of the current camera, in "factor"/scale.
+
+This value ranges from `minZoom` (e.g. `1`) to `maxZoom` (e.g. `128`). It is recommended to set this value
+to the CameraDevice's `neutralZoom` per default and let the user zoom out to the fish-eye (ultra-wide) camera
+on demand (if available)
+
+**Note:** Linearly increasing this value always appears logarithmic to the user.
+
+**`Default`**
+
+1.0
+
+#### Defined in
+
+[CameraProps.ts:98](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraProps.ts#L98)
diff --git a/docs/docs/api/interfaces/ErrorWithCause.md b/docs/docs/api/interfaces/ErrorWithCause.md
new file mode 100644
index 00000000..1078d022
--- /dev/null
+++ b/docs/docs/api/interfaces/ErrorWithCause.md
@@ -0,0 +1,98 @@
+---
+id: "ErrorWithCause"
+title: "ErrorWithCause"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+Represents a JSON-style error cause. This contains native `NSError`/`Throwable` information, and can have recursive [`.cause`](ErrorWithCause.md#cause) properties until the ultimate cause has been found.
+
+## Properties
+
+### cause
+
+• `Optional` **cause**: [`ErrorWithCause`](ErrorWithCause.md)
+
+Optional additional cause for nested errors
+
+* iOS: N/A
+* Android: `Throwable.cause`
+
+#### Defined in
+
+[CameraError.ts:105](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L105)
+
+___
+
+### code
+
+• `Optional` **code**: `number`
+
+The native error's code.
+
+* iOS: `NSError.code`
+* Android: N/A
+
+#### Defined in
+
+[CameraError.ts:70](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L70)
+
+___
+
+### details
+
+• `Optional` **details**: `Record`<`string`, `unknown`\>
+
+Optional additional details
+
+* iOS: `NSError.userInfo`
+* Android: N/A
+
+#### Defined in
+
+[CameraError.ts:91](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L91)
+
+___
+
+### domain
+
+• `Optional` **domain**: `string`
+
+The native error's domain.
+
+* iOS: `NSError.domain`
+* Android: N/A
+
+#### Defined in
+
+[CameraError.ts:77](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L77)
+
+___
+
+### message
+
+• **message**: `string`
+
+The native error description
+
+* iOS: `NSError.message`
+* Android: `Throwable.message`
+
+#### Defined in
+
+[CameraError.ts:84](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L84)
+
+___
+
+### stacktrace
+
+• `Optional` **stacktrace**: `string`
+
+Optional Java stacktrace
+
+* iOS: N/A
+* Android: `Throwable.stacktrace.toString()`
+
+#### Defined in
+
+[CameraError.ts:98](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/CameraError.ts#L98)
diff --git a/docs/docs/api/interfaces/Frame.md b/docs/docs/api/interfaces/Frame.md
new file mode 100644
index 00000000..eb1f3549
--- /dev/null
+++ b/docs/docs/api/interfaces/Frame.md
@@ -0,0 +1,118 @@
+---
+id: "Frame"
+title: "Frame"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+A single frame, as seen by the camera.
+
+## Properties
+
+### bytesPerRow
+
+• **bytesPerRow**: `number`
+
+Returns the amount of bytes per row.
+
+#### Defined in
+
+[Frame.ts:20](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/Frame.ts#L20)
+
+___
+
+### height
+
+• **height**: `number`
+
+Returns the height of the frame, in pixels.
+
+#### Defined in
+
+[Frame.ts:16](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/Frame.ts#L16)
+
+___
+
+### isValid
+
+• **isValid**: `boolean`
+
+Whether the underlying buffer is still valid or not. The buffer will be released after the frame processor returns, or `close()` is called.
+
+#### Defined in
+
+[Frame.ts:8](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/Frame.ts#L8)
+
+___
+
+### planesCount
+
+• **planesCount**: `number`
+
+Returns the number of planes this frame contains.
+
+#### Defined in
+
+[Frame.ts:24](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/Frame.ts#L24)
+
+___
+
+### width
+
+• **width**: `number`
+
+Returns the width of the frame, in pixels.
+
+#### Defined in
+
+[Frame.ts:12](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/Frame.ts#L12)
+
+## Methods
+
+### close
+
+▸ **close**(): `void`
+
+Closes and disposes the Frame.
+Only close frames that you have created yourself, e.g. by copying the frame you receive in a frame processor.
+
+**`Example`**
+
+```ts
+const frameProcessor = useFrameProcessor((frame) => {
+ const smallerCopy = resize(frame, 480, 270)
+ // run AI ...
+ smallerCopy.close()
+ // don't close `frame`!
+})
+```
+
+#### Returns
+
+`void`
+
+#### Defined in
+
+[Frame.ts:48](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/Frame.ts#L48)
+
+___
+
+### toString
+
+▸ **toString**(): `string`
+
+Returns a string representation of the frame.
+
+**`Example`**
+
+```ts
+console.log(frame.toString()) // -> "3840 x 2160 Frame"
+```
+
+#### Returns
+
+`string`
+
+#### Defined in
+
+[Frame.ts:33](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/Frame.ts#L33)
diff --git a/docs/docs/api/interfaces/FrameProcessorPerformanceSuggestion.md b/docs/docs/api/interfaces/FrameProcessorPerformanceSuggestion.md
new file mode 100644
index 00000000..33464ed3
--- /dev/null
+++ b/docs/docs/api/interfaces/FrameProcessorPerformanceSuggestion.md
@@ -0,0 +1,26 @@
+---
+id: "FrameProcessorPerformanceSuggestion"
+title: "FrameProcessorPerformanceSuggestion"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+## Properties
+
+### suggestedFrameProcessorFps
+
+• **suggestedFrameProcessorFps**: `number`
+
+#### Defined in
+
+[CameraProps.ts:9](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/CameraProps.ts#L9)
+
+___
+
+### type
+
+• **type**: ``"can-use-higher-fps"`` \| ``"should-use-lower-fps"``
+
+#### Defined in
+
+[CameraProps.ts:8](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/CameraProps.ts#L8)
diff --git a/docs/docs/api/interfaces/FrameRateRange.md b/docs/docs/api/interfaces/FrameRateRange.md
new file mode 100644
index 00000000..89ee22ef
--- /dev/null
+++ b/docs/docs/api/interfaces/FrameRateRange.md
@@ -0,0 +1,26 @@
+---
+id: "FrameRateRange"
+title: "FrameRateRange"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+## Properties
+
+### maxFrameRate
+
+• **maxFrameRate**: `number`
+
+#### Defined in
+
+[CameraDevice.ts:104](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/CameraDevice.ts#L104)
+
+___
+
+### minFrameRate
+
+• **minFrameRate**: `number`
+
+#### Defined in
+
+[CameraDevice.ts:103](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/CameraDevice.ts#L103)
diff --git a/docs/docs/api/interfaces/PhotoFile.md b/docs/docs/api/interfaces/PhotoFile.md
new file mode 100644
index 00000000..e88bf544
--- /dev/null
+++ b/docs/docs/api/interfaces/PhotoFile.md
@@ -0,0 +1,178 @@
+---
+id: "PhotoFile"
+title: "PhotoFile"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+Represents a Photo taken by the Camera written to the local filesystem.
+
+See [`Camera.takePhoto()`](../classes/Camera.md#takephoto)
+
+## Hierarchy
+
+- [`TemporaryFile`](TemporaryFile.md)
+
+ ↳ **`PhotoFile`**
+
+## Properties
+
+### height
+
+• **height**: `number`
+
+The height of the photo, in pixels.
+
+#### Defined in
+
+[PhotoFile.ts:62](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/PhotoFile.ts#L62)
+
+___
+
+### isMirrored
+
+• **isMirrored**: `boolean`
+
+Whether this photo is mirrored (selfies) or not.
+
+#### Defined in
+
+[PhotoFile.ts:76](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/PhotoFile.ts#L76)
+
+___
+
+### isRawPhoto
+
+• **isRawPhoto**: `boolean`
+
+Whether this photo is in RAW format or not.
+
+#### Defined in
+
+[PhotoFile.ts:66](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/PhotoFile.ts#L66)
+
+___
+
+### metadata
+
+• `Optional` **metadata**: `Object`
+
+Metadata information describing the captured image. (iOS only)
+
+**`See`**
+
+[AVCapturePhoto.metadata](https://developer.apple.com/documentation/avfoundation/avcapturephoto/2873982-metadata)
+
+**`Platform`**
+
+iOS
+
+#### Type declaration
+
+| Name | Type | Description |
+| :------ | :------ | :------ |
+| `DPIHeight` | `number` | **`Platform`** iOS |
+| `DPIWidth` | `number` | **`Platform`** iOS |
+| `Orientation` | `number` | Orientation of the EXIF Image. * 1 = 0 degrees: the correct orientation, no adjustment is required. * 2 = 0 degrees, mirrored: image has been flipped back-to-front. * 3 = 180 degrees: image is upside down. * 4 = 180 degrees, mirrored: image has been flipped back-to-front and is upside down. * 5 = 90 degrees: image has been flipped back-to-front and is on its side. * 6 = 90 degrees, mirrored: image is on its side. * 7 = 270 degrees: image has been flipped back-to-front and is on its far side. * 8 = 270 degrees, mirrored: image is on its far side. |
+| `{Exif}` | { `ApertureValue`: `number` ; `BrightnessValue`: `number` ; `ColorSpace`: `number` ; `DateTimeDigitized`: `string` ; `DateTimeOriginal`: `string` ; `ExifVersion`: `string` ; `ExposureBiasValue`: `number` ; `ExposureMode`: `number` ; `ExposureProgram`: `number` ; `ExposureTime`: `number` ; `FNumber`: `number` ; `Flash`: `number` ; `FocalLenIn35mmFilm`: `number` ; `FocalLength`: `number` ; `ISOSpeedRatings`: `number`[] ; `LensMake`: `string` ; `LensModel`: `string` ; `LensSpecification`: `number`[] ; `MeteringMode`: `number` ; `OffsetTime`: `string` ; `OffsetTimeDigitized`: `string` ; `OffsetTimeOriginal`: `string` ; `PixelXDimension`: `number` ; `PixelYDimension`: `number` ; `SceneType`: `number` ; `SensingMethod`: `number` ; `ShutterSpeedValue`: `number` ; `SubjectArea`: `number`[] ; `SubsecTimeDigitized`: `string` ; `SubsecTimeOriginal`: `string` ; `WhiteBalance`: `number` } | - |
+| `{Exif}.ApertureValue` | `number` | - |
+| `{Exif}.BrightnessValue` | `number` | - |
+| `{Exif}.ColorSpace` | `number` | - |
+| `{Exif}.DateTimeDigitized` | `string` | - |
+| `{Exif}.DateTimeOriginal` | `string` | - |
+| `{Exif}.ExifVersion` | `string` | - |
+| `{Exif}.ExposureBiasValue` | `number` | - |
+| `{Exif}.ExposureMode` | `number` | - |
+| `{Exif}.ExposureProgram` | `number` | - |
+| `{Exif}.ExposureTime` | `number` | - |
+| `{Exif}.FNumber` | `number` | - |
+| `{Exif}.Flash` | `number` | - |
+| `{Exif}.FocalLenIn35mmFilm` | `number` | - |
+| `{Exif}.FocalLength` | `number` | - |
+| `{Exif}.ISOSpeedRatings` | `number`[] | - |
+| `{Exif}.LensMake` | `string` | - |
+| `{Exif}.LensModel` | `string` | - |
+| `{Exif}.LensSpecification` | `number`[] | - |
+| `{Exif}.MeteringMode` | `number` | - |
+| `{Exif}.OffsetTime` | `string` | - |
+| `{Exif}.OffsetTimeDigitized` | `string` | - |
+| `{Exif}.OffsetTimeOriginal` | `string` | - |
+| `{Exif}.PixelXDimension` | `number` | - |
+| `{Exif}.PixelYDimension` | `number` | - |
+| `{Exif}.SceneType` | `number` | - |
+| `{Exif}.SensingMethod` | `number` | - |
+| `{Exif}.ShutterSpeedValue` | `number` | - |
+| `{Exif}.SubjectArea` | `number`[] | - |
+| `{Exif}.SubsecTimeDigitized` | `string` | - |
+| `{Exif}.SubsecTimeOriginal` | `string` | - |
+| `{Exif}.WhiteBalance` | `number` | - |
+| `{MakerApple}?` | `Record`<`string`, `unknown`\> | Represents any data Apple cameras write to the metadata **`Platform`** iOS |
+| `{TIFF}` | { `DateTime`: `string` ; `HostComputer?`: `string` ; `Make`: `string` ; `Model`: `string` ; `ResolutionUnit`: `number` ; `Software`: `string` ; `XResolution`: `number` ; `YResolution`: `number` } | - |
+| `{TIFF}.DateTime` | `string` | - |
+| `{TIFF}.HostComputer?` | `string` | **`Platform`** iOS |
+| `{TIFF}.Make` | `string` | - |
+| `{TIFF}.Model` | `string` | - |
+| `{TIFF}.ResolutionUnit` | `number` | - |
+| `{TIFF}.Software` | `string` | - |
+| `{TIFF}.XResolution` | `number` | - |
+| `{TIFF}.YResolution` | `number` | - |
+
+#### Defined in
+
+[PhotoFile.ts:85](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/PhotoFile.ts#L85)
+
+___
+
+### orientation
+
+• **orientation**: `Orientation`
+
+Display orientation of the photo, relative to the Camera's sensor orientation.
+
+Note that Camera sensors are landscape, so e.g. "portrait" photos will have a value of "landscape-left", etc.
+
+#### Defined in
+
+[PhotoFile.ts:72](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/PhotoFile.ts#L72)
+
+___
+
+### path
+
+• **path**: `string`
+
+The path of the file.
+
+* **Note:** If you want to consume this file (e.g. for displaying it in an `` component), you might have to add the `file://` prefix.
+
+* **Note:** This file might get deleted once the app closes because it lives in the temp directory.
+
+#### Inherited from
+
+[TemporaryFile](TemporaryFile.md).[path](TemporaryFile.md#path)
+
+#### Defined in
+
+[TemporaryFile.ts:12](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/TemporaryFile.ts#L12)
+
+___
+
+### thumbnail
+
+• `Optional` **thumbnail**: `Record`<`string`, `unknown`\>
+
+#### Defined in
+
+[PhotoFile.ts:77](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/PhotoFile.ts#L77)
+
+___
+
+### width
+
+• **width**: `number`
+
+The width of the photo, in pixels.
+
+#### Defined in
+
+[PhotoFile.ts:58](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/PhotoFile.ts#L58)
diff --git a/docs/docs/api/interfaces/Point.md b/docs/docs/api/interfaces/Point.md
new file mode 100644
index 00000000..f2ed6d42
--- /dev/null
+++ b/docs/docs/api/interfaces/Point.md
@@ -0,0 +1,32 @@
+---
+id: "Point"
+title: "Point"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+Represents a Point in a 2 dimensional coordinate system.
+
+## Properties
+
+### x
+
+• **x**: `number`
+
+The X coordinate of this Point. (double)
+
+#### Defined in
+
+[Point.ts:8](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Point.ts#L8)
+
+___
+
+### y
+
+• **y**: `number`
+
+The Y coordinate of this Point. (double)
+
+#### Defined in
+
+[Point.ts:12](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/Point.ts#L12)
diff --git a/docs/docs/api/interfaces/RecordVideoOptions.md b/docs/docs/api/interfaces/RecordVideoOptions.md
new file mode 100644
index 00000000..2747daec
--- /dev/null
+++ b/docs/docs/api/interfaces/RecordVideoOptions.md
@@ -0,0 +1,96 @@
+---
+id: "RecordVideoOptions"
+title: "RecordVideoOptions"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+## Properties
+
+### fileType
+
+• `Optional` **fileType**: ``"mov"`` \| ``"mp4"``
+
+Specifies the output file type to record videos into.
+
+#### Defined in
+
+[VideoFile.ts:12](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/VideoFile.ts#L12)
+
+___
+
+### flash
+
+• `Optional` **flash**: ``"off"`` \| ``"auto"`` \| ``"on"``
+
+Set the video flash mode. Natively, this just enables the torch while recording.
+
+#### Defined in
+
+[VideoFile.ts:8](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/VideoFile.ts#L8)
+
+___
+
+### onRecordingError
+
+• **onRecordingError**: (`error`: [`CameraCaptureError`](../classes/CameraCaptureError.md)) => `void`
+
+#### Type declaration
+
+▸ (`error`): `void`
+
+Called when there was an unexpected runtime error while recording the video.
+
+##### Parameters
+
+| Name | Type |
+| :------ | :------ |
+| `error` | [`CameraCaptureError`](../classes/CameraCaptureError.md) |
+
+##### Returns
+
+`void`
+
+#### Defined in
+
+[VideoFile.ts:16](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/VideoFile.ts#L16)
+
+___
+
+### onRecordingFinished
+
+• **onRecordingFinished**: (`video`: [`VideoFile`](VideoFile.md)) => `void`
+
+#### Type declaration
+
+▸ (`video`): `void`
+
+Called when the recording has been successfully saved to file.
+
+##### Parameters
+
+| Name | Type |
+| :------ | :------ |
+| `video` | [`VideoFile`](VideoFile.md) |
+
+##### Returns
+
+`void`
+
+#### Defined in
+
+[VideoFile.ts:20](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/VideoFile.ts#L20)
+
+___
+
+### videoCodec
+
+• `Optional` **videoCodec**: ``"h265"``
+
+The Video Codec to record in.
+- `h264`: Widely supported, but might be less efficient, especially with larger sizes or framerates.
+- `h265`: The HEVC (High-Efficient-Video-Codec) for higher efficient video recordings.
+
+#### Defined in
+
+[VideoFile.ts:26](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/VideoFile.ts#L26)
diff --git a/docs/docs/api/interfaces/TakePhotoOptions.md b/docs/docs/api/interfaces/TakePhotoOptions.md
new file mode 100644
index 00000000..2eca26ae
--- /dev/null
+++ b/docs/docs/api/interfaces/TakePhotoOptions.md
@@ -0,0 +1,111 @@
+---
+id: "TakePhotoOptions"
+title: "TakePhotoOptions"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+## Properties
+
+### enableAutoDistortionCorrection
+
+• `Optional` **enableAutoDistortionCorrection**: `boolean`
+
+Specifies whether the photo output should use content aware distortion correction on this photo request.
+For example, the algorithm may not apply correction to faces in the center of a photo, but may apply it to faces near the photo’s edges.
+
+**`Platform`**
+
+iOS
+
+**`Default`**
+
+false
+
+#### Defined in
+
+[PhotoFile.ts:40](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/PhotoFile.ts#L40)
+
+___
+
+### enableAutoRedEyeReduction
+
+• `Optional` **enableAutoRedEyeReduction**: `boolean`
+
+Specifies whether red-eye reduction should be applied automatically on flash captures.
+
+**`Default`**
+
+false
+
+#### Defined in
+
+[PhotoFile.ts:26](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/PhotoFile.ts#L26)
+
+___
+
+### enableAutoStabilization
+
+• `Optional` **enableAutoStabilization**: `boolean`
+
+Indicates whether still image stabilization will be employed when capturing the photo
+
+**`Default`**
+
+false
+
+#### Defined in
+
+[PhotoFile.ts:32](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/PhotoFile.ts#L32)
+
+___
+
+### enableShutterSound
+
+• `Optional` **enableShutterSound**: `boolean`
+
+Whether to play the default shutter "click" sound when taking a picture or not.
+
+**`Default`**
+
+true
+
+#### Defined in
+
+[PhotoFile.ts:46](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/PhotoFile.ts#L46)
+
+___
+
+### flash
+
+• `Optional` **flash**: ``"off"`` \| ``"auto"`` \| ``"on"``
+
+Whether the Flash should be enabled or disabled
+
+**`Default`**
+
+"auto"
+
+#### Defined in
+
+[PhotoFile.ts:20](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/PhotoFile.ts#L20)
+
+___
+
+### qualityPrioritization
+
+• `Optional` **qualityPrioritization**: ``"quality"`` \| ``"balanced"`` \| ``"speed"``
+
+Indicates how photo quality should be prioritized against speed.
+
+* `"quality"` Indicates that photo quality is paramount, even at the expense of shot-to-shot time
+* `"balanced"` Indicates that photo quality and speed of delivery are balanced in priority
+* `"speed"` Indicates that speed of photo delivery is most important, even at the expense of quality
+
+**`Default`**
+
+"balanced"
+
+#### Defined in
+
+[PhotoFile.ts:14](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/PhotoFile.ts#L14)
diff --git a/docs/docs/api/interfaces/TakeSnapshotOptions.md b/docs/docs/api/interfaces/TakeSnapshotOptions.md
new file mode 100644
index 00000000..42289502
--- /dev/null
+++ b/docs/docs/api/interfaces/TakeSnapshotOptions.md
@@ -0,0 +1,62 @@
+---
+id: "TakeSnapshotOptions"
+title: "TakeSnapshotOptions"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+## Properties
+
+### flash
+
+• `Optional` **flash**: ``"off"`` \| ``"on"``
+
+Whether the Flash should be enabled or disabled
+
+**`Default`**
+
+"off"
+
+#### Defined in
+
+[Snapshot.ts:16](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/Snapshot.ts#L16)
+
+___
+
+### quality
+
+• `Optional` **quality**: `number`
+
+Specifies the quality of the JPEG. (0-100, where 100 means best quality (no compression))
+
+It is recommended to set this to `90` or even `80`, since the user probably won't notice a difference between `90`/`80` and `100`.
+
+**`Default`**
+
+100
+
+#### Defined in
+
+[Snapshot.ts:9](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/Snapshot.ts#L9)
+
+___
+
+### skipMetadata
+
+• `Optional` **skipMetadata**: `boolean`
+
+When set to `true`, metadata reading and mapping will be skipped. ([`metadata`](PhotoFile.md#metadata) will be `null`)
+
+This might result in a faster capture, as metadata reading and mapping requires File IO.
+
+**`Default`**
+
+false
+
+**`Platform`**
+
+Android
+
+#### Defined in
+
+[Snapshot.ts:27](https://github.com/mrousavy/react-native-vision-camera/blob/c2fb5bf1/src/Snapshot.ts#L27)
diff --git a/docs/docs/api/interfaces/TemporaryFile.md b/docs/docs/api/interfaces/TemporaryFile.md
new file mode 100644
index 00000000..40804e81
--- /dev/null
+++ b/docs/docs/api/interfaces/TemporaryFile.md
@@ -0,0 +1,32 @@
+---
+id: "TemporaryFile"
+title: "TemporaryFile"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+Represents a temporary file in the local filesystem.
+
+## Hierarchy
+
+- **`TemporaryFile`**
+
+ ↳ [`PhotoFile`](PhotoFile.md)
+
+ ↳ [`VideoFile`](VideoFile.md)
+
+## Properties
+
+### path
+
+• **path**: `string`
+
+The path of the file.
+
+* **Note:** If you want to consume this file (e.g. for displaying it in an `` component), you might have to add the `file://` prefix.
+
+* **Note:** This file might get deleted once the app closes because it lives in the temp directory.
+
+#### Defined in
+
+[TemporaryFile.ts:12](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/TemporaryFile.ts#L12)
diff --git a/docs/docs/api/interfaces/VideoFile.md b/docs/docs/api/interfaces/VideoFile.md
new file mode 100644
index 00000000..06b4dc7b
--- /dev/null
+++ b/docs/docs/api/interfaces/VideoFile.md
@@ -0,0 +1,48 @@
+---
+id: "VideoFile"
+title: "VideoFile"
+sidebar_position: 0
+custom_edit_url: null
+---
+
+Represents a Video taken by the Camera written to the local filesystem.
+
+Related: [`Camera.startRecording()`](../classes/Camera.md#startrecording), [`Camera.stopRecording()`](../classes/Camera.md#stoprecording)
+
+## Hierarchy
+
+- [`TemporaryFile`](TemporaryFile.md)
+
+ ↳ **`VideoFile`**
+
+## Properties
+
+### duration
+
+• **duration**: `number`
+
+Represents the duration of the video, in seconds.
+
+#### Defined in
+
+[VideoFile.ts:38](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/VideoFile.ts#L38)
+
+___
+
+### path
+
+• **path**: `string`
+
+The path of the file.
+
+* **Note:** If you want to consume this file (e.g. for displaying it in an `` component), you might have to add the `file://` prefix.
+
+* **Note:** This file might get deleted once the app closes because it lives in the temp directory.
+
+#### Inherited from
+
+[TemporaryFile](TemporaryFile.md).[path](TemporaryFile.md#path)
+
+#### Defined in
+
+[TemporaryFile.ts:12](https://github.com/mrousavy/react-native-vision-camera/blob/c66550ed/package/src/TemporaryFile.ts#L12)
diff --git a/docs/docs/api/interfaces/_category_.yml b/docs/docs/api/interfaces/_category_.yml
new file mode 100644
index 00000000..43bec88c
--- /dev/null
+++ b/docs/docs/api/interfaces/_category_.yml
@@ -0,0 +1,2 @@
+label: "Interfaces"
+position: 4
\ No newline at end of file
diff --git a/docs/docs/guides/CAPTURING.mdx b/docs/docs/guides/CAPTURING.mdx
index a77fe2ea..ae137d44 100644
--- a/docs/docs/guides/CAPTURING.mdx
+++ b/docs/docs/guides/CAPTURING.mdx
@@ -34,7 +34,6 @@ function App() {
The most important actions are:
* [Taking Photos](#taking-photos)
- - [Taking Snapshots](#taking-snapshots)
* [Recording Videos](#recording-videos)
## Taking Photos
@@ -57,25 +56,6 @@ You can customize capture options such as [automatic red-eye reduction](/docs/ap
This function returns a [`PhotoFile`](/docs/api/interfaces/PhotoFile) which contains a [`path`](/docs/api/interfaces/PhotoFile#path) property you can display in your App using an `` or ``.
-### Taking Snapshots
-
-Compared to iOS, Cameras on Android tend to be slower in image capture. If you care about speed, you can use the Camera's [`takeSnapshot(...)`](/docs/api/classes/Camera#takesnapshot) function (Android only) which simply takes a snapshot of the Camera View instead of actually taking a photo through the Camera lens.
-
-```ts
-const snapshot = await camera.current.takeSnapshot({
- quality: 85,
- skipMetadata: true
-})
-```
-
-:::note
-While taking snapshots is faster than taking photos, the resulting image has way lower quality. You can combine both functions to create a snapshot to present to the user at first, then deliver the actual high-res photo afterwards.
-:::
-
-:::note
-The `takeSnapshot` function also works with `photo={false}`. For this reason VisionCamera will automatically fall-back to snapshot capture if you are trying to use more use-cases than the Camera natively supports. (see ["The `supportsParallelVideoProcessing` prop"](/docs/guides/devices#the-supportsparallelvideoprocessing-prop))
-:::
-
## Recording Videos
To start a video recording you first have to enable video capture:
@@ -108,6 +88,14 @@ await camera.current.stopRecording()
Once a recording has been stopped, the `onRecordingFinished` callback passed to the `startRecording` function will be invoked with a [`VideoFile`](/docs/api/interfaces/VideoFile) which you can then use to display in a [`