diff --git a/programming/android/user-guide-v2.6.1000.md b/programming/android/user-guide-v2.6.1000.md
new file mode 100644
index 0000000..771122d
--- /dev/null
+++ b/programming/android/user-guide-v2.6.1000.md
@@ -0,0 +1,404 @@
+---
+layout: default-layout
+title: Detect and Normalize Document - Android User Guide
+description: This page introduce how to detect and normalize document with Dynamsoft Capture Vision Android SDK.
+keywords: user guide, android, document scanner
+needAutoGenerateSidebar: true
+needGenerateH4Content: true
+noTitleIndex: true
+permalink: /programming/android/user-guide.html
+---
+
+# Android User Guide for Document Scanner Integration
+
+In this guide, you will learn step by step on how to build a document scanner application with Dynamsoft Capture Vision Android SDK.
+
+- [Android User Guide for Document Scanner Integration](#android-user-guide-for-document-scanner-integration)
+ - [Requirements](#requirements)
+ - [Add the SDK](#add-the-sdk)
+ - [Build Your First Application](#build-your-first-application)
+ - [Create a New Project](#create-a-new-project)
+ - [Include the Library](#include-the-library)
+ - [Initialize License](#initialize-license)
+ - [MainActivity for Realtime Document Normalization](#mainactivity-for-realtime-document-normalization)
+ - [Initialize Camera Module](#initialize-camera-module)
+ - [Initialize Capture Vision Router](#initialize-capture-vision-router)
+ - [Add a Captured Result Receiver and Filter](#add-a-captured-result-receiver-and-filter)
+ - [Start and Stop Video Document Normalization](#start-and-stop-video-document-normalization)
+ - [Additional Steps in MainActivity](#additional-steps-in-mainactivity)
+ - [ResultActivity for Displaying the Normalized Image](#resultactivity-for-displaying-the-normalized-image)
+ - [Display the Normalized Image](#display-the-normalized-image)
+ - [Build and Run the Project](#build-and-run-the-project)
+
+## Requirements
+
+- Supported OS: Android 5.0 (API Level 21) or higher.
+- Supported ABI: **armeabi-v7a**, **arm64-v8a**, **x86** and **x86_64**.
+- Development Environment: Android Studio 2022.2.1 or higher.
+
+## Add the SDK
+
+1. Open the file `[App Project Root Path]\app\build.gradle` and add the Maven repository:
+
+ ```groovy
+ allprojects {
+ repositories {
+ maven {
+ url "https://download2.dynamsoft.com/maven/aar"
+ }
+ }
+ }
+ ```
+
+2. Add the references in the dependencies:
+
+ ```groovy
+ dependencies {
+ implementation 'com.dynamsoft:dynamsoftcapturevisionbundle:2.6.1000'
+ }
+ ```
+
+ > Read more about the modules of [dynamsoftcapturevisionbundle]({{site.dcv_android_api}}index.html)
+
+## Build Your First Application
+
+In this section, let's see how to create a HelloWorld app for detecting and normalizing documents from camera video input.
+
+>Note:
+>
+> - Android Studio 2022.2.1 is used here in this guide.
+> - You can get the similar source code of the HelloWorld app from the following link
+> - [Java](https://github.com/Dynamsoft/capture-vision-mobile-samples/tree/main/android/DocumentScanner/AutoNormalize){:target="_blank"}.
+> - [Kotlin](https://github.com/Dynamsoft/capture-vision-mobile-samples/tree/main/android/DocumentScanner/AutoNormalizeKt){:target="_blank"}.
+
+### Create a New Project
+
+1. Open Android Studio, select **File > New > New Project**.
+
+2. Choose the correct template for your project. In this sample, we use **Empty Activity**.
+
+3. When prompted, choose your app name 'HelloWorld' and set the **Save** location, **Language**, and **Minimum SDK** (we use 21 here).
+ > Note:
+ >
+ > - With **minSdkVersion** set to 21, your app is compatible with more than 94.1% of devices on the Google Play Store (last update: March 2021).
+
+### Include the Library
+
+Add the SDK to your new project. Please read [Add the SDK](#add-the-sdk) section for more details.
+
+### Initialize License
+
+1. Import the `LicenseManager` class and initialize the license in the file `MyApplication.java`.
+
+ ```java
+ import com.dynamsoft.license.LicenseManager;
+
+ public class MyApplication extends Application {
+ super.onCreate(savedInstanceState);
+ setContentView(R.layout.activity_main);
+ LicenseManager.initLicense("DLS2eyJvcmdhbml6YXRpb25JRCI6IjIwMDAwMSJ9", this, (isSuccess, error) -> {
+ if (!isSuccess) {
+ Log.e("License", "InitLicense Error: " + error);
+ }
+ });
+ }
+ ```
+
+ >Note:
+ >
+ >- The license string here grants a time-limited free trial which requires network connection to work.
+ >- You can request a 30-day trial license via the [Request a Trial License](https://www.dynamsoft.com/customer/license/trialLicense?product=ddn&utm_source=guide&package=android){:target="_blank"} link
+
+### MainActivity for Realtime Document Normalization
+
+#### Initialize Camera Module
+
+1. In the Project window, open **app > res > layout > `activity_main.xml`** and create a DCE camera view section under the root node.
+
+ ```xml
+
+
+ ```
+
+2. Import the camera module, initialize the camera view and bind to the created Camera Enhancer instance in the file `MainActivity.java`.
+
+ ```java
+ ...
+
+ import com.dynamsoft.dce.CameraView;
+ import com.dynamsoft.dce.CameraEnhancer;
+ import com.dynamsoft.dce.CameraEnhancerException;
+ import com.dynamsoft.dce.utils.PermissionUtil;
+
+ public class MainActivity extends AppCompatActivity {
+ private CameraEnhancer mCamera;
+
+ @Override
+ protected void onCreate(Bundle savedInstanceState) {
+
+ ...
+
+ CameraView cameraView = findViewById(R.id.camera_view);
+ mCamera = new CameraEnhancer(cameraView, MainActivity.this);
+
+ PermissionUtil.requestCameraPermission(MainActivity.this);
+ }
+ }
+ ```
+
+#### Initialize Capture Vision Router
+
+1. Import and initialize the capture vision router, and set the created Camera Enhancer instance as the input image source.
+
+ ```java
+ ...
+
+ import com.dynamsoft.cvr.CaptureVisionRouter;
+ import com.dynamsoft.cvr.CaptureVisionRouterException;
+
+ public class MainActivity extends AppCompatActivity {
+
+ ...
+
+ private CaptureVisionRouter mRouter;
+
+ @Override
+ protected void onCreate(Bundle savedInstanceState) {
+
+ ...
+
+ mRouter = new CaptureVisionRouter(MainActivity.this);
+ try {
+ mRouter.setInput(mCamera);
+ } catch (CaptureVisionRouterException e) {
+ e.printStackTrace();
+ }
+ }
+ }
+ ```
+
+#### Add a Captured Result Receiver and Filter
+
+1. Add a result receiver to get the normalized image results.
+
+ ```java
+ ...
+
+ import com.dynamsoft.core.basic_structures.CapturedResultReceiver;
+ import com.dynamsoft.core.basic_structures.ImageData;
+ import com.dynamsoft.cvr.EnumPresetTemplate;
+ import com.dynamsoft.ddn.NormalizedImagesResult;
+
+ public class MainActivity extends AppCompatActivity {
+
+ ...
+
+ public static ImageData mNormalizedImageData;
+ private boolean mJumpToOtherActivity = false;
+
+ @Override
+ protected void onCreate(Bundle savedInstanceState) {
+
+ ...
+
+ mRouter.addResultReceiver(new CapturedResultReceiver() {
+ @Override
+ public void onNormalizedImagesReceived(NormalizedImagesResult result) {
+ if (result.getItems().length > 0) {
+ NormalizedImageResultItem normalizedImageResultItem = result.getItems()[0];
+ if (normalizedImageResultItem.getCrossVerificationStatus() == EnumCrossVerificationStatus.CVS_PASSED || mJumpToOtherActivity)
+ {
+ mJumpToOtherActivity = false;
+ mNormalizedImageData = result.getItems()[0].getImageData();
+
+ Intent intent = new Intent(MainActivity.this, ResultActivity.class);
+ startActivity(intent);
+ }
+ }
+ }
+ });
+ }
+ }
+ ```
+
+2. Add a result cross filter to validate the normalized image result across multiple frames.
+
+ ```java
+ ...
+ import com.dynamsoft.core.basic_structures.EnumCapturedResultItemType;
+ import com.dynamsoft.utility.MultiFrameResultCrossFilter;
+
+ public class MainActivity extends AppCompatActivity {
+
+ ...
+
+ @Override
+ protected void onCreate(Bundle savedInstanceState) {
+
+ ...
+ MultiFrameResultCrossFilter filter = new MultiFrameResultCrossFilter();
+ filter.enableResultCrossVerification(EnumCapturedResultItemType.CRIT_NORMALIZED_IMAGE, true);
+ mRouter.addResultFilter(filter);
+ }
+ }
+ ```
+
+#### Start and Stop Video Document Normalization
+
+1. Override the `MainActivity.onResume` function to open camera and start video document normalization, override the `MainActivity.onPause` function to close camera and stop video document normalization.
+
+ ```java
+ public class MainActivity extends AppCompatActivity {
+
+ ...
+
+ @Override
+ public void onResume() {
+ super.onResume();
+ try {
+ mCamera.open();
+ mRouter.startCapturing(EnumPresetTemplate.PT_DETECT_AND_NORMALIZE_DOCUMENT, new CompletionListener() {
+ @Override
+ public void onSuccess() {
+
+ }
+ @Override
+ public void onFailure(int i, String s) {
+ Log.e(TAG, "onFailure: "+s);
+ }
+ });
+ } catch (CameraEnhancerException e) {
+ e.printStackTrace();
+ }
+ }
+
+ @Override
+ public void onPause() {
+ super.onPause();
+ try {
+ mCamera.close();
+ } catch (CameraEnhancerException e) {
+ e.printStackTrace();
+ }
+
+ mRouter.stopCapturing();
+ }
+ }
+ ```
+
+2. Add `onCaptureBtnClick` function to start the video document normalization. After start capturing, the SDK will process the video frames from the Camera Enhancer, then send the normalized image results to the registered result receiver.
+
+ ```java
+ public class MainActivity extends AppCompatActivity {
+
+ ...
+
+ public void onCaptureBtnClick(View v) {
+ mJumpToOtherActivity = true;
+ }
+ }
+ ```
+
+#### Additional Steps in MainActivity
+
+1. In the Project window, open **app > res > layout > `activity_main.xml`**, create a button under the root node to capture the quads detected on the image.
+
+ ```xml
+ ...
+
+
+ ```
+
+### ResultActivity for Displaying the Normalized Image
+
+#### Display the Normalized Image
+
+1. Create a new empty activity named `ResultActivity`.
+
+2. In the AndroidManifest.xml file, declare the `ResultActivity`.
+
+ ```xml
+
+ ...
+
+
+
+
+
+
+
+
+
+
+
+ ...
+
+ ```
+
+3. In the Project window, open **app > res > layout > `activity_result.xml`**, create a image view under the root node to display the result image.
+
+ ```xml
+
+ ```
+
+4. Display the normalized image.
+
+ ```java
+ import android.os.Bundle;
+ import android.widget.ImageView;
+
+ import androidx.annotation.Nullable;
+ import androidx.appcompat.app.AppCompatActivity;
+ import com.dynamsoft.core.basic_structures.CoreException;
+
+ public class ResultActivity extends AppCompatActivity {
+
+ @Override
+ protected void onCreate(@Nullable Bundle savedInstanceState) {
+ super.onCreate(savedInstanceState);
+ setContentView(R.layout.activity_result);
+
+ ImageView ivNormalize = findViewById(R.id.iv_normalize);
+
+ try {
+ ivNormalize.setImageBitmap(MainActivity.mNormalizedImageData.toBitmap());
+ } catch (CoreException e) {
+ e.printStackTrace();
+ }
+ }
+ }
+ ```
+
+### Build and Run the Project
+
+1. Select the device that you want to run your app on from the target device drop-down menu in the toolbar.
+
+2. Click the **Run app** button, then Android Studio installs your app on your connected device and starts it.
+
+You can download the similar source code here:
+
+- [Java](https://github.com/Dynamsoft/capture-vision-mobile-samples/tree/main/android/DocumentScanner/AutoNormalize){:target="_blank"}.
+- [Kotlin](https://github.com/Dynamsoft/capture-vision-mobile-samples/tree/main/android/DocumentScanner/AutoNormalizeKt){:target="_blank"}.
diff --git a/programming/android/user-guide.md b/programming/android/user-guide.md
index 771122d..2b097a2 100644
--- a/programming/android/user-guide.md
+++ b/programming/android/user-guide.md
@@ -54,7 +54,7 @@ In this guide, you will learn step by step on how to build a document scanner ap
```groovy
dependencies {
- implementation 'com.dynamsoft:dynamsoftcapturevisionbundle:2.6.1000'
+ implementation 'com.dynamsoft:dynamsoftcapturevisionbundle:2.6.1001'
}
```
diff --git a/programming/ios/user-guide-v2.6.1000.md b/programming/ios/user-guide-v2.6.1000.md
new file mode 100644
index 0000000..9b5a860
--- /dev/null
+++ b/programming/ios/user-guide-v2.6.1000.md
@@ -0,0 +1,606 @@
+---
+layout: default-layout
+title: Detect and Normalize Document - iOS User Guide
+description: This page introduce how to detect and normalize document with Dynamsoft Capture Vision iOS SDK.
+keywords: user guide, iOS, document scanner
+needAutoGenerateSidebar: true
+needGenerateH4Content: true
+noTitleIndex: true
+multiProgrammingLanguage: true
+enableLanguageSelection: true
+---
+
+# iOS User Guide for Document Scanner Integration
+
+- [iOS User Guide for Document Scanner Integration](#ios-user-guide-for-document-scanner-integration)
+ - [System Requirements](#system-requirements)
+ - [Add the SDK](#add-the-sdk)
+ - [Add the xcframeworks via CocoaPods](#add-the-xcframeworks-via-cocoapods)
+ - [Add the xcframeworks via Swift Package Manager](#add-the-xcframeworks-via-swift-package-manager)
+ - [Build Your First Application](#build-your-first-application)
+ - [Create a New Project](#create-a-new-project)
+ - [Include the Library](#include-the-library)
+ - [Initialize License](#initialize-license)
+ - [Main ViewController for Realtime Detection of Quads](#main-viewcontroller-for-realtime-detection-of-quads)
+ - [Get Prepared with the Camera Module](#get-prepared-with-the-camera-module)
+ - [Initialize Capture Vision Router](#initialize-capture-vision-router)
+ - [Set up Result Receiver](#set-up-result-receiver)
+ - [Configure the methods viewDidLoad, viewWillAppear, and viewWillDisappear](#configure-the-methods-viewdidload-viewwillappear-and-viewwilldisappear)
+ - [Display the Normalized Image](#display-the-normalized-image)
+ - [Configure Camera Permissions](#configure-camera-permissions)
+ - [Additional Steps for iOS 12.x or Lower Versions](#additional-steps-for-ios-12x-or-lower-versions)
+ - [Build and Run the Project](#build-and-run-the-project)
+
+## System Requirements
+
+- Supported OS: iOS 11 or higher (iOS 13 and higher recommended).
+- Supported ABI: arm64 and x86_64.
+- Development Environment: Xcode 13 and above (Xcode 14.1+ recommended).
+
+## Add the SDK
+
+There are two ways to add the SDK into your project - **CocoaPods**, or via **Swift Package Manager**.
+
+### Add the xcframeworks via CocoaPods
+
+1. Add the frameworks in your **Podfile**.
+
+ ```sh
+ target 'HelloWorld' do
+ use_frameworks!
+
+ pod 'DynamsoftCaptureVisionBundle','2.6.1000'
+
+ end
+ ```
+
+2. Execute the pod command to install the frameworks and generate workspace(**HelloWorld.xcworkspace**):
+
+ ```sh
+ pod install
+ ```
+
+### Add the xcframeworks via Swift Package Manager
+
+1. In your Xcode project, go to **File --> AddPackages**.
+
+2. In the top-right section of the window, search "https://github.com/Dynamsoft/capture-vision-spm"
+
+3. Select `capture-vision-spm`, choose `Exact version`, enter **2.6.1000**, then click **Add Package**.
+
+4. Check all the frameworks and add.
+
+
+
+## Build Your First Application
+
+This guide will walk you through the process of creating a HelloWorld app for detecting and normalizing documents via a camera video input.
+
+>Note:
+>
+> - Xcode 14.0 is used in this guide.
+> - You can get the source code of the HelloWorld app from the following link
+> - [Objective-C](https://github.com/Dynamsoft/capture-vision-mobile-samples/tree/main/ios/DocumentScanner/AutoNormalizeObjc){:target="_blank"}.
+> - [Swift](https://github.com/Dynamsoft/capture-vision-mobile-samples/tree/main/ios/DocumentScanner/AutoNormalize){:target="_blank"}.
+
+### Create a New Project
+
+1. Open Xcode and select create a new project.
+
+2. Select **iOS -> App** for your application.
+
+3. Input your product name (HelloWorld), interface (StoryBoard) and language (Objective-C/Swift).
+
+4. Click on the **Next** button and select the location to save the project.
+
+5. Click on the **Create** button to finish.
+
+### Include the Library
+
+Add the SDK to your new project. Please read [Add the SDK](#add-the-sdk) section for more details.
+
+### Initialize License
+
+Initialize the license first. In your **ViewController** file, add the following code to initialize the license.
+
+
+>- Objective-C
+>- Swift
+>
+>1.
+```objc
+// Import the DynamsoftLicense module to init license
+#import
+// Add LicenseVerificationListener to the interface
+@interface ViewController ()
+- (void)setLicense{
+ [DSLicenseManager initLicense:@"DLS2eyJvcmdhbml6YXRpb25JRCI6IjIwMDAwMSJ9" verificationDelegate:self];
+}
+-(void)onLicenseVerified:(BOOL)isSuccess error:(NSError *)error
+{
+ NSLog(@"On License Verified");
+ if (!isSuccess)
+ {
+ NSLog(error.localizedDescription);
+ }else
+ {
+ NSLog(@"License approved");
+ }
+}
+...
+@end
+```
+2.
+```swift
+// Import the DynamsoftLicense module to init license
+import DynamsoftLicense
+// Add LicenseVerificationListener to the interface
+class ViewController: UIViewController, LicenseVerificationListener {
+ func setLicense(){
+ LicenseManager.initLicense("DLS2eyJvcmdhbml6YXRpb25JRCI6IjIwMDAwMSJ9", verificationDelegate: self)
+ }
+ func onLicenseVerified(_ isSuccess: Bool, error: Error?) {
+ // Add your code to do when license server returns.
+ if let error = error, !isSuccess{
+ print(error.localizedDescription)
+ }
+ }
+ ...
+}
+```
+
+>Note:
+>
+>- Network connection is required for the license to work.
+>- The license string here will grant you a time-limited trial license.
+>- You can request a 30-day trial license via the [Request a Trial License](https://www.dynamsoft.com/customer/license/trialLicense?product=ddn&utm_source=guide&package=ios){:target="_blank"} link
+
+
+
+### Main ViewController for Realtime Detection of Quads
+
+In the main view controller, your app will scan documents via video streaming and display the detect quadrilateral area on the screen. First of all, import the headers in the ViewController file.
+
+
+ >- Objective-C
+ >- Swift
+ >
+ >1.
+ ```objc
+ #import
+ #import
+ #import
+ #import
+ #import
+ ```
+ 2.
+ ```swift
+ import DynamsoftCore
+ import DynamsoftCaptureVisionRouter
+ import DynamsoftDocumentNormalizer
+ import DynamsoftUtility
+ import DynamsoftCameraEnhancer
+ ```
+
+#### Get Prepared with the Camera Module
+
+Create the instances of `CameraEnhancer` and `CameraView`.
+
+
+>- Objective-C
+>- Swift
+>
+>1.
+```objc
+@property (nonatomic, strong) DSCameraEnhancer *dce;
+@property (nonatomic, strong) DSCameraView *cameraView;
+...
+- (void)setUpCamera
+{
+ _cameraView = [[DSCameraView alloc] initWithFrame:self.view.bounds];
+ [_cameraView setAutoresizingMask:UIViewAutoresizingFlexibleWidth];
+ [self.view addSubview:_cameraView];
+ [_cameraView addSubview:_captureButton];
+ _dce = [[DSCameraEnhancer alloc] init];
+ [_dce setCameraView:_cameraView];
+ DSDrawingLayer * layer = [_cameraView getDrawingLayer:DSDrawingLayerIdDDN];
+ [layer setVisible:true];
+ // You can enable the frame filter feature of Dynamsoft Camera Enhancer.
+ //[_dce enableEnhancedFeatures:DSEnhancerFeatureFrameFilter];
+}
+```
+2.
+```swift
+var cameraView:CameraView!
+var dce:CameraEnhancer!
+...
+func setUpCamera() {
+ // Create a camera view and add it as a sub view of the current view.
+ cameraView = .init(frame: view.bounds)
+ cameraView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
+ view.insertSubview(cameraView, at: 0)
+ // Bind the camera enhancer with the camera view.
+ dce = CameraEnhancer()
+ dce.cameraView = cameraView
+ // Additional step: Highlight the detected document boundary.
+ let layer = cameraView.getDrawingLayer(DrawingLayerId.DDN.rawValue)
+ layer?.visible = true
+ // You can enable the frame filter feature of Dynamsoft Camera Enhancer.
+ // dce.enableEnhancedFeatures(.frameFilter)
+}
+```
+
+#### Initialize Capture Vision Router
+
+Once the camera component is set up, declare and create an instance of `CaptureVisionRouter` and set its input to the Camera Enhancer object you created in the last step.
+
+
+>- Objective-C
+>- Swift
+>
+>1.
+```objc
+@property (nonatomic, strong) DSCaptureVisionRouter *cvr;
+...
+- (void)setUpCvr
+{
+ _cvr = [[DSCaptureVisionRouter alloc] init];
+}
+```
+2.
+```swift
+var cvr:CaptureVisionRouter!
+func setUpCvr() {
+ cvr = CaptureVisionRouter()
+}
+```
+
+Bind your `CaptureVisionRouter` instance with the created `CameraEnhancer` instance.
+
+
+>- Objective-C
+>- Swift
+>
+>1.
+```objc
+- (void)setUpCvr
+{
+ ...
+ NSError *cvrError;
+ [_cvr setInput:_dce error:&cvrError];
+}
+```
+2.
+```swift
+func setUpCvr() {
+ try? cvr.setInput(dce)
+}
+```
+
+#### Set up Result Receiver
+
+1. Add `CapturedResultReceiver` to your ViewController.
+
+
+ >- Objective-C
+ >- Swift
+ >
+ >1.
+ ```objc
+ @interface ViewController ()
+ ```
+ 2.
+ ```swift
+ class ViewController: UIViewController, LicenseVerificationListener, CapturedResultReceiver {
+ ...
+ }
+ ```
+
+2. Implement `onNormalizedImagesReceived` method to receive the normalized images as the captured results.
+
+
+ >- Objective-C
+ >- Swift
+ >
+ >1.
+ ```objc
+ // import ImageViewController.h. It will be implemented later.
+ #import "ImageViewController.h"
+ -(void)onNormalizedImagesReceived:(DSNormalizedImagesResult *)result
+ {
+ if (result!=nil && result.items[0].imageData!=nil && (_implementCapture || result.items[0].crossVerificationStatus == DSCrossVerificationStatusPassed))
+ {
+ NSLog(@"Capture confirmed");
+ _implementCapture = false;
+ dispatch_async(dispatch_get_main_queue(), ^{
+ [self.cvr stopCapturing];
+ ImageViewController *imageViewController = [[ImageViewController alloc] init];
+ NSError * error;
+ imageViewController.normalizedImage = [result.items[0].imageData toUIImage:&error];
+ NSLog(@"UIImage set");
+ [self.navigationController pushViewController:imageViewController animated:YES];
+ });
+ }
+ }
+ ```
+ 2.
+ ```swift
+ func onNormalizedImagesReceived(_ result: NormalizedImagesResult) {
+ if let item = result.items?.first {
+ if item.crossVerificationStatus == .passed || implementCapture {
+ guard let data = item.imageData else { return }
+ implementCapture = false
+ cvr.stopCapturing()
+ DispatchQueue.main.async {
+ let resultView = ImageViewController()
+ resultView.normalizedImage = try? data.toUIImage()
+ self.navigationController?.pushViewController(resultView, animated: true)
+ }
+ }
+ }
+ }
+ ```
+
+3. Add the result receiver to the `CaptureVisionRouter`.
+
+
+ >- Objective-C
+ >- Swift
+ >
+ >1.
+ ```objc
+ - (void)setUpCvr
+ {
+ ...
+ [_cvr addResultReceiver:self];
+ DSMultiFrameResultCrossFilter *filter = [[DSMultiFrameResultCrossFilter alloc] init];
+ [filter enableResultCrossVerification:DSCapturedResultItemTypeNormalizedImage isEnabled:true];
+ [_cvr addResultFilter:filter];
+ }
+ ```
+ 2.
+ ```swift
+ func setUpCvr() {
+ ...
+ cvr.addResultReceiver(self)
+ let filter = MultiFrameResultCrossFilter.init()
+ filter.enableResultCrossVerification(.normalizedImage, isEnabled: true)
+ cvr.addResultFilter(filter)
+ }
+ ```
+
+4. Add a `confirmCapture` button to confirm the result.
+
+
+ >- Objective-C
+ >- Swift
+ >
+ >1.
+ ```objc
+ @property (nonatomic, strong) UIButton *captureButton;
+ @property (nonatomic) BOOL implementCapture;
+ ...
+ - (void)addCaptureButton {
+ [self.view addSubview:self.captureButton];
+ }
+ - (UIButton *)captureButton {
+ NSLog(@"Start adding button");
+ CGFloat screenWidth = [UIScreen mainScreen].bounds.size.width;
+ CGFloat screenHeight = [UIScreen mainScreen].bounds.size.height;
+ if (!_captureButton) {
+ _captureButton = [UIButton buttonWithType:UIButtonTypeCustom];
+ _captureButton.frame = CGRectMake((screenWidth - 150) / 2.0, screenHeight - 100, 150, 50);
+ _captureButton.backgroundColor = [UIColor grayColor];
+ _captureButton.layer.cornerRadius = 10;
+ _captureButton.layer.borderColor = [UIColor darkGrayColor].CGColor;
+ [_captureButton setTitle:@"Capture" forState:UIControlStateNormal];
+ [_captureButton setTitleColor:[UIColor whiteColor] forState:UIControlStateNormal];
+ [_captureButton addTarget:self action:@selector(setCapture) forControlEvents:UIControlEventTouchUpInside];
+ }
+ return _captureButton;
+ }
+ - (void)setCapture
+ {
+ _implementCapture = true;
+ }
+ ```
+ 2.
+ ```swift
+ var captureButton:UIButton!
+ var implementCapture:Bool = false
+ ...
+ func addCaptureButton()
+ {
+ let w = UIScreen.main.bounds.size.width
+ let h = UIScreen.main.bounds.size.height
+ let SafeAreaBottomHeight: CGFloat = UIApplication.shared.statusBarFrame.size.height > 20 ? 34 : 0
+ let photoButton = UIButton(frame: CGRect(x: w / 2 - 60, y: h - 100 - SafeAreaBottomHeight, width: 120, height: 60))
+ photoButton.setTitle("Capture", for: .normal)
+ photoButton.backgroundColor = UIColor.green
+ photoButton.addTarget(self, action: #selector(confirmCapture), for: .touchUpInside)
+ DispatchQueue.main.async(execute: { [self] in
+ view.addSubview(photoButton)
+ })
+ }
+ @objc func confirmCapture()
+ {
+ implementCapture = true
+ }
+ ```
+
+#### Configure the methods viewDidLoad, viewWillAppear, and viewWillDisappear
+
+
+>- Objective-C
+>- Swift
+>
+>1.
+```objc
+- (void)viewDidLoad {
+ [super viewDidLoad];
+ [self setLicense];
+ [self setUpCamera];
+ [self setUpCvr];
+ [self addCaptureButton];
+}
+- (void)viewWillAppear:(BOOL)animated
+{
+ [super viewWillAppear:animated];
+ [_dce open];
+ [_cvr startCapturing:DSPresetTemplateDetectAndNormalizeDocument completionHandler:^(BOOL isSuccess, NSError * _Nullable error) {
+ if (!isSuccess && error != nil) {
+ NSLog(@"%@", error.localizedDescription);
+ }
+ }];
+}
+- (void)viewWillDisappear:(BOOL)animated
+{
+ [super viewWillAppear:animated];
+ [_dce close];
+}
+```
+2.
+```swift
+override func viewDidLoad() {
+ super.viewDidLoad()
+ setLicense()
+ setUpCamera()
+ setUpCvr()
+ addCaptureButton()
+}
+override func viewWillAppear(_ animated: Bool) {
+ super.viewWillAppear(animated)
+ dce.open()
+ cvr.startCapturing(PresetTemplate.detectAndNormalizeDocument.rawValue){ isSuccess, error in
+ if let error = error, !isSuccess {
+ print("Capture start failed")
+ print(error.localizedDescription)
+ }
+ }
+}
+override func viewWillDisappear(_ animated: Bool) {
+ super.viewWillDisappear(animated)
+ dce.close()
+}
+```
+
+#### Display the Normalized Image
+
+1. Create a new `UIViewController` class `ImageViewController`.
+
+2. Add a property `normalizedImage` to the header file of `ImageViewController` (Objective-C only).
+
+ ```objc
+ #import
+
+ NS_ASSUME_NONNULL_BEGIN
+
+ @interface ImageViewController : UIViewController
+
+ @property (nonatomic, strong) UIImage *normalizedImage;
+
+ @end
+
+ NS_ASSUME_NONNULL_END
+ ```
+
+3. Configure the `ImageViewController` to display the normalized image..
+
+
+ >- Objective-C
+ >- Swift
+ >
+ >1.
+ ```objc
+ #import "ImageViewController.h"
+ @interface ImageViewController()
+ @property (nonatomic, strong) UIImageView *imageView;
+ @end
+ @implementation ImageViewController
+ -(void)viewDidLoad
+ {
+ NSLog(@"ImageViewController loaded");
+ [super viewDidLoad];
+ [self setUpView];
+ }
+ - (void)setUpView
+ {
+ _imageView = [[UIImageView alloc] initWithFrame:self.view.bounds];
+ [self.view addSubview:_imageView];
+ [_imageView setContentMode:UIViewContentModeScaleAspectFit];
+ dispatch_async(dispatch_get_main_queue(), ^{
+ [self.imageView setImage:self.normalizedImage];
+ });
+ }
+ @end
+ ```
+ 2.
+ ```swift
+ import UIKit
+ import DynamsoftCore
+ import DynamsoftCaptureVisionRouter
+ import DynamsoftDocumentNormalizer
+ class ImageViewController: UIViewController {
+ var normalizedImage:UIImage!
+ var imageView:UIImageView!
+ override func viewDidLoad() {
+ super.viewDidLoad()
+ setUpView()
+ }
+ func setUpView() {
+ imageView = UIImageView.init(frame: view.bounds)
+ imageView.contentMode = .scaleAspectFit
+ view.addSubview(imageView)
+ DispatchQueue.main.async { [self] in
+ imageView.image = normalizedImage
+ }
+ }
+ }
+ ```
+
+4. Go to your **Main.storyboard** and add **Navigation Controller**.
+
+
+
+### Configure Camera Permissions
+
+Add **Privacy - Camera Usage Description** to the `info.plist` of your project to request camera permission. An easy way to do this is to access your project settings, go to *Info* and then add this Privacy property to the iOS target properties list.
+
+### Additional Steps for iOS 12.x or Lower Versions
+
+If your iOS version is 12.x or lower, please add the following additional steps:
+
+1. Remove the methods `application:didDiscardSceneSessions:` and `application:configurationForConnectingSceneSession:options:` from your `AppDelegate` file.
+2. Remove the `SceneDelegate.Swift` file (`SceneDelegate.h` & `SceneDelegate.m` for Objective-C).
+3. Remove the `Application Scene Manifest` from your info.plist file.
+4. Declaire the window in your `AppDelegate.Swift` file (`AppDelegate.h` for Objective-C).
+
+
+ >- Objective-C
+ >- Swift
+ >
+ >1.
+ ```objc
+ @interface AppDelegate : UIResponder
+ @property (strong, nonatomic) UIWindow *window;
+ @end
+ ```
+ 2.
+ ```swift
+ import UIKit
+ @main
+ class AppDelegate: UIResponder, UIApplicationDelegate {
+ var window: UIWindow?
+ }
+ ```
+
+### Build and Run the Project
+
+1. Select the device that you want to run your app on.
+2. Run the project, then your app will be installed on your device.
+
+> Note:
+>
+> - You can get the source code of the HelloWorld app from the following link
+> - [Objective-C](https://github.com/Dynamsoft/capture-vision-mobile-samples/tree/main/ios/DocumentScanner/AutoNormalizeObjc){:target="_blank"}.
+> - [Swift](https://github.com/Dynamsoft/capture-vision-mobile-samples/tree/main/ios/DocumentScanner/AutoNormalize){:target="_blank"}.
diff --git a/programming/ios/user-guide.md b/programming/ios/user-guide.md
index 9b5a860..36bd3c5 100644
--- a/programming/ios/user-guide.md
+++ b/programming/ios/user-guide.md
@@ -49,7 +49,7 @@ There are two ways to add the SDK into your project - **CocoaPods**, or via **Sw
target 'HelloWorld' do
use_frameworks!
- pod 'DynamsoftCaptureVisionBundle','2.6.1000'
+ pod 'DynamsoftCaptureVisionBundle','2.6.1001'
end
```
@@ -66,7 +66,7 @@ There are two ways to add the SDK into your project - **CocoaPods**, or via **Sw
2. In the top-right section of the window, search "https://github.com/Dynamsoft/capture-vision-spm"
-3. Select `capture-vision-spm`, choose `Exact version`, enter **2.6.1000**, then click **Add Package**.
+3. Select `capture-vision-spm`, choose `Exact version`, enter **2.6.1001**, then click **Add Package**.
4. Check all the frameworks and add.